WorldWideScience

Sample records for complex event processing

  1. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  2. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  3. Real-time monitoring of clinical processes using complex event processing and transition systems.

    Science.gov (United States)

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  4. Compliance with Environmental Regulations through Complex Geo-Event Processing

    Directory of Open Access Journals (Sweden)

    Federico Herrera

    2017-11-01

    Full Text Available In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with data coming from autonomous sensors (e.g. reporting carbon emission in protected areas and from citizens providing information (e.g. illegal dumping in a voluntary way. Complex Event Processing (CEP technologies allow processing large amount of event data and detecting patterns from them. However, they do not provide native support for the geographic dimension of events which is essential for monitoring requirements which apply to specific geographic areas. This paper proposes a geospatial extension for CEP that allows monitoring environmental requirements considering the geographic location of the processed data. We extend an existing platform-independent, model-driven approach for CEP adding the geographic location to events and specifying patterns using geographic operators. The use and technical feasibility of the proposal is shown through the development of a case study and the implementation of a prototype.

  5. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  6. Real-time complex event processing for cloud resources

    Science.gov (United States)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  7. Compliance with Environmental Regulations through Complex Geo-Event Processing

    OpenAIRE

    Federico Herrera; Laura González; Daniel Calegari; Bruno Rienzi

    2017-01-01

    In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with ...

  8. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  9. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  10. Survey of Applications of Complex Event Processing (CEP in Health Domain

    Directory of Open Access Journals (Sweden)

    Nadeem Mahmood

    2017-12-01

    Full Text Available It is always difficult to manipulate the production of huge amount of data which comes from multiple sources and to extract meaningful information to make appropriate decisions. When data comes from various input resources, to get required streams of events form this complex input network, the one of the strong functionality of Business Intelligence (BI the Complex Event Processing (CEP is the appropriate solution for the above mention problems. Real time processing, pattern matching, stream processing, big data management, sensor data processing and many more are the application areas of CEP. Health domain itself is a multi-dimension domain such as hospital supply chain, OPD management, disease diagnostic, In-patient, out-patient management, and emergency care etc. In this paper, the main focus is to discuss the application areas of Complex Event Processing (CEP in health domain by using sensor device, such that how CEP manipulate health data set events coming from sensor devices such as blood pressure, heart rate, fall detection, sugar level, temperature or any other vital signs and how this systems respond to these events as quickly as possible. Different existing models and application using CEP are discussed and summarized according to different characteristics.

  11. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  12. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2013-10-09

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework, to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.

  13. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  14. Why Rules Matter in Complex Event Processing...and Vice Versa

    Science.gov (United States)

    Vincent, Paul

    Many commercial and research CEP solutions are moving beyond simple stream query languages to more complete definitions of "process" and thence to "decisions" and "actions". And as capabilities increase in event processing capabilities, there is an increasing realization that the humble "rule" is as relevant to the event cloud as it is to specific services. Less obvious is how much event processing has to offer the process and rule execution and management technologies. Does event processing change the way we should manage businesses, processes and services, together with their embedded (and hopefully managed) rulesets?

  15. ASPIE: A Framework for Active Sensing and Processing of Complex Events in the Internet of Manufacturing Things

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Rapid perception and processing of critical monitoring events are essential to ensure healthy operation of Internet of Manufacturing Things (IoMT-based manufacturing processes. In this paper, we proposed a framework (active sensing and processing architecture (ASPIE for active sensing and processing of critical events in IoMT-based manufacturing based on the characteristics of IoMT architecture as well as its perception model. A relation model of complex events in manufacturing processes, together with related operators and unified XML-based semantic definitions, are developed to effectively process the complex event big data. A template based processing method for complex events is further introduced to conduct complex event matching using the Apriori frequent item mining algorithm. To evaluate the proposed models and methods, we developed a software platform based on ASPIE for a local chili sauce manufacturing company, which demonstrated the feasibility and effectiveness of the proposed methods for active perception and processing of complex events in IoMT-based manufacturing.

  16. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Anders, Gabriel; The ATLAS collaboration; Kazarov, Andrei; Lehmann Miotto, Giovanna; Santos, Alejandro; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from different sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP directives is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the persistent storage of all relevant data streams that are produced during data taking, the second is a...

  17. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration; Avolio, Giuseppe; Kazarov, Andrei; Lehmann Miotto, Giovanna; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from many sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP rules is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the data reader from existing storage of all relevant data streams that are produced during data taking, the second...

  18. Prediction of Increasing Production Activities using Combination of Query Aggregation on Complex Events Processing and Neural Network

    Directory of Open Access Journals (Sweden)

    Achmad Arwan

    2016-07-01

    Full Text Available AbstrakProduksi, order, penjualan, dan pengiriman adalah serangkaian event yang saling terkait dalam industri manufaktur. Selanjutnya hasil dari event tersebut dicatat dalam event log. Complex Event Processing adalah metode yang digunakan untuk menganalisis apakah terdapat pola kombinasi peristiwa tertentu (peluang/ancaman yang terjadi pada sebuah sistem, sehingga dapat ditangani secara cepat dan tepat. Jaringan saraf tiruan adalah metode yang digunakan untuk mengklasifikasi data peningkatan proses produksi. Hasil pencatatan rangkaian proses yang menyebabkan peningkatan produksi digunakan sebagai data latih untuk mendapatkan fungsi aktivasi dari jaringan saraf tiruan. Penjumlahan hasil catatan event log dimasukkan ke input jaringan saraf tiruan untuk perhitungan nilai aktivasi. Ketika nilai aktivasi lebih dari batas yang ditentukan, maka sistem mengeluarkan sinyal untuk meningkatkan produksi, jika tidak, sistem tetap memantau kejadian. Hasil percobaan menunjukkan bahwa akurasi dari metode ini adalah 77% dari 39 rangkaian aliran event.Kata kunci: complex event processing, event, jaringan saraf tiruan, prediksi peningkatan produksi, proses. AbstractProductions, orders, sales, and shipments are series of interrelated events within manufacturing industry. Further these events were recorded in the event log. Complex event processing is a method that used to analyze whether there are patterns of combinations of certain events (opportunities / threats that occur in a system, so it can be addressed quickly and appropriately. Artificial neural network is a method that we used to classify production increase activities. The series of events that cause the increase of the production used as a dataset to train the weight of neural network which result activation value. An aggregate stream of events inserted into the neural network input to compute the value of activation. When the value is over a certain threshold (the activation value results

  19. Pre-attentive processing of spectrally complex sounds with asynchronous onsets: an event-related potential study with human subjects.

    Science.gov (United States)

    Tervaniemi, M; Schröger, E; Näätänen, R

    1997-05-23

    Neuronal mechanisms involved in the processing of complex sounds with asynchronous onsets were studied in reading subjects. The sound onset asynchrony (SOA) between the leading partial and the remaining complex tone was varied between 0 and 360 ms. Infrequently occurring deviant sounds (in which one out of 10 harmonics was different in pitch relative to the frequently occurring standard sound) elicited the mismatch negativity (MMN), a change-specific cortical event-related potential (ERP) component. This indicates that the pitch of standard stimuli had been pre-attentively coded by sensory-memory traces. Moreover, when the complex-tone onset fell within temporal integration window initiated by the leading-partial onset, the deviants elicited the N2b component. This indexes that involuntary attention switch towards the sound change occurred. In summary, the present results support the existence of pre-perceptual integration mechanism of 100-200 ms duration and emphasize its importance in switching attention towards the stimulus change.

  20. Using Complex Event Processing (CEP) and vocal synthesis techniques to improve comprehension of sonified human-centric data

    Science.gov (United States)

    Rimland, Jeff; Ballora, Mark

    2014-05-01

    The field of sonification, which uses auditory presentation of data to replace or augment visualization techniques, is gaining popularity and acceptance for analysis of "big data" and for assisting analysts who are unable to utilize traditional visual approaches due to either: 1) visual overload caused by existing displays; 2) concurrent need to perform critical visually intensive tasks (e.g. operating a vehicle or performing a medical procedure); or 3) visual impairment due to either temporary environmental factors (e.g. dense smoke) or biological causes. Sonification tools typically map data values to sound attributes such as pitch, volume, and localization to enable them to be interpreted via human listening. In more complex problems, the challenge is in creating multi-dimensional sonifications that are both compelling and listenable, and that have enough discrete features that can be modulated in ways that allow meaningful discrimination by a listener. We propose a solution to this problem that incorporates Complex Event Processing (CEP) with speech synthesis. Some of the more promising sonifications to date use speech synthesis, which is an "instrument" that is amenable to extended listening, and can also provide a great deal of subtle nuance. These vocal nuances, which can represent a nearly limitless number of expressive meanings (via a combination of pitch, inflection, volume, and other acoustic factors), are the basis of our daily communications, and thus have the potential to engage the innate human understanding of these sounds. Additionally, recent advances in CEP have facilitated the extraction of multi-level hierarchies of information, which is necessary to bridge the gap between raw data and this type of vocal synthesis. We therefore propose that CEP-enabled sonifications based on the sound of human utterances could be considered the next logical step in human-centric "big data" compression and transmission.

  1. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    J. King

    2004-01-01

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report

  2. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  3. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    P. Sanchez

    2004-11-08

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA).

  4. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    P. Sanchez

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA)

  5. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  6. Controlling extreme events on complex networks

    Science.gov (United States)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  7. Complex Event Extraction using DRUM

    Science.gov (United States)

    2015-10-01

    towards tackling these challenges . Figure 9. Evaluation results for eleven teams. The diamond ◆ represents the results of our system. The two topmost...Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000). The UniProt

  8. LHCb Online event processing and filtering

    Science.gov (United States)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  9. LHCb Online event processing and filtering

    International Nuclear Information System (INIS)

    Alessio, F; Barandela, C; Brarda, L; Frank, M; Gaspar, C; Herwijnen, E v; Jacobsson, R; Jost, B; Koestner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S; Franek, B; Galli, D

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed

  10. CRITICAL EVENTS IN CONSTRUCTION PROCESS

    DEFF Research Database (Denmark)

    Jørgensen, Kirsten; Rasmussen, Grane Mikael Gregaard

    2009-01-01

    cause-effects of failures and defects in the construction industry by using an analytical approach (The bowtie model) which is developed in the accident research. Using this model clarifies the relationships within the chain of failures that causes critical events with undesirable consequences......Function failures, defects and poor communication are major problems in the construction industry. These failures and defects are caused by a row of critical events in the construction process. The purpose of this paper is to define “critical events” in the construction process and to investigate....... In this way the causes of failures and the relationships between various failures are rendered visible. A large construction site was observed from start to finish as the empirical element in the research. The research focuses on all kinds of critical events identified throughout every phase during...

  11. LHCb Online event processing and filtering

    CERN Document Server

    Alessio, F; Brarda, L; Frank, M; Franek, B; Galli, D; Gaspar, C; Van Herwijnen, E; Jacobsson, R; Jost, B; Köstner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. ...

  12. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  13. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  14. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  15. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. The ATLAS Event Service: A New Approach to Event Processing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00070566; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre­staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabi...

  17. Buffer of Events as a Markovian Process

    International Nuclear Information System (INIS)

    Berdugo, J.; Casaus, J.; Mana, C.

    2001-01-01

    In Particle and Asro-Particle Physics experiments, the events which get trough the detectors are read and processes on-line before they are stored for a more detailed processing and future Physics analysis. Since the events are read and, usually, processed sequentially, the time involved in these operations can lead to a significant lose of events which is, to some extent, reduced by using buffers. We present an estimate of the optimum buffer size and the fraction of events lost for a simple experimental condition which serves as an introductory example to the use of Markow Chains.(Author)

  18. Buffer of Events as a Markovian Process

    Energy Technology Data Exchange (ETDEWEB)

    Berdugo, J.; Casaus, J.; Mana, C.

    2001-07-01

    In Particle and Asro-Particle Physics experiments, the events which get trough the detectors are read and processes on-line before they are stored for a more detailed processing and future Physics analysis. Since the events are read and, usually, processed sequentially, the time involved in these operations can lead to a significant lose of events which is, to some extent, reduced by using buffers. We present an estimate of the optimum buffer size and the fraction of events lost for a simple experimental condition which serves as an introductory example to the use of Markow Chains.(Author)

  19. Service Processes as a Sequence of Events

    NARCIS (Netherlands)

    P.C. Verhoef (Peter); G. Antonides (Gerrit); A.N. de Hoog

    2002-01-01

    textabstractIn this paper the service process is considered as a sequence of events. Using theory from economics and psychology a model is formulated that explains how the utility of each event affects the overall evaluation of the service process. In this model we especially account for the

  20. Third Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2009-01-01

    On June 4th, 2009, the third Dutch Process Control Security Event took place in Amsterdam. The event, organised by the Dutch National Infrastructure against Cybercrime (NICC), attracted both Dutch process control experts and members of the European SCADA and Control Systems Information Exchange

  1. Mining process performance from event logs

    NARCIS (Netherlands)

    Adriansyah, A.; Buijs, J.C.A.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    In systems where process executions are not strictly enforced by a predefined process model, obtaining reliable performance information is not trivial. In this paper, we analyzed an event log of a real-life process, taken from a Dutch financial institute, using process mining techniques. In

  2. Post-event processing in social anxiety.

    Science.gov (United States)

    Dannahy, Laura; Stopa, Lusia

    2007-06-01

    Clark and Wells' [1995. A cognitive model of social phobia. In: R. Heimberg, M. Liebowitz, D.A. Hope, & F.R. Schneier (Eds.) Social phobia: Diagnosis, assessment and treatment (pp. 69-93). New York: Guildford Press.] cognitive model of social phobia proposes that following a social event, individuals with social phobia will engage in post-event processing, during which they conduct a detailed review of the event. This study investigated the relationship between self-appraisals of performance and post-event processing in individuals high and low in social anxiety. Participants appraised their performance immediately after a conversation with an unknown individual and prior to an anticipated second conversation task 1 week later. The frequency and valence of post-event processing during the week following the conversation was also assessed. The study also explored differences in the metacognitive processes of high and low socially anxious participants. The high socially anxious group experienced more anxiety, predicted worse performance, underestimated their actual performance, and engaged in more post-event processing than low socially anxious participants. The degree of negative post-event processing was linked to the extent of social anxiety and negative appraisals of performance, both immediately after the conversation task and 1 week later. Differences were also observed in some metacognitive processes. The results are discussed in relation to current theory and previous research.

  3. Epidemic processes in complex networks

    OpenAIRE

    Pastor Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-01-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The t...

  4. Historical events of the Chemical Processing Department

    Energy Technology Data Exchange (ETDEWEB)

    Lane, W.A.

    1965-11-12

    The purpose of this report is to summarize and document the significant historical events pertinent to the operation of the Chemical Processing facilities at Hanford. The report covers, in chronological order, the major construction activities and historical events from 1944 to September, 1965. Also included are the production records achieved and a history of the department`s unit cost performance.

  5. First Dutch Process Control Security Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2008-01-01

    On May 21st , 2008, the Dutch National Infrastructure against Cyber Crime (NICC) organised their first Process Control Security Event. Mrs. Annemarie Zielstra, the NICC programme manager, opened the event. She welcomed the over 100 representatives of key industry sectors. “Earlier studies in the

  6. Fourth Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.; Zielstra, A.

    2010-01-01

    On December 1st, 2009, the fourth Dutch Process Control Security Event took place in Baarn, The Netherlands. The security event with the title ‘Manage IT!’ was organised by the Dutch National Infrastructure against Cybercrime (NICC). Mid of November, a group of over thirty people participated in the

  7. Recurrent process mining with live event data

    NARCIS (Netherlands)

    Syamsiyah, A.; van Dongen, B.F.; van der Aalst, W.M.P.; Teniente, E.; Weidlich, M.

    2018-01-01

    In organizations, process mining activities are typically performed in a recurrent fashion, e.g. once a week, an event log is extracted from the information systems and a process mining tool is used to analyze the process’ characteristics. Typically, process mining tools import the data from a

  8. The ATLAS Event Service: A new approach to event processing

    Science.gov (United States)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  9. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  10. Top event prevention in complex systems

    International Nuclear Information System (INIS)

    Youngblood, R.W.; Worrell, R.B.

    1995-01-01

    A key step in formulating a regulatory basis for licensing complex and potentially hazardous facilities is identification of a collection of design elements that is necessary and sufficient to achieve the desired level of protection of the public, the workers, and the environment. Here, such a collection of design elements will be called a ''prevention set.'' At the design stage, identifying a prevention set helps to determine what elements to include in the final design. Separately, a prevention-set argument could be used to limit the scope of regulatory oversight to a subset of design elements. This step can be taken during initial review of a design, or later as part of an effort to justify relief from regulatory requirements that are burdensome but provide little risk reduction. This paper presents a systematic approach to the problem of optimally choosing a prevention set

  11. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  12. Epidemic processes in complex networks

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  13. An algebra of discrete event processes

    Science.gov (United States)

    Heymann, Michael; Meyer, George

    1991-01-01

    This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.

  14. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  15. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  16. Atmospheric processes over complex terrain

    Science.gov (United States)

    Banta, Robert M.; Berri, G.; Blumen, William; Carruthers, David J.; Dalu, G. A.; Durran, Dale R.; Egger, Joseph; Garratt, J. R.; Hanna, Steven R.; Hunt, J. C. R.

    1990-06-01

    A workshop on atmospheric processes over complex terrain, sponsored by the American Meteorological Society, was convened in Park City, Utah from 24 vto 28 October 1988. The overall objective of the workshop was one of interaction and synthesis--interaction among atmospheric scientists carrying out research on a variety of orographic flow problems, and a synthesis of their results and points of view into an assessment of the current status of topical research problems. The final day of the workshop was devoted to an open discussion on the research directions that could be anticipated in the next decade because of new and planned instrumentation and observational networks, the recent emphasis on development of mesoscale numerical models, and continual theoretical investigations of thermally forced flows, orographic waves, and stratified turbulence. This monograph represents an outgrowth of the Park City Workshop. The authors have contributed chapters based on their lecture material. Workshop discussions indicated interest in both the remote sensing and predictability of orographic flows. These chapters were solicited following the workshop in order to provide a more balanced view of current progress and future directions in research on atmospheric processes over complex terrain.

  17. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  18. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  19. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project

  20. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    International Nuclear Information System (INIS)

    Jaros, W.

    2005-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  1. Waste Form Features, Events, and Processes

    International Nuclear Information System (INIS)

    R. Schreiner

    2004-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the waste form features, events and processes (FEPs) with respect to modeling used to support the Total System Performance Assessment for License Application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical bases for screening decisions. This information is required by the Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs addressed in this report deal with the issues related to the degradation and potential failure of the waste form and the migration of the waste form colloids. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA, (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical bases for exclusion from TSPA-LA (i.e., why the FEP is excluded). This revision addresses the TSPA-LA FEP list (DTN: MO0407SEPFEPLA.000 [DIRS 170760]). The primary purpose of this report is to identify and document the analyses and resolution of the features, events, and processes (FEPs) associated with the waste form performance in the repository. Forty FEPs were identified that are associated with the waste form performance. This report has been prepared to document the screening methodology used in the process of FEP inclusion and exclusion. The analyses documented in this report are for the license application (LA) base case design (BSC 2004 [DIRS 168489]). In this design, a drip shield is placed over the waste package and no backfill is placed over the drip shield (BSC 2004 [DIRS 168489]). Each FEP may include one or more specific issues that are collectively described by a FEP name and a FEP description. The FEP description may encompass a single feature, process or event, or a few closely related or coupled processes if the entire FEP can be addressed by a single specific screening argument or TSPA-LA disposition. The FEPs are

  2. Waste Form Features, Events, and Processes

    Energy Technology Data Exchange (ETDEWEB)

    R. Schreiner

    2004-10-27

    The purpose of this report is to evaluate and document the inclusion or exclusion of the waste form features, events and processes (FEPs) with respect to modeling used to support the Total System Performance Assessment for License Application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical bases for screening decisions. This information is required by the Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs addressed in this report deal with the issues related to the degradation and potential failure of the waste form and the migration of the waste form colloids. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA, (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical bases for exclusion from TSPA-LA (i.e., why the FEP is excluded). This revision addresses the TSPA-LA FEP list (DTN: MO0407SEPFEPLA.000 [DIRS 170760]). The primary purpose of this report is to identify and document the analyses and resolution of the features, events, and processes (FEPs) associated with the waste form performance in the repository. Forty FEPs were identified that are associated with the waste form performance. This report has been prepared to document the screening methodology used in the process of FEP inclusion and exclusion. The analyses documented in this report are for the license application (LA) base case design (BSC 2004 [DIRS 168489]). In this design, a drip shield is placed over the waste package and no backfill is placed over the drip shield (BSC 2004 [DIRS 168489]). Each FEP may include one or more specific issues that are collectively described by a FEP name and a FEP description. The FEP description may encompass a single feature, process or event, or a few closely related or coupled processes if the entire FEP can be addressed by a single specific screening argument or TSPA-LA disposition. The FEPs are

  3. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  4. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  5. Event-Driven Process Chains (EPC)

    Science.gov (United States)

    Mendling, Jan

    This chapter provides a comprehensive overview of Event-driven Process Chains (EPCs) and introduces a novel definition of EPC semantics. EPCs became popular in the 1990s as a conceptual business process modeling language in the context of reference modeling. Reference modeling refers to the documentation of generic business operations in a model such as service processes in the telecommunications sector, for example. It is claimed that reference models can be reused and adapted as best-practice recommendations in individual companies (see [230, 168, 229, 131, 400, 401, 446, 127, 362, 126]). The roots of reference modeling can be traced back to the Kölner Integrationsmodell (KIM) [146, 147] that was developed in the 1960s and 1970s. In the 1990s, the Institute of Information Systems (IWi) in Saarbrücken worked on a project with SAP to define a suitable business process modeling language to document the processes of the SAP R/3 enterprise resource planning system. There were two results from this joint effort: the definition of EPCs [210] and the documentation of the SAP system in the SAP Reference Model (see [92, 211]). The extensive database of this reference model contains almost 10,000 sub-models: 604 of them non-trivial EPC business process models. The SAP Reference model had a huge impact with several researchers referring to it in their publications (see [473, 235, 127, 362, 281, 427, 415]) as well as motivating the creation of EPC reference models in further domains including computer integrated manufacturing [377, 379], logistics [229] or retail [52]. The wide-spread application of EPCs in business process modeling theory and practice is supported by their coverage in seminal text books for business process management and information systems in general (see [378, 380, 49, 384, 167, 240]). EPCs are frequently used in practice due to a high user acceptance [376] and extensive tool support. Some examples of tools that support EPCs are ARIS Toolset by IDS

  6. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  7. Features, Events, and Processes: system Level

    International Nuclear Information System (INIS)

    D. McGregor

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760)

  8. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    International Nuclear Information System (INIS)

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  9. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  10. Optimizing access to conditions data in ATLAS event data processing

    CERN Document Server

    Rinaldi, Lorenzo; The ATLAS collaboration

    2018-01-01

    The processing of ATLAS event data requires access to conditions data which is stored in database systems. This data includes, for example alignment, calibration, and configuration information that may be characterized by large volumes, diverse content, and/or information which evolves over time as refinements are made in those conditions. Additional layers of complexity are added by the need to provide this information across the world-wide ATLAS computing grid and the sheer number of simultaneously executing processes on the grid, each demanding a unique set of conditions to proceed. Distributing this data to all the processes that require it in an efficient manner has proven to be an increasing challenge with the growing needs and number of event-wise tasks. In this presentation, we briefly describe the systems in which we have collected information about the use of conditions in event data processing. We then proceed to explain how this information has been used to refine not only reconstruction software ...

  11. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex

    Science.gov (United States)

    Keidel, James L.; Ing, Leslie P.; Horner, Aidan J.

    2015-01-01

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or “schemas”). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. SIGNIFICANCE STATEMENT Memories are strengthened via consolidation. We investigated memory for lifelike events using video clips and showed that rehearsing their content dramatically boosts memory consolidation. Using MRI scanning, we measured patterns of brain activity while

  12. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The

  13. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex.

    Science.gov (United States)

    Bird, Chris M; Keidel, James L; Ing, Leslie P; Horner, Aidan J; Burgess, Neil

    2015-10-28

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or "schemas"). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. Copyright © 2015 Bird, Keidel et al.

  14. Information processing in complex networks

    OpenAIRE

    Quax, R.

    2013-01-01

    Eerste resultaten van onderzoek van Rick Quax suggereren dat een combinatie van informatietheorie, netwerktheorie en statistische mechanica kan leiden tot een veelbelovende theorie om het gedrag van complexe netwerken te voorspellen. Er bestaat nog weinig theorie over het gedrag van dynamische eenheden die verbonden zijn in een netwerk, zoals neuronen in een breinnetwerk of genen in een gen-regulatienetwerk. Quax combineert informatietheorie, netwerktheorie, en statistische onderzoeken en mec...

  15. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  16. Investigating source processes of isotropic events

    Science.gov (United States)

    Chiang, Andrea

    This dissertation demonstrates the utility of the complete waveform regional moment tensor inversion for nuclear event discrimination. I explore the source processes and associated uncertainties for explosions and earthquakes under the effects of limited station coverage, compound seismic sources, assumptions in velocity models and the corresponding Green's functions, and the effects of shallow source depth and free-surface conditions. The motivation to develop better techniques to obtain reliable source mechanism and assess uncertainties is not limited to nuclear monitoring, but they also provide quantitative information about the characteristics of seismic hazards, local and regional tectonics and in-situ stress fields of the region . This dissertation begins with the analysis of three sparsely recorded events: the 14 September 1988 US-Soviet Joint Verification Experiment (JVE) nuclear test at the Semipalatinsk test site in Eastern Kazakhstan, and two nuclear explosions at the Chinese Lop Nor test site. We utilize a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long period waveforms and first motion observations provides unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We examine the effects of the free surface on the moment tensor via synthetic testing, and apply the moment tensor based discrimination method to well-recorded chemical explosions. These shallow chemical explosions represent rather severe source-station geometry in terms of the vanishing traction issues. We show that the combined waveform and first motion method enables the unique discrimination of these events, even though the data include unmodeled single force components resulting from the collapse and blowout of the quarry face immediately following the initial

  17. The underlying event in hard scattering processes

    International Nuclear Information System (INIS)

    Field, R.

    2002-01-01

    The authors study the behavior of the underlying event in hard scattering proton-antiproton collisions at 1.8 TeV and compare with the QCD Monte-Carlo models. The underlying event is everything except the two outgoing hard scattered jets and receives contributions from the beam-beam remnants plus initial and final-state radiation. The data indicate that neither ISAJET or HERWIG produce enough charged particles (with p T > 0.5 GeV/c) from the beam-beam remnant component and that ISAJET produces too many charged particles from initial-state radiation. PYTHIA which uses multiple parton scattering to enhance the underlying event does the best job describing the data

  18. Hydrometalurgical processes for mineral complexes

    International Nuclear Information System (INIS)

    Barskij, L.A.; Danil'chenko, L.M.

    1977-01-01

    Requirements for the technology of the processing of ores including uranium ores and principal stages of the working out of technological schemes are described in brief. There are reference data on commercial minerals and ores including uranium-thorium ores, their classification with due regard for physical, chemical and superficial properties which form the basis for ore-concentrating processes. There are also presented the classification of minerals including uranium minerals by their flotation ability, flotation regimes of minerals, structural-textural characteristics of ores, genetic types of ore formations and their concentrating ability, algorithmization of the apriori evaluation of the concentration and technological diagnostics of the processing of ores. The classification of ore concentration technique is suggested

  19. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  20. Alternating event processes during lifetimes: population dynamics and statistical inference.

    Science.gov (United States)

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  1. Post-Event Processing in Children with Social Phobia

    Science.gov (United States)

    Schmitz, Julian; Kramer, Martina; Blechert, Jens; Tuschen-Caffier, Brunna

    2010-01-01

    In the aftermath of a distressing social event, adults with social phobia (SP) engage in a review of this event with a focus on its negative aspects. To date, little is known about this post-event processing (PEP) and its relationship with perceived performance in SP children. We measured PEP in SP children (n = 24) and healthy controls (HC; n =…

  2. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  3. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  4. A review for identification of initiating events in event tree development process on nuclear power plants

    International Nuclear Information System (INIS)

    Riyadi, Eko H.

    2014-01-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events

  5. A review for identification of initiating events in event tree development process on nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id [Center for Regulatory Assessment of Nuclear Installation and Materials, Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada 8 Jakarta 10120 (Indonesia)

    2014-09-30

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  6. Effective Complexity of Stationary Process Realizations

    Directory of Open Access Journals (Sweden)

    Arleta Szkoła

    2011-06-01

    Full Text Available The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous paper [1] we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that needs less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.

  7. Out-of-order event processing in kinetic data structures

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Agrawal, Pankaj

    2011-01-01

    ’s for the maintenance of several fundamental structures such as kinetic sorting and kinetic tournament trees, which overcome the difficulty by employing a refined event scheduling and processing technique. We prove that the new event scheduling mechanism leads to a KDS that is correct except for finitely many short......We study the problem of designing kinetic data structures (KDS’s for short) when event times cannot be computed exactly and events may be processed in a wrong order. In traditional KDS’s this can lead to major inconsistencies from which the KDS cannot recover. We present more robust KDS...

  8. A Process for Predicting Manhole Events in Manhattan

    OpenAIRE

    Isaac, Delfina F.; Ierome, Steve; Dutta, Haimonti; Radeva, Axinia; Passonneau, Rebecca J.; Rudin, Cynthia

    2009-01-01

    We present a knowledge discovery and data mining process developed as part of the Columbia/Con Edison project on manhole event prediction. This process can assist with real-world prioritization problems that involve raw data in the form of noisy documents requiring significant amounts of pre-processing. The documents are linked to a set of instances to be ranked according to prediction criteria. In the case of manhole event prediction, which is a new application for machine learning, the goal...

  9. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  10. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  11. Event processing for business organizing the real-time enterprise

    CERN Document Server

    Luckham, David C

    2011-01-01

    Find out how Events Processing (EP) works and how it can workfor you Business Event Processing: An Introduction and StrategyGuide thoroughly describes what EP is, how to use it, and howit relates to other popular information technology architecturessuch as Service Oriented Architecture. Explains how sense and response architectures are being appliedwith tremendous results to businesses throughout the world andshows businesses how they can get started implementing EPShows how to choose business event processing technology tosuit your specific business needs and how to keep costs of adoptingit

  12. Evaluating and predicting overall process risk using event logs

    NARCIS (Netherlands)

    Pika, A.; Van Der Aalst, W.M.P.; Wynn, M.T.; Fidge, C.J.; Ter Hofstede, A.H.M.

    2016-01-01

    Companies standardise and automate their business processes in order to improve process efficiency and minimise operational risks. However, it is difficult to eliminate all process risks during the process design stage due to the fact that processes often run in complex and changeable environments

  13. Phonological Processes in Complex and Compound Words

    Directory of Open Access Journals (Sweden)

    Alieh Kord Zaferanlu Kambuziya

    2016-02-01

    Full Text Available Abstract This research at making a comparison between phonological processes in complex and compound Persian words. Data are gathered from a 40,000-word Persian dictionary. To catch some results, 4,034 complex words and 1,464 compound ones are chosen. To count the data, "excel" software is used. Some results of the research are: 1- "Insertion" is the usual phonological process in complex words. More than half of different insertions belongs to the consonant /g/. Then /y/ and // are in the second and the third order. The consonant /v/ has the least percentage of all. The most percentage of vowel insertion belongs to /e/. The vowels /a/ and /o/ are in the second and third order. Deletion in complex words can only be seen in consonant /t/ and vowel /e/. 2- The most frequent phonological processes in compounds is consonant deletion. In this process, seven different consonants including /t/, //, /m/, /r/, / ǰ/, /d, and /c/. The only deleted vowel is /e/. In both groups of complex and compound, /t/ deletion can be observed. A sequence of three consonants paves the way for the deletion of one of the consonants, if one of the sequences is a sonorant one like /n/, the deletion process rarely happens. 3- In complex words, consonant deletion causes a lighter syllable weight, whereas vowel deletion causes a heavier syllable weight. So, both of the processes lead to bi-moraic weight. 4- The production of bi-moraic syllable in Persian is preferable to Syllable Contact Law. So, Specific Rules have precedence to Universals. 5- Vowel insertion can be seen in both groups of complex and compound words. In complex words, /e/ insertion has the most fundamental part. The vowels /a/ and /o/ are in the second and third place. Whenever there are two sequences of ultra-heavy syllables. By vowel insertion, the first syllable is broken into two light syllables. The compounds that are influenced by vowel insertion, can be and are pronounced without any insertion

  14. Towards a methodology for the engineering of event-driven process applications

    NARCIS (Netherlands)

    Baumgrass, A.; Botezatu, M.; Di Ciccio, C.; Dijkman, R.M.; Grefen, P.W.P.J.; Hewelt, M.; Mendling, J.; Meyer, A.; Pourmirza, S.; Völzer, H.; Reijers, H.; Reichert, M.

    2016-01-01

    Successful applications of the Internet of Things such as smart cities, smart logistics, and predictive maintenance, build on observing and analyzing business-related objects in the real world for business process execution and monitoring. In this context, complex event processing is increasingly

  15. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  16. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  17. Self-Complexity, Daily Events, and Perceived Quality of Life.

    Science.gov (United States)

    Kardash, CarolAnne M.; Okun, Morris A.

    Recent research has demonstrated that self-cognitions can play an important role in physical and emotional well-being. One important aspect of self-cognition concerns the complexity of self-representations. This study tested the hypothesis that self-complexity, as assessed by Linville's self-trait sorting task, would moderate the effects of…

  18. Consequence Prioritization Process for Potential High Consequence Events (HCE)

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Sarah G. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-10-31

    This document describes the process for Consequence Prioritization, the first phase of the Consequence-Driven Cyber-Informed Engineering (CCE) framework. The primary goal of Consequence Prioritization is to identify potential disruptive events that would significantly inhibit an organization’s ability to provide the critical services and functions deemed fundamental to their business mission. These disruptive events, defined as High Consequence Events (HCE), include both events that have occurred or could be realized through an attack of critical infrastructure owner assets. While other efforts have been initiated to identify and mitigate disruptive events at the national security level, such as Presidential Policy Directive 41 (PPD-41), this process is intended to be used by individual organizations to evaluate events that fall below the threshold for a national security. Described another way, Consequence Prioritization considers threats greater than those addressable by standard cyber-hygiene and includes the consideration of events that go beyond a traditional continuity of operations (COOP) perspective. Finally, Consequence Prioritization is most successful when organizations adopt a multi-disciplinary approach, engaging both cyber security and engineering expertise, as in-depth engineering perspectives are required to recognize and characterize and mitigate HCEs. Figure 1 provides a high-level overview of the prioritization process.

  19. Designing and Securing an Event Processing System for Smart Spaces

    Science.gov (United States)

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  20. Software for event oriented processing on multiprocessor systems

    International Nuclear Information System (INIS)

    Fischler, M.; Areti, H.; Biel, J.; Bracker, S.; Case, G.; Gaines, I.; Husby, D.; Nash, T.

    1984-08-01

    Computing intensive problems that require the processing of numerous essentially independent events are natural customers for large scale multi-microprocessor systems. This paper describes the software required to support users with such problems in a multiprocessor environment. It is based on experience with and development work aimed at processing very large amounts of high energy physics data

  1. Process cubes : slicing, dicing, rolling up and drilling down event data for process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2013-01-01

    Recent breakthroughs in process mining research make it possible to discover, analyze, and improve business processes based on event data. The growth of event data provides many opportunities but also imposes new challenges. Process mining is typically done for an isolated well-defined process in

  2. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer science s, information science , bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www. science direct.com/ science /article/pii/S0005109818301730

  3. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www.sciencedirect.com/science/article/pii/S0005109818301730

  4. [Emerging infectious diseases: complex, unpredictable processes].

    Science.gov (United States)

    Guégan, Jean-François

    2016-01-01

    In the light of a double approach, at first empirical, later theoretical and comparative, illustrated by the example of the Buruli ulcer and its mycobacterial agent Mycobacterium ulcerans on which I focused my research activity these last ten years by studying determinants and factors of emerging infectious or parasitic diseases, the complexity of events explaining emerging diseases will be presented. The cascade of events occurring at various levels of spatiotemporal scales and organization of life, which lead to the numerous observed emergences, nowadays requires better taking into account the interactions between host(s), pathogen(s) and the environment by including the behavior of both individuals and the population. In numerous research studies on emerging infectious diseases, microbial hazard is described rather than infectious disease risk, the latter resulting from the confrontation between an association of threatening phenomena, or hazards, and a susceptible population. Beyond, the theme of emerging infectious diseases and its links with global environmental and societal changes leads to reconsider some well-established knowledge in infectiology and parasitology. © Société de Biologie, 2017.

  5. Management of a Complex Open Channel Network During Flood Events

    Science.gov (United States)

    Franchini, M.; Valiani, A.; Schippa, L.; Mascellani, G.

    2003-04-01

    Most part of the area around Ferrara (Italy) is below the mean sea level and an extensive drainage system combined with several pump stations allows the use of this area for both urban development and industrial and agricultural activities. The three main channels of this hydraulic system constitute the Ferrara Inland Waterway (total length approximately 70 km), which connects the Po river near Ferrara to the sea. Because of the level difference between the upstream and dowstream ends of the waterway, three locks are located along it, each of them combined with a set of gates to control the water levels. During rainfall events, most of the water of the basin flows into the waterway and heavy precipitations sometimes cause flooding in several areas. This is due to the insufficiency of the channel network dimensions and an inadequate manual operation of the gates. This study presents a hydrological-hydraulic model for the entire Ferrara basin and a system of rules in order to operate the gates. In particular, their opening is designed to be regulated in real time by monitoring the water level in several sections along the channels. Besides flood peak attenuation, this operation strategy contributes also to the maintenance of a constant water level for irrigation and fluvial navigation during the dry periods. With reference to the flood event of May 1996, it is shown that this floodgate operation policy, unlike that which was actually adopted during that event, would lead to a significant flood peak attenuation, avoiding flooding in the area upstream of Ferrara.

  6. Notification Event Architecture for Traveler Screening: Predictive Traveler Screening Using Event Driven Business Process Management

    Science.gov (United States)

    Lynch, John Kenneth

    2013-01-01

    Using an exploratory model of the 9/11 terrorists, this research investigates the linkages between Event Driven Business Process Management (edBPM) and decision making. Although the literature on the role of technology in efficient and effective decision making is extensive, research has yet to quantify the benefit of using edBPM to aid the…

  7. Radiochemical data collected on events from which radioactivity escaped beyond the borders of the Nevada test range complex

    International Nuclear Information System (INIS)

    Hicks, H.G.

    1981-01-01

    This report identifies all nuclear events in Nevada that are known to have sent radioactivity beyond the borders of the test range complex. There have been 177 such tests, representing seven different types: nuclear detonations in the atmosphere, nuclear excavation events, nuclear safety events, underground nuclear events that inadvertently seeped or vented to the atmosphere, dispersion of plutonium and/or uranium by chemical high explosives, nuclear rocket engine tests, and nuclear ramjet engine tests. The source term for each of these events is given, together with the data base from which it was derived (except where the data are classified). The computer programs used for organizing and processing the data base and calculating radionuclide production are described and included, together with the input and output data and details of the calculations. This is the basic formation needed to make computer modeling studies of the fallout from any of these 177 events

  8. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  9. Cognitive complexity of the medical record is a risk factor for major adverse events.

    Science.gov (United States)

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  10. Complex plasmochemical processing of solid fuel

    Directory of Open Access Journals (Sweden)

    Vladimir Messerle

    2012-12-01

    Full Text Available Technology of complex plasmaochemical processing of solid fuel by Ecibastuz bituminous and Turgay brown coals is presented. Thermodynamic and experimental study of the technology was fulfilled. Use of this technology allows producing of synthesis gas from organic mass of coal and valuable components (technical silicon, ferrosilicon, aluminum and silicon carbide and microelements of rare metals: uranium, molybdenum, vanadium etc. from mineral mass of coal. Produced a high-calorific synthesis gas can be used for methanol synthesis, as high-grade reducing gas instead of coke, as well as energy gas in thermal power plants.

  11. Complex diffusion process for noise reduction

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Barari, A.

    2014-01-01

    equations (PDEs) in image restoration and de-noising prompted many researchers to search for an improvement in the technique. In this paper, a new method is presented for signal de-noising, based on PDEs and Schrodinger equations, named as complex diffusion process (CDP). This method assumes that variations...... for signal de-noising. To evaluate the performance of the proposed method, a number of experiments have been performed using Sinusoid, multi-component and FM signals cluttered with noise. The results indicate that the proposed method outperforms the approaches for signal de-noising known in prior art....

  12. Offside Decisions by Expert Assistant Referees in Association Football: Perception and Recall of Spatial Positions in Complex Dynamic Events

    Science.gov (United States)

    Gilis, Bart; Helsen, Werner; Catteeuw, Peter; Wagemans, Johan

    2008-01-01

    This study investigated the offside decision-making process in association football. The first aim was to capture the specific offside decision-making skills in complex dynamic events. Second, we analyzed the type of errors to investigate the factors leading to incorrect decisions. Federation Internationale de Football Association (FIFA; n = 29)…

  13. Theory of mind for processing unexpected events across contexts.

    Science.gov (United States)

    Dungan, James A; Stepanovic, Michael; Young, Liane

    2016-08-01

    Theory of mind, or mental state reasoning, may be particularly useful for making sense of unexpected events. Here, we investigated unexpected behavior across both social and non-social contexts in order to characterize the precise role of theory of mind in processing unexpected events. We used functional magnetic resonance imaging to examine how people respond to unexpected outcomes when initial expectations were based on (i) an object's prior behavior, (ii) an agent's prior behavior and (iii) an agent's mental states. Consistent with prior work, brain regions for theory of mind were preferentially recruited when people first formed expectations about social agents vs non-social objects. Critically, unexpected vs expected outcomes elicited greater activity in dorsomedial prefrontal cortex, which also discriminated in its spatial pattern of activity between unexpected and expected outcomes for social events. In contrast, social vs non-social events elicited greater activity in precuneus across both expected and unexpected outcomes. Finally, given prior information about an agent's behavior, unexpected vs expected outcomes elicited an especially robust response in right temporoparietal junction, and the magnitude of this difference across participants correlated negatively with autistic-like traits. Together, these findings illuminate the distinct contributions of brain regions for theory of mind for processing unexpected events across contexts. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. Client-Side Event Processing for Personalized Web Advertisement

    Science.gov (United States)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.

  15. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  16. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  17. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  18. Processing ser and estar to locate objects and events

    Science.gov (United States)

    Dussias, Paola E.; Contemori, Carla; Román, Patricia

    2016-01-01

    In Spanish locative constructions, a different form of the copula is selected in relation to the semantic properties of the grammatical subject: sentences that locate objects require estar while those that locate events require ser (both translated in English as ‘to be’). In an ERP study, we examined whether second language (L2) speakers of Spanish are sensitive to the selectional restrictions that the different types of subjects impose on the choice of the two copulas. Twenty-four native speakers of Spanish and two groups of L2 Spanish speakers (24 beginners and 18 advanced speakers) were recruited to investigate the processing of ‘object/event + estar/ser’ permutations. Participants provided grammaticality judgments on correct (object + estar; event + ser) and incorrect (object + ser; event + estar) sentences while their brain activity was recorded. In line with previous studies (Leone-Fernández, Molinaro, Carreiras, & Barber, 2012; Sera, Gathje, & Pintado, 1999), the results of the grammaticality judgment for the native speakers showed that participants correctly accepted object + estar and event + ser constructions. In addition, while ‘object + ser’ constructions were considered grossly ungrammatical, ‘event + estar’ combinations were perceived as unacceptable to a lesser degree. For these same participants, ERP recording time-locked to the onset of the critical word ‘en’ showed a larger P600 for the ser predicates when the subject was an object than when it was an event (*La silla es en la cocina vs. La fiesta es en la cocina). This P600 effect is consistent with syntactic repair of the defining predicate when it does not fit with the adequate semantic properties of the subject. For estar predicates (La silla está en la cocina vs. *La fiesta está en la cocina), the findings showed a central-frontal negativity between 500–700 ms. Grammaticality judgment data for the L2 speakers of Spanish showed that beginners were significantly less

  19. Self-Exciting Point Process Modeling of Conversation Event Sequences

    Science.gov (United States)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  20. Perceptual processing of a complex musical context

    DEFF Research Database (Denmark)

    Quiroga Martinez, David Ricardo; Hansen, Niels Christian; Højlund, Andreas

    play a fundamental role in music perception. The mismatch negativity (MMN) is a brain response that offers a unique insight into these processes. The MMN is elicited by deviants in a series of repetitive sounds and reflects the perception of change in physical and abstract sound regularities. Therefore......, it is regarded as a prediction error signal and a neural correlate of the updating of predictive perceptual models. In music, the MMN has been particularly valuable for the assessment of musical expectations, learning and expertise. However, the MMN paradigm has an important limitation: its ecological validity....... To this aim we will develop a new paradigm using more real-sounding stimuli. Our stimuli will be two-part music excerpts made by adding a melody to a previous design based on the Alberti bass (Vuust et al., 2011). Our second goal is to determine how the complexity of this context affects the predictive...

  1. Mapping stochastic processes onto complex networks

    International Nuclear Information System (INIS)

    Shirazi, A H; Reza Jafari, G; Davoudi, J; Peinke, J; Reza Rahimi Tabar, M; Sahimi, Muhammad

    2009-01-01

    We introduce a method by which stochastic processes are mapped onto complex networks. As examples, we construct the networks for such time series as those for free-jet and low-temperature helium turbulence, the German stock market index (the DAX), and white noise. The networks are further studied by contrasting their geometrical properties, such as the mean length, diameter, clustering, and average number of connections per node. By comparing the network properties of the original time series investigated with those for the shuffled and surrogate series, we are able to quantify the effect of the long-range correlations and the fatness of the probability distribution functions of the series on the networks constructed. Most importantly, we demonstrate that the time series can be reconstructed with high precision by means of a simple random walk on their corresponding networks

  2. Event-Related Potentials and Emotion Processing in Child Psychopathology

    Directory of Open Access Journals (Sweden)

    Georgia eChronaki

    2016-04-01

    Full Text Available In recent years there has been increasing interest in the neural mechanisms underlying altered emotional processes in children and adolescents with psychopathology. This review provides a brief overview of the most up-to-date findings in the field of Event-Related Potentials (ERPs to facial and vocal emotional expressions in the most common child psychopathological conditions. In regards to externalising behaviour (i.e. ADHD, CD, ERP studies show enhanced early components to anger, reflecting enhanced sensory processing, followed by reductions in later components to anger, reflecting reduced cognitive-evaluative processing. In regards to internalising behaviour, research supports models of increased processing of threat stimuli especially at later more elaborate and effortful stages. Finally, in autism spectrum disorders abnormalities have been observed at early visual-perceptual stages of processing. An affective neuroscience framework for understanding child psychopathology can be valuable in elucidating underlying mechanisms and inform preventive intervention.

  3. Stressful life events and psychological dysfunction in complex regional pain syndrome type I

    NARCIS (Netherlands)

    Geertzen, JHB; de Bruijn-Kofman, AT; de Bruijn, HP; van de Wiel, HBM; Dijkstra, PU

    Objective: To determine to what extent stressful life events and psychological dysfunction play a role in the pathogenesis of Complex Regional Pain Syndrome type I (CRPS). Design: A comparative study between a CRPS group and a control group. Stressful life events and psychological dysfunction

  4. Aridity and decomposition processes in complex landscapes

    Science.gov (United States)

    Ossola, Alessandro; Nyman, Petter

    2015-04-01

    Decomposition of organic matter is a key biogeochemical process contributing to nutrient cycles, carbon fluxes and soil development. The activity of decomposers depends on microclimate, with temperature and rainfall being major drivers. In complex terrain the fine-scale variation in microclimate (and hence water availability) as a result of slope orientation is caused by differences in incoming radiation and surface temperature. Aridity, measured as the long-term balance between net radiation and rainfall, is a metric that can be used to represent variations in water availability within the landscape. Since aridity metrics can be obtained at fine spatial scales, they could theoretically be used to investigate how decomposition processes vary across complex landscapes. In this study, four research sites were selected in tall open sclerophyll forest along a aridity gradient (Budyko dryness index ranging from 1.56 -2.22) where microclimate, litter moisture and soil moisture were monitored continuously for one year. Litter bags were packed to estimate decomposition rates (k) using leaves of a tree species not present in the study area (Eucalyptus globulus) in order to avoid home-field advantage effects. Litter mass loss was measured to assess the activity of macro-decomposers (6mm litter bag mesh size), meso-decomposers (1 mm mesh), microbes above-ground (0.2 mm mesh) and microbes below-ground (2 cm depth, 0.2 mm mesh). Four replicates for each set of bags were installed at each site and bags were collected at 1, 2, 4, 7 and 12 months since installation. We first tested whether differences in microclimate due to slope orientation have significant effects on decomposition processes. Then the dryness index was related to decomposition rates to evaluate if small-scale variation in decomposition can be predicted using readily available information on rainfall and radiation. Decomposition rates (k), calculated fitting single pool negative exponential models, generally

  5. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  6. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  7. Managing the Organizational and Cultural Precursors to Major Events — Recognising and Addressing Complexity

    International Nuclear Information System (INIS)

    Taylor, R. H.; Carhart, N.; May, J.; Wijk, L. G. A. van

    2016-01-01

    illustration will be given of the use of Hierarchical Process Modelling (HPM) to develop a vulnerability tool using the question sets. However, to understand the issues involved more fully, requires the development of models and associated tools which recognise the complexity and interactive nature of the organizational and cultural issues involved. Various repeating patterns of system failure appear in most of the events studied. Techniques such as System Dynamics (SD) can be used to ‘map’ these processes and capture the complexity involved. This highlights interdependencies, incubating vulnerabilities and the impact of time lags within systems. Two examples will be given. In almost all of the events studied, there has been a strong disconnect between the knowledge and aspirations of senior management and those planning and carrying out operations. There has, for example, frequently been a failure to ensure that information flows up and down the management chain are effective. It has often led to conflicts between the need to maintain safety standards through exercising a cautious and questioning attitude in the light of uncertainty and the need to meet production and cost targets. Business pressures have led to shortcuts, failure to provide sufficient oversight so that leaders are aware of the true picture of process and nuclear safety at operational level (often leading to organizational ‘drift’), normalisation of risks, and the establishment of a ‘good news culture’. The development of this disconnect and its consequences have been shown to be interdependent, dynamic and complex. A second example is that of gaining a better appreciation of the deeper factors involved in managing the supply chain and, in particular, of the interface with contractors. Initiating projects with unclear accountabilities and to unrealistic timescales, together with a lack of clarity about the cost implications when safety-related concerns are reported and need to be addressed, have

  8. Event processing in X-IFU detector onboard Athena.

    Science.gov (United States)

    Ceballos, M. T.; Cobos, B.; van der Kuurs, J.; Fraga-Encinas, R.

    2015-05-01

    The X-ray Observatory ATHENA was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA's Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard ATHENA is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by a consortium of European research institutions currently from France (leadership), Italy, The Netherlands, Belgium, UK, Germany and Spain. From Spain, IFCA (CSIC-UC) is involved in the Digital Readout Electronics (DRE) unit of the X-IFU detector, in particular in the Event Processor Subsytem. We at IFCA are in charge of the development and implementation in the DRE unit of the Event Processing algorithms, designed to recognize, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, and lately extract their main parameters (coordinates, energy, arrival time, grade, etc.) Here we will present the design and performance of the algorithms developed for the event recognition (adjusted derivative), and pulse grading/qualification as well as the progress in the algorithms designed to extract the energy content of the pulses (pulse optimal filtering). IFCA will finally have the responsibility of the implementation on board in the (TBD) FPGAs or micro-processors of the DRE unit, where this Event Processing part will take place, to fit into the limited telemetry of the instrument.

  9. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    International Nuclear Information System (INIS)

    D.L. McGregor

    2000-01-01

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process

  10. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  11. MEG event-related desynchronization and synchronization deficits during basic somatosensory processing in individuals with ADHD

    Directory of Open Access Journals (Sweden)

    Wang Frank

    2008-02-01

    Full Text Available Abstract Background Attention-Deficit/Hyperactivity Disorder (ADHD is a prevalent, complex disorder which is characterized by symptoms of inattention, hyperactivity, and impulsivity. Convergent evidence from neurobiological studies of ADHD identifies dysfunction in fronto-striatal-cerebellar circuitry as the source of behavioural deficits. Recent studies have shown that regions governing basic sensory processing, such as the somatosensory cortex, show abnormalities in those with ADHD suggesting that these processes may also be compromised. Methods We used event-related magnetoencephalography (MEG to examine patterns of cortical rhythms in the primary (SI and secondary (SII somatosensory cortices in response to median nerve stimulation, in 9 adults with ADHD and 10 healthy controls. Stimuli were brief (0.2 ms non-painful electrical pulses presented to the median nerve in two counterbalanced conditions: unpredictable and predictable stimulus presentation. We measured changes in strength, synchronicity, and frequency of cortical rhythms. Results Healthy comparison group showed strong event-related desynchrony and synchrony in SI and SII. By contrast, those with ADHD showed significantly weaker event-related desynchrony and event-related synchrony in the alpha (8–12 Hz and beta (15–30 Hz bands, respectively. This was most striking during random presentation of median nerve stimulation. Adults with ADHD showed significantly shorter duration of beta rebound in both SI and SII except for when the onset of the stimulus event could be predicted. In this case, the rhythmicity of SI (but not SII in the ADHD group did not differ from that of controls. Conclusion Our findings suggest that somatosensory processing is altered in individuals with ADHD. MEG constitutes a promising approach to profiling patterns of neural activity during the processing of sensory input (e.g., detection of a tactile stimulus, stimulus predictability and facilitating our

  12. Features, events and processes evaluation catalogue for argillaceous media

    International Nuclear Information System (INIS)

    Mazurek, M.; Pearson, F.J.; Volckaert, G.; Bock, H.

    2003-01-01

    The OECD/NEA Working Group on the Characterisation, the Understanding and the Performance of Argillaceous Rocks as Repository Host Formations for the disposal of radioactive waste (known as the 'Clay Club') launched a project called FEPCAT (Features, Events and Processes Catalogue for argillaceous media) in late 1998. The present report provides the results of work performed by an expert group to develop a FEPs database related to argillaceous formations, whether soft or indurated. It describes the methodology used for the work performed, provides a list of relevant FEPs and summarises the knowledge on each of them. It also provides general conclusions and identifies priorities for future work. (authors)

  13. ALADDIN: a neural model for event classification in dynamic processes

    International Nuclear Information System (INIS)

    Roverso, Davide

    1998-02-01

    ALADDIN is a prototype system which combines fuzzy clustering techniques and artificial neural network (ANN) models in a novel approach to the problem of classifying events in dynamic processes. The main motivation for the development of such a system derived originally from the problem of finding new principled methods to perform alarm structuring/suppression in a nuclear power plant (NPP) alarm system. One such method consists in basing the alarm structuring/suppression on a fast recognition of the event generating the alarms, so that a subset of alarms sufficient to efficiently handle the current fault can be selected to be presented to the operator, minimizing in this way the operator's workload in a potentially stressful situation. The scope of application of a system like ALADDIN goes however beyond alarm handling, to include diagnostic tasks in general. The eventual application of the system to domains other than NPPs was also taken into special consideration during the design phase. In this document we report on the first phase of the ALADDIN project which consisted mainly in a comparative study of a series of ANN-based approaches to event classification, and on the proposal of a first system prototype which is to undergo further tests and, eventually, be integrated in existing alarm, diagnosis, and accident management systems such as CASH, IDS, and CAMS. (author)

  14. RNA assemblages orchestrate complex cellular processes

    DEFF Research Database (Denmark)

    Nielsen, Finn Cilius; Hansen, Heidi Theil; Christiansen, Jan

    2016-01-01

    Eukaryotic mRNAs are monocistronic, and therefore mechanisms exist that coordinate the synthesis of multiprotein complexes in order to obtain proper stoichiometry at the appropriate intracellular locations. RNA-binding proteins containing low-complexity sequences are prone to generate liquid drop...

  15. Responses of diatom communities to hydrological processes during rainfall events

    Science.gov (United States)

    Wu, Naicheng; Faber, Claas; Ulrich, Uta; Fohrer, Nicola

    2015-04-01

    The importance of diatoms as a tracer of hydrological processes has been recently recognized (Pfister et al. 2009, Pfister et al. 2011, Tauro et al. 2013). However, diatom variations in a short-term scale (e.g., sub-daily) during rainfall events have not been well documented yet. In this study, rainfall event-based diatom samples were taken at the outlet of the Kielstau catchment (50 km2), a lowland catchment in northern Germany. A total of nine rainfall events were caught from May 2013 to April 2014. Non-metric multidimensional scaling (NMDS) revealed that diatom communities of different events were well separated along NMDS axis I and II, indicating a remarkable temporal variation. By correlating water level (a proxy of discharge) and different diatom indices, close relationships were found. For example, species richness, biovolume (μm3), Shannon diversity and moisture index01 (%, classified according to van Dam et al. 1994) were positively related with water level at the beginning phase of the rainfall (i.e. increasing limb of discharge peak). However, in contrast, during the recession limb of the discharge peak, diatom indices showed distinct responses to water level declines in different rainfall events. These preliminary results indicate that diatom indices are highly related to hydrological processes. The next steps will include finding out the possible mechanisms of the above phenomena, and exploring the contributions of abiotic variables (e.g., hydrologic indices, nutrients) to diatom community patterns. Based on this and ongoing studies (Wu et al. unpublished data), we will incorporate diatom data into End Member Mixing Analysis (EMMA) and select the tracer set that is best suited for separation of different runoff components in our study catchment. Keywords: Diatoms, Rainfall event, Non-metric multidimensional scaling, Hydrological process, Indices References: Pfister L, McDonnell JJ, Wrede S, Hlúbiková D, Matgen P, Fenicia F, Ector L, Hoffmann L

  16. Features, Events and Processes in UZ Flow and Transport

    International Nuclear Information System (INIS)

    P. Persoff

    2005-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA

  17. Features, Events, and Processes in UZ Flow and Transport

    International Nuclear Information System (INIS)

    Persoff, P.

    2004-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA

  18. Features, Events and Processes in UZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    P. Persoff

    2005-08-04

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA.

  19. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  20. Complex active regions as the main source of extreme and large solar proton events

    Science.gov (United States)

    Ishkov, V. N.

    2013-12-01

    A study of solar proton sources indicated that solar flare events responsible for ≥2000 pfu proton fluxes mostly occur in complex active regions (CARs), i.e., in transition structures between active regions and activity complexes. Different classes of similar structures and their relation to solar proton events (SPEs) and evolution, depending on the origination conditions, are considered. Arguments in favor of the fact that sunspot groups with extreme dimensions are CARs are presented. An analysis of the flare activity in a CAR resulted in the detection of "physical" boundaries, which separate magnetic structures of the same polarity and are responsible for the independent development of each structure.

  1. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  2. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  3. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    International Nuclear Information System (INIS)

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning

  4. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  5. Dynamic anticipatory processing of hierarchical sequential events: a common role for Broca's area and ventral premotor cortex across domains?

    Science.gov (United States)

    Fiebach, Christian J; Schubotz, Ricarda I

    2006-05-01

    This paper proposes a domain-general model for the functional contribution of ventral premotor cortex (PMv) and adjacent Broca's area to perceptual, cognitive, and motor processing. We propose to understand this frontal region as a highly flexible sequence processor, with the PMv mapping sequential events onto stored structural templates and Broca's Area involved in more complex, hierarchical or hypersequential processing. This proposal is supported by reference to previous functional neuroimaging studies investigating abstract sequence processing and syntactic processing.

  6. Features, Events, and Processes in UZ and Transport

    Energy Technology Data Exchange (ETDEWEB)

    P. Persoff

    2004-11-06

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA.

  7. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  8. Fostering Organizational Innovation based on modeling the Marketing Research Process through Event-driven Process Chain (EPC

    Directory of Open Access Journals (Sweden)

    Elena Fleacă

    2016-11-01

    Full Text Available Enterprises competing in an actual business framework are required to win and maintain their competitiveness by flexibility, fast reaction and conformation to the changing customers' needs based on innovation of work related to products, services, and internal processes. The paper addresses these challenges which gain more complex bonds in a case of high pressure for innovation. The methodology commences with a literature review of the current knowledge on innovation through business processes management. Secondly, it has been applied the Event-driven Process Chain tool from the scientific literature to model the variables of marketing research process. The findings highlight benefits of marketing research workflow that enhances the value of market information while reducing costs of obtaining it, in a coherent way.

  9. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Academic writing development: a complex, dynamic process

    NARCIS (Netherlands)

    Penris, Wouter; Verspoor, Marjolijn; Pfenniger, Simone; Navracsics, Judit

    2017-01-01

    Traditionally we look at learning outcomes by examining single outcomes. A new and future direction is to look at the actual process of development. Imagine an advanced, 17-year-old student of English (L2) who has just finished secondary school in the Netherlands and wants to become an English

  11. APACS: Monitoring and diagnosis of complex processes

    International Nuclear Information System (INIS)

    Kramer, B.M.; Mylopoulos, J.; Cheng Wang

    1994-01-01

    This paper describes APACS - a new framework for a system that detects, predicts and identifies faults in industrial processes. The APACS frameworks provides a structure in which a heterogeneous set of programs can share a common view of the problem and a common model of the domain. (author). 17 refs, 2 figs

  12. Features, Events, and Processes in SZ Flow and Transport

    International Nuclear Information System (INIS)

    Economy, K.

    2004-01-01

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), (f) (DIRS 156605). This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded)

  13. Features, Events, and Processes in SZ Flow and Transport

    International Nuclear Information System (INIS)

    S. Kuzio

    2005-01-01

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.11(d), (e), (f) [DIRS 173273]. This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded)

  14. Features, Events, and Processes in SZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    K. Economy

    2004-11-16

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), (f) (DIRS 156605). This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded).

  15. Features, Events, and Processes in SZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    S. Kuzio

    2005-08-20

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.11(d), (e), (f) [DIRS 173273]. This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded).

  16. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  17. EFFECTIVE COMPLEX PROCESSING OF RAW TOMATOES

    Directory of Open Access Journals (Sweden)

    AIDA M. GADZHIEVA

    2018-03-01

    Full Text Available Tomatoes grown in the central and southern parts of the country, which contain 5 - 6 % of solids, including 0.13 % of pectin, 0.86 % of fat, 0.5 % of organic acids, 0.5 % minerals, etc. are used as research material. These tomatoes, grown in the mountains, on soils with high salinity, contain high amounts of valuable components and have long term preservation. For the extraction of valuable components from dried tomato pomace, the CO2 extraction method is applied. The technological and environmental feasibility of graded tomato drying in the atmosphere of an inert gas and in a solar drier is evaluated; the scheme of dried tomatoes production is improved; a system for tomato pomace drying is developed; a scheme of tomato powder production from pulp, skin and seeds is developed. The combined method of tomato pomace drying involves the simultaneous use of electromagnetic field of low and ultra-high frequency and blowing hot nitrogen on the product surface. Conducting the drying process in the atmosphere of nitrogen intensifies the process of removing moisture from tomatoes. The expediency of using tomato powder as an enriching additive is proved. Based on the study of the chemical composition of the tomato powder made from the Dagestan varieties, and on the organoleptic evaluation and physicochemical analysis of finished products, we prove the best degree of recoverability of tomato powder in the production of reconstituted juice and tomato beverages.

  18. COMPLEX PROCESSING TECHNOLOGY OF TOMATO RAW MATERIALS

    Directory of Open Access Journals (Sweden)

    A. M. Gadzhieva

    2015-01-01

    Full Text Available Tomatoes grown in the central and southern parts of the country, which contain 5-6 % of solids, including 0.13 % of pectin, 0.86 % of fat, 0.5 % of organic acids; 0.5 % minerals, etc. were used as a subject of research. These tomatoes, grown in the mountains, on soils with high salinity, contain high amounts of valuable components and have a long-term preservation. For the extraction of valuable components from dried tomato pomace CO2 extraction method was applied. Technological and environmental feasibility of tomatoes stage drying in the atmosphere of inert gas in solar dry kiln were evaluated; production scheme of dried tomatoes is improved; a system for tomato pomace drying is developed; a production scheme of powders of pulp, skin and seeds of tomatoes is developed. Combined method of tomato pomace drying involves the simultaneous use of the electromagnetic field of low and ultra-high frequency and blowing product surface with hot nitrogen. Conducting the drying process in an inert gas atmosphere of nitrogen intensified the process of moisture removing from tomatoes. The expediency of using tomato powder as enriching additive was proved. Based on the study of the chemical composition of the tomato powder made from Dagestan varieties of tomatoes, and on the organoleptic evaluation and physico-chemical studies of finished products, we have proved the best degree of recoverability of tomato powder during the production of reconstituted juice and tomato beverages.

  19. Entropy type complexity of quantum processes

    International Nuclear Information System (INIS)

    Watanabe, Noboru

    2014-01-01

    von Neumann entropy represents the amount of information in the quantum state, and this was extended by Ohya for general quantum systems [10]. Umegaki first defined the quantum relative entropy for σ-finite von Neumann algebras, which was extended by Araki, and Uhlmann, for general von Neumann algebras and *-algebras, respectively. In 1983 Ohya introduced the quantum mutual entropy by using compound states; this describes the amount of information correctly transmitted through the quantum channel, which was also extended by Ohya for general quantum systems. In this paper, we briefly explain Ohya's S-mixing entropy and the quantum mutual entropy for general quantum systems. By using structure equivalent class, we will introduce entropy type functionals based on quantum information theory to improve treatment for the Gaussian communication process. (paper)

  20. Features, Events and Processes for the Used Fuel Disposition Campaign

    International Nuclear Information System (INIS)

    Blink, J.A.; Greenberg, H.R.; Caporuscio, F.A.; Houseworth, J.E.; Freeze, G.A.; Mariner, P.; Cunnane, J.C.

    2010-01-01

    The Used Fuel Disposition (UFD) Campaign within DOE-NE is evaluating storage and disposal options for a range of waste forms and a range of geologic environments. To assess the potential performance of conceptual repository designs for the combinations of waste form and geologic environment, a master set of Features, Events, and Processes (FEPs) has been developed and evaluated. These FEPs are based on prior lists developed by the Yucca Mountain Project (YMP) and the international repository community. The objective of the UFD FEPs activity is to identify and categorize FEPs that are important to disposal system performance for a variety of disposal alternatives (i.e., combinations of waste forms, disposal concepts, and geologic environments). FEP analysis provides guidance for the identification of (1) important considerations in disposal system design, and (2) gaps in the technical bases. The UFD FEPs also support the development of performance assessment (PA) models to evaluate the long-term performance of waste forms in the engineered and geologic environments of candidate disposal system alternatives. For the UFD FEP development, five waste form groups and seven geologic settings are being considered. A total of 208 FEPs have been identified, categorized by the physical components of the waste disposal system as well as cross-cutting physical phenomena. The combination of 35 waste-form/geologic environments and 208 FEPs is large; however, some FEP evaluations can cut across multiple waste/environment combinations, and other FEPs can be categorized as not-applicable for some waste/environment combinations, making the task of FEP evaluation more tractable. A FEP status tool has been developed to document progress. The tool emphasizes three major areas that can be statused numerically. FEP Applicability documents whether the FEP is pertinent to a waste/environment combination. FEP Completion Status documents the progress of the evaluation for the FEP

  1. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  2. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  3. The dimerization of the yeast cytochrome bc1 complex is an early event and is independent of Rip1.

    Science.gov (United States)

    Conte, Annalea; Papa, Benedetta; Ferramosca, Alessandra; Zara, Vincenzo

    2015-05-01

    In Saccharomyces cerevisiae the mature cytochrome bc1 complex exists as an obligate homo-dimer in which each monomer consists of ten distinct protein subunits inserted into or bound to the inner mitochondrial membrane. Among them, the Rieske iron-sulfur protein (Rip1), besides its catalytic role in electron transfer, may be implicated in the bc1 complex dimerization. Indeed, Rip1 has the globular domain containing the catalytic center in one monomer while the transmembrane helix interacts with the adjacent monomer. In addition, the lack of Rip1 leads to the accumulation of an immature bc1 intermediate, only loosely associated with cytochrome c oxidase. In this study we have investigated the biogenesis of the yeast cytochrome bc1 complex using epitope tagged proteins to purify native assembly intermediates. We showed that the dimerization process is an early event during bc1 complex biogenesis and that the presence of Rip1, differently from previous proposals, is not essential for this process. We also investigated the multi-step model of bc1 assembly thereby lending further support to the existence of bona fide subcomplexes during bc1 maturation in the inner mitochondrial membrane. Finally, a new model of cytochrome bc1 complex assembly, in which distinct intermediates sequentially interact during bc1 maturation, has been proposed. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Features, Events, and Processes in UZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    J.E. Houseworth

    2001-04-10

    Unsaturated zone (UZ) flow and radionuclide transport is a component of the natural barriers that affects potential repository performance. The total system performance assessment (TSPA) model, and underlying process models, of this natural barrier component capture some, but not all, of the associated features, events, and processes (FEPs) as identified in the FEPs Database (Freeze, et al. 2001 [154365]). This analysis and model report (AMR) discusses all FEPs identified as associated with UZ flow and radionuclide transport. The purpose of this analysis is to give a comprehensive summary of all UZ flow and radionuclide transport FEPs and their treatment in, or exclusion from, TSPA models. The scope of this analysis is to provide a summary of the FEPs associated with the UZ flow and radionuclide transport and to provide a reference roadmap to other documentation where detailed discussions of these FEPs, treated explicitly in TSPA models, are offered. Other FEPs may be screened out from treatment in TSPA by direct regulatory exclusion or through arguments concerning low probability and/or low consequence of the FEPs on potential repository performance. Arguments for exclusion of FEPs are presented in this analysis. Exclusion of specific FEPs from the UZ flow and transport models does not necessarily imply that the FEP is excluded from the TSPA. Similarly, in the treatment of included FEPs, only the way in which the FEPs are included in the UZ flow and transport models is discussed in this document. This report has been prepared in accordance with the technical work plan for the unsaturated zone subproduct element (CRWMS M&O 2000 [153447]). The purpose of this report is to document that all FEPs are either included in UZ flow and transport models for TSPA, or can be excluded from UZ flow and transport models for TSPA on the basis of low probability or low consequence. Arguments for exclusion are presented in this analysis. Exclusion of specific FEPs from UZ flow and

  5. Features, Events, and Processes in UZ Flow and Transport

    International Nuclear Information System (INIS)

    Houseworth, J.E.

    2001-01-01

    Unsaturated zone (UZ) flow and radionuclide transport is a component of the natural barriers that affects potential repository performance. The total system performance assessment (TSPA) model, and underlying process models, of this natural barrier component capture some, but not all, of the associated features, events, and processes (FEPs) as identified in the FEPs Database (Freeze, et al. 2001 [154365]). This analysis and model report (AMR) discusses all FEPs identified as associated with UZ flow and radionuclide transport. The purpose of this analysis is to give a comprehensive summary of all UZ flow and radionuclide transport FEPs and their treatment in, or exclusion from, TSPA models. The scope of this analysis is to provide a summary of the FEPs associated with the UZ flow and radionuclide transport and to provide a reference roadmap to other documentation where detailed discussions of these FEPs, treated explicitly in TSPA models, are offered. Other FEPs may be screened out from treatment in TSPA by direct regulatory exclusion or through arguments concerning low probability and/or low consequence of the FEPs on potential repository performance. Arguments for exclusion of FEPs are presented in this analysis. Exclusion of specific FEPs from the UZ flow and transport models does not necessarily imply that the FEP is excluded from the TSPA. Similarly, in the treatment of included FEPs, only the way in which the FEPs are included in the UZ flow and transport models is discussed in this document. This report has been prepared in accordance with the technical work plan for the unsaturated zone subproduct element (CRWMS MandO 2000 [153447]). The purpose of this report is to document that all FEPs are either included in UZ flow and transport models for TSPA, or can be excluded from UZ flow and transport models for TSPA on the basis of low probability or low consequence. Arguments for exclusion are presented in this analysis. Exclusion of specific FEPs from UZ flow

  6. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  7. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    International Nuclear Information System (INIS)

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  8. Psychological distress and stressful life events in pediatric complex regional pain syndrome

    Science.gov (United States)

    Wager, Julia; Brehmer, Hannah; Hirschfeld, Gerrit; Zernikow, Boris

    2015-01-01

    BACKGROUND: There is little knowledge regarding the association between psychological factors and complex regional pain syndrome (CRPS) in children. Specifically, it is not known which factors precipitate CRPS and which result from the ongoing painful disease. OBJECTIVES: To examine symptoms of depression and anxiety as well as the experience of stressful life events in children with CRPS compared with children with chronic primary headaches and functional abdominal pain. METHODS: A retrospective chart study examined children with CRPS (n=37) who received intensive inpatient pain treatment between 2004 and 2010. They were compared with two control groups (chronic primary headaches and functional abdominal pain; each n=37), who also received intensive inpatient pain treatment. Control groups were matched with the CRPS group with regard to admission date, age and sex. Groups were compared on symptoms of depression and anxiety as well as stressful life events. RESULTS: Children with CRPS reported lower anxiety and depression scores compared with children with abdominal pain. A higher number of stressful life events before and after the onset of the pain condition was observed for children with CRPS. CONCLUSIONS: Children with CRPS are not particularly prone to symptoms of anxiety or depression. Importantly, children with CRPS experienced more stressful life events than children with chronic headaches or abdominal pain. Prospective long-term studies are needed to further explore the potential role of stressful life events in the etiology of CRPS. PMID:26035287

  9. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  10. Minimized state complexity of quantum-encoded cryptic processes

    Science.gov (United States)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  11. Cognitive and emotional reactions to daily events: the effects of self-esteem and self-complexity.

    Science.gov (United States)

    Campbell, J D; Chew, B; Scratchley, L S

    1991-09-01

    In this article we examine the effects of self-esteem and self-complexity on cognitive appraisals of daily events and emotional lability. Subjects (n = 67) participated in a 2-week diary study; each day they made five mood ratings, described the most positive and negative events of the day, and rated these two events on six appraisal measures. Neither self-esteem nor self-complexity was related to an extremity measure of mood variability. Both traits were negatively related to measures assessing the frequency of mood change, although the effect of self-complexity dissipated when self-esteem was taken into account. Self-esteem (but not self-complexity) was also related to event appraisals: Subjects with low self-esteem rated their daily events as less positive and as having more impact on their moods. Subjects with high self-esteem made more internal, stable, global attributions for positive events than for negative events, whereas subjects low in self-esteem made similar attributions for both types of events and viewed their negative events as being more personally important than did subjects high in self-esteem. Despite these self-esteem differences in subjects' views of their daily events, naive judges (n = 63) who read the event descriptions and role-played their appraisals of them generally did not distinguish between the events that had been experienced by low self-esteem versus high self-esteem diary subjects.

  12. A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali; Dierks, Meghan

    2017-03-01

    In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations

  13. Emotional Granularity Effects on Event-Related Brain Potentials during Affective Picture Processing.

    Science.gov (United States)

    Lee, Ja Y; Lindquist, Kristen A; Nam, Chang S

    2017-01-01

    There is debate about whether emotional granularity , the tendency to label emotions in a nuanced and specific manner, is merely a product of labeling abilities, or a systematic difference in the experience of emotion during emotionally evocative events. According to the Conceptual Act Theory of Emotion (CAT) (Barrett, 2006), emotional granularity is due to the latter and is a product of on-going temporal differences in how individuals categorize and thus make meaning of their affective states. To address this question, the present study investigated the effects of individual differences in emotional granularity on electroencephalography-based brain activity during the experience of emotion in response to affective images. Event-related potentials (ERP) and event-related desynchronization and synchronization (ERD/ERS) analysis techniques were used. We found that ERP responses during the very early (60-90 ms), middle (270-300 ms), and later (540-570 ms) moments of stimulus presentation were associated with individuals' level of granularity. We also observed that highly granular individuals, compared to lowly granular individuals, exhibited relatively stable desynchronization of alpha power (8-12 Hz) and synchronization of gamma power (30-50 Hz) during the 3 s of stimulus presentation. Overall, our results suggest that emotional granularity is related to differences in neural processing throughout emotional experiences and that high granularity could be associated with access to executive control resources and a more habitual processing of affective stimuli, or a kind of "emotional complexity." Implications for models of emotion are also discussed.

  14. Inclusive Education as Complex Process and Challenge for School System

    Directory of Open Access Journals (Sweden)

    Al-Khamisy Danuta

    2015-08-01

    Full Text Available Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the complexity appears to be one of very significant terms both in science and in philosophy. It appears that despite searching for simple rules, strategies, solutions everything is still more complex. The environment is complex, the organism living in it and exploring it, and just the exploration itself is a complex phenomenon, much more than this could initially seem to be.

  15. Y-12 National Security Complex Emergency Management Hazards Assessment (EMHA) Process; FINAL

    International Nuclear Information System (INIS)

    Bailiff, E.F.; Bolling, J.D.

    2001-01-01

    This document establishes requirements and standard methods for the development and maintenance of the Emergency Management Hazards Assessment (EMHA) process used by the lead and all event contractors at the Y-12 Complex for emergency planning and preparedness. The EMHA process provides the technical basis for the Y-12 emergency management program. The instructions provided in this document include methods and requirements for performing the following emergency management activities at Y-12: (1) hazards identification; (2) hazards survey, and (3) hazards assessment

  16. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  17. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  18. Focused process improvement events: sustainability of impact on process and performance in an academic radiology department.

    Science.gov (United States)

    Rosenkrantz, Andrew B; Lawson, Kirk; Ally, Rosina; Chen, David; Donno, Frank; Rittberg, Steven; Rodriguez, Joan; Recht, Michael P

    2015-01-01

    To evaluate sustainability of impact of rapid, focused process improvement (PI) events on process and performance within an academic radiology department. Our department conducted PI during 2011 and 2012 in CT, MRI, ultrasound, breast imaging, and research billing. PI entailed participation by all stakeholders, facilitation by the department chair, collection of baseline data, meetings during several weeks, definition of performance metrics, creation of an improvement plan, and prompt implementation. We explore common themes among PI events regarding initial impact and durability of changes. We also assess performance in each area pre-PI, immediately post-PI, and at the time of the current study. All PI events achieved an immediate improvement in performance metrics, often entailing both examination volumes and on-time performance. IT-based solutions, process standardization, and redefinition of staff responsibilities were often central in these changes, and participants consistently expressed improved internal leadership and problem-solving ability. Major environmental changes commonly occurred after PI, including a natural disaster with equipment loss, a change in location or services offered, and new enterprise-wide electronic medical record system incorporating new billing and radiology informatics systems, requiring flexibility in the PI implementation plan. Only one PI team conducted regular post-PI follow-up meetings. Sustained improvement was frequently, but not universally, observed: in the long-term following initial PI, measures of examination volume showed continued progressive improvements, whereas measures of operational efficiency remained stable or occasionally declined. Focused PI is generally effective in achieving performance improvement, although a changing environment influences the sustainability of impact. Thus, continued process evaluation and ongoing workflow modifications are warranted. Copyright © 2015 American College of Radiology

  19. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  20. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  1. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    Science.gov (United States)

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  3. Knowing what, where, and when: event comprehension in language processing.

    Science.gov (United States)

    Kukona, Anuenue; Altmann, Gerry T M; Kamide, Yuki

    2014-10-01

    We investigated the retrieval of location information, and the deployment of attention to these locations, following (described) event-related location changes. In two visual world experiments, listeners viewed arrays with containers like a bowl, jar, pan, and jug, while hearing sentences like "The boy will pour the sweetcorn from the bowl into the jar, and he will pour the gravy from the pan into the jug. And then, he will taste the sweetcorn". At the discourse-final "sweetcorn", listeners fixated context-relevant "Target" containers most (jar). Crucially, we also observed two forms of competition: listeners fixated containers that were not directly referred to but associated with "sweetcorn" (bowl), and containers that played the same role as Targets (goals of moving events; jug), more than distractors (pan). These results suggest that event-related location changes are encoded across representations that compete for comprehenders' attention, such that listeners retrieve, and fixate, locations that are not referred to in the unfolding language, but related to them via object or role information. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Event processing time prediction at the CMS experiment of the Large Hadron Collider

    International Nuclear Information System (INIS)

    Cury, Samir; Gutsche, Oliver; Kcira, Dorian

    2014-01-01

    The physics event reconstruction is one of the biggest challenges for the computing of the LHC experiments. Among the different tasks that computing systems of the CMS experiment performs, the reconstruction takes most of the available CPU resources. The reconstruction time of single collisions varies according to event complexity. Measurements were done in order to determine this correlation quantitatively, creating means to predict it based on the data-taking conditions of the input samples. Currently the data processing system splits tasks in groups with the same number of collisions and does not account for variations in the processing time. These variations can be large and can lead to a considerable increase in the time it takes for CMS workflows to finish. The goal of this study was to use estimates on processing time to more efficiently split the workflow into jobs. By considering the CPU time needed for each job the spread of the job-length distribution in a workflow is reduced.

  5. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  6. Attention - Control in the Frequentistic Processing of Multidimensional Event Streams.

    Science.gov (United States)

    1980-07-01

    Human memory. Annual Review of Psychology, 1979, 30, 63-702. Craik , F. I. M., & Lockhart , R. S. Levels of processing : A framework for memory research...1979; Jacoby & Craik , 1979). Thus, the notions of memora- bility (or retrievability) and levels of processing are tied closely in the sense that the...differing levels and degrees of elaborateness (Jacoby & Craik , 1979). Decisions as to which attributes receive elaborated processing and how they are

  7. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  8. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  9. The Effects of Syntactic Complexity on Processing Sentences in Noise

    Science.gov (United States)

    Carroll, Rebecca; Ruigendijk, Esther

    2013-01-01

    This paper discusses the influence of stationary (non-fluctuating) noise on processing and understanding of sentences, which vary in their syntactic complexity (with the factors canonicity, embedding, ambiguity). It presents data from two RT-studies with 44 participants testing processing of German sentences in silence and in noise. Results show a…

  10. Low latitude ionospheric TEC responses to dynamical complexity quantifiers during transient events over Nigeria

    Science.gov (United States)

    Ogunsua, Babalola

    2018-04-01

    In this study, the values of chaoticity and dynamical complexity parameters for some selected storm periods in the year 2011 and 2012 have been computed. This was done using detrended TEC data sets measured from Birnin-Kebbi, Torro and Enugu global positioning system (GPS) receiver stations in Nigeria. It was observed that the significance of difference (SD) values were mostly greater than 1.96 but surprisingly lower than 1.96 in September 29, 2011. The values of the computed SD were also found to be reduced in most cases just after the geomagnetic storm with immediate recovery a day after the main phase of the storm while the values of Lyapunov exponent and Tsallis entropy remains reduced due to the influence of geomagnetic storms. It was also observed that the value of Lyapunov exponent and Tsallis entropy reveals similar variation pattern during storm period in most cases. Also recorded surprisingly were lower values of these dynamical quantifiers during the solar flare event of August 8th and 9th of the year 2011. The possible mechanisms responsible for these observations were further discussed in this work. However, our observations show that the ionospheric effects of some other possible transient events other than geomagnetic storms can also be revealed by the variation of chaoticity and dynamical complexity.

  11. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  12. Gradation of complexity and predictability of hydrological processes

    Science.gov (United States)

    Sang, Yan-Fang; Singh, Vijay P.; Wen, Jun; Liu, Changming

    2015-06-01

    Quantification of the complexity and predictability of hydrological systems is important for evaluating the impact of climate change on hydrological processes, and for guiding water activities. In the literature, the focus seems to have been on describing the complexity of spatiotemporal distribution of hydrological variables, but little attention has been paid to the study of complexity gradation, because the degree of absolute complexity of hydrological systems cannot be objectively evaluated. Here we show that complexity and predictability of hydrological processes can be graded into three ranks (low, middle, and high). The gradation is based on the difference in the energy distribution of hydrological series and that of white noise under multitemporal scales. It reflects different energy concentration levels and contents of deterministic components of the hydrological series in the three ranks. Higher energy concentration level reflects lower complexity and higher predictability, but scattered energy distribution being similar to white noise has the highest complexity and is almost unpredictable. We conclude that the three ranks (low, middle, and high) approximately correspond to deterministic, stochastic, and random hydrological systems, respectively. The result of complexity gradation can guide hydrological observations and modeling, and identification of similarity patterns among different hydrological systems.

  13. Remote Sensing of Surficial Process Responses to Extreme Meteorological Events

    Science.gov (United States)

    Brakenridge, G. Robert

    1997-01-01

    Changes in the frequency and magnitude of extreme meteorological events are associated with changing environmental means. Such events are important in human affairs, and can also be investigated by orbital remote sensing. During the course of this project, we applied ERS-1, ERS-2, Radarsat, and an airborne sensor (AIRSAR-TOPSAR) to measure flood extents, flood water surface profiles, and flood depths. We established a World Wide Web site (the Dartmouth Flood Observatory) for publishing remote sensing-based maps of contemporary floods worldwide; this is also an online "active archive" that presently constitutes the only global compilation of extreme flood events. We prepared an article for EOS concerning SAR imaging of the Mississippi Valley flood; an article for the International Journal of Remote Sensing on measurement of a river flood wave using ERS-2, began work on an article (since completed and published) on the Flood Observatory for a Geoscience Information Society Proceedings volume, and presented lectures at several Geol. Soc. of America Natl. Meetings, an Assoc. of Amer. Geographers Natl. Meeting, and a Binghamton Geomorphology Symposium (all on SAR remote sensing of the Mississippi Valley flood). We expanded in-house modeling capabilities by installing the latest version of the Army Corps of Engineers RMA two-dimensional hydraulics software and BYU Engineering Graphics Lab's Surface Water Modeling System (finite elements based pre- and post-processors for RMA work) and also added watershed modeling software. We are presently comparing the results of the 2-d flow models with SAR image data. The grant also supported several important upgrades of pc-based remote sensing infrastructure at Dartmouth. During work on this grant, we collaborated with several workers at the U.S. Army Corps of Engineers, Remote Sensing/GIS laboratory (for flood inundation mapping and modeling; particularly of the Illinois River using the AIRSAR/TOPSAR/ERS-2 combined data), with Dr

  14. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy.

    Science.gov (United States)

    Fidler, Andrew F; Singh, Ved P; Long, Phillip D; Dahlberg, Peter D; Engel, Gregory S

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  15. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S. [Department of Chemistry, The Institute for Biophysical Dynamics, and The James Franck Institute, The University of Chicago, Chicago, Illinois 60637 (United States); Long, Phillip D.; Dahlberg, Peter D. [Graduate Program in the Biophysical Sciences, The University of Chicago, Chicago, Illinois 60637 (United States)

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  16. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    International Nuclear Information System (INIS)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S.; Long, Phillip D.; Dahlberg, Peter D.

    2013-01-01

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex

  17. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    Science.gov (United States)

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  18. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  19. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores.

    Directory of Open Access Journals (Sweden)

    Lorea Flores

    Full Text Available Habitat complexity can influence predation rates (e.g. by providing refuge but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants, in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants; and 3. as the spatial configuration of structures (measured as fractal dimension. The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology. We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, 'habitat complexity' by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems.

  20. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  1. Event Processing and Variable Part of Sample Period Determining in Combined Systems Using GA

    Science.gov (United States)

    Strémy, Maximilián; Závacký, Pavol; Jedlička, Martin

    2011-01-01

    This article deals with combined dynamic systems and usage of modern techniques in dealing with these systems, focusing particularly on sampling period design, cyclic processing tasks and related processing algorithms in the combined event management systems using genetic algorithms.

  2. Profiling event logs to configure risk indicators for process delays

    NARCIS (Netherlands)

    Pika, A.; Aalst, van der W.M.P.; Fidge, C.J.; Hofstede, ter A.H.M.; Wynn, M.T.; Salinesi, C.; Norrie, M.C.; Pastor, O.

    2013-01-01

    Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to

  3. Event-driven processing for hardware-efficient neural spike sorting

    Science.gov (United States)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  4. Brain Signals of Face Processing as Revealed by Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Ela I. Olivares

    2015-01-01

    Full Text Available We analyze the functional significance of different event-related potentials (ERPs as electrophysiological indices of face perception and face recognition, according to cognitive and neurofunctional models of face processing. Initially, the processing of faces seems to be supported by early extrastriate occipital cortices and revealed by modulations of the occipital P1. This early response is thought to reflect the detection of certain primary structural aspects indicating the presence grosso modo of a face within the visual field. The posterior-temporal N170 is more sensitive to the detection of faces as complex-structured stimuli and, therefore, to the presence of its distinctive organizational characteristics prior to within-category identification. In turn, the relatively late and probably more rostrally generated N250r and N400-like responses might respectively indicate processes of access and retrieval of face-related information, which is stored in long-term memory (LTM. New methods of analysis of electrophysiological and neuroanatomical data, namely, dynamic causal modeling, single-trial and time-frequency analyses, are highly recommended to advance in the knowledge of those brain mechanisms concerning face processing.

  5. Fine grained event processing on HPCs with the ATLAS Yoda system

    CERN Document Server

    Calafiura, Paolo; The ATLAS collaboration; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    High performance computing facilities present unique challenges and opportunities for HENP event processing. The massive scale of many HPC systems means that fractionally small utilizations can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HENP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficie...

  6. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun; Fu, Fengjiang; Chung, Neal Tai-Shung

    2014-01-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  7. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun

    2014-07-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  8. Mesoscale Convective Complexes (MCCs) over the Indonesian Maritime Continent during the ENSO events

    Science.gov (United States)

    Trismidianto; Satyawardhana, H.

    2018-05-01

    This study analyzed the mesoscale convective complexes (MCCs) over the Indonesian Maritime Continent (IMC) during the El Niño/Southern Oscillation (ENSO) events for the the15-year period from 2001 to 2015. The MCCs identified by infrared satellite imagery that obtained from the Himawari generation satellite data. This study has reported that the frequencies of the MCC occurrences at the El Niño and La Niña were higher than that of neutral conditions during DJF. Peak of MCC occurrences during DJF at La Niña and neutral condition is in February, while El Niño is in January. ENSO strongly affects the occurrence of MCC during the DJF season. The existences of the MCC were also accompanied by increased rainfall intensity at the locations of the MCC occurrences for all ENSO events. During JJA seasons, the MCC occurrences are always found during neutral conditions, El Niño and La Niña in Indian Ocean. MCC occurring during the JJA season on El Niño and neutral conditions averaged much longer than during the DJF season. In contrast, MCCs occurring in La Niña conditions during the JJA season are more rapidly extinct than during the DJF. It indicates that the influence of MCC during La Niña during the DJF season is stronger than during the JJA season.

  9. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    Directory of Open Access Journals (Sweden)

    Xiaoqing Hao

    Full Text Available To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P, weakly positive (p, neutral (o, weakly negative (n, and strongly negative (N. These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism.

  10. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    Science.gov (United States)

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  11. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    Directory of Open Access Journals (Sweden)

    Huajiao Li

    Full Text Available Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  12. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  13. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    Science.gov (United States)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  14. Can complex cellular processes be governed by simple linear rules?

    Science.gov (United States)

    Selvarajoo, Kumar; Tomita, Masaru; Tsuchiya, Masa

    2009-02-01

    Complex living systems have shown remarkably well-orchestrated, self-organized, robust, and stable behavior under a wide range of perturbations. However, despite the recent generation of high-throughput experimental datasets, basic cellular processes such as division, differentiation, and apoptosis still remain elusive. One of the key reasons is the lack of understanding of the governing principles of complex living systems. Here, we have reviewed the success of perturbation-response approaches, where without the requirement of detailed in vivo physiological parameters, the analysis of temporal concentration or activation response unravels biological network features such as causal relationships of reactant species, regulatory motifs, etc. Our review shows that simple linear rules govern the response behavior of biological networks in an ensemble of cells. It is daunting to know why such simplicity could hold in a complex heterogeneous environment. Provided physical reasons can be explained for these phenomena, major advancement in the understanding of basic cellular processes could be achieved.

  15. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  16. Some considerations on Bible translation as complex process | Van ...

    African Journals Online (AJOL)

    It is argued that translation is a complex process: meaning is "created" by decoding the source text on several levels (for instance, grammatical; structural; literary; and socio-cultural levels). This "meaning" must then be encoded into the target language by means of the linguistic, literary, and cultural conventions of the target ...

  17. Managing complexity in process digitalisation with dynamic condition response graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Debois, Søren; Slaats, Tijs

    2017-01-01

    . Sadly, it is also witnessed by a number of expensive failed digitalisation projects. In this paper we point to two key problems in state-of-The art BPM technologies: 1) the use of rigid flow diagrams as the "source code" of process digitalisation is not suitable for managing the complexity of knowledge...

  18. Cueing Complex Animations: Does Direction of Attention Foster Learning Processes?

    Science.gov (United States)

    Lowe, Richard; Boucheix, Jean-Michel

    2011-01-01

    The time course of learners' processing of a complex animation was studied using a dynamic diagram of a piano mechanism. Over successive repetitions of the material, two forms of cueing (standard colour cueing and anti-cueing) were administered either before or during the animated segment of the presentation. An uncued group and two other control…

  19. Emotional processing and psychopathic traits in male college students: An event-related potential study.

    Science.gov (United States)

    Medina, Amy L; Kirilko, Elvira; Grose-Fifer, Jillian

    2016-08-01

    Emotional processing deficits are often considered a hallmark of psychopathy. However, there are relatively few studies that have investigated how the late positive potential (LPP) elicited by both positive and negative emotional stimuli is modulated by psychopathic traits, especially in undergraduates. Attentional deficits have also been posited to be associated with emotional blunting in psychopathy, consequently, results from previous studies may have been influenced by task demands. Therefore, we investigated the relationship between the neural correlates of emotional processing and psychopathic traits by measuring event-related potentials (ERPs) during a task with a relatively low cognitive load. A group of male undergraduates were classified as having either high or low levels of psychopathic traits according to their total scores on the Psychopathic Personality Inventory - Revised (PPI-R). A subgroup of these participants then passively viewed complex emotional and neutral images from the International Affective Picture System (IAPS) while their EEGs were recorded. As hypothesized, in general the late LPP elicited by emotional pictures was found to be significantly reduced for participants with high Total PPI-R scores relative to those with low scores, especially for pictures that were rated as less emotionally arousing. Our data suggest that male undergraduates with high, but subclinical levels of psychopathic traits did not maintain continued higher-order processing of affective information, especially when it was perceived to be less arousing in nature. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Capturing connectivity and causality in complex industrial processes

    CERN Document Server

    Yang, Fan; Shah, Sirish L; Chen, Tongwen

    2014-01-01

    This brief reviews concepts of inter-relationship in modern industrial processes, biological and social systems. Specifically ideas of connectivity and causality within and between elements of a complex system are treated; these ideas are of great importance in analysing and influencing mechanisms, structural properties and their dynamic behaviour, especially for fault diagnosis and hazard analysis. Fault detection and isolation for industrial processes being concerned with root causes and fault propagation, the brief shows that, process connectivity and causality information can be captured in two ways: ·      from process knowledge: structural modeling based on first-principles structural models can be merged with adjacency/reachability matrices or topology models obtained from process flow-sheets described in standard formats; and ·      from process data: cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian ne...

  1. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  2. Visual perception of complex shape-transforming processes.

    Science.gov (United States)

    Schmidt, Filipp; Fleming, Roland W

    2016-11-01

    Morphogenesis-or the origin of complex natural form-has long fascinated researchers from practically every branch of science. However, we know practically nothing about how we perceive and understand such processes. Here, we measured how observers visually infer shape-transforming processes. Participants viewed pairs of objects ('before' and 'after' a transformation) and identified points that corresponded across the transformation. This allowed us to map out in spatial detail how perceived shape and space were affected by the transformations. Participants' responses were strikingly accurate and mutually consistent for a wide range of non-rigid transformations including complex growth-like processes. A zero-free-parameter model based on matching and interpolating/extrapolating the positions of high-salience contour features predicts the data surprisingly well, suggesting observers infer spatial correspondences relative to key landmarks. Together, our findings reveal the operation of specific perceptual organization processes that make us remarkably adept at identifying correspondences across complex shape-transforming processes by using salient object features. We suggest that these abilities, which allow us to parse and interpret the causally significant features of shapes, are invaluable for many tasks that involve 'making sense' of shape. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  4. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  5. Complex processing of rubber waste through energy recovery

    Directory of Open Access Journals (Sweden)

    Roman Smelík

    2015-12-01

    Full Text Available This article deals with the applied energy recovery solutions for complex processing of rubber waste for energy recovery. It deals specifically with the solution that could maximize possible use of all rubber waste and does not create no additional waste that disposal would be expensive and dangerous for the environment. The project is economically viable and energy self-sufficient. The outputs of the process could replace natural gas and crude oil products. The other part of the process is also the separation of metals, which can be returned to the metallurgical secondary production.

  6. Component Neural Systems for the Creation of Emotional Memories during Free Viewing of a Complex, Real-World Event.

    Science.gov (United States)

    Botzung, Anne; Labar, Kevin S; Kragel, Philip; Miles, Amanda; Rubin, David C

    2010-01-01

    To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  7. Component neural systems for the creation of emotional memories during free viewing of a complex, real-world event

    Directory of Open Access Journals (Sweden)

    Anne Botzung

    2010-05-01

    Full Text Available To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI. During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets or from non-viewed portions of the same game (foils. After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan’s perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  8. Iterative Calibration: A Novel Approach for Calibrating the Molecular Clock Using Complex Geological Events.

    Science.gov (United States)

    Loeza-Quintana, Tzitziki; Adamowicz, Sarah J

    2018-02-01

    During the past 50 years, the molecular clock has become one of the main tools for providing a time scale for the history of life. In the era of robust molecular evolutionary analysis, clock calibration is still one of the most basic steps needing attention. When fossil records are limited, well-dated geological events are the main resource for calibration. However, biogeographic calibrations have often been used in a simplistic manner, for example assuming simultaneous vicariant divergence of multiple sister lineages. Here, we propose a novel iterative calibration approach to define the most appropriate calibration date by seeking congruence between the dates assigned to multiple allopatric divergences and the geological history. Exploring patterns of molecular divergence in 16 trans-Bering sister clades of echinoderms, we demonstrate that the iterative calibration is predominantly advantageous when using complex geological or climatological events-such as the opening/reclosure of the Bering Strait-providing a powerful tool for clock dating that can be applied to other biogeographic calibration systems and further taxa. Using Bayesian analysis, we observed that evolutionary rate variability in the COI-5P gene is generally distributed in a clock-like fashion for Northern echinoderms. The results reveal a large range of genetic divergences, consistent with multiple pulses of trans-Bering migrations. A resulting rate of 2.8% pairwise Kimura-2-parameter sequence divergence per million years is suggested for the COI-5P gene in Northern echinoderms. Given that molecular rates may vary across latitudes and taxa, this study provides a new context for dating the evolutionary history of Arctic marine life.

  9. Direct risk standardisation: a new method for comparing casemix adjusted event rates using complex models.

    Science.gov (United States)

    Nicholl, Jon; Jacques, Richard M; Campbell, Michael J

    2013-10-29

    Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as "practically impossible", and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons.

  10. The Mega Events in the processes of foundation and transformation of the city

    Directory of Open Access Journals (Sweden)

    Mario Coletta

    2012-12-01

    Full Text Available Every city arises from an event, an individual decision supported by a collective engagement, disciplined by rules defining technical and legal norms for implementing and managing, breeding customs, traditions, rituals and shared behaviours, as roots of culture and civilization. The origin of the foundation city, both in the ancient time and in the medieval and modern ages, represents the first major event for the city which resumes, in its physical and management setting-up, the matrix characteristics of urban planning complexity, putting into a dialectic comparison not only “where” (place and space, “when” (age and time and “how” (form and behaviour, but also “why” and “for whom”, turning out as the dominant subjects of the residential making-process since the beginning of civilisations. “Why” resumes the ambit of needs, material and spiritual instances, concrete and abstract instances, strictly connected with the ambit of will, wishes and ambitions, leading decisions and policies at the basis of plans, programmes and projects dominated by “must-can” binomial dialectics. “For whom” determines the transition from the material to the immaterial, from the concreteness of actions to relative aims, from the object to the subject, from the operator to the addressee of the operation, recalling the “ethical reason” which finds its deepest roots in the ideology of “idea” sublimation, fulfilled by new linguistic assumptions, symbolic messages aiming at exalting the membership extension from the family, brotherhood and tribe to the native land, to the territory and the city. In this perspective, the pyramid verticalisation of social relationships, disciplining the urban order, finds a convenient and comfortable acceptance for a faithful commonality (which guarantees trust relations and a progressive process of “civic sense”, which makes the general particular and the particular general, connecting rationality

  11. Integrating Continuous-Time and Discrete-Event Concepts in Process Modelling, Simulation and Control

    NARCIS (Netherlands)

    Beek, van D.A.; Gordijn, S.H.F.; Rooda, J.E.; Ertas, A.

    1995-01-01

    Currently, modelling of systems in the process industry requires the use of different specification languages for the specification of the discrete-event and continuous-time subsystems. In this way, models are restricted to individual subsystems of either a continuous-time or discrete-event nature.

  12. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  13. Strategy for introduction of rainwater management facility considering rainfall event applied on new apartment complex

    Science.gov (United States)

    KIM, H.; Lee, D. K.; Yoo, S.

    2014-12-01

    As regional torrential rains become frequent due to climate change, urban flooding happens very often. That is why it is necessary to prepare for integrated measures against a wide range of rainfall. This study proposes introduction of effective rainwater management facilities to maximize the rainwater runoff reductions and recover natural water circulation for unpredictable extreme rainfall in apartment complex scale. The study site is new apartment complex in Hanam located in east of Seoul, Korea. It has an area of 7.28ha and is analysed using the EPA-SWMM and STORM model. First, it is analyzed that green infrastructure(GI) had efficiency of flood reduction at the various rainfall events and soil characteristics, and then the most effective value of variables are derived. In case of rainfall event, Last 10 years data of 15 minutes were used for analysis. A comparison between A(686mm rainfall during 22days) and B(661mm/4days) knew that soil infiltration of A is 17.08% and B is 5.48% of the rainfall. Reduction of runoff after introduction of the GI of A is 24.76% and B is 6.56%. These results mean that GI is effective to small rainfall intensity, and artificial rainwater retarding reservoir is needed at extreme rainfall. Second, set of target year is conducted for the recovery of hydrological cycle at the predevelopment. And an amount of infiltration, evaporation, surface runoff of the target year and now is analysed on the basis of land coverage, and an arrangement of LID facilities. Third, rainwater management scenarios are established and simulated by the SWMM-LID. Rainwater management facilities include GI(green roof, porous pavement, vegetative swale, ecological pond, and raingarden), and artificial rainwater. Design scenarios are categorized five type: 1)no GI, 2)conventional GI design(current design), 3)intensive GI design, 4)GI design+rainwater retarding reservoir 5)maximized rainwater retarding reservoir. Intensive GI design is to have attribute value to

  14. A dataflow meta-computing framework for event processing in the H1 experiment

    International Nuclear Information System (INIS)

    Campbell, A.; Gerhards, R.; Mkrtchyan, T.; Levonian, S.; Grab, C.; Martyniak, J.; Nowak, J.

    2001-01-01

    Linux based networked PCs clusters are replacing both the VME non uniform direct memory access systems and SMP shared memory systems used previously for the online event filtering and reconstruction. To allow an optimal use of the distributed resources of PC clusters an open software framework is presently being developed based on a dataflow paradigm for event processing. This framework allows for the distribution of the data of physics events and associated calibration data to multiple computers from multiple input sources for processing and the subsequent collection of the processed events at multiple outputs. The basis of the system is the event repository, basically a first-in first-out event store which may be read and written in a manner similar to sequential file access. Events are stored in and transferred between repositories as suitably large sequences to enable high throughput. Multiple readers can read simultaneously from a single repository to receive event sequences and multiple writers can insert event sequences to a repository. Hence repositories are used for event distribution and collection. To support synchronisation of the event flow the repository implements barriers. A barrier must be written by all the writers of a repository before any reader can read the barrier. A reader must read a barrier before it may receive data from behind it. Only after all readers have read the barrier is the barrier removed from the repository. A barrier may also have attached data. In this way calibration data can be distributed to all processing units. The repositories are implemented as multi-threaded CORBA objects in C++ and CORBA is used for all data transfers. Job setup scripts are written in python and interactive status and histogram display is provided by a Java program. Jobs run under the PBS batch system providing shared use of resources for online triggering, offline mass reprocessing and user analysis jobs

  15. Accident and Off-Normal Response and Recovery from Multi-Canister Overpack (MCO) Processing Events

    International Nuclear Information System (INIS)

    ALDERMAN, C.A.

    2000-01-01

    In the process of removing spent nuclear fuel (SNF) from the K Basins through its subsequent packaging, drymg, transportation and storage steps, the SNF Project must be able to respond to all anticipated or foreseeable off-normal and accident events that may occur. Response procedures and recovery plans need to be in place, personnel training established and implemented to ensure the project will be capable of appropriate actions. To establish suitable project planning, these events must first be identified and analyzed for their expected impact to the project. This document assesses all off-normal and accident events for their potential cross-facility or Multi-Canister Overpack (MCO) process reversal impact. Table 1 provides the methodology for establishing the event planning level and these events are provided in Table 2 along with the general response and recovery planning. Accidents and off-normal events of the SNF Project have been evaluated and are identified in the appropriate facility Safety Analysis Report (SAR) or in the transportation Safety Analysis Report for Packaging (SARP). Hazards and accidents are summarized from these safety analyses and listed in separate tables for each facility and the transportation system in Appendix A, along with identified off-normal events. The tables identify the general response time required to ensure a stable state after the event, governing response documents, and the events with potential cross-facility or SNF process reversal impacts. The event closure is predicated on stable state response time, impact to operations and the mitigated annual occurrence frequency of the event as developed in the hazard analysis process

  16. Complex service recovery processes: how to avoid triple deviation

    OpenAIRE

    Edvardsson, Bo; Tronvoll, Bård; Höykinpuro, Ritva

    2011-01-01

    Purpose – This article seeks to develop a new framework to outline factors that influence the resolution of unfavourable service experiences as a result of double deviation. The focus is on understanding and managing complex service recovery processes. Design/methodology/approach – An inductive, explorative and narrative approach was selected. Data were collected in the form of narratives from the field through interviews with actors at various levels in organisations as well as with custo...

  17. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  18. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  19. GET controller and UNICORN: event-driven process execution and monitoring in logistics

    NARCIS (Netherlands)

    Baumgrass, A.; Di Ciccio, C.; Dijkman, R.M.; Hewelt, M; Mendling, J.; Meyer, Andreas; Pourmirza, S.; Weske, M.H.; Wong, T.Y.

    2015-01-01

    Especially in logistics, process instances often interact with their real-world environment during execution. This is challenging due to the fact that events from this environment are often heterogeneous, lack process instance information, and their import and visualisation in traditional process

  20. Discovering block-structured process models from event logs containing infrequent behaviour

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Aalst, van der W.M.P.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Given an event log describing observed behaviour, process discovery aims to find a process model that ‘best’ describes this behaviour. A large variety of process discovery algorithms has been proposed. However, no existing algorithm returns a sound model in all cases (free of deadlocks and other

  1. Temporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis.

    Science.gov (United States)

    Sussman, Elyse; Winkler, István; Kreuzer, Judith; Saher, Marieke; Näätänen, Risto; Ritter, Walter

    2002-12-01

    Our previous study showed that the auditory context could influence whether two successive acoustic changes occurring within the temporal integration window (approximately 200ms) were pre-attentively encoded as a single auditory event or as two discrete events (Cogn Brain Res 12 (2001) 431). The aim of the current study was to assess whether top-down processes could influence the stimulus-driven processes in determining what constitutes an auditory event. Electroencepholagram (EEG) was recorded from 11 scalp electrodes to frequently occurring standard and infrequently occurring deviant sounds. Within the stimulus blocks, deviants either occurred only in pairs (successive feature changes) or both singly and in pairs. Event-related potential indices of change and target detection, the mismatch negativity (MMN) and the N2b component, respectively, were compared with the simultaneously measured performance in discriminating the deviants. Even though subjects could voluntarily distinguish the two successive auditory feature changes from each other, which was also indicated by the elicitation of the N2b target-detection response, top-down processes did not modify the event organization reflected by the MMN response. Top-down processes can extract elemental auditory information from a single integrated acoustic event, but the extraction occurs at a later processing stage than the one whose outcome is indexed by MMN. Initial processes of auditory event-formation are fully governed by the context within which the sounds occur. Perception of the deviants as two separate sound events (the top-down effects) did not change the initial neural representation of the same deviants as one event (indexed by the MMN), without a corresponding change in the stimulus-driven sound organization.

  2. Processing data communications events by awakening threads in parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2016-03-15

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  3. Processing communications events in parallel active messaging interface by awakening thread from wait state

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-10-22

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  4. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  5. Episodic events in long-term geological processes: A new classification and its applications

    Directory of Open Access Journals (Sweden)

    Dmitry A. Ruban

    2018-03-01

    Full Text Available Long-term geological processes are usually described with curves reflecting continuous changes in the characteristic parameters through the geological history, and such curves can be employed directly for recognition of episodic (relatively long-term events linked to these changes. The episodic events can be classified into several categories according to their scale (ordinary and anomalous events, “shape” (positive, negative, and neutral events, and relation to long-term trend change (successive, interruptive, facilitative, stabilizing, transformative, increasing, and decreasing. Many types of these events can be defined depending on the combination of the above-mentioned patterns. Of course, spatial rank, duration, and origin can be also considered in description of these events. The proposed classification can be applied to events in some real long-term geological processes, which include global sea-level changes, biodiversity dynamics, lithospheric plate number changes, and palaeoclimate changes. Several case examples prove the usefulness of the classification. It is established that the Early Valanginian (Early Cretaceous eustatic lowstand (the lowest position of the sea level in the entire Cretaceous was negative, but ordinary and only interruptive event. In the other case, it becomes clear that the only end-Ordovician and the Permian/Triassic mass extinctions transformed the trends of the biodiversity dynamics (from increase to decrease and from decrease to increase respectively, and the only Cretaceous/Paleogene mass extinction was really anomalous event on the Phanerozoic biodiversity curve. The new palaeontological data are employed to reconstruct the diversity dynamics of brachiopods in Germany (without the Alps and the Swiss Jura Mountains. The further interpretation of the both diversity curves implies that the Early Toarcian mass extinction affected the regional brachiopod faunas strongly, but this event was only decreasing

  6. Integrating technology into complex intervention trial processes: a case study.

    Science.gov (United States)

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database

  7. Complex Parts, Complex Data: Why You Need to Understand What Radiation Single Event Testing Data Does and Doesn't Show and the Implications Thereof

    Science.gov (United States)

    LaBel, Kenneth A.; Berg, Melanie D.

    2015-01-01

    Electronic parts (integrated circuits) have grown in complexity such that determining all failure modes and risks from single particle event testing is impossible. In this presentation, the authors will present why this is so and provide some realism on what this means. Its all about understanding actual risks and not making assumptions.

  8. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  9. Crystallization process of a three-dimensional complex plasma

    Science.gov (United States)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  10. Modelling of the quenching process in complex superconducting magnet systems

    International Nuclear Information System (INIS)

    Hagedorn, D.; Rodriguez-Mateos, F.

    1992-01-01

    This paper reports that the superconducting twin bore dipole magnet for the proposed Large Hadron Collider (LHC) at CERN shows a complex winding structure consisting of eight compact layers each of them electromagnetically and thermally coupled with the others. This magnet is only one part of an electrical circuit; test and operation conditions are characterized by different circuits. In order to study the quenching process in this complex system, design adequate protection schemes, and provide a basis for the dimensioning of protection devices such as heaters, current breakers and dump resistors, a general simulation tool called QUABER has been developed using the analog system analysis program SABER. A complete set of electro-thermal models has been crated for the propagation of normal regions. Any network extension or modification is easy to implement without rewriting the whole set of differential equations

  11. Quantum-information processing in disordered and complex quantum systems

    International Nuclear Information System (INIS)

    Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2006-01-01

    We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations

  12. Using high complexity analysis to probe the evolution of organic aerosol during pollution events in Beijing

    Science.gov (United States)

    Hamilton, J.; Dixon, W.; Dunmore, R.; Squires, F. A.; Swift, S.; Lee, J. D.; Rickard, A. R.; Sun, Y.; Xu, W.

    2017-12-01

    There is increasing evidence that exposure to air pollution results in significant impacts on human health. In Beijing, home to over 20 million inhabitants, particulate matter levels are very high by international standards, with official estimates of an annual mean PM2.5 concentration in 2014 of 86 μg m-3, nearly 9 times higher than the WHO guideline. Changes in particle composition during pollution events will provide key information on sources and can be used to inform strategies for pollution mitigation and health benefits. The organic fraction of PM is an extremely complex mixture reflecting the diversity of sources to the atmosphere. In this study we attempt to harness the chemical complexity of OA by developing an extensive database of over 700 mass spectra, built using literature data and sources specific tracers (e.g. diesel emission characterisation experiments and SOA generated in chamber simulations). Using a high throughput analysis method (15 min), involving UHPLC coupled to Orbitrap mass spectrometry, chromatograms are integrated, compared to the library and a list of identified compounds produced. Purpose built software based on R is used to automatically produce time series, alongside common aerosol metrics and data visualisation techniques, dramatically reducing analysis times. Offline measurements of organic aerosol composition were made as part of the Sources and Emissions of Air Pollutants in Beijing project, a collaborative program between leading UK and Chinese research groups. Rather than studying only a small number of 24 hr PM samples, we collected 250 filters samples at a range of different time resolutions, from 30 minutes to 12 hours, depending on the time of day and PM loadings. In total 643 species were identified based on their elemental formula and retention time, with species ranging from C2-C22 and between 1-13 oxygens. A large fraction of the OA species observed were organosulfates and/or nitrates. Here we will present

  13. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  14. Dyadic Event Attribution in Social Networks with Mixtures of Hawkes Processes.

    Science.gov (United States)

    Li, Liangda; Zha, Hongyuan

    2013-01-01

    In many applications in social network analysis, it is important to model the interactions and infer the influence between pairs of actors, leading to the problem of dyadic event modeling which has attracted increasing interests recently. In this paper we focus on the problem of dyadic event attribution, an important missing data problem in dyadic event modeling where one needs to infer the missing actor-pairs of a subset of dyadic events based on their observed timestamps. Existing works either use fixed model parameters and heuristic rules for event attribution, or assume the dyadic events across actor-pairs are independent. To address those shortcomings we propose a probabilistic model based on mixtures of Hawkes processes that simultaneously tackles event attribution and network parameter inference, taking into consideration the dependency among dyadic events that share at least one actor. We also investigate using additive models to incorporate regularization to avoid overfitting. Our experiments on both synthetic and real-world data sets on international armed conflicts suggest that the proposed new method is capable of significantly improve accuracy when compared with the state-of-the-art for dyadic event attribution.

  15. Complex Ornament Machining Process on a CNC Router

    Directory of Open Access Journals (Sweden)

    Camelia COŞEREANU

    2014-03-01

    Full Text Available The paper investigates the CNC routering possibilities for three species of wood, namely ash (Fraxinus Excelsior, lime wood (Tilia cordata and fir wood (Abies Alba, in order to obtain right surfaces of Art Nouveau sculptured ornaments. Given the complexity of the CNC tool path for getting wavy shapes of Art Nouveau decorations, the choice of processing parameters for each processed species of wood requires a laborious research work to correlate these parameters. Two Art Nouveau ornaments are proposed for the investigation. They are CNC routered using two types of cutting tools. The processed parameters namely the spindle speed, feed speed and depth of cut were the three variables of the machining process for the three species of wood, which were combined so, to provide good surface finish as a quality attribute. There were totally forty six variants of combining the processing parameter which were applied for CNC routering the samples made of the three species of wood. At the end, an optimum combination of the processed parameters is recommended for each species of wood.

  16. Uncertainty Reduction for Stochastic Processes on Complex Networks

    Science.gov (United States)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  17. Cognitive processing in non-communicative patients: what can event-related potentials tell us?

    Directory of Open Access Journals (Sweden)

    Zulay Rosario Lugo

    2016-11-01

    Full Text Available Event-related potentials (ERP have been proposed to improve the differential diagnosis of non-responsive patients. We investigated the potential of the P300 as a reliable marker of conscious processing in patients with locked-in syndrome (LIS. Eleven chronic LIS patients and ten healthy subjects (HS listened to a complex-tone auditory oddball paradigm, first in a passive condition (listen to the sounds and then in an active condition (counting the deviant tones. Seven out of nine HS displayed a P300 waveform in the passive condition and all in the active condition. HS showed statistically significant changes in peak and area amplitude between conditions. Three out of seven LIS patients showed the P3 waveform in the passive condition and 5 of 7 in the active condition. No changes in peak amplitude and only a significant difference at one electrode in area amplitude were observed in this group between conditions. We conclude that, in spite of keeping full consciousness and intact or nearly intact cortical functions, compared to HS, LIS patients present less reliable results when testing with ERP, specifically in the passive condition. We thus strongly recommend applying ERP paradigms in an active condition when evaluating consciousness in non-responsive patients.

  18. Characterisation of sub-micron particle number concentrations and formation events in the western Bushveld Igneous Complex, South Africa

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2012-05-01

    Full Text Available South Africa holds significant mineral resources, with a substantial fraction of these reserves occurring and being processed in a large geological structure termed the Bushveld Igneous Complex (BIC. The area is also highly populated by informal, semi-formal and formal residential developments. However, knowledge of air quality and research related to the atmosphere is still very limited in the area. In order to investigate the characteristics and processes affecting sub-micron particle number concentrations and formation events, air ion and aerosol particle size distributions and number concentrations, together with meteorological parameters, trace gases and particulate matter (PM were measured for over two years at Marikana in the heart of the western BIC. The observations showed that trace gas (i.e. SO2, NOx, CO and black carbon concentrations were relatively high, but in general within the limits of local air quality standards. The area was characterised by very high condensation sink due to background aerosol particles, PM10 and O3 concentration. The results indicated that high amounts of Aitken and accumulation mode particles originated from domestic burning for heating and cooking in the morning and evening, while during daytime SO2-based nucleation followed by the growth by condensation of vapours from industrial, residential and natural sources was the most probable source for large number concentrations of nucleation and Aitken mode particles. Nucleation event day frequency was extremely high, i.e. 86% of the analysed days, which to the knowledge of the authors is the highest frequency ever reported. The air mass back trajectory and wind direction analyses showed that the secondary particle formation was influenced both by local and regional pollution and vapour sources. Therefore, our observation of the annual cycle and magnitude of the particle formation and growth rates during

  19. The preparation of reports of a significant event at a uranium processing or uranium handling facility

    International Nuclear Information System (INIS)

    1988-08-01

    Licenses to operate uranium processing or uranium handling facilities require that certain events be reported to the Atomic Energy Control Board (AECB) and to other regulatory authorities. Reports of a significant event describe unusual events which had or could have had a significant impact on the safety of facility operations, the worker, the public or on the environment. The purpose of this guide is to suggest an acceptable method of reporting a significant event to the AECB and to describe the information that should be included. The reports of a significant event are made available to the public in accordance with the provisions of the Access to Information Act and the AECB's policy on public access to licensing information

  20. Process variant comparison: using event logs to detect differences in behavior and business rules

    NARCIS (Netherlands)

    Bolt, A.; de Leoni, M.; van der Aalst, W.M.P.

    2018-01-01

    This paper addresses the problem of comparing different variants of the same process. We aim to detect relevant differences between processes based on what was recorded in event logs. We use transition systems to model behavior and to highlight differences. Transition systems are annotated with

  1. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    Science.gov (United States)

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the

  2. Dystrophic Cardiomyopathy: Complex Pathobiological Processes to Generate Clinical Phenotype

    Directory of Open Access Journals (Sweden)

    Takeshi Tsuda

    2017-09-01

    Full Text Available Duchenne muscular dystrophy (DMD, Becker muscular dystrophy (BMD, and X-linked dilated cardiomyopathy (XL-DCM consist of a unique clinical entity, the dystrophinopathies, which are due to variable mutations in the dystrophin gene. Dilated cardiomyopathy (DCM is a common complication of dystrophinopathies, but the onset, progression, and severity of heart disease differ among these subgroups. Extensive molecular genetic studies have been conducted to assess genotype-phenotype correlation in DMD, BMD, and XL-DCM to understand the underlying mechanisms of these diseases, but the results are not always conclusive, suggesting the involvement of complex multi-layers of pathological processes that generate the final clinical phenotype. Dystrophin protein is a part of dystrophin-glycoprotein complex (DGC that is localized in skeletal muscles, myocardium, smooth muscles, and neuronal tissues. Diversity of cardiac phenotype in dystrophinopathies suggests multiple layers of pathogenetic mechanisms in forming dystrophic cardiomyopathy. In this review article, we review the complex molecular interactions involving the pathogenesis of dystrophic cardiomyopathy, including primary gene mutations and loss of structural integrity, secondary cellular responses, and certain epigenetic and other factors that modulate gene expressions. Involvement of epigenetic gene regulation appears to lead to specific cardiac phenotypes in dystrophic hearts.

  3. A computer interface for processing multi-parameter data of multiple event types

    International Nuclear Information System (INIS)

    Katayama, I.; Ogata, H.

    1980-01-01

    A logic circuit called a 'Raw Data Processor (RDP)' which functions as an interface between ADCs and the PDP-11 computer has been developed at RCNP, Osaka University for general use. It enables data processing simultaneously for numbers of events of various types up to 16, and an arbitrary combination of ADCs of any number up to 14 can be assigned to each event type by means of a pinboard matrix. The details of the RDP and its application are described. (orig.)

  4. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  5. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  6. Multiple-predators-based capture process on complex networks

    International Nuclear Information System (INIS)

    Sharafat, Rajput Ramiz; Pu Cunlai; Li Jie; Chen Rongbin; Xu Zhongqi

    2017-01-01

    The predator/prey (capture) problem is a prototype of many network-related applications. We study the capture process on complex networks by considering multiple predators from multiple sources. In our model, some lions start from multiple sources simultaneously to capture the lamb by biased random walks, which are controlled with a free parameter α . We derive the distribution of the lamb’s lifetime and the expected lifetime 〈 T 〉. Through simulation, we find that the expected lifetime drops substantially with the increasing number of lions. Moreover, we study how the underlying topological structure affects the capture process, and obtain that locating on small-degree nodes is better than on large-degree nodes to prolong the lifetime of the lamb. The dense or homogeneous network structures are against the survival of the lamb. We also discuss how to improve the capture efficiency in our model. (paper)

  7. Complex analyses on clinical information systems using restricted natural language querying to resolve time-event dependencies.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2018-06-01

    This paper reports on a generic framework to provide clinicians with the ability to conduct complex analyses on elaborate research topics using cascaded queries to resolve internal time-event dependencies in the research questions, as an extension to the proposed Clinical Data Analytics Language (CliniDAL). A cascaded query model is proposed to resolve internal time-event dependencies in the queries which can have up to five levels of criteria starting with a query to define subjects to be admitted into a study, followed by a query to define the time span of the experiment. Three more cascaded queries can be required to define control groups, control variables and output variables which all together simulate a real scientific experiment. According to the complexity of the research questions, the cascaded query model has the flexibility of merging some lower level queries for simple research questions or adding a nested query to each level to compose more complex queries. Three different scenarios (one of them contains two studies) are described and used for evaluation of the proposed solution. CliniDAL's complex analyses solution enables answering complex queries with time-event dependencies at most in a few hours which manually would take many days. An evaluation of results of the research studies based on the comparison between CliniDAL and SQL solutions reveals high usability and efficiency of CliniDAL's solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Epigenetics and Shared Molecular Processes in the Regeneration of Complex Structures

    Directory of Open Access Journals (Sweden)

    Labib Rouhana

    2016-01-01

    Full Text Available The ability to regenerate complex structures is broadly represented in both plant and animal kingdoms. Although regenerative abilities vary significantly amongst metazoans, cumulative studies have identified cellular events that are broadly observed during regenerative events. For example, structural damage is recognized and wound healing initiated upon injury, which is followed by programmed cell death in the vicinity of damaged tissue and a burst in proliferation of progenitor cells. Sustained proliferation and localization of progenitor cells to site of injury give rise to an assembly of differentiating cells known as the regeneration blastema, which fosters the development of new tissue. Finally, preexisting tissue rearranges and integrates with newly differentiated cells to restore proportionality and function. While heterogeneity exists in the basic processes displayed during regenerative events in different species—most notably the cellular source contributing to formation of new tissue—activation of conserved molecular pathways is imperative for proper regulation of cells during regeneration. Perhaps the most fundamental of such molecular processes entails chromatin rearrangements, which prime large changes in gene expression required for differentiation and/or dedifferentiation of progenitor cells. This review provides an overview of known contributions to regenerative processes by noncoding RNAs and chromatin-modifying enzymes involved in epigenetic regulation.

  9. Leveraging the BPEL Event Model to Support QoS-aware Process Execution

    Science.gov (United States)

    Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf

    Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.

  10. COMPLEX PROCESSING OF CELLULOSE WASTE FROM POULTRY AND SUGAR PRODUCTION

    Directory of Open Access Journals (Sweden)

    E. V. Sklyadnev

    2015-01-01

    Full Text Available Summary.To solve the problem of disposing of huge volumes of cellulose waste from sugar production in the form of beet pulp and waste of poultry farms in the form of poultry manure is proposed to use the joint use of two methods of thermal processing of waste - pyrolysis and gasification. The possibility of using pyrolysis applied to the waste are confirmed by experimental results. Based on the results of laboratory studies of the properties of by-products resulting from the thermal processing of the feedstock, it is proposed complex processing to produce useful products, to be implemented in the form of marketable products, and the organization's own process energy utilization. Developed flow diagram of an integrated processing said waste comprises 3 sections, which successively carried out: pyrolytic decomposition of the feedstock to obtain a secondary product in the form of solid, liquid and gas fractions, the gasification of solids to obtain combustible gas and separating the liquid fraction by distillation to obtain valuable products. The main equipment in the first region is the pyrolysis reactor cascade condensers; the second section - gasifiers layers and stream type; the third - one or more distillation columns with the necessary strapping. Proper power supply installation is organized by the use of the heat produced during combustion of the synthesis gas for heating and gasification reactor. For the developed scheme presents calculations of the heat balance of the installation, supporting the energy efficiency of the proposed disposal process. Developments carried out in the framework of the project the winner of the Youth Prize Competition Government of Voronezh region to support youth programs in the 2014-2015.

  11. Detection of unusual events and trends in complex non-stationary data streams

    International Nuclear Information System (INIS)

    Charlton-Perez, C.; Perez, R.B.; Protopopescu, V.; Worley, B.A.

    2011-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for diverse applications, ranging from power plant operation to homeland security. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden events inside intermittent signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  12. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    Science.gov (United States)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  13. R-process enrichment from a single event in an ancient dwarf galaxy.

    Science.gov (United States)

    Ji, Alexander P; Frebel, Anna; Chiti, Anirudh; Simon, Joshua D

    2016-03-31

    Elements heavier than zinc are synthesized through the rapid (r) and slow (s) neutron-capture processes. The main site of production of the r-process elements (such as europium) has been debated for nearly 60 years. Initial studies of trends in chemical abundances in old Milky Way halo stars suggested that these elements are produced continually, in sites such as core-collapse supernovae. But evidence from the local Universe favours the idea that r-process production occurs mainly during rare events, such as neutron star mergers. The appearance of a plateau of europium abundance in some dwarf spheroidal galaxies has been suggested as evidence for rare r-process enrichment in the early Universe, but only under the assumption that no gas accretes into those dwarf galaxies; gas accretion favours continual r-process enrichment in these systems. Furthermore, the universal r-process pattern has not been cleanly identified in dwarf spheroidals. The smaller, chemically simpler, and more ancient ultrafaint dwarf galaxies assembled shortly after the first stars formed, and are ideal systems with which to study nucleosynthesis events such as the r-process. Reticulum II is one such galaxy. The abundances of non-neutron-capture elements in this galaxy (and others like it) are similar to those in other old stars. Here, we report that seven of the nine brightest stars in Reticulum II, observed with high-resolution spectroscopy, show strong enhancements in heavy neutron-capture elements, with abundances that follow the universal r-process pattern beyond barium. The enhancement seen in this 'r-process galaxy' is two to three orders of magnitude higher than that detected in any other ultrafaint dwarf galaxy. This implies that a single, rare event produced the r-process material in Reticulum II. The r-process yield and event rate are incompatible with the source being ordinary core-collapse supernovae, but consistent with other possible sources, such as neutron star mergers.

  14. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  15. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  16. Processing of complex auditory patterns in musicians and nonmusicians.

    Science.gov (United States)

    Boh, Bastiaan; Herholz, Sibylle C; Lappe, Claudia; Pantev, Christo

    2011-01-01

    In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN) response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG) we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  17. Processing of complex auditory patterns in musicians and nonmusicians.

    Directory of Open Access Journals (Sweden)

    Bastiaan Boh

    Full Text Available In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  18. Usefulness of surgical complexity classification index in cataract surgery process.

    Science.gov (United States)

    Salazar Méndez, R; Cuesta García, M; Llaneza Velasco, M E; Rodríguez Villa, S; Cubillas Martín, M; Alonso Álvarez, C M

    2016-06-01

    To evaluate the usefulness of surgical complexity classification index (SCCI) to predict the degree of surgical difficulty in cataract surgery. This retrospective study includes data collected between January 2013 and December 2014 from patients who underwent cataract extraction by phacoemulsification at our hospital. A sample size of 159 patients was obtained by simple random sampling (P=.5, 10% accuracy, 95% confidence). The main variables were: recording and value of SCCI in electronic medical record (EMR), presence of exfoliation syndrome (XFS), criteria for inclusion in surgical waiting list (SWL), and functional results. SCCI was classified into 7 categories (range: 1-4) according to predictors of technical difficulty, which was indirectly estimated in terms of surgical time (ST). All statistical analyses were performed using SPSS v15.0 statistical software. Prevalence of XFS was 18.2% (95%CI: 11.9-24.5). In terms of quality indicators in the cataract surgery process, 96.8% of patients met at least one of the criteria to be included in SWL, and 98.1% gained ≥2 Snellen lines. The SCCI was recorded in EMR of 98.1% patients, and it was grouped for study into 2 categories: High and low surgical complexity. Statistically significant differences in the distribution of ST were found depending on the assigned SCCI (Pde Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  19. [Effect of the microencapsulation process parameters piroxicam by complex coacervation].

    Science.gov (United States)

    Lamoudi, L; Chaumeil, J-C; Daoud, K

    2015-01-01

    The gelatin-acacia system is used for the microencapsulation of piroxicam by complex coacervation. The effect of some formulation parameters and process, namely the ratio of gelatin/gum acacia, core/wall ratio, concentration of crosslinking agent and crosslinking time are studied. The microcapsules properties are evaluated. The results showed that the microcapsules have a spherical shape, a coacervation efficiency greater than 70%, an average diameter less than 250 microns, a good stability and finally, the better values are obtained for gelatin/acacia ratio (5/3), ratio core/wall (1/4), an amount of 2 mL of crosslinking agent and a crosslinking time of 60 minutes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. EEG-based cognitive load of processing events in 3D virtual worlds is lower than processing events in 2D displays.

    Science.gov (United States)

    Dan, Alex; Reiner, Miriam

    2017-12-01

    Interacting with 2D displays, such as computer screens, smartphones, and TV, is currently a part of our daily routine; however, our visual system is built for processing 3D worlds. We examined the cognitive load associated with a simple and a complex task of learning paper-folding (origami) by observing 2D or stereoscopic 3D displays. While connected to an electroencephalogram (EEG) system, participants watched a 2D video of an instructor demonstrating the paper-folding tasks, followed by a stereoscopic 3D projection of the same instructor (a digital avatar) illustrating identical tasks. We recorded the power of alpha and theta oscillations and calculated the cognitive load index (CLI) as the ratio of the average power of frontal theta (Fz.) and parietal alpha (Pz). The results showed a significantly higher cognitive load index associated with processing the 2D projection as compared to the 3D projection; additionally, changes in the average theta Fz power were larger for the 2D conditions as compared to the 3D conditions, while alpha average Pz power values were similar for 2D and 3D conditions for the less complex task and higher in the 3D state for the more complex task. The cognitive load index was lower for the easier task and higher for the more complex task in 2D and 3D. In addition, participants with lower spatial abilities benefited more from the 3D compared to the 2D display. These findings have implications for understanding cognitive processing associated with 2D and 3D worlds and for employing stereoscopic 3D technology over 2D displays in designing emerging virtual and augmented reality applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Initiating events identification of the IS process using the master logic diagram

    International Nuclear Information System (INIS)

    Cho, Nam Chul; Jae, Moo Sung; Yang, Joon Eon

    2005-01-01

    Hydrogen is very attractive as a future secondary energy carrier considering environmental problems. It is important to produce hydrogen from water by use of carbon free primary energy source. The thermochemical water decomposition cycle is one of the methods for the hydrogen production process from water. Japan Atomic Energy Research Institute (JAERI) has been carrying out an R and D on the IS (iodine.sulfur) process that was first proposed by GA (General Atomic Co.) focusing on demonstration the 'closed-cycle' continuous hydrogen production on developing a feasible and efficient scheme for the HI processing, and on screening and/or developing materials of construction to be used in the corrosive process environment. The successful continuous operation of the IS-process was demonstrated and this process is one of the thermochemical processes, which is the closest to being industrialized. Currently, Korea has also started a research about the IS process and the construction of the IS process system is planned. In this study, for risk analysis of the IS process, initiating events of the IS process are identified by using the Master Logic Diagram (MLD) that is method for initiating event identification

  2. Journaling about stressful events: effects of cognitive processing and emotional expression.

    Science.gov (United States)

    Ullrich, Philip M; Lutgendorf, Susan K

    2002-01-01

    The effects of two journaling interventions, one focusing on emotional expression and the other on both cognitive processing and emotional expression, were compared during 1 month of journaling about a stressful or traumatic event. One hundred twenty-two students were randomly assigned to one of three writing conditions: (a) focusing on emotions related to a trauma or stressor, (b) focusing on cognitions and emotions related to a trauma or stressor, or (c) writing factually about media events. Writers focusing on cognitions and emotions developed greater awareness of the positive benefits of the stressful event than the other two groups. This effect was apparently mediated by greater cognitive processing during writing. Writers focusing on emotions alone reported more severe illness symptoms during the study than those in other conditions. This effect appeared to be mediated by a greater focus on negative emotional expression during writing.

  3. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    Science.gov (United States)

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  4. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  5. Effects of Grammatical Categories on Children's Visual Language Processing: Evidence from Event-Related Brain Potentials

    Science.gov (United States)

    Weber-Fox, Christine; Hart, Laura J.; Spruill, John E., III

    2006-01-01

    This study examined how school-aged children process different grammatical categories. Event-related brain potentials elicited by words in visually presented sentences were analyzed according to seven grammatical categories with naturally varying characteristics of linguistic functions, semantic features, and quantitative attributes of length and…

  6. Event-related Potentials Reflecting the Processing of Phonological Constraint Violations

    NARCIS (Netherlands)

    Domahs, Ulrike; Kehrein, Wolfgang; Knaus, Johannes; Wiese, Richard; Schlesewsky, Matthias

    2009-01-01

    Flow are violations of phonological constraints processed in word comprehension? The present article reports the results of ail event-related potentials (ERP) Study oil a phonological constraint of German that disallows identical segments within it syllable or word (CC(i)VC(i)). We examined three

  7. Event-related potentials reflecting the processing of phonological constraint violations

    NARCIS (Netherlands)

    Domahs, U.; Kehrein, W.; Knaus, J.; Wiese, R.; Schlesewsky, M.

    2009-01-01

    How are violations of phonological constraints processed in word comprehension? The present article reports the results of an event-related potentials (ERP) study on a phonological constraint of German that disallows identical segments within a syllable or word (CC iVCi). We examined three types of

  8. Early referential context effects in sentence processing: Evidence from event-related brain potentials

    NARCIS (Netherlands)

    Berkum, J.J.A. van; Brown, C.M.; Hagoort, P.

    1999-01-01

    An event-related brain potentials experiment was carried out to examine the interplay of referential and structural factors during sentence processing in discourse. Subjects read (Dutch) sentences beginning like “David told the girl that … ” in short story contexts that had introduced either one or

  9. Preparing for novel versus familiar events: shifts in global and local processing

    NARCIS (Netherlands)

    Förster, J.; Liberman, N.; Shapiro, O.

    2009-01-01

    Six experiments examined whether novelty versus familiarity influences global versus local processing styles. Novelty and familiarity were manipulated by either framing a task as new versus familiar or by asking participants to reflect upon novel versus familiar events prior to the task (i.e.,

  10. Working memory processes show different degrees of lateralization : Evidence from event-related potentials

    NARCIS (Netherlands)

    Talsma, D; Wijers, A.A.; Klaver, P; Mulder, G.

    This study aimed to identify different processes in working memory, using event-related potentials (ERPs) and response times. Abstract polygons were presented for memorization and subsequent recall in a delayed matching-to-sample paradigm. Two polygons were presented bilaterally for memorization and

  11. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    International Nuclear Information System (INIS)

    Perez, Rafael B.; Protopopescu, Vladimir A.; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method

  12. A tool for aligning event logs and prescriptive process models through automated planning

    OpenAIRE

    De Leoni, M.; Lanciano, G.; Marrella, A.

    2017-01-01

    In Conformance Checking, alignment is the problem of detecting and repairing nonconformity between the actual execution of a business process, as recorded in an event log, and the model of the same process. Literature proposes solutions for the alignment problem that are implementations of planning algorithms built ad-hoc for the specific problem. Unfortunately, in the era of big data, these ad-hoc implementations do not scale sufficiently compared with wellestablished planning systems. In th...

  13. Event recognition by detrended fluctuation analysis: An application to Teide-Pico Viejo volcanic complex, Tenerife, Spain

    International Nuclear Information System (INIS)

    Del Pin, Enrico; Carniel, Roberto; Tarraga, Marta

    2008-01-01

    In this work we investigate the application of the detrended fluctuation analysis (DFA) to seismic data recorded in the island of Tenerife (Canary Islands, Spain) during the month of July 2004, in a phase of possible unrest of the Teide-Pico Viejo volcanic complex. Tectonic events recorded in the area are recognized and located by the Spanish national agency Instituto Geografico Nacional (IGN) and their catalogue is the only currently available dataset, whose completeness unfortunately suffers from the strong presence of anthropogenic noise. In this paper we propose the use of DFA to help to automatically identify events. The evaluation of this case study proves DFA to be a promising tool to be used for rapidly screening large seismic datasets and highlighting time windows with the potential presence of discrete events

  14. Supporting change processes in design: Complexity, prediction and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eckert, Claudia M. [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: cme26@cam.ac.uk; Keller, Rene [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: rk313@cam.ac.uk; Earl, Chris [Open University, Department of Design and Innovation, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)]. E-mail: C.F.Earl@open.ac.uk; Clarkson, P. John [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: pjc10@cam.ac.uk

    2006-12-15

    Change to existing products is fundamental to design processes. New products are often designed through change or modification to existing products. Specific parts or subsystems are changed to similar ones whilst others are directly reused. Design by modification applies particularly to safety critical products where the reuse of existing working parts and subsystems can reduce cost and risk. However change is rarely a matter of just reusing or modifying parts. Changing one part can propagate through the entire design leading to costly rework or jeopardising the integrity of the whole product. This paper characterises product change based on studies in the aerospace and automotive industry and introduces tools to aid designers in understanding the potential effects of change. Two ways of supporting designers are described: probabilistic prediction of the effects of change and visualisation of change propagation through product connectivities. Change propagation has uncertainties which are amplified by the choices designers make in practice as they implement change. Change prediction and visualisation is discussed with reference to complexity in three areas of product development: the structural backcloth of connectivities in the existing product (and its processes), the descriptions of the product used in design and the actions taken to carry out changes.

  15. WAITING TIME DISTRIBUTION OF SOLAR ENERGETIC PARTICLE EVENTS MODELED WITH A NON-STATIONARY POISSON PROCESS

    International Nuclear Information System (INIS)

    Li, C.; Su, W.; Fang, C.; Zhong, S. J.; Wang, L.

    2014-01-01

    We present a study of the waiting time distributions (WTDs) of solar energetic particle (SEP) events observed with the spacecraft WIND and GOES. The WTDs of both solar electron events (SEEs) and solar proton events (SPEs) display a power-law tail of ∼Δt –γ . The SEEs display a broken power-law WTD. The power-law index is γ 1 = 0.99 for the short waiting times (<70 hr) and γ 2 = 1.92 for large waiting times (>100 hr). The break of the WTD of SEEs is probably due to the modulation of the corotating interaction regions. The power-law index, γ ∼ 1.82, is derived for the WTD of the SPEs which is consistent with the WTD of type II radio bursts, indicating a close relationship between the shock wave and the production of energetic protons. The WTDs of SEP events can be modeled with a non-stationary Poisson process, which was proposed to understand the waiting time statistics of solar flares. We generalize the method and find that, if the SEP event rate λ = 1/Δt varies as the time distribution of event rate f(λ) = Aλ –α exp (– βλ), the time-dependent Poisson distribution can produce a power-law tail WTD of ∼Δt α –3 , where 0 ≤ α < 2

  16. A digital pixel cell for address event representation image convolution processing

    Science.gov (United States)

    Camunas-Mesa, Luis; Acosta-Jimenez, Antonio; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number of neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate events according to their information levels. Neurons with more information (activity, derivative of activities, contrast, motion, edges,...) generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. AER technology has been used and reported for the implementation of various type of image sensors or retinae: luminance with local agc, contrast retinae, motion retinae,... Also, there has been a proposal for realizing programmable kernel image convolution chips. Such convolution chips would contain an array of pixels that perform weighted addition of events. Once a pixel has added sufficient event contributions to reach a fixed threshold, the pixel fires an event, which is then routed out of the chip for further processing. Such convolution chips have been proposed to be implemented using pulsed current mode mixed analog and digital circuit techniques. In this paper we present a fully digital pixel implementation to perform the weighted additions and fire the events. This way, for a given technology, there is a fully digital implementation reference against which compare the mixed signal implementations. We have designed, implemented and tested a fully digital AER convolution pixel. This pixel will be used to implement a full AER convolution chip for programmable kernel image convolution processing.

  17. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  18. A Cognition-based View of Decision Processes in Complex Social-Ecological Systems

    Directory of Open Access Journals (Sweden)

    Kathi K. Beratan

    2007-06-01

    Full Text Available This synthesis paper is intended to provide an overview of individual and collective decision-making processes that might serve as a theoretical foundation for a complexity-based approach to environmental policy design and natural resource management planning. Human activities are the primary drivers of change in the Earth's biosphere today, so efforts to shift the trajectory of social-ecological systems must focus on changes in individual and collective human behavior. Recent advances in understanding the biological basis of thought and memory offer insights of use in designing management and planning processes. The human brain has evolved ways of dealing with complexity and uncertainty, and is particularly attuned to social information. Changes in an individual's schemas, reflecting changes in the patterns of neural connections that are activated by particular stimuli, occur primarily through nonconsious processes in response to experiential learning during repeated exposure to novel situations, ideas, and relationships. Discourse is an important mechanism for schema modification, and thus for behavior change. Through discourse, groups of people construct a shared story - a collective model - that is useful for predicting likely outcomes of actions and events. In effect, good stories are models that filter and organize distributed knowledge about complex situations and relationships in ways that are readily absorbed by human cognitive processes. The importance of discourse supports the view that collaborative approaches are needed to effectively deal with environmental problems and natural resource management challenges. Methods derived from the field of mediation and dispute resolution can help us take advantage of the distinctly human ability to deal with complexity and uncertainty. This cognitive view of decision making supports fundamental elements of resilience management and adaptive co-management, including fostering social learning

  19. Source complexity of the May 20, 2012, Mw 5.9, Ferrara (Italy event

    Directory of Open Access Journals (Sweden)

    Davide Piccinini

    2012-10-01

    Full Text Available A Mw 3.9 foreshock on May 19, 2012, at 23:13 UTC, was followed at 02:03 on May 20, 2012, by a Mw 5.9 earthquake that hit a densely populated area in the Po Plain, west of the city of Ferrara, Italy (Figure 1. Over the subsequent 13 days, six Mw >5 events occurred; of these, the most energetic was a Mw 5.8 earthquake on May 29, 2012, 12 km WSW of the main shock. The tragic balance of this sequence was 17 casualties, hundreds of injured, and severe damage to the historical and cultural heritage of the area. From a seismological point of view, the 2012 earthquake was not an outstanding event in its regional context. The same area was hit in 1996 by a Mw 5.4 earthquake [Selvaggi et al. 2001], and previously in 1986 and in 1967 (DBMI11 [Locati et al. 2011]. The most destructive historical event was the 1570, Imax 8 event, which struck the town of Ferrara [Guidoboni et al. 2007, Rovida et al. 2011]. The 2012 seismic sequence lasted for several weeks and probably developed on a well-known buried thrust fault [Basili et al. 2008, Toscani et al. 2009, DISS Working Group 2010], at depths between 2 km and 10-12 km. […

  20. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  1. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    Science.gov (United States)

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity

  2. A bioactive molecule in a complex wound healing process: platelet-derived growth factor.

    Science.gov (United States)

    Kaltalioglu, Kaan; Coskun-Cevher, Sule

    2015-08-01

    Wound healing is considered to be particularly important after surgical procedures, and the most important wounds related to surgical procedures are incisional, excisional, and punch wounds. Research is ongoing to identify methods to heal non-closed wounds or to accelerate wound healing; however, wound healing is a complex process that includes many biological and physiological events, and it is affected by various local and systemic factors, including diabetes mellitus, infection, ischemia, and aging. Different cell types (such as platelets, macrophages, and neutrophils) release growth factors during the healing process, and platelet-derived growth factor is a particularly important mediator in most stages of wound healing. This review explores the relationship between platelet-derived growth factor and wound healing. © 2014 The International Society of Dermatology.

  3. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  4. Food processing as an agricultural countermeasure after an accidental contamination event

    International Nuclear Information System (INIS)

    Igreja, Eduardo; Rochedo, Elaine R.R.; Prado, Nadya M.P.D.; Silva, Diogo N.G.

    2013-01-01

    Food processing allows significant reduction in the radionuclide contamination of foodstuffs. The effects of processing on contaminated food depend on the radionuclide, the type of foodstuff and the method of processing. The effectiveness of radionuclide removal from raw material during processing can vary widely; however, processing of raw materials of vegetable and animal origin is often considered one of the most effective countermeasures for reducing the radioactive contamination of the foodstuff to or below permissible levels, and can be applied both domestically and in industrial processing of food. The food processing retention factor, Fr, is the fraction of radionuclide activity that is retained in the food after processing; it is obtained by the product of two quantities, the processing efficiency, Pe, that is the ratio of the fresh weight of the processed food to the weight of the original raw material, and the processing factor, Pf, that is the ratio of the radionuclide activity concentrations in the processed and in the raw material. The objective of this work was to investigate the effect of the reduction in dose due to food processing after a nuclear or radiological accident. Radionuclides considered were Cs-137, Sr-90 and I-131. The effect on total diet of individuals was investigating for a typical diet of the Southeast region, where the Brazilian Nuclear Power Plants are located. The effect was analyzed considering the use of the processing technologies after contamination events occurring in different seasons of the year. (author)

  5. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  6. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  7. An integrative process model of leadership: examining loci, mechanisms, and event cycles.

    Science.gov (United States)

    Eberly, Marion B; Johnson, Michael D; Hernandez, Morela; Avolio, Bruce J

    2013-09-01

    Utilizing the locus (source) and mechanism (transmission) of leadership framework (Hernandez, Eberly, Avolio, & Johnson, 2011), we propose and examine the application of an integrative process model of leadership to help determine the psychological interactive processes that constitute leadership. In particular, we identify the various dynamics involved in generating leadership processes by modeling how the loci and mechanisms interact through a series of leadership event cycles. We discuss the major implications of this model for advancing an integrative understanding of what constitutes leadership and its current and future impact on the field of psychological theory, research, and practice. © 2013 APA, all rights reserved.

  8. Relationship between single-event upset immunity and fabrication processes of recent memories

    International Nuclear Information System (INIS)

    Nemoto, N.; Shindou, H.; Kuboyama, S.; Matsuda, S.; Itoh, H.; Okada, S.; Nashiyama, I.

    1999-01-01

    Single-Event upset (SEU) immunity for commercial devices were evaluated by irradiation tests using high-energy heavy ions. We show test results and describe the relationship between observed SEU and structures/fabrication processes. We have evaluated single-even upset (SEU) tolerance of recent commercial memory devices using high energy heavy ions in order to find relationship between SEU rate and their fabrication process. It was revealed that the change of the process parameter gives much effect for the SEU rate of the devices. (authors)

  9. A modeling process to understand complex system architectures

    Science.gov (United States)

    Robinson, Santiago Balestrini

    2009-12-01

    In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two

  10. [Biohydrometallurgical technology of a complex copper concentrate process].

    Science.gov (United States)

    Murav'ev, M I; Fomchenko, N V; Kondrat'eva, T F

    2011-01-01

    Leaching of sulfide-oxidized copper concentrate of the Udokan deposit ore with a copper content of 37.4% was studied. In the course of treatment in a sulfuric acid solution with pH 1.2, a copper leaching rate was 6.9 g/kg h for 22 h, which allowed extraction of 40.6% of copper. As a result of subsequent chemical leaching at 80 degrees C during 7 h with a solution of sulphate ferric iron obtained after bio-oxidation by an association of microorganisms, the rate of copper recovery was 52.7 g/kg h. The total copper recovery was 94.5% (over 29 h). Regeneration of the Fe3+ ions was carried out by an association of moderately thermophilic microorganisms, including bacteria of genus Sulfobacillus and archaea of genus Ferroplasma acidiphilum, at 1.0 g/l h at 40 degrees C in the presence of 3% solids obtained by chemical leaching of copper concentrate. A technological scheme of a complex copper concentrate process with the use of bacterial-chemical leaching is proposed.

  11. Simulating flaring events in complex active regions driven by observed magnetograms

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has

  12. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  13. Modeling associations between latent event processes governing time series of pulsing hormones.

    Science.gov (United States)

    Liu, Huayu; Carlson, Nichole E; Grunwald, Gary K; Polotsky, Alex J

    2017-10-31

    This work is motivated by a desire to quantify relationships between two time series of pulsing hormone concentrations. The locations of pulses are not directly observed and may be considered latent event processes. The latent event processes of pulsing hormones are often associated. It is this joint relationship we model. Current approaches to jointly modeling pulsing hormone data generally assume that a pulse in one hormone is coupled with a pulse in another hormone (one-to-one association). However, pulse coupling is often imperfect. Existing joint models are not flexible enough for imperfect systems. In this article, we develop a more flexible class of pulse association models that incorporate parameters quantifying imperfect pulse associations. We propose a novel use of the Cox process model as a model of how pulse events co-occur in time. We embed the Cox process model into a hormone concentration model. Hormone concentration is the observed data. Spatial birth and death Markov chain Monte Carlo is used for estimation. Simulations show the joint model works well for quantifying both perfect and imperfect associations and offers estimation improvements over single hormone analyses. We apply this model to luteinizing hormone (LH) and follicle stimulating hormone (FSH), two reproductive hormones. Use of our joint model results in an ability to investigate novel hypotheses regarding associations between LH and FSH secretion in obese and non-obese women. © 2017, The International Biometric Society.

  14. Modeling the Process of Event Sequence Data Generated for Working Condition Diagnosis

    Directory of Open Access Journals (Sweden)

    Jianwei Ding

    2015-01-01

    Full Text Available Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM, which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis. Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.

  15. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  16. Information structure influences depth of syntactic processing: event-related potential evidence for the Chomsky illusion.

    Science.gov (United States)

    Wang, Lin; Bastiaansen, Marcel; Yang, Yufang; Hagoort, Peter

    2012-01-01

    Information structure facilitates communication between interlocutors by highlighting relevant information. It has previously been shown that information structure modulates the depth of semantic processing. Here we used event-related potentials to investigate whether information structure can modulate the depth of syntactic processing. In question-answer pairs, subtle (number agreement) or salient (phrase structure) syntactic violations were placed either in focus or out of focus through information structure marking. P600 effects to these violations reflect the depth of syntactic processing. For subtle violations, a P600 effect was observed in the focus condition, but not in the non-focus condition. For salient violations, comparable P600 effects were found in both conditions. These results indicate that information structure can modulate the depth of syntactic processing, but that this effect depends on the salience of the information. When subtle violations are not in focus, they are processed less elaborately. We label this phenomenon the Chomsky illusion.

  17. Attenuation of deep semantic processing during mind wandering: an event-related potential study.

    Science.gov (United States)

    Xu, Judy; Friedman, David; Metcalfe, Janet

    2018-03-21

    Although much research shows that early sensory and attentional processing is affected by mind wandering, the effect of mind wandering on deep (i.e. semantic) processing is relatively unexplored. To investigate this relation, we recorded event-related potentials as participants studied English-Spanish word pairs, one at a time, while being intermittently probed for whether they were 'on task' or 'mind wandering'. Both perceptual processing, indexed by the P2 component, and deep processing, indexed by a late, sustained slow wave maximal at parietal electrodes, was attenuated during periods preceding participants' mind wandering reports. The pattern when participants were on task, rather than mind wandering, is similar to the subsequent memory or difference in memory effect. These results support previous findings of sensory attenuation during mind wandering, and extend them to a long-duration slow wave by suggesting that the deeper and more sustained levels of processing are also disrupted.

  18. Available processing resources influence encoding-related brain activity before an event

    OpenAIRE

    Galli, Giulia; Gebert, A. Dorothea; Otten, Leun J.

    2013-01-01

    Effective cognitive functioning not only relies on brain activity elicited by an event, but also on activity that precedes it. This has been demonstrated in a number of cognitive domains, including memory. Here, we show that brain activity that precedes the effective encoding of a word into long-term memory depends on the availability of sufficient processing resources. We recorded electrical brain activity from the scalps of healthy adult men and women while they memorized intermixed visual ...

  19. An Event-related Brain Potential Study of English Morphosyntactic Processing in Japanese Learners of English

    OpenAIRE

    Tatsuta, Natsuko

    2014-01-01

    This dissertation investigated the neural mechanisms underlying English morphosyntactic processing in Case, subject-verb agreement, and past tense inflection in Japanese learners of English (JLEs) using event-related brain potentials (ERPs) in terms of the effects of the age of second language (L2) acquisition (the age of learning English), L2 proficiency level (the English proficiency level), and native/first language (L1) transfer. Researchers have debated for a number of years the question...

  20. Investigation of Microphysical Parameters within Winter and Summer Type Precipitation Events over Mountainous [Complex] Terrain

    International Nuclear Information System (INIS)

    Stalker, James R.; Bossert, James E.

    1997-10-01

    In this study we investigate complex terrain effects on precipitation with RAMS for both in winter and summer cases from a microphysical perspective. We consider a two dimensional east-west topographic cross section in New Mexico representative of the Jemez mountains on the west and the Sangre de Cristo mountains on the east. Located between these two ranges is the Rio Grande Valley. In these two dimensional experiments, variations in DSDs are considered to simulate total precipitation that closely duplicate observed precipitation

  1. Prioritisation process for decommissioning of the Iraq former nuclear complex

    International Nuclear Information System (INIS)

    Jarjies, Adnan; Abbas, Mohammed; Fernandes, Horst M.; Coates, Roger

    2008-01-01

    There are a number of sites in Iraq which have been used for nuclear activities and which contain potentially significant amounts of radioactive waste. The principal nuclear site is Al-Tuwaitha, the former nuclear research centre. Many of these sites suffered substantial physical damage during the Gulf Wars and have been subjected to subsequent looting. All require decommissioning in order to ensure both radiological and non-radiological safety. However, it is not possible to undertake the decommissioning of all sites and facilities at the same time. Therefore, a prioritization methodology has been developed in order to aid the decision-making process. The methodology comprises three principal stages of assessment: 1) a quantitative surrogate risk assessment, 2) a range of sensitivity analyses and 3) the inclusion of qualitative modifying factors. A group of five Tuwaitha facilities presented the highest evaluated risk, followed by a middle ranking grouping of Tuwaitha facilities and some other sites, with a relatively large number of lower risk facilities and sites comprising a third group. This initial risk-based order of priority is changed when modifying factors are taken into account. It is necessary to take account of Iraq's isolation from the international nuclear community over the last two decades and the lack of experienced personnel. Therefore it is appropriate to initiate decommissioning operations on selected low risk facilities at Tuwaitha in order to build capacity/experience and prepare for work to be carried out in more complex and potentially high hazard facilities. In addition it is appropriate to initiate some prudent precautionary actions relating to some of the higher risk facilities. (author)

  2. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  3. FLCNDEMF: An Event Metamodel for Flood Process Information Management under the Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Nengcheng Chen

    2015-06-01

    Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.

  4. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  5. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  6. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  7. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    International Nuclear Information System (INIS)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-01

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  8. Process evaluation of complex interventions: Medical Research Council guidance

    OpenAIRE

    Moore, G.F.; Audrey, S.; Barker, M.; Bond, L.; Bonell, C.; Hardeman, W.; Moore, L.; O'Cathain, A.; Tinati, T.; Wight, D.; Baird, J.

    2015-01-01

    Attempts to tackle problems such as smoking and obesity increasingly use complex interventions. These are commonly defined as interventions that comprise multiple interacting components, although additional dimensions of complexity include the difficulty of their implementation and the number of organisational levels they target.1 Randomised controlled trials are regarded as the gold standard for establishing the effectiveness of interventions, when randomisation is feasible. However, effect ...

  9. From products to processes: Academic events to foster interdisciplinary and iterative dialogue in a changing climate

    Science.gov (United States)

    Addor, Nans; Ewen, Tracy; Johnson, Leigh; Ćöltekin, Arzu; Derungs, Curdin; Muccione, Veruska

    2015-08-01

    In the context of climate change, both climate researchers and decision makers deal with uncertainties, but these uncertainties differ in fundamental ways. They stem from different sources, cover different temporal and spatial scales, might or might not be reducible or quantifiable, and are generally difficult to characterize and communicate. Hence, a mutual understanding between current and future climate researchers and decision makers must evolve for adaptation strategies and planning to progress. Iterative two-way dialogue can help to improve the decision making process by bridging current top-down and bottom-up approaches. One way to cultivate such interactions is by providing venues for these actors to interact and exchange on the uncertainties they face. We use a workshop-seminar series involving academic researchers, students, and decision makers as an opportunity to put this idea into practice and evaluate it. Seminars, case studies, and a round table allowed participants to reflect upon and experiment with uncertainties. An opinion survey conducted before and after the workshop-seminar series allowed us to qualitatively evaluate its influence on the participants. We find that the event stimulated new perspectives on research products and communication processes, and we suggest that similar events may ultimately contribute to the midterm goal of improving support for decision making in a changing climate. Therefore, we recommend integrating bridging events into university curriculum to foster interdisciplinary and iterative dialogue among researchers, decision makers, and students.

  10. APNEA list mode data acquisition and real-time event processing

    Energy Technology Data Exchange (ETDEWEB)

    Hogle, R.A.; Miller, P. [GE Corporate Research & Development Center, Schenectady, NY (United States); Bramblett, R.L. [Lockheed Martin Specialty Components, Largo, FL (United States)

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  11. Climate Variability Reveals Complex Events for Tularemia Dynamics in Man and Mammals

    Directory of Open Access Journals (Sweden)

    Thomas R. Palo

    2005-06-01

    Full Text Available Tularemia is caused by the bacterium Francisella tularensis, but the natural reservoir is unknown and environmental conditions for outbreaks in mammals and man are poorly understood. The present study analyzed the synchrony between the North Atlantic Oscillation (NAO index, the number of human cases of tularemia reported in Sweden, and the density of hares. Climate variation at a lag of 2 yr explained as a single factor ~ 27% of the variation in the number of tularemia cases over time. A low NAO index, indicating cold winters, and low water flow in rivers during the coming summer were associated with high numbers of human cases of tularemia 2 yr later. The number of mountain hares was not related to NAO or to the number of cases of tularemia. The change in mountain hare numbers was negatively associated with the number of human cases, showing the sensitivity of this species to the disease. Low turnover in water environments may at some point in time trigger a chain of events leading to increased replication of F. tularensis via unknown reservoirs and/or vectors that affect humans and mammals. A possible increase in the NAO index with a future warmer climate would not be expected to facilitate a higher frequency of tularemia outbreaks in Sweden.

  12. Complex Socio-Ecological Dynamics driven by extreme events in the Amazon

    Science.gov (United States)

    Pinho, P. F.

    2015-12-01

    Several years with extreme floods or droughts in the past decade have caused human suffering in remote communities of the Brazilian Amazon. Despite documented local knowledge and practices for coping with the high seasonal variability characteristic of the region's hydrology (e.g. 10m change in river levels between dry and flood seasons), and despite 'civil Defense' interventions by various levels of government, the more extreme years seem to have exceeded the coping capacity of the community. In this paper, we explore whether there is a real increase in variability, whether the community perceives that recent extreme events are outside the experience which shapes their responses to 'normal' levels of variability, and what science-based policy could contribute to greater local resilience. Hydrological analyses suggest that variability is indeed increasing, in line with expectations from future climate change. However, current measures of hydrological regimes do not predict years with social hardship very well. Interviewees in two regions are able to express their strategies for dealing with 'normal' variability very well, but also identify ways in which abnormal years exceed their ability to cope. Current Civil Defense arrangements struggle to deliver emergency assistance in a sufficiently timely and locally appropriate fashion. Combining these insights in the context of social-ecological change, we suggest how better integration of science, policy and local knowledge could improve resilience to future trends, and identify some contributions science could make into such an arrangement.

  13. Web processing service for climate impact and extreme weather event analyses. Flyingpigeon (Version 1.0)

    Science.gov (United States)

    Hempelmann, Nils; Ehbrecht, Carsten; Alvarez-Castro, Carmen; Brockmann, Patrick; Falk, Wolfgang; Hoffmann, Jörg; Kindermann, Stephan; Koziol, Ben; Nangini, Cathy; Radanovics, Sabine; Vautard, Robert; Yiou, Pascal

    2018-01-01

    Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files ;at home; with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.

  14. Geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin

    Science.gov (United States)

    Vibhava, F.; Graham, W. D.; Maxwell, R. M.

    2012-12-01

    Streamflow at any given location and time is representative of surface and subsurface contributions from various sources. The ability to fully identify the factors controlling these contributions is key to successfully understanding the transport of contaminants through the system. In this study we developed a fully integrated 3D surface water-groundwater-land surface model, PARFLOW, to evaluate geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin in North Central Florida. In addition to traditional model evaluation criterion, such as comparing field observations to model simulated streamflow and groundwater elevations, we quantitatively evaluated the model's predictions of surface-groundwater interactions over space and time using a suite of binary end-member mixing models that were developed using observed specific conductivity differences among surface and groundwater sources throughout the domain. Analysis of model predictions showed that geologic heterogeneity exerts a strong control on both streamflow generation processes and land atmospheric fluxes in this watershed. In the upper basin, where the karst aquifer is overlain by a thick confining layer, approximately 92% of streamflow is "young" event flow, produced by near stream rainfall. Throughout the upper basin the confining layer produces a persistent high surficial water table which results in high evapotranspiration, low groundwater recharge and thus negligible "inter-event" streamflow. In the lower basin, where the karst aquifer is unconfined, deeper water tables result in less evapotranspiration. Thus, over 80% of the streamflow is "old" subsurface flow produced by diffuse infiltration through the epikarst throughout the lower basin, and all surface contributions to streamflow originate in the upper confined basin. Climatic variability provides a secondary control on surface-subsurface and land-atmosphere fluxes, producing significant seasonal and

  15. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  16. Spatial and Semantic Processing between Audition and Vision: An Event-Related Potential Study

    Directory of Open Access Journals (Sweden)

    Xiaoxi Chen

    2011-10-01

    Full Text Available Using a crossmodal priming paradigm, this study investigated how the brain bound the spatial and semantic features in multisensory processing. The visual stimuli (pictures of animals were presented after the auditory stimuli (sounds of animals, and the stimuli from different modalities may match spatially (or semantically or not. Participants were required to detect the head orientation of the visual target (an oddball paradigm. The event-related potentials (ERPs to the visual stimuli was enhanced by spatial attention (150–170 ms irrespectively of semantic information. The early crossmodal attention effect for the visual stimuli was more negative in the spatial-congruent condition than in the spatial-incongruent condition. By contrast, the later effects of spatial ERPs were significant only for the semantic- congruent condition (250–300 ms. These findings indicated that spatial attention modulated early visual processing, and semantic and spatial features were simultaneously used to orient attention and modulate later processing stages.

  17. Are affective events richly recollected or simply familiar? The experience and process of recognizing feelings past.

    Science.gov (United States)

    Ochsner, K N

    2000-06-01

    The author used the remember/know paradigm and the dual process recognition model of A. P. Yonelinas, N. E. A. Kroll, I. Dobbins, M. Lazzara, and R. T. Knight (1998) to study the states of awareness accompanying recognition of affective images and the processes of recollection and familiarity that may underlie them. Results from all experiments showed that (a) negative stimuli tended to be remembered, whereas positive stimuli tended to be known; (b) recollection, but not familiarity, was boosted for negative or highly arousing and, to a lesser extent, positive stimuli; and (c) across experiments, variations in depth of encoding did not influence these patterns. These data suggest that greater recollection for affective events leads them to be more richly experienced in memory, and they are consistent with the idea that the states of remembering and knowing are experientially exclusive, whereas the processes underlying them are functionally independent.

  18. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  19. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  20. Complex reassortment events of unusual G9P[4] rotavirus strains in India between 2011 and 2013.

    Science.gov (United States)

    Doan, Yen Hai; Suzuki, Yoshiyuki; Fujii, Yoshiki; Haga, Kei; Fujimoto, Akira; Takai-Todaka, Reiko; Someya, Yuichi; Nayak, Mukti K; Mukherjee, Anupam; Imamura, Daisuke; Shinoda, Sumio; Chawla-Sarkar, Mamta; Katayama, Kazuhiko

    2017-10-01

    Rotavirus A (RVA) is the predominant etiological agent of acute gastroenteritis in young children worldwide. Recently, unusual G9P[4] rotavirus strains emerged with high prevalence in many countries. Such intergenogroup reassortant strains highlight the ongoing spread of unusual rotavirus strains throughout Asia. This study was undertaken to determine the whole genome of eleven unusual G9P[4] strains detected in India during 2011-2013, and to compare them with other human and animal global RVAs to understand the exact origin of unusual G9P[4] circulating in India and other countries worldwide. Of these 11 RVAs, four G9P[4] strains were double-reassortants with the G9-VP7 and E6-NSP4 genes on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N2-T2-E6-H2). The other strains showed a complex genetic constellation, likely derived from triple reassortment event with the G9-VP7, N1-NSP2 and E6-NSP4 on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N1-T2-E6-H2). Presumably, these unusual G9P[4] strains were generated after several reassortment events between the contemporary co-circulating human rotavirus strains. Moreover, the point mutation S291L at the interaction site between inner and outer capsid proteins of VP6 gene may be important in the rapid spread of this unusual strain. The complex reassortment events within the G9[4] strains may be related to the high prevalence of mixed infections in India as reported in this study and other previous studies. Copyright © 2017. Published by Elsevier B.V.

  1. Relationship between early and late stages of information processing: an event-related potential study

    Directory of Open Access Journals (Sweden)

    Claudio Portella

    2012-11-01

    Full Text Available The brain is capable of elaborating and executing different stages of information processing. However, exactly how these stages are processed in the brain remains largely unknown. This study aimed to analyze the possible correlation between early and late stages of information processing by assessing the latency to, and amplitude of, early and late event-related potential (ERP components, including P200, N200, premotor potential (PMP and P300, in healthy participants in the context of a visual oddball paradigm. We found a moderate positive correlation among the latency of P200 (electrode O2, N200 (electrode O2, PMP (electrode C3, P300 (electrode PZ and the reaction time (RT. In addition, moderate negative correlation between the amplitude of P200 and the latencies of N200 (electrode O2, PMP (electrode C3, P300 (electrode PZ was found. Therefore, we propose that if the secondary processing of visual input (P200 latency occurs faster, the following will also happen sooner: discrimination and classification process of this input (N200 latency, motor response processing (PMP latency, reorganization of attention and working memory update (P300 latency, and RT. N200, PMP, and P300 latencies are also anticipated when higher activation level of occipital areas involved in the secondary processing of visual input rise (P200 amplitude.

  2. A hierarchy of event-related potential markers of auditory processing in disorders of consciousness

    Directory of Open Access Journals (Sweden)

    Steve Beukema

    2016-01-01

    Full Text Available Functional neuroimaging of covert perceptual and cognitive processes can inform the diagnoses and prognoses of patients with disorders of consciousness, such as the vegetative and minimally conscious states (VS;MCS. Here we report an event-related potential (ERP paradigm for detecting a hierarchy of auditory processes in a group of healthy individuals and patients with disorders of consciousness. Simple cortical responses to sounds were observed in all 16 patients; 7/16 (44% patients exhibited markers of the differential processing of speech and noise; and 1 patient produced evidence of the semantic processing of speech (i.e. the N400 effect. In several patients, the level of auditory processing that was evident from ERPs was higher than the abilities that were evident from behavioural assessment, indicating a greater sensitivity of ERPs in some cases. However, there were no differences in auditory processing between VS and MCS patient groups, indicating a lack of diagnostic specificity for this paradigm. Reliably detecting semantic processing by means of the N400 effect in passively listening single-subjects is a challenge. Multiple assessment methods are needed in order to fully characterise the abilities of patients with disorders of consciousness.

  3. Relationship between early and late stages of information processing: an event-related potential study

    Science.gov (United States)

    Portella, Claudio; Machado, Sergio; Arias-Carrión, Oscar; Sack, Alexander T.; Silva, Julio Guilherme; Orsini, Marco; Leite, Marco Antonio Araujo; Silva, Adriana Cardoso; Nardi, Antonio E.; Cagy, Mauricio; Piedade, Roberto; Ribeiro, Pedro

    2012-01-01

    The brain is capable of elaborating and executing different stages of information processing. However, exactly how these stages are processed in the brain remains largely unknown. This study aimed to analyze the possible correlation between early and late stages of information processing by assessing the latency to, and amplitude of, early and late event-related potential (ERP) components, including P200, N200, premotor potential (PMP) and P300, in healthy participants in the context of a visual oddball paradigm. We found a moderate positive correlation among the latency of P200 (electrode O2), N200 (electrode O2), PMP (electrode C3), P300 (electrode PZ) and the reaction time (RT). In addition, moderate negative correlation between the amplitude of P200 and the latencies of N200 (electrode O2), PMP (electrode C3), P300 (electrode PZ) was found. Therefore, we propose that if the secondary processing of visual input (P200 latency) occurs faster, the following will also happen sooner: discrimination and classification process of this input (N200 latency), motor response processing (PMP latency), reorganization of attention and working memory update (P300 latency), and RT. N200, PMP, and P300 latencies are also anticipated when higher activation level of occipital areas involved in the secondary processing of visual input rise (P200 amplitude). PMID:23355929

  4. Zinc removal from wastewater by complexation-microfiltration process

    Directory of Open Access Journals (Sweden)

    Trivunac Katarina

    2012-01-01

    Full Text Available As a result of its wide industrial applications, zinc has become an important contaminant in aquatic environment since it is a toxic heavy metal and some of its compounds such as zinc arsenate and zinc cyanide, may be extremely hazardous. Therefore, there is a growing need for developing simple methods capable of separating and recovering trace zinc from environmental waters. Nowadays, the ultra and microfiltration method for trace metals removal from waters by the addition of water-soluble polymers into the aqueous solutions has become a significant research area. The choice of watersoluble macroligands remains important for developing this technology. Sodium carboxymethyl cellulose (Na-CMC was selected as complexing agent. The microfiltration experiments were carried out in stirred dead-end cell. To separate formed polymer-metal complex Versapor membranes were used. The concentration of heavy metal ions after microfiltration in aqueous solution was determined using atomic absorption spectroscopy (AAS. Effects of amount of complexing agent, pH value, type of anion, ionic strength and operating pressure on the flux (J and rejection coefficient (R were investigated. Experimental results indicate a considerable influence of the pH, ionic strength and type of anion on the rejection coefficient, while effect of amount of complexing agent is relatively insignificant. The Na-CMC used in the research proved to be very effective, which may be supported by the high rejection coefficients obtained (99%.

  5. Autopoietic Automata: Complexity Issues in Offspring-Producing Evolving Processes

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2007-01-01

    Roč. 383, č. 2-3 (2007), s. 260-269 ISSN 0304-3975 R&D Projects: GA AV ČR 1ET100300517 Institutional research plan: CEZ:AV0Z10300504 Keywords : autopoietic automata * self-reproducing automata * interactive Turing machine * simulation * computational complexity Subject RIV: IN - Informatics, Computer Science Impact factor: 0.735, year: 2007

  6. Phenylketonuria and Complex Spatial Visualization: An Analysis of Information Processing.

    Science.gov (United States)

    Brunner, Robert L.; And Others

    1987-01-01

    The study of the ability of 16 early treated phenylketonuric (PKU) patients (ages 6-23 years) to solve complex spatial problems suggested that choice of problem-solving strategy, attention span, and accuracy of mental representation may be affected in PKU patients, despite efforts to maintain well-controlled phenylalanine concentrations in the…

  7. Can Models Capture the Complexity of the Systems Engineering Process?

    Science.gov (United States)

    Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.

    Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"

  8. Complex source mechanisms of mining-induced seismic events - implications for surface effects

    Science.gov (United States)

    Orlecka-Sikora, B.; Cesca, S.; Lasocki, S.; Rudzinski, L.; Lizurek, L.; Wiejacz, P.; Urban, P.; kozlowska, M.

    2012-04-01

    The seismicity of Legnica-Głogów Copper District (LGCD) is induced by mining activities in three mines: Lubin, Rudna and Polkowice-Sieroszowice. Ground motion caused by strong tremors might affect local infrastructure. "Żelazny Most" tailings pond, the biggest structure of this type in Europe, is here under special concern. Due to surface objects protection, Rudna Mine has been running ground motion monitoring for several years. From June 2010 to June 2011 unusually strong and extensive surface impact has been observed for 6 mining tremors induced in one of Rudna mining sections. The observed peak ground acceleration (PGA) for both horizontal and vertical component were in or even beyond 99% confidence interval for prediction. The aim of this paper is analyze the reason of such unusual ground motion. On the basis of registrations from Rudna Mine mining seismological network and records from Polish Seismological Network held by the Institute of Geophysics Polish Academy of Sciences (IGF PAN), the source mechanisms of these 6 tremors were calculated using a time domain moment tensor inversion. Furthermore, a kinematic analysis of the seismic source was performed, in order to determine the rupture planes orientations and rupture directions. These results showed that in case of the investigated tremors, point source models and shear fault mechanisms, which are most often assumed in mining seismology, are invalid. All analyzed events indicate extended sources with non-shear mechanism. The rapture planes have small dip angles and the rupture starts at the tremors hypocenter and propagates in the direction opposite to the plane dip. The tensional component plays here also big role. These source mechanisms well explain such observed strong ground motion, and calculated synthetic PGA values well correlates with observed ones. The relationship between mining tremors were also under investigation. All subsequent tremors occurred in the area of increased stress due to

  9. Does ego development increase during midlife? The effects of openness and accommodative processing of difficult events.

    Science.gov (United States)

    Lilgendahl, Jennifer Pals; Helson, Ravenna; John, Oliver P

    2013-08-01

    Although Loevinger's model of ego development is a theory of personality growth, there are few studies that have examined age-related change in ego level over developmentally significant periods of adulthood. To address this gap in the literature, we examined mean-level change and individual differences in change in ego level over 18 years of midlife. In this longitudinal study, participants were 79 predominantly White, college-educated women who completed the Washington University Sentence Completion Test in early (age 43) and late (age 61) midlife as well as measures of the trait of Openness (ages 21, 43, 52, and 61) and accommodative processing (assessed from narratives of difficult life events at age 52). As hypothesized, the sample overall showed a mean-level increase in ego level from age 43 to age 61. Additionally, a regression analysis showed that both the trait of Openness at age 21 and accommodative processing of difficult events that occurred during (as opposed to prior to) midlife were each predictive of increasing ego level from age 43 to age 61. These findings counter prior claims that ego level remains stable during adulthood and contribute to our understanding of the underlying processes involved in personality growth in midlife. © 2012 Wiley Periodicals, Inc.

  10. The light-makeup advantage in facial processing: Evidence from event-related potentials

    OpenAIRE

    Tagai, Keiko; Shimakura, Hitomi; Isobe, Hiroko; Nittono, Hiroshi

    2017-01-01

    The effects of makeup on attractiveness have been evaluated using mainly subjective measures. In this study, event-related brain potentials (ERPs) were recorded from a total of 45 Japanese women (n = 23 and n = 22 for Experiment 1 and 2, respectively) to examine the neural processing of faces with no makeup, light makeup, and heavy makeup. To have the participants look at each face carefully, an identity judgement task was used: they were asked to judge whether the two faces presented in succ...

  11. Subliminal Emotional Words Impact Syntactic Processing: Evidence from Performance and Event-Related Brain Potentials

    Directory of Open Access Journals (Sweden)

    Laura Jiménez-Ortega

    2017-04-01

    Full Text Available Recent studies demonstrate that syntactic processing can be affected by emotional information and that subliminal emotional information can also affect cognitive processes. In this study, we explore whether unconscious emotional information may also impact syntactic processing. In an Event-Related brain Potential (ERP study, positive, neutral and negative subliminal adjectives were inserted within neutral sentences, just before the presentation of the supraliminal adjective. They could either be correct (50% or contain a morphosyntactic violation (number or gender disagreements. Larger error rates were observed for incorrect sentences than for correct ones, in contrast to most studies using supraliminal information. Strikingly, emotional adjectives affected the conscious syntactic processing of sentences containing morphosyntactic anomalies. The neutral condition elicited left anterior negativity (LAN followed by a P600 component. However, a lack of anterior negativity and an early P600 onset for the negative condition were found, probably as a result of the negative subliminal correct adjective capturing early syntactic resources. Positive masked adjectives in turn prompted an N400 component in response to morphosyntactic violations, probably reflecting the induction of a heuristic processing mode involving access to lexico-semantic information to solve agreement anomalies. Our results add to recent evidence on the impact of emotional information on syntactic processing, while showing that this can occur even when the reader is unaware of the emotional stimuli.

  12. Initiating events study of the first extraction cycle process in a model reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Renze; Zhang, Jian Gang; Zhuang, Dajie; Feng, Zong Yang [China Institute for Radiation Protection, Taiyuan (China)

    2016-06-15

    Definition and grouping of initiating events (IEs) are important basics for probabilistic safety assessment (PSA). An IE in a spent fuel reprocessing plant (SFRP) is an event that probably leads to the release of dangerous material to jeopardize workers, public and environment. The main difference between SFRPs and nuclear power plants (NPPs) is that hazard materials spread diffusely in a SFRP and radioactive material is just one kind of hazard material. Since the research on IEs for NPPs is in-depth around the world, there are several general methods to identify IEs: reference of lists in existence, review of experience feedback, qualitative analysis method, and deductive analysis method. While failure mode and effect analysis (FMEA) is an important qualitative analysis method, master logic diagram (MLD) method is the deductive analysis method. IE identification in SFRPs should be consulted with the experience of NPPs, however the differences between SFRPs and NPPs should be considered seriously. The plutonium uranium reduction extraction (Purex) process is adopted by the technics in a model reprocessing plant. The first extraction cycle (FEC) is the pivotal process in the Purex process. Whether the FEC can function safely and steadily would directly influence the production process of the whole plant-production quality. Important facilities of the FEC are installed in the equipment cells (ECs). In this work, IEs in the FEC process were identified and categorized by FMEA and MLD two methods, based on the fact that ECs are containments in the plant. The results show that only two ECs in the FEC do not need to be concerned particularly with safety problems, and criticality, fire and red oil explosion are IEs which should be emphatically analyzed. The results are accordant with the references.

  13. Commentary: Competency restoration research--complicating an already complex process.

    Science.gov (United States)

    Rotter, Merrill; Greenspan, Michael

    2011-01-01

    Predicting restorability in individuals found not competent to stand trial is an enduring focus of interest among forensic clinicians and academicians. In our commentary, we suggest that to understand this area even more comprehensively, we must look further. We must build on existing research on fitness to stand trial, move beyond diagnosis and a binary competence variable, and include the complex interplay between symptoms and fitness-related capacities that may be associated with lack of adjudicative competence and challenges to restorability.

  14. Processing of Complex Auditory Patterns in Musicians and Nonmusicians

    OpenAIRE

    Boh, Bastiaan; Herholz, Sibylle C.; Lappe, Claudia; Pantev, Christo

    2011-01-01

    In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN) response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG) w...

  15. Exploiting global information in complex network repair processes

    Institute of Scientific and Technical Information of China (English)

    Tianyu WANG; Jun ZHANG; Sebastian WANDELT

    2017-01-01

    Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger compo nents at a low cost only with local information.In this paper,we discuss the effectiveness and effi ciency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.

  16. The Development of Narrative Productivity, Syntactic Complexity, Referential Cohesion and Event Content in Four- to Eight-Year-Old Finnish Children

    Science.gov (United States)

    Mäkinen, Leena; Loukusa, Soile; Nieminen, Lea; Leinonen, Eeva; Kunnari, Sari

    2014-01-01

    This study focuses on the development of narrative structure and the relationship between narrative productivity and event content. A total of 172 Finnish children aged between four and eight participated. Their picture-elicited narrations were analysed for productivity, syntactic complexity, referential cohesion and event content. Each measure…

  17. Automating the Simulation of SME Processes through a Discrete Event Parametric Model

    Directory of Open Access Journals (Sweden)

    Francesco Aggogeri

    2015-02-01

    Full Text Available At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the user’s specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project.

  18. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  19. Sexual Abuse Exposure Alters Early Processing of Emotional Words: Evidence from Event-Related Potentials

    Science.gov (United States)

    Grégoire, Laurent; Caparos, Serge; Leblanc, Carole-Anne; Brisson, Benoit; Blanchette, Isabelle

    2018-01-01

    This study aimed to compare the time course of emotional information processing between trauma-exposed and control participants, using electrophysiological measures. We conceived an emotional Stroop task with two types of words: trauma-related emotional words and neutral words. We assessed the evoked cerebral responses of sexual abuse victims without post-traumatic stress disorder (PTSD) and no abuse participants. We focused particularly on an early wave (C1/P1), the N2pc, and the P3b. Our main result indicated an early effect (55–165 ms) of emotionality, which varied between non-exposed participants and sexual abuse victims. This suggests that potentially traumatic experiences modulate early processing of emotional information. Our findings showing neurobiological alterations in sexual abuse victims (without PTSD) suggest that exposure to highly emotional events has an important impact on neurocognitive function even in the absence of psychopathology. PMID:29379428

  20. Sexual Abuse Exposure Alters Early Processing of Emotional Words: Evidence from Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Laurent Grégoire

    2018-01-01

    Full Text Available This study aimed to compare the time course of emotional information processing between trauma-exposed and control participants, using electrophysiological measures. We conceived an emotional Stroop task with two types of words: trauma-related emotional words and neutral words. We assessed the evoked cerebral responses of sexual abuse victims without post-traumatic stress disorder (PTSD and no abuse participants. We focused particularly on an early wave (C1/P1, the N2pc, and the P3b. Our main result indicated an early effect (55–165 ms of emotionality, which varied between non-exposed participants and sexual abuse victims. This suggests that potentially traumatic experiences modulate early processing of emotional information. Our findings showing neurobiological alterations in sexual abuse victims (without PTSD suggest that exposure to highly emotional events has an important impact on neurocognitive function even in the absence of psychopathology.

  1. Comparing the temporal dynamics of thematic and taxonomic processing using event-related potentials.

    Directory of Open Access Journals (Sweden)

    Olivera Savic

    Full Text Available We report the results of a study comparing the temporal dynamics of thematic and taxonomic knowledge activation in a picture-word priming paradigm using event-related potentials. Although we found no behavioral differences between thematic and taxonomic processing, ERP data revealed distinct patterns of N400 and P600 amplitude modulation for thematic and taxonomic priming. Thematically related target stimuli elicited less negativity than taxonomic targets between 280-460 ms after stimulus onset, suggesting easier semantic processing of thematic than taxonomic relationships. Moreover, P600 mean amplitude was significantly increased for taxonomic targets between 520-600 ms, consistent with a greater need for stimulus reevaluation in that condition. These results offer novel evidence in favor of a dissociation between thematic and taxonomic thinking in the early phases of conceptual evaluation.

  2. Comparing the temporal dynamics of thematic and taxonomic processing using event-related potentials.

    Science.gov (United States)

    Savic, Olivera; Savic, Andrej M; Kovic, Vanja

    2017-01-01

    We report the results of a study comparing the temporal dynamics of thematic and taxonomic knowledge activation in a picture-word priming paradigm using event-related potentials. Although we found no behavioral differences between thematic and taxonomic processing, ERP data revealed distinct patterns of N400 and P600 amplitude modulation for thematic and taxonomic priming. Thematically related target stimuli elicited less negativity than taxonomic targets between 280-460 ms after stimulus onset, suggesting easier semantic processing of thematic than taxonomic relationships. Moreover, P600 mean amplitude was significantly increased for taxonomic targets between 520-600 ms, consistent with a greater need for stimulus reevaluation in that condition. These results offer novel evidence in favor of a dissociation between thematic and taxonomic thinking in the early phases of conceptual evaluation.

  3. Upwelling events, coastal offshore exchange, links to biogeochemical processes - Highlights from the Baltic Sea Science Congress

    Directory of Open Access Journals (Sweden)

    Bogdan Ołdakowski

    2008-03-01

    Full Text Available The Baltic Sea Science Congress was held at Rostock University, Germany, from 19 to 22 March 2007. In the session entitled"Upwelling events, coastal offshore exchange, links to biogeochemical processes" 20 presentations were given,including 7 talks and 13 posters related to the theme of the session.This paper summarises new findings of the upwelling-related studies reported in the session. It deals with investigationsbased on the use of in situ and remote sensing measurements as well as numerical modelling tools. The biogeochemicalimplications of upwelling are also discussed.Our knowledge of the fine structure and dynamic considerations of upwelling has increased in recent decades with the advent ofhigh-resolution modern measurement techniques and modelling studies. The forcing and the overall structure, duration and intensity ofupwelling events are understood quite well. However, the quantification of related transports and the contribution to the overall mixingof upwelling requires further research. Furthermore, our knowledge of the links between upwelling and biogeochemical processes is stillincomplete. Numerical modelling has advanced to the extent that horizontal resolutions of c. 0.5 nautical miles can now be applied,which allows the complete spectrum of meso-scale features to be described. Even the development of filaments can be describedrealistically in comparison with high-resolution satellite data.But the effect of upwelling at a basin scale and possible changes under changing climatic conditions remain open questions.

  4. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    Science.gov (United States)

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. Thinking back about a positive event: The impact of processing style on positive affect

    Directory of Open Access Journals (Sweden)

    Sabine eNelis

    2015-03-01

    Full Text Available The manner in which individuals recall an autobiographical positive life event has affective consequences. Two studies addressed the processing styles during positive memory recall in a non-clinical sample. Participants retrieved a positive memory which was self-generated (Study 1, n = 70 or experimenter-chosen (i.e., academic achievement, Study 2, n = 159, followed by the induction of one of three processing styles (between-subjects: In Study 1, a ‘concrete/imagery’ vs. ‘abstract/verbal’ processing style was compared. In Study 2, a ‘concrete/imagery’, ‘abstract/verbal’, and ‘comparative/verbal’ processing style were compared. The processing of a personal memory in a concrete/imagery-based way led to a larger increase in positive affect compared to abstract/verbal processing in Study 1, as well as compared to comparative/verbal thinking in Study 2. Results of Study 2 further suggest that it is making unfavourable verbal comparisons that may hinder affective benefits to positive memories (rather then general abstract/verbal processing per se. The comparative/verbal thinking style failed to lead to improvements in positive affect, and with increasing levels of depressive symptoms it had a more negative impact on change in positive affect. We found no evidence that participant’s tendency to have dampening thoughts in response to positive affect in daily life contributed to the affective impact of positive memory recall. The results support the potential for current trainings in boosting positive memories and mental imagery, and underline the search for parameters that determine at times deleterious outcomes of abstract/verbal memory processing in the face of positive information.

  7. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  8. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  9. Anticancer Activity of Metal Complexes: Involvement of Redox Processes

    Science.gov (United States)

    Jungwirth, Ute; Kowol, Christian R.; Keppler, Bernhard K.; Hartinger, Christian G.; Berger, Walter; Heffeter, Petra

    2012-01-01

    Cells require tight regulation of the intracellular redox balance and consequently of reactive oxygen species for proper redox signaling and maintenance of metal (e.g., of iron and copper) homeostasis. In several diseases, including cancer, this balance is disturbed. Therefore, anticancer drugs targeting the redox systems, for example, glutathione and thioredoxin, have entered focus of interest. Anticancer metal complexes (platinum, gold, arsenic, ruthenium, rhodium, copper, vanadium, cobalt, manganese, gadolinium, and molybdenum) have been shown to strongly interact with or even disturb cellular redox homeostasis. In this context, especially the hypothesis of “activation by reduction” as well as the “hard and soft acids and bases” theory with respect to coordination of metal ions to cellular ligands represent important concepts to understand the molecular modes of action of anticancer metal drugs. The aim of this review is to highlight specific interactions of metal-based anticancer drugs with the cellular redox homeostasis and to explain this behavior by considering chemical properties of the respective anticancer metal complexes currently either in (pre)clinical development or in daily clinical routine in oncology. PMID:21275772

  10. Identification of Hazardous Events for Drinking Water Production Process Using Managed Aquifer Recharge in the Nakdong River Delta, Korea

    International Nuclear Information System (INIS)

    Sang-Il, L.; Ji, H.W.

    2016-01-01

    Various hazardous events can cause chemical, microbial or physical hazards to a water supply system. The World Health Organization (WHO) and some countries have introduced the hazardous event analysis for identifying potential events which may be harmful to the safety of drinking water. This study extends the application of the hazardous event analysis into drinking water production using managed aquifer recharge (MAR). MAR is a way of using an aquifer to secure water resources by storing freshwater for future use and pumping it whenever necessary. The entire drinking water production process is subjected to the analysis from the catchment area to the consumer. Hazardous event analysis incorporates site-specific data as well as common issues occurring in the process of drinking water production. The hazardous events are classified based on chemical, microbial or physical characteristics. Likelihood and severity values are assigned, resulting in quantitative risk by multiplying them. The study site is located at a coastal area in the delta of the Nakdong River, South Korea. The site has suffered from salt water intrusion and surface water pollution from the water upstream. Nine major hazardous events were identified out of total 114 events from 10 drinking water production processes. These major hazardous events will provide useful information on what to be done to secure the water quality produced by a new water supply method. (author)

  11. Safety case for the disposal of spent nuclear fuel at Olkiluoto. Features, events and processes 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-12-15

    Features, Events and Processes sits within Posiva Oy's Safety Case 'TURVA-2012' portfolio and has the objective of presenting the main features, events and processes (FEPs) that are considered to be potentially significant for the long-term safety of the planned KBS-3V repository for spent nuclear fuel at Olkiluoto. The primary purpose of this report is to support Performance Assessment, Formulation of Radionuclide Release Scenarios, Assessment of the Radionuclide Release Scenarios for the Repository System and Biosphere Assessment by ensuring that the scenarios are comprehensive and take account of all significant FEPs. The main FEPs potentially affecting the disposal system are described for each relevant subsystem component or barrier (i.e. the spent nuclear fuel, the canister, the buffer and tunnel backfill, the auxiliary components, the geosphere and the surface environment). In addition, a small number of external FEPs that may potentially influence the evolution of the disposal system are described. The conceptual understanding and operation of each FEP is described, together with the main features (variables) of the disposal system that may affect its occurrence or significance. Olkiluoto-specific issues are considered when relevant. The main uncertainties (conceptual and parameter/data) associated with each FEP that may affect understanding are also documented. Indicative parameter values are provided, in some cases, to illustrate the magnitude or rate of a process, but it is not the intention of this report to provide the complete set of numerical values that are used in the quantitative safety assessment calculations. Many of the FEPs are interdependent and, therefore, the descriptions also identify the most important direct couplings between the FEPs. This information is used in the formulation of scenarios to ensure the conceptual models and calculational cases are both comprehensive and representative. (orig.)

  12. Safety case for the disposal of spent nuclear fuel at Olkiluoto. Features, events and processes 2012

    International Nuclear Information System (INIS)

    2012-12-01

    Features, Events and Processes sits within Posiva Oy's Safety Case 'TURVA-2012' portfolio and has the objective of presenting the main features, events and processes (FEPs) that are considered to be potentially significant for the long-term safety of the planned KBS-3V repository for spent nuclear fuel at Olkiluoto. The primary purpose of this report is to support Performance Assessment, Formulation of Radionuclide Release Scenarios, Assessment of the Radionuclide Release Scenarios for the Repository System and Biosphere Assessment by ensuring that the scenarios are comprehensive and take account of all significant FEPs. The main FEPs potentially affecting the disposal system are described for each relevant subsystem component or barrier (i.e. the spent nuclear fuel, the canister, the buffer and tunnel backfill, the auxiliary components, the geosphere and the surface environment). In addition, a small number of external FEPs that may potentially influence the evolution of the disposal system are described. The conceptual understanding and operation of each FEP is described, together with the main features (variables) of the disposal system that may affect its occurrence or significance. Olkiluoto-specific issues are considered when relevant. The main uncertainties (conceptual and parameter/data) associated with each FEP that may affect understanding are also documented. Indicative parameter values are provided, in some cases, to illustrate the magnitude or rate of a process, but it is not the intention of this report to provide the complete set of numerical values that are used in the quantitative safety assessment calculations. Many of the FEPs are interdependent and, therefore, the descriptions also identify the most important direct couplings between the FEPs. This information is used in the formulation of scenarios to ensure the conceptual models and calculational cases are both comprehensive and representative. (orig.)

  13. Safety case for the disposal of spent nuclear fuel at Olkiluoto. Features, events and processes 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-12-15

    Features, Events and Processes sits within Posiva Oy's Safety Case 'TURVA-2012' portfolio and has the objective of presenting the main features, events and processes (FEPs) that are considered to be potentially significant for the long-term safety of the planned KBS-3V repository for spent nuclear fuel at Olkiluoto. The primary purpose of this report is to support Performance Assessment, Formulation of Radionuclide Release Scenarios, Assessment of the Radionuclide Release Scenarios for the Repository System and Biosphere Assessment by ensuring that the scenarios are comprehensive and take account of all significant FEPs. The main FEPs potentially affecting the disposal system are described for each relevant subsystem component or barrier (i.e. the spent nuclear fuel, the canister, the buffer and tunnel backfill, the auxiliary components, the geosphere and the surface environment). In addition, a small number of external FEPs that may potentially influence the evolution of the disposal system are described. The conceptual understanding and operation of each FEP is described, together with the main features (variables) of the disposal system that may affect its occurrence or significance. Olkiluoto-specific issues are considered when relevant. The main uncertainties (conceptual and parameter/data) associated with each FEP that may affect understanding are also documented. Indicative parameter values are provided, in some cases, to illustrate the magnitude or rate of a process, but it is not the intention of this report to provide the complete set of numerical values that are used in the quantitative safety assessment calculations. Many of the FEPs are interdependent and, therefore, the descriptions also identify the most important direct couplings between the FEPs. This information is used in the formulation of scenarios to ensure the conceptual models and calculational cases are both comprehensive and representative. (orig.)

  14. Analyzing the evolution of beta-endorphin post-translational processing events: studies on reptiles.

    Science.gov (United States)

    Shoureshi, Pezhman; Baron, Andrea; Szynskie, Laura; Dores, Robert M

    2007-01-01

    In many cartilaginous fishes, most ray-finned fishes, lungfishes, and amphibians, the post-translational processing of POMC includes the monobasic cleavage of beta-endorphin to yield an opioid that is eight to ten amino acids in length. The amino acid motif within the beta-endorphin sequence required for a monobasic cleavage event is -E-R-(S/G)-Q-. Mammals and birds lack this motif and as a result beta-endorphin(1-8) is a not an end-product in either group. Since both mammals and birds were derived from ancestors with reptilian origins, an analysis of beta-endorphin sequences from extant groups of reptiles should provide insights into the manner in which beta-endorphin post-translational processing mechanisms have evolved in amniotes. To this end a POMC cDNA was cloned from the pituitary of the turtle, Chrysemys scripta. The beta-endorphin sequence in this species was compared to other reptile beta-endorphin sequences (i.e., Chinese soft shell turtle and gecko) and to known bird and mammal sequences. This analysis indicated that either the loss of the arginine residue at the cleavage site (the two turtle species, chick, and human) or a substitution at the glutamine position in the consensus sequence (gecko and ostrich) would account for the loss of the monobasic cleavage reaction in that species. Since amphibians are capable of performing the beta-endorphin monobasic reaction, it would appear that the amino acid substitutions that eliminated this post-translational process event in reptilian-related tetrapods must have occurred in the ancestral amniotes.

  15. Complex life forms may arise from electrical processes

    Directory of Open Access Journals (Sweden)

    Elson Edward C

    2010-06-01

    Full Text Available Abstract There is still not an appealing and testable model to explain how single-celled organisms, usually following fusion of male and female gametes, proceed to grow and evolve into multi-cellular, complexly differentiated systems, a particular species following virtually an invariant and unique growth pattern. An intrinsic electrical oscillator, resembling the cardiac pacemaker, may explain the process. Highly auto-correlated, it could live independently of ordinary thermodynamic processes which mandate increasing disorder, and could coordinate growth and differentiation of organ anlage.

  16. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk

    2017-03-01

    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  17. Mastication accelerates Go/No-go decisional processing: An event-related potential study.

    Science.gov (United States)

    Sakamoto, Kiwako; Nakata, Hiroki; Yumoto, Masato; Sadato, Norihiro; Kakigi, Ryusuke

    2015-11-01

    The purpose of the present study was to investigate the effect of mastication on Go/No-go decisional processing using event-related potentials (ERPs). Thirteen normal subjects underwent seven sessions of a somatosensory Go/No-go paradigm for approximately 4min; Pre, and Post 1, 2, 3, 4, 5, and 6. The Control condition included the same seven sessions. The RT and standard deviation were recorded, and the peak amplitude and latency of the N140 and P300 components were analyzed. The RT was significantly shorter in Mastication than in Control at Post 1-3 and 4-6. The peak latency of N140 was earlier in Mastication than in Control at Post 4-6. The latency of N140 was shortened by repeated sessions in Mastication, but not by those in Control. The peak latency of P300 was significantly shorter in Mastication than in Control at Post 4-6. The peak latency of P300 was significantly longer in Control with repeated sessions, but not in Mastication. These results suggest that mastication may influence response execution processing in Go trials, as well as response inhibition processing in No-go trials. Mastication accelerated Go/No-go decisional processing in the human brain. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. False memory and level of processing effect: an event-related potential study.

    Science.gov (United States)

    Beato, Maria Soledad; Boldini, Angela; Cadavid, Sara

    2012-09-12

    Event-related potentials (ERPs) were used to determine the effects of level of processing on true and false memory, using the Deese-Roediger-McDermott (DRM) paradigm. In the DRM paradigm, lists of words highly associated to a single nonpresented word (the 'critical lure') are studied and, in a subsequent memory test, critical lures are often falsely remembered. Lists with three critical lures per list were auditorily presented here to participants who studied them with either a shallow (saying whether the word contained the letter 'o') or a deep (creating a mental image of the word) processing task. Visual presentation modality was used on a final recognition test. True recognition of studied words was significantly higher after deep encoding, whereas false recognition of nonpresented critical lures was similar in both experimental groups. At the ERP level, true and false recognition showed similar patterns: no FN400 effect was found, whereas comparable left parietal and late right frontal old/new effects were found for true and false recognition in both experimental conditions. Items studied under shallow encoding conditions elicited more positive ERP than items studied under deep encoding conditions at a 1000-1500 ms interval. These ERP results suggest that true and false recognition share some common underlying processes. Differential effects of level of processing on true and false memory were found only at the behavioral level but not at the ERP level.

  19. Sex differences in humor processing: An event-related potential study.

    Science.gov (United States)

    Chang, Yi-Tzu; Ku, Li-Chuan; Chen, Hsueh-Chih

    2018-02-01

    Numerous behavioral studies and a handful of functional neuroimaging studies have reported sex differences in humor. However, no study to date has examined differences in the time-course of brain activity during multistage humor processing between the sexes. The purpose of this study was to compare real-time dynamics related to humor processing between women and men, with reference to a proposed three-stage model (involving incongruity detection, incongruity resolution, and elaboration stages). Forty undergraduate students (20 women) underwent event-related potential recording while subjectively rating 30 question-answer-type jokes and 30 question-answer-type statements in a random order. Sex differences were revealed by analyses of the mean amplitudes of difference waves during a specific time window between 1000 and 1300 ms poststimulus onset (P1000-1300). This indicates that women recruited more mental resources to integrate cognitive and emotional components at this late stage. In contrast, men recruited more automated processes during the transition from the cognitive operations of the incongruity resolution stage to the emotional response of the humor elaboration stage. Our results suggest that sex differences in humor processing lie in differences in the integration of cognitive and emotional components, which are closely linked and interact reciprocally, particularly in women. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  1. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  2. Epidemic Processes on Complex Networks : Modelling, Simulation and Algorithms

    NARCIS (Netherlands)

    Van de Bovenkamp, R.

    2015-01-01

    Local interactions on a graph will lead to global dynamic behaviour. In this thesis we focus on two types of dynamic processes on graphs: the Susceptible-Infected-Susceptilbe (SIS) virus spreading model, and gossip style epidemic algorithms. The largest part of this thesis is devoted to the SIS

  3. Complex Dynamics in Academics' Developmental Processes in Teaching

    Science.gov (United States)

    Trautwein, Caroline; Nückles, Matthias; Merkt, Marianne

    2015-01-01

    Improving teaching in higher education is a concern for universities worldwide. This study explored academics' developmental processes in teaching using episodic interviews and teaching portfolios. Eight academics in the context of teaching development reported changes in their teaching and change triggers. Thematic analyses revealed seven areas…

  4. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  5. Nonlinear signal processing for ultrasonic imaging of material complexity

    Czech Academy of Sciences Publication Activity Database

    Dos Santos, S.; Vejvodová, Šárka; Převorovský, Zdeněk

    2010-01-01

    Roč. 59, č. 2 (2010), s. 108-117 ISSN 1736-6046 Institutional research plan: CEZ:AV0Z20760514 Keywords : nonlinear signal processing * TR-NEWS * symmetry analysis * DORT Subject RIV: BI - Acoustics Impact factor: 0.464, year: 2010 www.eap.ee/proceedings

  6. ALGORITHM OF CARDIO COMPLEX DETECTION AND SORTING FOR PROCESSING THE DATA OF CONTINUOUS CARDIO SIGNAL MONITORING.

    Science.gov (United States)

    Krasichkov, A S; Grigoriev, E B; Nifontov, E M; Shapovalov, V V

    The paper presents an algorithm of cardio complex classification as part of processing the data of continuous cardiac monitoring. R-wave detection concurrently with cardio complex sorting is discussed. The core of this approach is the use of prior information about. cardio complex forms, segmental structure, and degree of kindness. Results of the sorting algorithm testing are provided.

  7. Certain aspects of the reactivity of carotenoids. Redox processes and complexation

    International Nuclear Information System (INIS)

    Polyakov, Nikolay E; Leshina, Tatyana V

    2006-01-01

    The published data on the redox reactions of carotenoids, their supramolecular inclusion complexes and the composition, properties and practical application of these complexes are generalised. Special attention is given to the effect of complexation on radical processes involving carotenoids and on the antioxidant activity of carotenoids.

  8. Anxiety symptoms mediate the relationship between exposure to stressful negative life events and depressive symptoms: A conditional process modelling of the protective effects of resilience.

    Science.gov (United States)

    Anyan, Frederick; Worsley, Lyn; Hjemdal, Odin

    2017-10-01

    Resilience has provided a useful framework that elucidates the effects of protective factors to overcome psychological adversities but studies that address the potential contingencies of resilience to protect against direct and indirect negative effects are lacking. These obvious gaps have also resulted in oversimplification of complex processes that can be clarified by moderated mediation associations. This study examines a conditional process modelling of the protective effects of resilience against indirect effects. Two separate samples were recruited in a cross-sectional survey from Australia and Norway to complete the Patient Health Questionnaire -9, Generalized Anxiety Disorder, Stressful Negative Life Events Questionnaire and the Resilience Scale for Adults. The final sample sizes were 206 (females=114; males=91; other=1) and 210 (females=155; males=55) for Australia and Norway respectively. Moderated mediation analyses were conducted across the samples. Anxiety symptoms mediated the relationship between exposure to stressful negative life events and depressive symptoms in both samples. Conditional indirect effects of exposure to stressful negative life events on depressive symptoms mediated by anxiety symptoms showed that high subgroup of resilience was associated with less effect of exposure to stressful negative life events through anxiety symptoms on depressive symptoms than the low subgroup of resilience. As a cross-sectional survey, the present study does not answer questions about causal processes despite the use of a conditional process modelling. These findings support that, resilience protective resources can protect against both direct and indirect - through other channels - psychological adversities. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Older Adults' Coping with Negative Life Events: Common Processes of Managing Health, Interpersonal, and Financial/Work Stressors

    Science.gov (United States)

    Moos, Rudolf H.; Brennan, Penny L.; Schutte, Kathleen K.; Moos, Bernice S.

    2006-01-01

    This study examined how older adults cope with negative life events in health, interpersonal, and financial/work domains and whether common stress and coping processes hold across these three domains. On three occasions, older adults identified the most severe negative event they faced in the last year and described how they appraised and coped…

  10. The visual illustration of complex process information during abnormal incidents

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Kautto, A.; Norros, L.; Ranta, J.

    1985-01-01

    One of the proposed solutions to the man-process interface problem in nuclear power plants is the integration of a system in the control room that can provide the operator with a display of a minimum set of critical plant parameters defining the safety status of the plant. Such a system has been experimentally validated using the Loviisa training simulator during the fall of 1982. The project was a joint effort between Combustion Engineering Inc., the Halden Reactor Project, Imatran Voima Oy and VTT. Alarm systems are used in nuclear power plants to tell the control room operators that an unexpected change in the plant operation state has occurred. One difficulty in using the alarms for checking the actions of the operator is that the conventional way of realizing the alarm systems implies that several alarms are active also during normal operation. The coding and representation of alarm information will be discussed in the paper. An important trend in control room design is the move away from direct, concrete indication of process parameters towards use of more abstract/logical representation of information as a basis for plant supervision. Recent advances in computer graphics provide the possibility that, in the future, visual information will be utilized to make the essential dynamics of the process more intelligible. A set of criteria for use of visual information will be necessary. The paper discusses practical aspects for the realisation of such criteria in the context of nuclear power plant. The criteria of the decomposition of the process information concerning the sub-goals safety and availability and also the tentative results of the conceptualization of a PWR-process are discussed in the paper

  11. Electrospray ionization mass spectrometry for the hydrolysis complexes of cisplatin: implications for the hydrolysis process of platinum complexes.

    Science.gov (United States)

    Feifan, Xie; Pieter, Colin; Jan, Van Bocxlaer

    2017-07-01

    Non-enzyme-dependent hydrolysis of the drug cisplatin is important for its mode of action and toxicity. However, up until today, the hydrolysis process of cisplatin is still not completely understood. In the present study, the hydrolysis of cisplatin in an aqueous solution was systematically investigated by using electrospray ionization mass spectrometry coupled to liquid chromatography. A variety of previously unreported hydrolysis complexes corresponding to monomeric, dimeric and trimeric species were detected and identified. The characteristics of the Pt-containing complexes were investigated by using collision-induced dissociation (CID). The hydrolysis complexes demonstrate distinctive and correlative CID characteristics, which provides tools for an informative identification. The most frequently observed dissociation mechanism was sequential loss of NH 3 , H 2 O and HCl. Loss of the Pt atom was observed as the final step during the CID process. The formation mechanisms of the observed complexes were explored and experimentally examined. The strongly bound dimeric species, which existed in solution, are assumed to be formed from the clustering of the parent compound and its monohydrated or dihydrated complexes. The role of the electrospray process in the formation of some of the observed ions was also evaluated, and the electrospray ionization-related cold clusters were identified. The previously reported hydrolysis equilibria were tested and subsequently refined via a hydrolysis study resulting in a renewed mechanistic equilibrium system of cisplatin as proposed from our results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Social anxiety and post-event processing among African-American individuals.

    Science.gov (United States)

    Buckner, Julia D; Dean, Kimberlye E

    2017-03-01

    Social anxiety is among the most prevalent psychiatric conditions, yet little attention has been paid to whether putative cognitive vulnerability factors related to social anxiety in predominantly White samples are related to social anxiety among historically underrepresented groups. We tested whether one such vulnerability factor, post-event processing (PEP; detailed review of social event that can increase state social anxiety) was related to social anxiety among African-American (AA; n = 127) persons, who comprise one of the largest underrepresented racial groups in the U.S. Secondarily, we tested whether AA participants differed from non-Hispanic White participants (n = 127) on PEP and social anxiety and whether race moderated the relation between PEP and social anxiety. Data were collected online among undergraduates. PEP was positively correlated with social anxiety among AA participants, even after controlling for depression and income, pr = .30, p = .001. AA and White participants did not differ on social anxiety or PEP, β = -1.57, 95% CI: -5.11, 1.96. The relation of PEP to social anxiety did not vary as a function of race, β = 0.00, 95% CI: -0.02, 0.02. PEP may be an important cognitive vulnerability factor related to social anxiety among AA persons suffering from social anxiety.

  13. Social Anxiety and Post-Event Processing: The Impact of Race

    Science.gov (United States)

    Buckner, Julia D.; Dean, Kimberlye E.

    2016-01-01

    Background Social anxiety is among the most prevalent psychiatric conditions, yet little attention has been paid to whether putative cognitive vulnerability factors related to social anxiety in predominantly White samples are related to social anxiety among historically underrepresented groups. Design We tested whether one such vulnerability factor, post-event processing (PEP; detailed review of social event that can increase state social anxiety) was related to social anxiety among African American (AA; n=127) persons, who comprise one of the largest underrepresented racial groups in the U.S. Secondarily, we tested whether AA participants differed from non-Hispanic White participants (n=127) on PEP and social anxiety and whether race moderated the relation between PEP and social anxiety. Method Data were collected online among undergraduates. Results PEP was positively correlated with social anxiety among AA participants, even after controlling for depression and income, pr=.30, p=.001. AA and White participants did not differ on social anxiety or PEP, β=−1.57, 95% C.I.: −5.11, 1.96. The relation of PEP to social anxiety did not vary as a function of race, β=0.00, 95% C.I.: −0.02, 0.02. Conclusions PEP may be an important cognitive vulnerability factor related to social anxiety among AA persons suffering from social anxiety. PMID:27576610

  14. Event-related delta, theta, alpha and gamma correlates to auditory oddball processing during Vipassana meditation

    Science.gov (United States)

    Delorme, Arnaud; Polich, John

    2013-01-01

    Long-term Vipassana meditators sat in meditation vs. a control (instructed mind wandering) states for 25 min, electroencephalography (EEG) was recorded and condition order counterbalanced. For the last 4 min, a three-stimulus auditory oddball series was presented during both meditation and control periods through headphones and no task imposed. Time-frequency analysis demonstrated that meditation relative to the control condition evinced decreased evoked delta (2–4 Hz) power to distracter stimuli concomitantly with a greater event-related reduction of late (500–900 ms) alpha-1 (8–10 Hz) activity, which indexed altered dynamics of attentional engagement to distracters. Additionally, standard stimuli were associated with increased early event-related alpha phase synchrony (inter-trial coherence) and evoked theta (4–8 Hz) phase synchrony, suggesting enhanced processing of the habituated standard background stimuli. Finally, during meditation, there was a greater differential early-evoked gamma power to the different stimulus classes. Correlation analysis indicated that this effect stemmed from a meditation state-related increase in early distracter-evoked gamma power and phase synchrony specific to longer-term expert practitioners. The findings suggest that Vipassana meditation evokes a brain state of enhanced perceptual clarity and decreased automated reactivity. PMID:22648958

  15. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    Science.gov (United States)

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at

  16. Complex Event Processing for Content-Based Text, Image, and Video Retrieval

    Science.gov (United States)

    2016-06-01

    use by intelligence analysts when exploiting social media sources. The group is composed of Open Source Intelligence ( OSINT ) practitioners and...defense scientists, each having specific roles in achieving the following goals. For OSINT practitioners, the need from social media was identified as...selected tool/script, or methodology, using the US-provided ground- truth data set to determine how well the social media platform met OSINT needs

  17. Complex event processing for content-based text, image, and video retrieval

    NARCIS (Netherlands)

    Bowman, E.K.; Broome, B.D.; Holland, V.M.; Summers-Stay, D.; Rao, R.M.; Duselis, J.; Howe, J.; Madahar, B.K.; Boury-Brisset, A.C.; Forrester, B.; Kwantes, P.; Burghouts, G.; Huis, J. van; Mulayim, A.Y.

    2016-01-01

    This report summarizes the findings of an exploratory team of the North Atlantic Treaty Organization (NATO) Information Systems Technology panel into Content-Based Analytics (CBA). The team carried out a technical review into the current status of theoretical and practical developments of methods,

  18. Complex media from processing of agricultural crops for microbial fermentation

    DEFF Research Database (Denmark)

    Thomsen, M.H.

    2005-01-01

    , is converted to a basic, universal fermentation medium by lactic acid fermentation, is outlined. The resulting all-round fermentation medium can be used for the production of many useful fermentation products when added a carbohydrate source, which could possibly be another agricultural by-product. Two...... examples of such products-polylactic acid and L-lysine-are given. A cost calculation shows that this fermentation medium can be produced at a very low cost approximate to 1.7 Euro cent/kg, when taking into account that the green crop industry has expenses amounting to 270,000 Euro/year for disposal...... of the brown juice. A newly built lysine factory in Esbjerg, Denmark, can benefit from this process by buying a low price medium for the fermentation process instead of more expensive traditional fermentation liquids such as corn steep liquor....

  19. Specific processes in solvent extractiotn of radionuclide complexes

    International Nuclear Information System (INIS)

    Macasek, F.

    1982-01-01

    The doctoral thesis discusses the consequences of the radioactive beta transformation in systems liquid-liquid and liquid-ion exchanger, and the effect of the chemical composition of liquid-liquid systems on the distribution of radionuclide traces. A model is derived of radiolysis in two-phase liquid-liquid systems used in nuclear chemical technology. The obtained results are used to suggest the processing of radioactive wastes using the Purex process. For solvent extraction the following radionuclides were used: 59 Fe, 95 Zr- 95 Nb, 99 Mo, sup(99m)Tc, 99 Tc, 103 Pd, 137 Cs, 141 Ce, 144 Ce- 144 Pr, 234 Th, and 233 Pa. Extraction was carried out at laboratory temperature. 60 Co was used as the radiation source. Mainly scintillation spectrometry equipment was used for radiometric analysis. (E.S.)

  20. [Event-related brain potentials when Russian verbs being conjugated: to the problem of language processing modularity].

    Science.gov (United States)

    Dan'ko, S G; Boĭtsova, Iu A; Solov'eva, M L; Chernigovskaia, T V; Medvedev, S V

    2014-01-01

    In the light of alternative conceptions of "two-system" and "single-system" models of language processing the efforts have been undertaken to study brain mechanisnis for generation of regular and irregular forms of Russian verbs. The 19 EEG channels of evoked activity were registered along with casual alternations of speech morphology operations to be compared. Verbs of imperfective aspect in the form of an infinitive, belonging either to a group of productive verbs (default, conventionally regular class), or toan unproductive group of verbs (conventionally irregular class) were presented to healthy subjects. The subjects were requested to produce first person present time forms of these verbs. Results of analysis of event related potentials (ERP) for a group of 22 persons are presented. Statistically reliable ERP amplitude distinctions between the verb groups are found onlyin the latencies 600-850 ms in central and parietal zones of the cortex. In these latencies ERP values associated with a presentation of irregular verbs are negative in relation to ERP values associated with the presentation of regular verbs. The received results are interpreted as a consequence of various complexity of mental work with verbs of these different groups and presumably don't support a hypothesis of universality of the "two-system" brain mechanism for processing of the regular and irregular language forms.

  1. Processing of visual semantic information to concrete words: temporal dynamics and neural mechanisms indicated by event-related brain potentials( ).

    Science.gov (United States)

    van Schie, Hein T; Wijers, Albertus A; Mars, Rogier B; Benjamins, Jeroen S; Stowe, Laurie A

    2005-05-01

    Event-related brain potentials were used to study the retrieval of visual semantic information to concrete words, and to investigate possible structural overlap between visual object working memory and concreteness effects in word processing. Subjects performed an object working memory task that involved 5 s retention of simple 4-angled polygons (load 1), complex 10-angled polygons (load 2), and a no-load baseline condition. During the polygon retention interval subjects were presented with a lexical decision task to auditory presented concrete (imageable) and abstract (nonimageable) words, and pseudowords. ERP results are consistent with the use of object working memory for the visualisation of concrete words. Our data indicate a two-step processing model of visual semantics in which visual descriptive information of concrete words is first encoded in semantic memory (indicated by an anterior N400 and posterior occipital positivity), and is subsequently visualised via the network for object working memory (reflected by a left frontal positive slow wave and a bilateral occipital slow wave negativity). Results are discussed in the light of contemporary models of semantic memory.

  2. The Impact of Cognitive Behavioral Therapy on Post Event Processing Among Those with Social Anxiety Disorder

    Science.gov (United States)

    Price, Matthew; Anderson, Page L.

    2011-01-01

    Individuals with social anxiety are prone to engage in post event processing (PEP), a post mortem review of a social interaction that focuses on negative elements. The extent that PEP is impacted by cognitive behavioral therapy (CBT) and the relation between PEP and change during treatment has yet to be evaluated in a controlled study. The current study used multilevel modeling to determine if PEP decreased as a result of treatment and if PEP limits treatment response for two types of cognitive behavioral treatments, a group-based cognitive behavioral intervention and individually based virtual reality exposure. These hypotheses were evaluated using 91 participants diagnosed with social anxiety disorder. The findings suggested that PEP decreased as a result of treatment, and that social anxiety symptoms for individuals reporting greater levels of PEP improved at a slower rate than those with lower levels of PEP. Further research is needed to understand why PEP attenuates response to treatment. PMID:21159328

  3. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    National Research Council Canada - National Science Library

    Neu, Charles R; Davenport, Jon; Smith, William R

    2007-01-01

    This paper uses discrete-event simulation modeling, inventory-reduction, and process improvement concepts to identify and analyze possibilities for improving the training continuum at the Marine Corps...

  4. The time course of implicit processing of erotic pictures: an event-related potential study.

    Science.gov (United States)

    Feng, Chunliang; Wang, Lili; Wang, Naiyi; Gu, Ruolei; Luo, Yue-Jia

    2012-12-13

    The current study investigated the time course of the implicit processing of erotic stimuli using event-related potentials (ERPs). ERPs elicited by erotic pictures were compared with those by three other types of pictures: non-erotic positive, negative, and neutral pictures. We observed that erotic pictures evoked enhanced neural responses compared with other pictures at both early (P2/N2) and late (P3/positive slow wave) temporal stages. These results suggested that erotic pictures selectively captured individuals' attention at early stages and evoked deeper processing at late stages. More importantly, the amplitudes of P2, N2, and P3 only discriminated between erotic and non-erotic (i.e., positive, neutral, and negative) pictures. That is, no difference was revealed among non-erotic pictures, although these pictures differed in both valence and arousal. Thus, our results suggest that the erotic picture processing is beyond the valence and arousal. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Specific neural basis of Chinese idioms processing: an event-related functional MRI study

    International Nuclear Information System (INIS)

    Chen Shaoqi; Zhang Yanzhen; Xiao Zhuangwei; Zhang Xuexin

    2007-01-01

    Objective: To address the neural basis of Chinese idioms processing with different kinds of stimuli using an event-related fMRI design. Methods: Sixteen native Chinese speakers were asked to perform a semantic decision task during fMRI scanning. Three kinds of stimuli were used: Real idioms (Real-idiom condition); Literally plausible phrases (Pseudo-idiom condition, the last character of a real idiom was replaced by a character with similar meaning); Literally implausible strings (Non-idiom condition, the last character of a real idiom was replaced by a character with unrelated meaning). Reaction time and correct rate were recorded at the same time. Results: The error rate was 2.6%, 5.2% and 0.9% (F=3.51, P 0.05) for real idioms, pseudo-idioms and wrong idioms, respectively. Similar neural network was activated in all of the three conditions. However, the right hippocampus was only activated in the real idiom condition, and significant activations were found in anterior portion of left inferior frontal gyms (BA47) in real-and pseudo-idiom conditions, but not in non-idiom condition. Conclusion: The right hippocampus plays a specific role in the particular wording of the Chinese idioms. And the left anterior inferior frontal gyms (BA47) may be engaged in the semantic processing of Chinese idioms. The results support the notion that there were specific neural bases for Chinese idioms processing. (authors)

  6. THE DEVELOPMENT OF THE YUCCA MOUNTAIN PROJECT FEATURE, EVENT, AND PROCESS (FEP) DATABASE

    International Nuclear Information System (INIS)

    Freeze, G.; Swift, P.; Brodsky, N.

    2000-01-01

    A Total System Performance Assessment for Site Recommendation (TSPA-SR) has recently been completed (CRWMS M andO, 2000b) for the potential high-level waste repository at the Yucca Mountain site. The TSPA-SR is an integrated model of scenarios and processes relevant to the postclosure performance of the potential repository. The TSPA-SR scenarios and model components in turn include representations of all features, events, and processes (FEPs) identified as being relevant (i.e., screened in) for analysis. The process of identifying, classifying, and screening potentially relevant FEPs thus provides a critical foundation for scenario development and TSPA analyses for the Yucca Mountain site (Swift et al., 1999). The objectives of this paper are to describe (a) the identification and classification of the comprehensive list of FEPs potentially relevant to the postclosure performance of the potential Yucca Mountain repository, and (b) the development, structure, and use of an electronic database for storing and retrieving screening information about the inclusion and/or exclusion of these Yucca Mountain FEPs in TSPA-SR. The FEPs approach to scenario development is not unique to the Yucca Mountain Project (YMP). General systematic approaches are summarized in NEA (1992). The application of the FEPs approach in several other international radioactive waste disposal programs is summarized in NEA ( 1999)

  7. Available processing resources influence encoding-related brain activity before an event.

    Science.gov (United States)

    Galli, Giulia; Gebert, A Dorothea; Otten, Leun J

    2013-09-01

    Effective cognitive functioning not only relies on brain activity elicited by an event, but also on activity that precedes it. This has been demonstrated in a number of cognitive domains, including memory. Here, we show that brain activity that precedes the effective encoding of a word into long-term memory depends on the availability of sufficient processing resources. We recorded electrical brain activity from the scalps of healthy adult men and women while they memorized intermixed visual and auditory words for later recall. Each word was preceded by a cue that indicated the modality of the upcoming word. The degree to which processing resources were available before word onset was manipulated by asking participants to make an easy or difficult perceptual discrimination on the cue. Brain activity before word onset predicted later recall of the word, but only in the easy discrimination condition. These findings indicate that anticipatory influences on long-term memory are limited in capacity and sensitive to the degree to which attention is divided between tasks. Prestimulus activity that affects later encoding can only be engaged when the necessary cognitive resources can be allocated to the encoding process. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Process evaluation for complex interventions in health services research: analysing context, text trajectories and disruptions.

    Science.gov (United States)

    Murdoch, Jamie

    2016-08-19

    Process evaluations assess the implementation and sustainability of complex healthcare interventions within clinical trials, with well-established theoretical models available for evaluating intervention delivery within specific contexts. However, there is a need to translate conceptualisations of context into analytical tools which enable the dynamic relationship between context and intervention implementation to be captured and understood. In this paper I propose an alternative approach to the design, implementation and analysis of process evaluations for complex health interventions through a consideration of trial protocols as textual documents, distributed and enacted at multiple contextual levels. As an example, I conduct retrospective analysis of a sample of field notes and transcripts collected during the ESTEEM study - a cluster randomised controlled trial of primary care telephone triage. I draw on theoretical perspectives associated with Linguistic Ethnography to examine the delivery of ESTEEM through staff orientation to different texts. In doing so I consider what can be learned from examining the flow and enactment of protocols for notions of implementation and theoretical fidelity (i.e. intervention delivered as intended and whether congruent with the intervention theory). Implementation of the triage intervention required staff to integrate essential elements of the protocol within everyday practice, seen through the adoption and use of different texts that were distributed across staff and within specific events. Staff were observed deploying texts in diverse ways (e.g. reinterpreting scripts, deviating from standard operating procedures, difficulty completing decision support software), providing numerous instances of disruption to maintaining intervention fidelity. Such observations exposed tensions between different contextual features in which the trial was implemented, offering theoretical explanations for the main trial findings. The value of

  9. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  10. Effect of task complexity on intelligence and neural efficiency in children: an event-related potential study.

    Science.gov (United States)

    Zhang, Qiong; Shi, Jiannong; Luo, Yuejia; Liu, Sainan; Yang, Jie; Shen, Mowei

    2007-10-08

    The present study investigates the effects of task complexity, intelligence and neural efficiency on children's performance on an Elementary Cognitive Task. Twenty-three children were divided into two groups on the basis of their Raven Progressive Matrix scores and were then asked to complete a choice reaction task with two test conditions. We recorded the electroencephalogram and calculated the peak latencies and amplitudes for anteriorly distributed P225, N380 and late positive component. Our results suggested shorter late positive component latencies in brighter children, possibly reflecting a higher processing speed in these individuals. Increased P225 amplitude and increased N380 amplitudes for brighter children may indicate a more efficient allocation of attention for brighter children. No moderating effect of task complexity on brain-intelligence relationship was found.

  11. Geologic factors in the isolation of nuclear waste: evaluation of long-term geomorphic processes and events

    International Nuclear Information System (INIS)

    Mara, S.J.

    1979-01-01

    In this report the rate, duration, and magnitude of changes from geomorphic processes and events in the Southwest and the Gulf Coast over the next million years are projected. The projections were made by reviewing the pertinent literature; evaluating the geomorphic history of each region, especially that during the Quaternary Period; identifying the geomorphic processes and events likely to be significant in the two regions of interest; and estimating the average and worst-case conditions expected over the next million years

  12. Space Launch System Complex Decision-Making Process

    Science.gov (United States)

    Lyles, Garry; Flores, Tim; Hundley, Jason; Monk, Timothy; Feldman,Stuart

    2012-01-01

    The Space Shuttle program has ended and elements of the Constellation Program have either been cancelled or transitioned to new NASA exploration endeavors. The National Aeronautics and Space Administration (NASA) has worked diligently to select an optimum configuration for the Space Launch System (SLS), a heavy lift vehicle that will provide the foundation for future beyond low earth orbit (LEO) large-scale missions for the next several decades. From Fall 2010 until Spring 2011, an SLS decision-making framework was formulated, tested, fully documented, and applied to multiple SLS vehicle concepts at NASA from previous exploration architecture studies. This was a multistep process that involved performing figure of merit (FOM)-based assessments, creating Pass/Fail gates based on draft threshold requirements, performing a margin-based assessment with supporting statistical analyses, and performing sensitivity analysis on each. This paper focuses on the various steps and methods of this process (rather than specific data) that allowed for competing concepts to be compared across a variety of launch vehicle metrics in support of the successful completion of the SLS Mission Concept Review (MCR) milestone.

  13. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    Science.gov (United States)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  14. Tube formation by complex cellular processes in Ciona intestinalis notochord.

    Science.gov (United States)

    Dong, Bo; Horie, Takeo; Denker, Elsa; Kusakabe, Takehiro; Tsuda, Motoyuki; Smith, William C; Jiang, Di

    2009-06-15

    In the course of embryogenesis multicellular structures and organs are assembled from constituent cells. One structural component common to many organs is the tube, which consists most simply of a luminal space surrounded by a single layer of epithelial cells. The notochord of ascidian Ciona forms a tube consisting of only 40 cells, and serves as a hydrostatic "skeleton" essential for swimming. While the early processes of convergent extension in ascidian notochord development have been extensively studied, the later phases of development, which include lumen formation, have not been well characterized. Here we used molecular markers and confocal imaging to describe tubulogenesis in the developing Ciona notochord. We found that during tubulogenesis each notochord cell established de novo apical domains, and underwent a mesenchymal-epithelial transition to become an unusual epithelial cell with two opposing apical domains. Concomitantly, extracellular luminal matrix was produced and deposited between notochord cells. Subsequently, each notochord cell simultaneously executed two types of crawling movements bi-directionally along the anterior/posterior axis on the inner surface of notochordal sheath. Lamellipodia-like protrusions resulted in cell lengthening along the anterior/posterior axis, while the retraction of trailing edges of the same cell led to the merging of the two apical domains. As a result, the notochord cells acquired endothelial-like shape and formed the wall of the central lumen. Inhibition of actin polymerization prevented the cell movement and tube formation. Ciona notochord tube formation utilized an assortment of common and fundamental cellular processes including cell shape change, apical membrane biogenesis, cell/cell adhesion remodeling, dynamic cell crawling, and lumen matrix secretion.

  15. The investigation of form and processes in the coastal zone under extreme storm events - the case study of Rethymno, Greece

    Science.gov (United States)

    Afentoulis, Vasileios; Mohammadi, Bijan; Tsoukala, Vasiliki

    2017-04-01

    Coastal zone is a significant geographical and particular region, since it gathers a wide range of social-human's activities and appears to be a complex as well as fragile system of natural variables. Coastal communities are increasingly at risk from serious coastal hazards, such as shoreline erosion and flooding related to extreme hydro-meteorological events: storm surges, heavy precipitation, tsunamis and tides. In order to investigate the impact of these extreme events on the coastal zone, it is necessary to describe the driving mechanisms which contribute to its destabilization and more precisely the interaction between the wave forces and the transport of sediment. The aim of the present study is to examine the capability of coastal zone processes simulation under extreme wave events, using numerical models, in the coastal area of Rethymno, Greece. Rethymno city is one of the eleven case study areas of PEARL (Preparing for Extreme And Rare events in coastal regions) project, an EU funded research project, which aims at developing adaptive risk management strategies for coastal communities focusing on extreme hydro-meteorological events, with a multidisciplinary approach integrating social, environmental and technical research and innovation so as to increase the resilience of coastal regions all over the world. Within this framework, three different numerical models have been used: the MIKE 21 - DHI, the XBeach model and a numerical formulation for sea bed evolution, developed by Afaf Bouharguane and Bijan Mohammadi (2013). For the determination of the wave and hydrodynamic conditions, as well as the assessment of the sediment transport components, the MIKE 21 SW and the MIKE 21 FM modules have been applied and the bathymetry of Rethymno is arranged into a 2D unstructured mesh. This method of digitalization was selected because of its ability to easily represent the complex geometry of the coastal zone. It allows smaller scale wave characteristics to be

  16. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks.

    Science.gov (United States)

    Naveros, Francisco; Garrido, Jesus A; Carrillo, Richard R; Ros, Eduardo; Luque, Niceto R

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  17. ELECTRIC FACTORS INFLUENCING THE COMPLEX EROSION PROCESSING BY INTRODUCING THE ELECTROLYTE THROUGH THE TRANSFER OBJECT

    Directory of Open Access Journals (Sweden)

    Alin Nioata

    2014-05-01

    Full Text Available The electric and electrochemical complex erosion processing is influenced by a great number of factors acting in tight interdependence and mutually influencing one another for achieving the stability of the processing process and achieving the final technological characteristics.The values taking part in developing the fundamental phenomena of the mechanism of complex erosion prevailing and contributes to the definition of technological characteristics, are factors.The paper presents the U potential difference and electric strength I as determining factors of the complex erosion process as well as other factors deriving from them: the current density, the power of the supply source.

  18. Extrinsic and intrinsic complexities of the Los Alamos plutonium processing facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Roberts, N.J.; Longmire, V.L.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. The authors have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. The authors have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities

  19. Extrinsic and intrinsic complexities of the Los Alamos Plutonium Processing Facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Longmire, V.L.; Roberts, N.J.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. We have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. We have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities. Examples that will be reported here include audit trails, lot-name multiplicity, etc

  20. Low Level Event and Near Miss Process for Nuclear Power Plants: Best Practices

    International Nuclear Information System (INIS)

    2012-01-01

    The IAEA programme on the operational safety of nuclear power plants gives priority to the development and promotion of the proper use of IAEA safety standards through the provision of assistance to Member States in the application of safety standards, the performance of safety review missions and the conduct of training activities based on safety standards. A number of IAEA safety standards and nuclear safety publications discuss the processes that need to be put into place for the feedback and analysis of operating experience (OE) at nuclear power plants. These include: Fundamental Safety Principles (IAEA Safety Standards Series No. SF-1), Safety of Nuclear Power Plants: Commissioning and Operation (IAEA Safety Standards Series No. SSR-2/2), Application of the Management System for Facilities and Activities (IAEA Safety Standards Series No. GS-G-3.1) and A System for the Feedback of Experience from Events in Nuclear Installations (IAEA Safety Standards Series No. NS-G-2.11). Additionally, several IAEA TECDOCs cover many aspects of the establishment, conduct and continuous improvement of an OE programme at nuclear power plants, including the consideration of low level events (LLEs) and near misses (NMs). Although these IAEA safety standards and nuclear safety publications have been in existence for several years, 70 per cent of the IAEA Operational Safety Review Team (OSART) missions carried out at nuclear power plants between 2006 and 2010 identified weaknesses in the reporting and analysis process for LLEs and NMs. In fact, this has been one of the recurring issues most often identified in the area of OE during these missions. These weaknesses have been further confirmed by most of the IAEA Peer Review of the Operational Safety Performance Experience (PROSPER) missions that have been conducted to date. Finally, the IAEA International Nuclear Safety Group, in their report entitled Improving the International System for Operating Experience Feedback (INSAG-23

  1. Sociality Mental Modes Modulate the Processing of Advice-Giving: An Event-Related Potentials Study

    Directory of Open Access Journals (Sweden)

    Jin Li

    2018-02-01

    Full Text Available People have different motivations to get along with others in different sociality mental modes (i.e., communal mode and market mode, which might affect social decision-making. The present study examined how these two types of sociality mental modes affect the processing of advice-giving using the event-related potentials (ERPs. After primed with the communal mode and market mode, participants were instructed to decide whether or not give an advice (profitable or damnous to a stranger without any feedback. The behavioral results showed that participants preferred to give the profitable advice to the stranger more slowly compared with the damnous advice, but this difference was only observed in the market mode condition. The ERP results indicated that participants demonstrated more negative N1 amplitude for the damnous advice compared with the profitable advice, and larger P300 was elicited in the market mode relative to both the communal mode and the control group. More importantly, participants in the market mode demonstrated larger P300 for the profitable advice than the damnous advice, whereas this difference was not observed at the communal mode and the control group. These findings are consistent with the dual-process system during decision-making and suggest that market mode may lead to deliberate calculation for costs and benefits when giving the profitable advice to others.

  2. The light-makeup advantage in facial processing: Evidence from event-related potentials.

    Science.gov (United States)

    Tagai, Keiko; Shimakura, Hitomi; Isobe, Hiroko; Nittono, Hiroshi

    2017-01-01

    The effects of makeup on attractiveness have been evaluated using mainly subjective measures. In this study, event-related brain potentials (ERPs) were recorded from a total of 45 Japanese women (n = 23 and n = 22 for Experiment 1 and 2, respectively) to examine the neural processing of faces with no makeup, light makeup, and heavy makeup. To have the participants look at each face carefully, an identity judgement task was used: they were asked to judge whether the two faces presented in succession were of the same person or not. The ERP waveforms in response to the first faces were analyzed. In two experiments with different stimulus probabilities, the amplitudes of N170 and vertex positive potential (VPP) were smaller for faces with light makeup than for faces with heavy makeup or no makeup. The P1 amplitude did not differ between facial types. In a subsequent rating phase, faces with light makeup were rated as more attractive than faces with heavy makeup and no makeup. The results suggest that the processing fluency of faces with light makeup is one of the reasons why light makeup is preferred to heavy makeup and no makeup in daily life.

  3. Impaired Empathy Processing in Individuals with Internet Addiction Disorder: An Event-Related Potential Study

    Directory of Open Access Journals (Sweden)

    Can Jiao

    2017-10-01

    Full Text Available Internet addiction disorder (IAD is associated with deficits in social communication and avoidance of social contact. It has been hypothesized that people with IAD may have an impaired capacity for empathy. The purpose of the current study was to examine the processing of empathy for others’ pain in IADs. Event-related potentials produced in response to pictures showing others in painful and non-painful situations were recorded in 16 IAD subjects and 16 healthy controls (HCs. The N1, P2, N2, P3, and late positive potential components were compared between the two groups. Robust picture × group interactions were observed for N2 and P3. The painful pictures elicited larger N2 and P3 amplitudes than the non-painful pictures did only in the HC group but not in the IAD group. The results of this study suggest that both of the early automatic and of the later cognitive processes of pain empathy may be impaired in IADs. This study provides psychophysical evidence of empathy deficits in association with IAD. Further studies combining multidimensional measurements of empathy are needed to confirm these findings.

  4. Lexical ambiguity resolution during sentence processing in Parkinson's disease: An event-related potential study.

    Directory of Open Access Journals (Sweden)

    Anthony J Angwin

    Full Text Available Event-related potentials (ERPs were recorded to investigate lexical ambiguity resolution during sentence processing in 16 people with Parkinson's disease (PD and 16 healthy controls. Sentences were presented word-by-word on computer screen, and participants were required to decide if a subsequent target word was related to the meaning of the sentence. The task consisted of related, unrelated and ambiguous trials. For the ambiguous trials, the sentence ended with an ambiguous word and the target was related to one of the meanings of that word, but not the one captured by the sentence context (e.g., 'He dug with the spade', Target 'ACE'. Both groups demonstrated slower reaction times and lower accuracy for the ambiguous condition relative to the unrelated condition, however accuracy was impacted by the ambiguous condition to a larger extent in the PD group. These results suggested that PD patients experience increased difficulties with contextual ambiguity resolution. The ERP results did not reflect increased ambiguity resolution difficulties in PD, as a similar N400 effect was evident for the unrelated and ambiguous condition in both groups. However, the magnitude of the N400 for these conditions was correlated with a measure of inhibition in the PD group, but not the control group. The ERP results suggest that semantic processing may be more compromised in PD patients with increased response inhibition deficits.

  5. Feedback processing in adolescence: an event-related potential study of age and gender differences.

    Science.gov (United States)

    Grose-Fifer, Jillian; Migliaccio, Renee; Zottoli, Tina M

    2014-01-01

    Adolescence has frequently been characterized as a period of increased risk taking, which may be largely driven by maturational changes in neural areas that process incentives. To investigate age- and gender-related differences in reward processing, we recorded event-related potentials (ERPs) from 80 participants in a gambling game, in which monetary wins and losses were either large or small. We measured two ERP components: the feedback-related negativity (FRN) and the feedback P3 (fP3). The FRN was sensitive to the size of a win in both adult (aged 23-35 years) and adolescent (aged 13-17 years) males, but not in females. Small wins appeared to be less rewarding for males than for females, which may in part explain more approach-driven behavior in males in general. Furthermore, adolescent boys showed both delayed FRNs to high losses and less differentiation in FRN amplitude between wins and losses in comparison to girls. The fP3, which is thought to index the salience of the feedback at a more conscious level than the FRN, was also larger in boys than in girls. Taken together, these results imply that higher levels of risk taking that are commonly reported in adolescent males may be driven both by hypersensitivity to high rewards and insensitivity to punishment or losses. © 2014 S. Karger AG, Basel.

  6. An Address Event Representation-Based Processing System for a Biped Robot

    Directory of Open Access Journals (Sweden)

    Uziel Jaramillo-Avila

    2016-02-01

    Full Text Available In recent years, several important advances have been made in the fields of both biologically inspired sensorial processing and locomotion systems, such as Address Event Representation-based cameras (or Dynamic Vision Sensors and in human-like robot locomotion, e.g., the walking of a biped robot. However, making these fields merge properly is not an easy task. In this regard, Neuromorphic Engineering is a fast-growing research field, the main goal of which is the biologically inspired design of hybrid hardware systems in order to mimic neural architectures and to process information in the manner of the brain. However, few robotic applications exist to illustrate them. The main goal of this work is to demonstrate, by creating a closed-loop system using only bio-inspired techniques, how such applications can work properly. We present an algorithm using Spiking Neural Networks (SNN for a biped robot equipped with a Dynamic Vision Sensor, which is designed to follow a line drawn on the floor. This is a commonly used method for demonstrating control techniques. Most of them are fairly simple to implement without very sophisticated components; however, it can still serve as a good test in more elaborate circumstances. In addition, the locomotion system proposed is able to coordinately control the six DOFs of a biped robot in switching between basic forms of movement. The latter has been implemented as a FPGA-based neuromorphic system. Numerical tests and hardware validation are presented.

  7. [The strategy and process of out-hospital emergency care of acute cardiovascular events].

    Science.gov (United States)

    Sun, Gang; Wu, Li-e; Li, Qian-ying; Yang, Ye; Wang, Zi-chao; Zhang, Jing-yin; Li, Shu-jun; Yan, Xu-long; Wang, Ming; Zhang, Wen-xiang; Huang, Guan-hua

    2009-06-01

    To study the strategy and process of out-hospital emergency care of acute cardiovascular events. One hundred and eighty-three patients in the Second Affiliated Hospital of Baotou Medical College were prospectively studied. The patients were divided into two groups according to the different ways of out-hospital care, one group consisted of patients who received first-aid care after calling "120" (94 cases), another was self-aid group consisting of patients sent to hospital by relatives (89 cases). The proportion of persons with higher than high school education and better knowledge for emergency care of patients with heart disease in first-aid group was higher than self-aid group (50.0% vs. 29.2%, 83.0% vs. 60.7%, both Pemergency room, they were all treated according to our standard procedure and then registered. All patients were followed up at the end of first and third month after illness. Cardiovascular events were mainly myocardial infarction (61.7%) among 183 patients. There were statistically significant differences between two groups in self-aid response time, first disposal time and out-hospital rescuing time [(32.3+/-5.6) minutes vs. (89.6+/- 8.4) minutes, (47.3+/-7.3) minutes vs. (149.8+/-13.5) minutes, (61.7+/-8.3) minutes vs. [(149.8+/- 13.5) minutes, all P0.05]. Morbidity rate was lower in first-aid group than self-aid group in 1st and 3rd month, respectively (2.1% vs. 9.0%, 4.2% vs. 12.4%, both Pemergency system and procedure can shorten initial disposal time and out-hospital rescuing time, thus improve patients' prognosis. The education level and health knowledge of patients and their relatives directly affect their mode of arriving hospital and prognosis.

  8. Transport mechanisms of soil-bound mercury in the erosion process during rainfall-runoff events.

    Science.gov (United States)

    Zheng, Yi; Luo, Xiaolin; Zhang, Wei; Wu, Xin; Zhang, Juan; Han, Feng

    2016-08-01

    Soil contamination by mercury (Hg) is a global environmental issue. In watersheds with a significant soil Hg storage, soil erosion during rainfall-runoff events can result in nonpoint source (NPS) Hg pollution and therefore, can extend its environmental risk from soils to aquatic ecosystems. Nonetheless, transport mechanisms of soil-bound Hg in the erosion process have not been explored directly, and how different fractions of soil organic matter (SOM) impact transport is not fully understood. This study investigated transport mechanisms based on rainfall-runoff simulation experiments. The experiments simulated high-intensity and long-duration rainfall conditions, which can produce significant soil erosion and NPS pollution. The enrichment ratio (ER) of total mercury (THg) was the key variable in exploring the mechanisms. The main study findings include the following: First, the ER-sediment flux relationship for Hg depends on soil composition, and no uniform ER-sediment flux function exists for different soils. Second, depending on soil composition, significantly more Hg could be released from a less polluted soil in the early stage of large rainfall events. Third, the heavy fraction of SOM (i.e., the remnant organic matter coating on mineral particles) has a dominant influence on the enrichment behavior and transport mechanisms of Hg, while clay mineral content exhibits a significant, but indirect, influence. The study results imply that it is critical to quantify the SOM composition in addition to total organic carbon (TOC) for different soils in the watershed to adequately model the NPS pollution of Hg and spatially prioritize management actions in a heterogeneous watershed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Self-Referential Processing in Adolescents: Stability of Behavioral and Event-Related Potential Markers

    Science.gov (United States)

    Auerbach, Randy P.; Bondy, Erin; Stanton, Colin H.; Webb, Christian A.; Shankman, Stewart A.; Pizzagalli, Diego A.

    2016-01-01

    The self-referential encoding task (SRET)—an implicit measure of self-schema—has been used widely to probe cognitive biases associated with depression, including among adolescents. However, research testing the stability of behavioral and electrocortical effects is sparse. Therefore, the current study sought to evaluate the stability of behavioral markers and event-related potentials (ERP) elicited from the SRET over time in healthy, female adolescents (n = 31). At baseline, participants were administered a diagnostic interview and a self-report measure of depression severity. In addition, they completed the SRET while 128-channel event-related potential (ERP) data were recorded to examine early (P1) and late (late positive potential [LPP]) ERPs. Three months later, participants were re-administered the depression self-report measure and the SRET in conjunction with ERPs. Results revealed that healthy adolescents endorsed, recalled, and recognized more positive and fewer negative words at each assessment, and these effects were stable over time (rs = 0.44–0.83). Similarly, they reported a faster reaction time when endorsing self-relevant positive words, as opposed to negative words, at both the initial and follow-up assessment (r = 0.82). Second, ERP responses, specifically potentiated P1 and late LPP positivity to positive versus negative words, were consistent over time (rs = 0.56–0.83), and the internal reliability of ERPs were robust at each time point (rs = 0.52–0.80). As a whole, these medium-to-large effects suggest that the SRET is a reliable behavioral and neural probe of self-referential processing. PMID:27302282

  10. MATHEMATICAL MODELS OF PROCESSES AND SYSTEMS OF TECHNICAL OPERATION FOR ONBOARD COMPLEXES AND FUNCTIONAL SYSTEMS OF AVIONICS

    Directory of Open Access Journals (Sweden)

    Sergey Viktorovich Kuznetsov

    2017-01-01

    Full Text Available Modern aircraft are equipped with complicated systems and complexes of avionics. Aircraft and its avionics tech- nical operation process is observed as a process with changing of operation states. Mathematical models of avionics pro- cesses and systems of technical operation are represented as Markov chains, Markov and semi-Markov processes. The pur- pose is to develop the graph-models of avionics technical operation processes, describing their work in flight, as well as during maintenance on the ground in the various systems of technical operation. The graph-models of processes and sys- tems of on-board complexes and functional avionics systems in flight are proposed. They are based on the state tables. The models are specified for the various technical operation systems: the system with control of the reliability level, the system with parameters control and the system with resource control. The events, which cause the avionics complexes and func- tional systems change their technical state, are failures and faults of built-in test equipment. Avionics system of technical operation with reliability level control is applicable for objects with constant or slowly varying in time failure rate. Avion- ics system of technical operation with resource control is mainly used for objects with increasing over time failure rate. Avionics system of technical operation with parameters control is used for objects with increasing over time failure rate and with generalized parameters, which can provide forecasting and assign the borders of before-fail technical states. The pro- posed formal graphical approach avionics complexes and systems models designing is the basis for models and complex systems and facilities construction, both for a single aircraft and for an airline aircraft fleet, or even for the entire aircraft fleet of some specific type. The ultimate graph-models for avionics in various systems of technical operation permit the beginning of

  11. Effects of Age and Working Memory Load on Syntactic Processing: An Event-Related Potential Study

    Directory of Open Access Journals (Sweden)

    Graciela C. Alatorre-Cruz

    2018-05-01

    Full Text Available Cognitive changes in aging include working memory (WM decline, which may hamper language comprehension. An increase in WM demands in older adults would probably provoke a poorer sentence processing performance in this age group. A way to increase the WM load is to separate two lexical units in an agreement relation (i.e., adjective and noun, in a given sentence. To test this hypothesis, event-related potentials (ERPs were collected from Spanish speakers (30 older adults, mean age = 66.06 years old; and 30 young adults, mean age = 25.7 years old who read sentences to detect grammatical errors. The sentences varied with regard to (1 the gender agreement of the noun and adjective, where the gender of the adjective either agreed or disagreed with the noun, and (2 the WM load (i.e., the number of words between the noun and adjective in the sentence. No significant behavioral differences between groups were observed in the accuracy of the response, but older adults showed longer reaction times regardless of WM load condition. Compared with young participants, older adults showed a different pattern of ERP components characterized by smaller amplitudes of LAN, P600a, and P600b effects when the WM load was increased. A smaller LAN effect probably reflects greater difficulties in processing the morpho-syntactic features of the sentence, while smaller P600a and P600b effects could be related to difficulties in recovering and mapping all sentence constituents. We concluded that the ERP pattern in older adults showed subtle problems in syntactic processing when the WM load was increased, which was not sufficient to affect response accuracy but was only observed to result in a longer reaction time.

  12. The development of control processes supporting source memory discrimination as revealed by event-related potentials.

    Science.gov (United States)

    de Chastelaine, Marianne; Friedman, David; Cycowicz, Yael M

    2007-08-01

    Improvement in source memory performance throughout childhood is thought to be mediated by the development of executive control. As postretrieval control processes may be better time-locked to the recognition response rather than the retrieval cue, the development of processes underlying source memory was investigated with both stimulus- and response-locked event-related potentials (ERPs). These were recorded in children, adolescents, and adults during a recognition memory exclusion task. Green- and red-outlined pictures were studied, but were tested in black outline. The test requirement was to endorse old items shown in one study color ("targets") and to reject new items along with old items shown in the alternative study color ("nontargets"). Source memory improved with age. All age groups retrieved target and nontarget memories as reflected by reliable parietal episodic memory (EM) effects, a stimulus-locked ERP correlate of recollection. Response-locked ERPs to targets and nontargets diverged in all groups prior to the response, although this occurred at an increasingly earlier time point with age. We suggest these findings reflect the implementation of attentional control mechanisms to enhance target memories and facilitate response selection with the greatest and least success, respectively, in adults and children. In adults only, response-locked ERPs revealed an early-onsetting parietal negativity for nontargets, but not for targets. This was suggested to reflect adults' ability to consistently inhibit prepotent target responses for nontargets. The findings support the notion that the development of source memory relies on the maturation of control processes that serve to enhance accurate selection of task-relevant memories.

  13. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  14. Fungal Iron Availability during Deep Seated Candidiasis Is Defined by a Complex Interplay Involving Systemic and Local Events

    Science.gov (United States)

    Potrykus, Joanna; Stead, David; MacCallum, Donna M.; Urgast, Dagmar S.; Raab, Andrea; van Rooijen, Nico; Feldmann, Jörg; Brown, Alistair J. P.

    2013-01-01

    Nutritional immunity – the withholding of nutrients by the host – has long been recognised as an important factor that shapes bacterial-host interactions. However, the dynamics of nutrient availability within local host niches during fungal infection are poorly defined. We have combined laser ablation-inductively coupled plasma mass spectrometry (LA-ICP MS), MALDI imaging and immunohistochemistry with microtranscriptomics to examine iron homeostasis in the host and pathogen in the murine model of systemic candidiasis. Dramatic changes in the renal iron landscape occur during disease progression. The infection perturbs global iron homeostasis in the host leading to iron accumulation in the renal medulla. Paradoxically, this is accompanied by nutritional immunity in the renal cortex as iron exclusion zones emerge locally around fungal lesions. These exclusion zones correlate with immune infiltrates and haem oxygenase 1-expressing host cells. This local nutritional immunity decreases iron availability, leading to a switch in iron acquisition mechanisms within mature fungal lesions, as revealed by laser capture microdissection and qRT-PCR analyses. Therefore, a complex interplay of systemic and local events influences iron homeostasis and pathogen-host dynamics during disease progression. PMID:24146619

  15. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    Science.gov (United States)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  16. BPMN as a Communication Language for the Process- and Event-Oriented Perspectives in Fact-Oriented Conceptual Models

    Science.gov (United States)

    Bollen, Peter

    In this paper we will show how the OMG specification of BPMN (Business Process Modeling Notation) can be used to model the process- and event-oriented perspectives of an application subject area. We will illustrate how the fact-oriented conceptual models for the information-, process- and event perspectives can be used in a 'bottom-up' approach for creating a BPMN model in combination with other approaches, e.g. the use of a textual description. We will use the common doctor's office example as a running example in this article.

  17. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  18. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  19. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Directory of Open Access Journals (Sweden)

    Vera Hagemann

    2017-09-01

    Full Text Available Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes, the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58 or low (n = 58 collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes

  20. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  1. Ionic flotation of complexing oxyanions. Thermodynamics of uranyl complexation in a sulfuric medium. Definition of selectivity conditions of the process

    International Nuclear Information System (INIS)

    Bouzat, G.

    1987-01-01

    Oxyanion ionic flotation process with dodecylamine hydrochloride as collector is studied by investigation of flotation and filtration recovery curves, suspension turbidity, conductimetric measurements, and solubility of ionic species. The process is controlled by chemical reactions of precipitation and by adsorption of surfactant confering hydrophobic or hydrophilic surface properties to the solid phase respectively when one or two monolayers of surfactant are successively adsorbed. Equilibrium constants (in terms of activity) of the uranium (VI) complexation with sulphate oxyanions are calculated through Raman spectroscopic study of uranyl sulphate aqueous solutions. The complexation results in a shift of the symmetrical stretching vibration band of U0 2 to low wavenumbers and an increase of their cross section. The solubility curves of ionic species in the precipitation of uranyl -sulphate complexes by dodecylamine hydrochloride are modelled. The knowledge of solubility products, activity coefficients of the species and critical micellar concentration of the surfactant allow the modelling of flotation recovery curves. Temperature and collector structure modifications are studied in terms of optimization parameters and uranium extraction of mine drainage water is processed. Residual concentration of surfactant is considerably lowered by adsorption on montmorillonite

  2. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    International Nuclear Information System (INIS)

    J. J. Tappen

    2003-01-01

    The purpose of this revision of ''Evaluation of the Applicability of Biosphere-Related Features, Events, and Processes (FEPs)'' (BSC 2001) is to document the screening analysis of biosphere-related primary FEPs, as identified in ''The Development of Information Catalogued in REV00 of the YMP FEP Database'' (Freeze et al. 2001), in accordance with the requirements of the final U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR Part 63. This database is referred to as the Yucca Mountain Project (YMP) FEP Database throughout this document. Those biosphere-related primary FEPs that are screened as applicable will be used to develop the conceptual model portion of the biosphere model, which will in turn be used to develop the mathematical model portion of the biosphere model. As part of this revision, any reference to the screening guidance or criteria provided either by Dyer (1999) or by the proposed NRC regulations at 64 FR 8640 has been removed. The title of this revision has been changed to more accurately reflect the purpose of the analyses. In addition, this revision will address Item Numbers 19, 20, 21, 25, and 26 from Attachment 2 of ''U.S. Nuclear Regulatory Commission/U.S. Department of Energy Technical Exchange and Management Meeting on Total System Performance Assessment and Integration (August 6 through 10, 2001)'' (Reamer 2001). This Scientific Analysis Report (SAR) does not support the current revision to the YMP FEP Database (Freeze et al. 2001). Subsequent to the release of the YMP FEP Database (Freeze et al. 2001), a series of reviews was conducted on both the FEP processes used to support Total System Performance Assessment for Site Recommendation and to develop the YMP FEP Database. In response to observations and comments from these reviews, particularly the NRC/DOE TSPA Technical Exchange in August 2001 (Reamer 2001), several Key Technical Issue (KTI) Agreements were developed. ''The Enhanced Plan for Features, Events and Processes

  3. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    J. J. Tappen

    2003-02-16

    The purpose of this revision of ''Evaluation of the Applicability of Biosphere-Related Features, Events, and Processes (FEPs)'' (BSC 2001) is to document the screening analysis of biosphere-related primary FEPs, as identified in ''The Development of Information Catalogued in REV00 of the YMP FEP Database'' (Freeze et al. 2001), in accordance with the requirements of the final U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR Part 63. This database is referred to as the Yucca Mountain Project (YMP) FEP Database throughout this document. Those biosphere-related primary FEPs that are screened as applicable will be used to develop the conceptual model portion of the biosphere model, which will in turn be used to develop the mathematical model portion of the biosphere model. As part of this revision, any reference to the screening guidance or criteria provided either by Dyer (1999) or by the proposed NRC regulations at 64 FR 8640 has been removed. The title of this revision has been changed to more accurately reflect the purpose of the analyses. In addition, this revision will address Item Numbers 19, 20, 21, 25, and 26 from Attachment 2 of ''U.S. Nuclear Regulatory Commission/U.S. Department of Energy Technical Exchange and Management Meeting on Total System Performance Assessment and Integration (August 6 through 10, 2001)'' (Reamer 2001). This Scientific Analysis Report (SAR) does not support the current revision to the YMP FEP Database (Freeze et al. 2001). Subsequent to the release of the YMP FEP Database (Freeze et al. 2001), a series of reviews was conducted on both the FEP processes used to support Total System Performance Assessment for Site Recommendation and to develop the YMP FEP Database. In response to observations and comments from these reviews, particularly the NRC/DOE TSPA Technical Exchange in August 2001 (Reamer 2001), several Key Technical Issue (KTI) Agreements were developed

  4. Potentially disruptive hydrologic features, events and processes at the Yucca Mountain Site, Nevada

    International Nuclear Information System (INIS)

    Hoxie, D.T.

    1995-01-01

    Yucca Mountain, Nevada, has been selected by the United States to be evaluated as a potential site for the development of a geologic repository for the disposal of spent nuclear fuel and high-level radioactive waste. If the site is determined to be suitable for repository development and construction is authorized, the repository at the Yucca Mountain site is planned to be constructed in unsaturated tuff at a depth of about 250 meters below land surface and at a distance of about 250 meters above the water table. The intent of locating a repository in a thick unsaturated-zone geohydrologic setting, such as occurs at Yucca Mountain under the arid to semi-arid climatic conditions that currently prevail in the region, is to provide a natural setting for the repository system in which little ground water will be available to contact emplaced waste or to transport radioactive material from the repository to the biosphere. In principle, an unsaturated-zone repository will be vulnerable to water entry from both above and below. Consequently, a major effort within the site-characterization program at the Yucca Mountain site is concerned with identifying and evaluating those features, events, and processes, such as increased net infiltration or water-table rise, whose presence or future occurrence could introduce water into a potential repository at the site in quantities sufficient to compromise the waste-isolation capability of the repository system

  5. Cannabis-Related Problems and Social Anxiety: The Mediational Role of Post-Event Processing.

    Science.gov (United States)

    Ecker, Anthony H; Buckner, Julia D

    2018-01-02

    Cannabis is the most commonly used illicit drug in the US, and is associated with a range of psychological, social, and physical health-related problems. Individuals who endorse elevated levels of social anxiety are especially at risk for experiencing cannabis-related problems, including cannabis use disorder, despite not using cannabis more often than those with more normative social anxiety. Identification of mechanisms that underlie the relationship between social anxiety and cannabis-related problems may inform treatment and prevention efforts. Post-event processing (PEP, i.e., cognitively reviewing past social interactions/performances) is a social anxiety-related phenomenon that may be one such mechanism. The current study sought to test PEP as a mediator of the relationship between social anxiety and cannabis-related problems, adjusting for cannabis use frequency. Cannabis-using (past 3-month) undergraduate students recruited in 2015 (N = 244; 76.2% female; 74.2% Non-Hispanic Caucasian) completed an online survey of cannabis use, cannabis-related problems, social anxiety, and PEP. Bootstrap estimate of the indirect effect of social anxiety through PEP was significant, suggesting PEP is a mediator of the social anxiety-cannabis-related problems relationship. Conclusions/Importance: Treatment and prevention efforts may benefit from targeting PEP among individuals with elevated social anxiety and cannabis-related problems.

  6. Altered processing of visual emotional stimuli in posttraumatic stress disorder: an event-related potential study.

    Science.gov (United States)

    Saar-Ashkenazy, Rotem; Shalev, Hadar; Kanthak, Magdalena K; Guez, Jonathan; Friedman, Alon; Cohen, Jonathan E

    2015-08-30

    Patients with posttraumatic stress disorder (PTSD) display abnormal emotional processing and bias towards emotional content. Most neurophysiological studies in PTSD found higher amplitudes of event-related potentials (ERPs) in response to trauma-related visual content. Here we aimed to characterize brain electrical activity in PTSD subjects in response to non-trauma-related emotion-laden pictures (positive, neutral and negative). A combined behavioral-ERP study was conducted in 14 severe PTSD patients and 14 controls. Response time in PTSD patients was slower compared with that in controls, irrespective to emotional valence. In both PTSD and controls, response time to negative pictures was slower compared with that to neutral or positive pictures. Upon ranking, both control and PTSD subjects similarly discriminated between pictures with different emotional valences. ERP analysis revealed three distinctive components (at ~300, ~600 and ~1000 ms post-stimulus onset) for emotional valence in control subjects. In contrast, PTSD patients displayed a similar brain response across all emotional categories, resembling the response of controls to negative stimuli. We interpret these findings as a brain-circuit response tendency towards negative overgeneralization in PTSD. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. The impact of cognitive behavioral therapy on post event processing among those with social anxiety disorder.

    Science.gov (United States)

    Price, Matthew; Anderson, Page L

    2011-02-01

    Individuals with social anxiety are prone to engage in post event processing (PEP), a post mortem review of a social interaction that focuses on negative elements. The extent that PEP is impacted by cognitive behavioral therapy (CBT) and the relation between PEP and change during treatment has yet to be evaluated in a controlled study. The current study used multilevel modeling to determine if PEP decreased as a result of treatment and if PEP limits treatment response for two types of cognitive behavioral treatments, a group-based cognitive behavioral intervention and individually based virtual reality exposure. These hypotheses were evaluated using 91 participants diagnosed with social anxiety disorder. The findings suggested that PEP decreased as a result of treatment, and that social anxiety symptoms for individuals reporting greater levels of PEP improved at a slower rate than those with lower levels of PEP. Further research is needed to understand why PEP attenuates response to treatment. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Hemispheric Lateralization of Event-Related Brain Potentials in Different Processing Phases during Unimanual Finger Movements

    Directory of Open Access Journals (Sweden)

    Yi-Wen Li

    2008-04-01

    Full Text Available Previous functional MRI and brain electrophysiology studies have studied the left-right differences during the tapping tasks and found that the activation of left hemisphere was more significant than that of right hemisphere. In this study, we wanted to delineate this lateralization phenomenon not only in the execution phase but also in other processing phases, such as early visual, pre-executive and post-executive phases. We have designed a finger-tapping task to delineate the left-right differences of event related potentials (ERPs to right finger movement in sixteen right handed college students. The mean amplitudes of ERPs were analyzed to examine the left-right dominance of cortical activity in the phase of early visual process (75-120ms, pre-execution (175-260ms, execution (310-420ms and post-execution (420-620ms. In the execution phase, ERPs at the left electrodes were significantly more pronounced than those at the right electrodes (F3 > F4, C3 > C4, P3 > P4, O1 > O2 under the situation without comparing the central electrodes (Fz, Cz, Pz, and Oz. No difference was found between left and right electrodes in other three phases except the C3 electrode still showed more dominant than C4 in the pre- and post-execution phase. In conclusion, the phenomenon of brain lateralization occur major in the execution phase. The central area also showed the lateralization in the pre- and post-execution to demonstrate its unique lateralized contributions to unilateral simple finger movements.

  9. Disentangling the complex evolutionary history of the Western Palearctic blue tits (Cyanistes spp.) - phylogenomic analyses suggest radiation by multiple colonization events and subsequent isolation.

    Science.gov (United States)

    Stervander, Martin; Illera, Juan Carlos; Kvist, Laura; Barbosa, Pedro; Keehnen, Naomi P; Pruisscher, Peter; Bensch, Staffan; Hansson, Bengt

    2015-05-01

    Isolated islands and their often unique biota continue to play key roles for understanding the importance of drift, genetic variation and adaptation in the process of population differentiation and speciation. One island system that has inspired and intrigued evolutionary biologists is the blue tit complex (Cyanistes spp.) in Europe and Africa, in particular the complex evolutionary history of the multiple genetically distinct taxa of the Canary Islands. Understanding Afrocanarian colonization events is of particular importance because of recent unconventional suggestions that these island populations acted as source of the widespread population in mainland Africa. We investigated the relationship between mainland and island blue tits using a combination of Sanger sequencing at a population level (20 loci; 12 500 nucleotides) and next-generation sequencing of single population representatives (>3 200 000 nucleotides), analysed in coalescence and phylogenetic frameworks. We found (i) that Afrocanarian blue tits are monophyletic and represent four major clades, (ii) that the blue tit complex has a continental origin and that the Canary Islands were colonized three times, (iii) that all island populations have low genetic variation, indicating low long-term effective population sizes and (iv) that populations on La Palma and in Libya represent relicts of an ancestral North African population. Further, demographic reconstructions revealed (v) that the Canary Islands, conforming to traditional views, hold sink populations, which have not served as source for back colonization of the African mainland. Our study demonstrates the importance of complete taxon sampling and an extensive multimarker study design to obtain robust phylogeographical inferences. © 2015 John Wiley & Sons Ltd.

  10. Study of multiple hard-scatter processes from different p p interactions in the same ATLAS event

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    Given the large integrated luminosity and the large average pileup in the 2017 ATLAS dataset, the probability of having two leptonically decaying Z bosons originating from separate $pp$ interactions in the same LHC bunch crossing becomes non-negligible. Such events are searched for and the number observed compared with the expectation. These types of events (also for the case involving other hard scatter processes, such as W, photon or top quark production, in the same event) may cause additional backgrounds for particular physics analyses, and therefore this background must be accounted for when relevant.

  11. Children's Verbal Working Memory: Role of Processing Complexity in Predicting Spoken Sentence Comprehension

    Science.gov (United States)

    Magimairaj, Beula M.; Montgomery, James W.

    2012-01-01

    Purpose: This study investigated the role of processing complexity of verbal working memory tasks in predicting spoken sentence comprehension in typically developing children. Of interest was whether simple and more complex working memory tasks have similar or different power in predicting sentence comprehension. Method: Sixty-five children (6- to…

  12. Using the calculational simulating complexes when making the computer process control systems for NPP

    International Nuclear Information System (INIS)

    Zimakov, V.N.; Chernykh, V.P.

    1998-01-01

    The problems on creating calculational-simulating (CSC) and their application by developing the program and program-technical means for computer-aided process control systems at NPP are considered. The abo- ve complex is based on the all-mode real time mathematical model, functioning at a special complex of computerized means

  13. The Effects of Visual Complexity for Japanese Kanji Processing with High and Low Frequencies

    Science.gov (United States)

    Tamaoka, Katsuo; Kiyama, Sachiko

    2013-01-01

    The present study investigated the effects of visual complexity for kanji processing by selecting target kanji from different stroke ranges of visually simple (2-6 strokes), medium (8-12 strokes), and complex (14-20 strokes) kanji with high and low frequencies. A kanji lexical decision task in Experiment 1 and a kanji naming task in Experiment 2…

  14. Role of ion chromatograph in nuclear fuel fabrication process at Nuclear Fuel Complex

    International Nuclear Information System (INIS)

    Balaji Rao, Y.; Prasada Rao, G.; Prahlad, B.; Saibaba, N.

    2012-01-01

    The present paper discusses the different applications of ion chromatography followed in nuclear fuel fabrication process at Nuclear Fuel Complex. Some more applications of IC for characterization of nuclear materials and which are at different stages of method development at Control Laboratory, Nuclear Fuel Complex are also highlighted

  15. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek; P. Rogers

    2004-10-27

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of biosphere features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for the license application (LA). A screening decision, either ''Included'' or ''Excluded'', is given for each FEP along with the corresponding technical basis for the excluded FEPs and the descriptions of how the included FEPs were incorporated in the biosphere model. This information is required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs addressed in this report concern characteristics of the reference biosphere, the receptor, and the environmental transport and receptor exposure pathways for the groundwater and volcanic ash exposure scenarios considered in biosphere modeling. This revision provides the summary of the implementation of included FEPs in TSPA-LA, (i.e., how the FEP is included); for excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report is one of the 10 documents constituting the biosphere model documentation suite. A graphical representation of the documentation hierarchy for the biosphere model is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling. The ''Biosphere Model Report'' describes in detail the biosphere conceptual model and mathematical model. The input parameter reports shown to the right of the ''Biosphere Model Report'' contain detailed descriptions of the model input parameters and their development. Outputs from these six reports are used in the ''Nominal Performance Biosphere Dose Conversion Factor Analysis and Disruptive Event Biosphere Dose Conversion Factor Analysis

  16. Isotopic constraints on contamination processes in the Tonian Goiás Stratiform Complex

    Science.gov (United States)

    Giovanardi, Tommaso; Mazzucchelli, Maurizio; Lugli, Federico; Girardi, Vicente A. V.; Correia, Ciro T.; Tassinari, Colombo C. G.; Cipriani, Anna

    2018-06-01

    The Tonian Goiás Stratiform Complex (TGSC, Goiás, central Brazil), is one of the largest mafic-ultramafic layered complexes in the world, emplaced during the geotectonic events that led to the Gondwana accretion. In this study, we present trace elements and in-situ U/Pb-Lu-Hf analyses of zircons and 87Sr/86Sr ratios of plagioclases from anorthosites and gabbros of the TGSC. Although formed by three isolated bodies (Cana Brava, Niquelândia and Barro Alto), and characterized by a Lower and Upper Sequence (LS and US), our new U/Pb zircon data confirm recent geochemical, geochronological, and structural evidences that the TGSC has originated from a single intrusive body in the Neoproterozoic. New Hf and Sr isotope ratios construe a complex contamination history for the TGSC, with different geochemical signatures in the two sequences. The low Hf and high Sr isotope ratios of the Lower Sequence (εHf(t) from -4.2 down to -27.5; 87Sr/86Sr = 0.706605-0.729226), suggest the presence of a crustal component and are consistent with contamination from meta-pelitic and calc-silicate rocks found as xenoliths within the Sequence. The more radiogenic Hf isotope ratios and low Sr isotope composition of the Upper Sequence (εHf(t) from 11.3 down to -8.4; 87Sr/86Sr = 0.702368-0.702452), suggest a contamination from mantle-derived metabasalts in agreement with the occurrences of amphibolite xenoliths in the US stratigraphy. The differential contamination of the two sequences is explained by the intrusion of the TGSC in a stratified crust dominated by metasedimentary rocks in its deeper part and metavolcanics at shallower levels. Moreover, the differential thermal gradient in the two crystallizing sequences might have contributed to the preservation and recrystallization of inherited zircon grains in the US and total dissolution or magmatic overgrowth of the LS zircons via melt/rock reaction processes.

  17. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  18. Complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit

    International Nuclear Information System (INIS)

    Abdusalyamova, M.N.; Gadoev, S.A.; Dreisinger, D.; Solozhenkin, P.M.

    2013-01-01

    Present article is devoted to complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit. The purpose of research was obtaining the metallic mercury and antimony with further gold and thallium extraction.

  19. Enhancement of Hydrodynamic Processes in Oil Pipelines Considering Rheologically Complex High-Viscosity Oils

    Science.gov (United States)

    Konakhina, I. A.; Khusnutdinova, E. M.; Khamidullina, G. R.; Khamidullina, A. F.

    2016-06-01

    This paper describes a mathematical model of flow-related hydrodynamic processes for rheologically complex high-viscosity bitumen oil and oil-water suspensions and presents methods to improve the design and performance of oil pipelines.

  20. Investigation of complex formation processes of hydroxypropylmethylcellulose and polymethacrylic acid in aqueous solutions

    Directory of Open Access Journals (Sweden)

    M. Katayeva

    2012-12-01

    Full Text Available The complex formation process of hydroxypropylcellulose (HPC with polymethacrylic acid (PMA have been studied using methods of turbidimetric and viscosimetric titration. Position of maximum depending on polymer concentration and molecular mass of polysaccharide have different values.

  1. Investigation of complex formation processes of hydroxypropylmethylcellulose and polymethacrylic acid in aqueous solutions

    OpenAIRE

    M. Katayeva; R. Mangazbayeva; R. Abdykalykova

    2012-01-01

    The complex formation process of hydroxypropylcellulose (HPC) with polymethacrylic acid (PMA) have been studied using methods of turbidimetric and viscosimetric titration. Position of maximum depending on polymer concentration and molecular mass of polysaccharide have different values.

  2. A stable isotope approach for source apportionment of chlorinated ethene plumes at a complex multi-contamination events urban site

    Science.gov (United States)

    Nijenhuis, Ivonne; Schmidt, Marie; Pellegatti, Eleonora; Paramatti, Enrico; Richnow, Hans Hermann; Gargini, Alessandro

    2013-10-01

    The stable carbon isotope composition of chlorinated aliphatic compounds such as chlorinated methanes, ethanes and ethenes was examined as an intrinsic fingerprint for apportionment of sources. A complex field site located in Ferrara (Italy), with more than 50 years history of use of chlorinated aliphatic compounds, was investigated in order to assess contamination sources. Several contamination plumes were found in a complex alluvial sandy multi-aquifer system close to the river Po; sources are represented by uncontained former industrial and municipal dump sites as well as by spills at industrial areas. The carbon stable isotope signature allowed distinguishing 2 major sources of contaminants. One source of chlorinated aliphatic contaminants was strongly depleted in 13C (-40‰ which is commonly observed in recent production of chlorinated solvents. The degradation processes in the plumes could be traced interpreting the isotope enrichment and depletion of parent and daughter compounds, respectively. We demonstrate that, under specific production conditions, namely when highly chlorinated ethenes are produced as by-product during chloromethanes production, 13C depleted fingerprinting of contaminants can be obtained and this can be used to track sources and address the responsible party of the pollution in urban areas.

  3. Leadership within Emergent Events in Complex Systems: Micro-Enactments and the Mechanisms of Organisational Learning and Change

    Science.gov (United States)

    Hazy, James K.; Silberstang, Joyce

    2009-01-01

    One tradition within the complexity paradigm considers organisations as complex adaptive systems in which autonomous individuals interact, often in complex ways with difficult to predict, non-linear outcomes. Building upon this tradition, and more specifically following the complex systems leadership theory approach, we describe the ways in which…

  4. Dogs cannot bark: event-related brain responses to true and false negated statements as indicators of higher-order conscious processing.

    Science.gov (United States)

    Herbert, Cornelia; Kübler, Andrea

    2011-01-01

    The present study investigated event-related brain potentials elicited by true and false negated statements to evaluate if discrimination of the truth value of negated information relies on conscious processing and requires higher-order cognitive processing in healthy subjects across different levels of stimulus complexity. The stimulus material consisted of true and false negated sentences (sentence level) and prime-target expressions (word level). Stimuli were presented acoustically and no overt behavioral response of the participants was required. Event-related brain potentials to target words preceded by true and false negated expressions were analyzed both within group and at the single subject level. Across the different processing conditions (word pairs and sentences), target words elicited a frontal negativity and a late positivity in the time window from 600-1000 msec post target word onset. Amplitudes of both brain potentials varied as a function of the truth value of the negated expressions. Results were confirmed at the single-subject level. In sum, our results support recent suggestions according to which evaluation of the truth value of a negated expression is a time- and cognitively demanding process that cannot be solved automatically, and thus requires conscious processing. Our paradigm provides insight into higher-order processing related to language comprehension and reasoning in healthy subjects. Future studies are needed to evaluate if our paradigm also proves sensitive for the detection of consciousness in non-responsive patients.

  5. Dogs cannot bark: event-related brain responses to true and false negated statements as indicators of higher-order conscious processing.

    Directory of Open Access Journals (Sweden)

    Cornelia Herbert

    Full Text Available The present study investigated event-related brain potentials elicited by true and false negated statements to evaluate if discrimination of the truth value of negated information relies on conscious processing and requires higher-order cognitive processing in healthy subjects across different levels of stimulus complexity. The stimulus material consisted of true and false negated sentences (sentence level and prime-target expressions (word level. Stimuli were presented acoustically and no overt behavioral response of the participants was required. Event-related brain potentials to target words preceded by true and false negated expressions were analyzed both within group and at the single subject level. Across the different processing conditions (word pairs and sentences, target words elicited a frontal negativity and a late positivity in the time window from 600-1000 msec post target word onset. Amplitudes of both brain potentials varied as a function of the truth value of the negated expressions. Results were confirmed at the single-subject level. In sum, our results support recent suggestions according to which evaluation of the truth value of a negated expression is a time- and cognitively demanding process that cannot be solved automatically, and thus requires conscious processing. Our paradigm provides insight into higher-order processing related to language comprehension and reasoning in healthy subjects. Future studies are needed to evaluate if our paradigm also proves sensitive for the detection of consciousness in non-responsive patients.

  6. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Directory of Open Access Journals (Sweden)

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  7. Impact of background noise and sentence complexity on cognitive processing demands

    DEFF Research Database (Denmark)

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive processingdemands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were recorded...... in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the subjectively perceived difficulty of sentence comprehension. The results showed...

  8. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  9. Functional definition of the N450 event-related brain potential marker of conflict processing: a numerical stroop study

    OpenAIRE

    Szűcs, Denes; Soltész, F

    2012-01-01

    BACKGROUND: Several conflict processing studies aimed to dissociate neuroimaging phenomena related to stimulus and response conflict processing. However, previous studies typically did not include a paradigm-independent measure of either stimulus or response conflict. Here we have combined electro-myography (EMG) with event-related brain potentials (ERPs) in order to determine whether a particularly robust marker of conflict processing, the N450 ERP effect usually related to the activity of t...

  10. Impacts of natural events and processes on groundwater flow conditions: a case study in the Horonobe Area, Hokkaido, Northern Japan

    International Nuclear Information System (INIS)

    Niizato, T.; Yasue, K.I.; Kurikami, H.

    2009-01-01

    In order to assess the long-term stability of the geological environments for over several hundred thousand years, it is important to consider the influence of natural events and processes, such as uplift, subsidence, denudation and climate change, on the geological environments, especially in an active region such as Japan. This study presents a conceptual model related to the future natural events and processes which have potential impacts on the groundwater flow conditions in the Horonobe area, Hokkaido, northern Japan on the basis of the neo-tectonics, palaeogeography, palaeo-climate, historical development of landform, and present state of groundwater flow conditions. We conclude that it is important to consider interactions among natural events and processes on the describing of the best-possible approximation of the time-variation of geological environment. (authors)

  11. Reliability of the peer-review process for adverse event rating.

    Directory of Open Access Journals (Sweden)

    Alan J Forster

    Full Text Available Adverse events are poor patient outcomes caused by medical care. Their identification requires the peer-review of poor outcomes, which may be unreliable. Combining physician ratings might improve the accuracy of adverse event classification.To evaluate the variation in peer-reviewer ratings of adverse outcomes; determine the impact of this variation on estimates of reviewer accuracy; and determine the number of reviewers who judge an adverse event occurred that is required to ensure that the true probability of an adverse event exceeded 50%, 75% or 95%.Thirty physicians rated 319 case reports giving details of poor patient outcomes following hospital discharge. They rated whether medical management caused the outcome using a six-point ordinal scale. We conducted latent class analyses to estimate the prevalence of adverse events as well as the sensitivity and specificity of each reviewer. We used this model and Bayesian calculations to determine the probability that an adverse event truly occurred to each patient as function of their number of positive ratings.The overall median score on the 6-point ordinal scale was 3 (IQR 2,4 but the individual rater median score ranged from a minimum of 1 (in four reviewers to a maximum median score of 5. The overall percentage of cases rated as an adverse event was 39.7% (3798/9570. The median kappa for all pair-wise combinations of the 30 reviewers was 0.26 (IQR 0.16, 0.42; Min = -0.07, Max = 0.62. Reviewer sensitivity and specificity for adverse event classification ranged from 0.06 to 0.93 and 0.50 to 0.98, respectively. The estimated prevalence of adverse events using a latent class model with a common sensitivity and specificity for all reviewers (0.64 and 0.83 respectively was 47.6%. For patients to have a 95% chance of truly having an adverse event, at least 3 of 3 reviewers are required to deem the outcome an adverse event.Adverse event classification is unreliable. To be certain that a case

  12. Social dimension and complexity differentially influence brain responses during feedback processing.

    Science.gov (United States)

    Pfabigan, Daniela M; Gittenberger, Marianne; Lamm, Claus

    2017-10-30

    Recent research emphasizes the importance of social factors during performance monitoring. Thus, the current study investigated the impact of social stimuli -such as communicative gestures- on feedback processing. Moreover, it addressed a shortcoming of previous studies, which failed to consider stimulus complexity as potential confounding factor. Twenty-four volunteers performed a time estimation task while their electroencephalogram was recorded. Either social complex, social non-complex, non-social complex, or non-social non-complex stimuli were used to provide performance feedback. No effects of social dimension or complexity were found for task performance. In contrast, Feedback-Related Negativity (FRN) and P300 amplitudes were sensitive to both factors, with larger FRN and P300 amplitudes after social compared to non-social stimuli, and larger FRN amplitudes after complex positive than non-complex positive stimuli. P2 amplitudes were solely sensitive to feedback valence and social dimension. Subjectively, social complex stimuli were rated as more motivating than non-social complex ones. Independently of each other, social dimension and visual complexity influenced amplitude variation during performance monitoring. Social stimuli seem to be perceived as more salient, which is corroborated by P2, FRN and P300 results, as well as by subjective ratings. This could be explained due to their given relevance during every day social interactions.

  13. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    Science.gov (United States)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  14. On risk analysis for repositories in northern Switzerland: extent and probability of geological processes and events

    International Nuclear Information System (INIS)

    Buergisser, H.M.; Herrnberger, V.

    1981-01-01

    The literature study assesses, in the form of expert analysis, geological processes and events for a 1200 km 2 -area of northern Switzerland, with regard to repositories for medium- and high-active waste (depth 100 to 600 m and 600 to 2500 m, respectively) over the next 10 6 years. The area, which comprises parts of the Tabular Jura, the folded Jura and the Molasse Basin, the latter two being parts of the Alpine Orogene, has undergone a non-uniform geologic development since the Oligocene. Within the next 10 4 to 10 5 years a maximum earthquake intensity of VIII-IX (MSK-scale) has been predicted. After this period, particularly in the southern and eastern parts of the area, glaciations will probably occur, with associated erosion of possibly 200 to 300 m. Fluvial erosion as a reponse to an uplift could reach similar values after 10 5 to 10 6 years; however, there are no data on the recent relative vertical crustal movements of the area. The risk of a meteorite impact is considered small as compared to that of these factors. Seismic activity and the position and extent of faults are so poorly known within the area that the faulting probability cannot be derived at present. Flooding by the sea, intrusion of magma, diapirism, metamorphism and volcanic eruptions are not considered to be risk factors for final repositories in northern Switzerland. For the shallow-type repositories, the risk of denudation and landslides have to be judged when locality-bound projects have been proposed. (Auth.)

  15. Event-related potential studies of outcome processing and feedback-guided learning

    Directory of Open Access Journals (Sweden)

    René eSan Martín

    2012-11-01

    Full Text Available In order to control behavior in an adaptive manner the brain has to learn how some situations and actions predict positive or negative outcomes. During the last decade cognitive neuroscientists have shown that the brain is able to evaluate and learn from outcomes within a few hundred milliseconds of their occurrence. This research has been primarily focused on the feedback-related negativity (FRN and the P3, two event-related potential (ERP components that are elicited by outcomes. The FRN is a frontally distributed negative-polarity ERP component that typically reaches its maximal amplitude 250 ms after outcome presentation and tends to be larger for negative than for positive outcomes. The FRN has been associated with activity in the anterior cingulate cortex. The P3 (~300-600 ms is a parietally distributed positive-polarity ERP component that tends to be larger for large magnitude than for small magnitude outcomes. The neural sources of the P3 are probably distributed over different regions of the cortex. This paper examines the theories that have been proposed to explain the functional role of these two ERP components during outcome processing. Special attention is paid to extant literature addressing how these ERP components are modulated by outcome valence (negative vs. positive, outcome magnitude (large vs. small, outcome probability (unlikely vs. likely and behavioral adjustment. The literature offers few generalizable conclusions, but is beset with a number of inconsistencies across studies. This paper discusses the potential reasons for these inconsistencies and points out some challenges that will shape the field over the next decade.

  16. Risk analysis for repositories in north Switzerland. Extent and probability of geologic processes and events

    Energy Technology Data Exchange (ETDEWEB)

    Buergisser, H M; Herrnberger, V

    1981-07-01

    The literature study assesses, in the form of expert analysis, geological processes and events for a 1200 km/sup 2/-area of northern Switzerland, with regard to repositories for medium- and high-active waste (depth 100 to 600 m and 600 to 2500 m, respectively) over the next 10/sup 6/ years. The area, which comprises parts of the Tabular Jura, the folded Jura and the Molasse Basin, the latter two being parts of the Alpine Orogene, has undergone a non-uniform geologic development since the Oligocene. Within the next 10/sup 4/ to 10/sup 5/ years a maximum earthquake intensity of VIII-IX (MSK-scale) has been predicted. After this period, particularly in the southern and eastern parts of the area, glaciations will probably occur, with asociated erosion of possibly 200 to 300 m. Fluvial erosion as a response to an uplift could reach similar values after 10/sup 5/ to 10/sup 6/ years; however, there are no data on the recent relative vertical crustal movements of the area. The risk of a meteorite impact is considered small as compared to that of these factors. Seismic activity and the position and extent of faults are so poorly known within the area that the faulting probability cannot be derived at present. Flooding by the sea, intrusion of magma, diapirism, metamorphism and volcanic eruptions are not considered to be risk factors for final repositories in northern Switzerland. For the shallow-type repositories, the risk of denudation and landslides have to be judged when locality-bound projects have been proposed.

  17. A systems engineering approach to manage the complexity in sustainable chemical product-process design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper provides a perspective on model-data based solution approaches for chemical product-process design, which consists of finding the identity of the candidate chemical product, designing the process that can sustainably manufacture it and verifying the performance of the product during...... framework can manage the complexity associated with product-process problems very efficiently. Three specific computer-aided tools (ICAS, Sustain-Pro and VPPDLab) have been presented and their applications to product-process design, highlighted....

  18. Locating seismicity on the Arctic plate boundary using multiple-event techniques and empirical signal processing

    Science.gov (United States)

    Gibbons, S. J.; Harris, D. B.; Dahl-Jensen, T.; Kværna, T.; Larsen, T. B.; Paulsen, B.; Voss, P. H.

    2017-12-01

    The oceanic boundary separating the Eurasian and North American plates between 70° and 84° north hosts large earthquakes which are well recorded teleseismically, and many more seismic events at far lower magnitudes that are well recorded only at regional distances. Existing seismic bulletins have considerable spread and bias resulting from limited station coverage and deficiencies in the velocity models applied. This is particularly acute for the lower magnitude events which may only be constrained by a small number of Pn and Sn arrivals. Over the past two decades there has been a significant improvement in the seismic network in the Arctic: a difficult region to instrument due to the harsh climate, a sparsity of accessible sites (particularly at significant distances from the sea), and the expense and difficult logistics of deploying and maintaining stations. New deployments and upgrades to stations on Greenland, Svalbard, Jan Mayen, Hopen, and Bjørnøya have resulted in a sparse but stable regional seismic network which results in events down to magnitudes below 3 generating high-quality Pn and Sn signals on multiple stations. A catalogue of several hundred events in the region since 1998 has been generated using many new phase readings on stations on both sides of the spreading ridge in addition to teleseismic P phases. A Bayesian multiple event relocation has resulted in a significant reduction in the spread of hypocentre estimates for both large and small events. Whereas single event location algorithms minimize vectors of time residuals on an event-by-event basis, the Bayesloc program finds a joint probability distribution of origins, hypocentres, and corrections to traveltime predictions for large numbers of events. The solutions obtained favour those event hypotheses resulting in time residuals which are most consistent over a given source region. The relocations have been performed with different 1-D velocity models applicable to the Arctic region and

  19. The Enhanced Plan for Features, Events, and Processes (FEPs) at Yucca Mountain

    International Nuclear Information System (INIS)

    G. Freeze

    2002-01-01

    A performance assessment is required to demonstrate compliance with the post-closure performance objectives for the Yucca Mountain Project (YMP), as stated in 10 CFR Part 63.1 13 (66 FR 55732, p. 55807). A performance assessment is defined in 10 CFR 63.2 (66 FR 55732, p. 55794) as an analysis that: (1) identifies the features, events, and processes (FEPs) that might affect the potential geologic repository; (2) examines the effects of those FEPs upon the performance of the potential geologic repository; and (3) estimates the expected dose incurred by a specified reasonably maximally exposed individual as a result of releases caused by significant FEPs. The performance assessment must also provide the technical basis for inclusion or exclusion of specific FEPs in the performance assessment as stated in 10 CFR 63.114 (66 FR 55732, p. 55807). An initial approach for FEP development, in support of the Total System Performance Assessment for the Site Recommendation (TSPA-SR) (CRWMS M and O 2000e), was documented in Freeze et al. (2001). The development of a comprehensive list of FEPs potentially relevant to the post-closure performance of the potential Yucca Mountain repository is an ongoing, iterative process based on site-specific information, design, and regulations. Although comprehensiveness of the FEPs list cannot be proven with absolute certainty, confidence can be gained through a combination of formal and systematic reviews (both top-down and bottom-up), audits, and comparisons with other FEP lists and through the application of more than one classification scheme. To support TSPA-SR, DOE used a multi-step approach for demonstrating comprehensiveness of the initial list of FEPs. Input was obtained from other international radioactive waste disposal programs as compiled by the Nuclear Energy Agency (NEA) of the Organization for Economic Co-operation and Development (OECD) to establish a general list of FEPs. The list was subsequently refined to include YMP

  20. The Enhanced Plan for Features, Events, and Processes (FEPS) at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    G. Freeze

    2002-03-25

    A performance assessment is required to demonstrate compliance with the post-closure performance objectives for the Yucca Mountain Project (YMP), as stated in 10 CFR Part 63.1 13 (66 FR 55732, p. 55807). A performance assessment is defined in 10 CFR 63.2 (66 FR 55732, p. 55794) as an analysis that: (1) identifies the features, events, and processes (FEPs) that might affect the potential geologic repository; (2) examines the effects of those FEPs upon the performance of the potential geologic repository; and (3) estimates the expected dose incurred by a specified reasonably maximally exposed individual as a result of releases caused by significant FEPs. The performance assessment must also provide the technical basis for inclusion or exclusion of specific FEPs in the performance assessment as stated in 10 CFR 63.114 (66 FR 55732, p. 55807). An initial approach for FEP development, in support of the Total System Performance Assessment for the Site Recommendation (TSPA-SR) (CRWMS M&O 2000e), was documented in Freeze et al. (2001). The development of a comprehensive list of FEPs potentially relevant to the post-closure performance of the potential Yucca Mountain repository is an ongoing, iterative process based on site-specific information, design, and regulations. Although comprehensiveness of the FEPs list cannot be proven with absolute certainty, confidence can be gained through a combination of formal and systematic reviews (both top-down and bottom-up), audits, and comparisons with other FEP lists and through the application of more than one classification scheme. To support TSPA-SR, DOE used a multi-step approach for demonstrating comprehensiveness of the initial list of FEPs. Input was obtained from other international radioactive waste disposal programs as compiled by the Nuclear Energy Agency (NEA) of the Organization for Economic Co-operation and Development (OECD) to establish a general list of FEPs. The list was subsequently refined to include YMP

  1. Emotional Picture and Word Processing: An fMRI Study on Effects of Stimulus Complexity

    Science.gov (United States)

    Schlochtermeier, Lorna H.; Kuchinke, Lars; Pehrs, Corinna; Urton, Karolina; Kappelhoff, Hermann; Jacobs, Arthur M.

    2013-01-01

    Neuroscientific investigations regarding aspects of emotional experiences usually focus on one stimulus modality (e.g., pictorial or verbal). Similarities and differences in the processing between the different modalities have rarely been studied directly. The comparison of verbal and pictorial emotional stimuli often reveals a processing advantage of emotional pictures in terms of larger or more pronounced emotion effects evoked by pictorial stimuli. In this study, we examined whether this picture advantage refers to general processing differences or whether it might partly be attributed to differences in visual complexity between pictures and words. We first developed a new stimulus database comprising valence and arousal ratings for more than 200 concrete objects representable in different modalities including different levels of complexity: words, phrases, pictograms, and photographs. Using fMRI we then studied the neural correlates of the processing of these emotional stimuli in a valence judgment task, in which the stimulus material was controlled for differences in emotional arousal. No superiority for the pictorial stimuli was found in terms of emotional information processing with differences between modalities being revealed mainly in perceptual processing regions. While visual complexity might partly account for previously found differences in emotional stimulus processing, the main existing processing differences are probably due to enhanced processing in modality specific perceptual regions. We would suggest that both pictures and words elicit emotional responses with no general superiority for either stimulus modality, while emotional responses to pictures are modulated by perceptual stimulus features, such as picture complexity. PMID:23409009

  2. Application of complex inoculants in improving the process-ability of grey cast iron for cylinder blocks

    Directory of Open Access Journals (Sweden)

    LIU Wei-ming

    2006-05-01

    Full Text Available Effect of several complex inoculants on mechanical properties, process-ability and sensibility of grey cast iron used in cylinder block were investigated. The experimental results showed that the grey cast iron treated with 60%FeSi75+40%RE complex inoculants has tensile strength consistently at about 295 MPa along with good hardness and improved metallurgy quality. While the grey cast iron inoculated with 20%FeSi75+80%Sr compound inoculants has the best process-ability, the lowest cross-section sensibility and the least microhardness difference. The wear amount of the drill increases correspondingly with the increase of the microhardness difference of matrix structure, indicating the great effect of homogeneousness of matrix structure in the grey cast iron on the machinability of the grey cast iron.

  3. Using Simple Manipulatives to Improve Student Comprehension of a Complex Biological Process: Protein Synthesis

    Science.gov (United States)

    Guzman, Karen; Bartlett, John

    2012-01-01

    Biological systems and living processes involve a complex interplay of biochemicals and macromolecular structures that can be challenging for undergraduate students to comprehend and, thus, misconceptions abound. Protein synthesis, or translation, is an example of a biological process for which students often hold many misconceptions. This article…

  4. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    Science.gov (United States)

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  5. Complex Networks Dynamics Based on Events-Phase Synchronization and Intensity Correlation Applied to The Anomaly Patterns and Extremes in The Tropical African Climate System

    Science.gov (United States)

    Oluoch, K.; Marwan, N.; Trauth, M.; Loew, A.; Kurths, J.

    2012-04-01

    The African continent lie almost entirely within the tropics and as such its (tropical) climate systems are predominantly governed by the heterogeneous, spatial and temporal variability of the Hadley and Walker circulations. The variabilities in these meridional and zonal circulations lead to intensification or suppression of the intensities, durations and frequencies of the Inter-tropical Convergence Zone (ICTZ) migration, trade winds and subtropical high-pressure regions and the continental monsoons. The above features play a central role in determining the African rainfall spatial and temporal variability patterns. The current understanding of these climate features and their influence on the rainfall patterns is not sufficiently understood. Like many real-world systems, atmospheric-oceanic processes exhibit non-linear properties that can be better explored using non-linear (NL) methods of time-series analysis. Over the recent years, the complex network approach has evolved as a powerful new player in understanding spatio-temporal dynamics and evolution of complex systems. Together with NL techniques, it is continuing to find new applications in many areas of science and technology including climate research. We would like to use these two powerful methods to understand the spatial structure and dynamics of African rainfall anomaly patterns and extremes. The method of event synchronization (ES) developed by Quiroga et al., 2002 and first applied to climate networks by Malik et al., 2011 looks at correlations with a dynamic time lag and as such, it is a more intuitive way to correlate a complex and heterogeneous system like climate networks than a fixed time delay most commonly used. On the other hand, the short comings of ES is its lack of vigorous test statistics for the significance level of the correlations, and the fact that only the events' time indices are synchronized while all information about how the relative intensities propagate within network

  6. [Process study on hysteresis of vegetation cover influencing sand-dust events].

    Science.gov (United States)

    Xu, Xing-Kui; Wang, Xiao-Tao; Zhang, Feng

    2009-02-15

    Data analysis from satellite and weather stations during 1982-2000 shows nonlinear relationship between vegetation cover and sand-dust events is present in most part of China. Vegetation cover ratio in summer can impact significantly on the frequency of sand-dust storms from winter to spring in the source regions of sand-dust events. It is not quite clear about the hysteresis that vegetation cover in summer influence sand-dust events during winter and spring. A quasi-geostrophic barotropic model is used under the condition of 3 magnitude of frictional coefficient to investigate the cause of the hysteresis. Wind velocity shows a greatest decline at 90% during 72 h as initial wind velocity is 10 m/s for magnitude of frictional coefficient between atmosphere and water surface, greatest decline at 100% during 18 h for magnitude of frictional coefficient between atmosphere and bare soil and a 100% reduction of wind speed during 1 h for magnitude of frictional coefficient between atmosphere and vegetation cover. Observation and simulation prove that residual root and stem from summervegetation are one of factors to influence sand-dust events happened during winter and spring. Air inhibition from residual root and stem is a most important reason for hysteresis that vegetation cover influence sand-dust events.

  7. Event-related cortical processing in neuropathic pain under long-term spinal cord stimulation.

    Science.gov (United States)

    Weigel, Ralf; Capelle, H Holger; Flor, Herta; Krauss, Joachim K

    2015-01-01

    Several mechanisms were suggested in the past to explain the beneficial effect of spinal cord stimulation (SCS) in patients suffering from neuropathic pain. Little is known about potential supraspinal mechanisms. In this study cortical signaling of patients with neuropathic pain and successful long-term treatment with SCS was analyzed. Observational study. University hospital, neurosurgical department, outpatient clinic for movement disorders and pain, institute for cognitive and clinical neuroscience. Nine patients with neuropathic pain of a lower extremity with a lasting response to chronic SCS were included. Cortical activity was analyzed using event-related potentials of the electroencephalogram after non-painful and painful stimulation. Each patient was tested under the effect of long-term SCS and 24 hours after cessation of SCS. Cortical areas involved in the peaks of evoked potentials were localized using a source localization method based on a fixed dipole model. Detection threshold and intensity of non-painful stimulation did not differ significantly on both sides. Pain threshold was significantly lower on the neuropathic side under the effect of SCS (P = 0.03). Bilateral pain thresholds were significantly lower (P = 0.03 healthy side, P = 0.003 neuropathic side) in 5 patients with increased pain after cessation of SCS. Under the effect of SCS cortical negativities (N1, N2, N3) and positivities (P1) demonstrated bilaterally comparable amplitudes. After cessation of SCS, decreased threshold for peripheral stimulation resulted in lowered negativities on both sides. The positivity P1 was differentially regulated and was reduced more contralateral to the unaffected side. N2 was localized at the sensory representation of the leg within the homunculus. The main vector of P1 was localized within the cingular cortex (CC) and moved more anteriorly under the effect of SCS. The exact time span that SCS continues to have an effect is not known. However, due to patient

  8. Processing of visual semantic information to concrete words : temporal dynamics and neural mechanisms indicated by event-related brain potentials

    NARCIS (Netherlands)

    van Schie, Hein T.; Wijers, Albertus A.; Mars, Rogier B.; Benjamins, Jeroen S.; Stowe, Laurie A.

    2005-01-01

    Event-related brain potentials were used to study the retrieval of visual semantic information to concrete words, and to investigate possible structural overlap between visual object working memory and concreteness effects in word processing. Subjects performed an object working memory task that

  9. Processing of visual semantic information to concrete words: temporal dynamics and neural mechanisms indicated by event-related brain potentials

    NARCIS (Netherlands)

    Schie, H.T. van; Wijers, A.A.; Mars, R.B.; Benjamins, J.S.; Stowe, L.A.

    2005-01-01

    Event-related brain potentials were used to study the retrieval of visual semantic information to concrete words, and to investigate possible structural overlap between visual object working memory and concreteness effects in word processing. Subjects performed an object working memory task that

  10. Social Information Processing of Positive and Negative Hypothetical Events in Children with ADHD and Conduct Problems and Controls

    Science.gov (United States)

    Andrade, Brendan F.; Waschbusch, Daniel A.; Doucet, Amelie; King, Sara; MacKinnon, Maura; McGrath, Patrick J.; Stewart, Sherry H.; Corkum, Penny

    2012-01-01

    Objective: This study examined social information processing (SIP) of events with varied outcomes in children with ADHD and conduct problems (CPs; defined as oppositional defiant disorder [ODD] or conduct disorder [CD]) and controls. Method: Participants were 64 children (46 boys, 18 girls) aged 6 to 12, including 39 with ADHD and 25 controls.…

  11. Observation of hard processes in rapidity gap events in {gamma}p interactions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, T.; Aid, S.; Andreev, V.; Andrieu, B.; Appuhn, R.D.; Arpagaus, M.; Babaev, A.; Baehr, J.; Ban, J.; Baranov, P.; Barrelet, E.; Bartel, W.; Barth, M.; Bassler, U.; Beck, H.P.; Behrend, H.J.; Belousov, A.; Berger, C.; Bergstein, H.; Bernardi, G.; Bernet, R.; Bertrand-Coremans, G.; Besancon, M.; Beyer, R.; Biddulph, P.; Bizot, J.C.; Blobel, V.; Borras, K.; Botterweck, F.; Boudry, V.; Braemer, A.; Brasse, F.; Braunschweig, W.; Brisson, V.; Bruncko, D.; Brune, C.; Buchholz, R.; Buengener, L.; Buerger, J.; Buesser, F.W.; Buniatian, A.; Burke, S.; Buschhorn, G.; Campbell, A.J.; Carli, T.; Charles, F.; Clarke, D.; Clegg, A.B.; Clerbaux, B.; Colombo, M.; Contreras, J.G.; Coughlan, J.A.; Courau, A.; Coutures, C.; Cozzika, G.; Criegee, L.; Cussans, D.G.; Cvach, J.; Dagoret, S.; Dainton, J.B.; Danilov, M.; Dau, W.D.; Daum, K.; David, M.; Deffur, E.; Delcourt, B.; Del Buono, L.; De Roeck, A.; De Wolf, E.A.; Di Nezza, P.; Dollfus, C.; Dowell, J.D.; Dreis, H.B.; Droutskoi, V.; Duboc, J.; Duellmann, D.; Duenger, O.; Duhm, H.; Ebert, J.; Ebert, T.R.; Eckerlin, G.; Efremenko, V.; Egli, S.; Ehrlichmann, H.; Eichenberger, S.; Eichler, R.; Eisele, F.; Eisenhandler, E.; Ellison, R.J.; Elsen, E.; Erdmann, M.; Erdmann, W.; Evrard, E.; Favart, L.; Fedotov, A.; Feeken, D.; Felst, R.; Feltesse, J.; Ferencei, J.; Ferrarotto, F.; Flamm, K.; Fleischer, M.; Flieser, M.; Fluegge, G.; Fomenko, A.; Fominykh, B.; Forbush, M.; Formanek, J.; Foster, J.M.; Franke, G.; Fretwurst, E.; Gabathuler, E.; Gabathuler, K.; Gamerdinger, K.; Garvey, J.; Gayler, J.; Gebauer, M.; Gellrich, A.; Genzel, H.; Gerhards, R.; Goerlach, U.; Goerlich, L.; Gogitidze, N.; Goldberg, M.; Goldner, D.; Gonzalez-Pineiro, B.; Goodall, A.M.; Gorelov, I.; Goritchev, P.; Grab, C.; Graessler, H.; Graessler, R.; Greenshaw, T.; Grindhammer, G.; Gruber, A.; Gruber, C.; Haack, J.; Haidt, D.; Hajduk, L.; Hamon, O.; Hampel, M.; Hanlon, E.M.; Hapke, M.; Haynes, W.J.; Heatherington, J.; Heinzelmann, G.; Henderson, R.C.W.; H1 Collaboration

    1995-02-06

    Events with no hadronic energy flow in a large interval of pseudo-rapidity in the proton direction are observed in photon-proton interactions at an average centre of mass energy left angle {radical}(s{sub {gamma}p}) right angle of 200 GeV. These events are interpreted as photon diffractive dissociation. Evidence for hard scattering in photon diffractive dissociation is demonstrated using inclusive single particle spectra, thrust as a function of transverse energy, and the observation of jet production. The data can be described by a Monte Carlo calculation including hard photon-pomeron scattering. ((orig.))

  12. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-10-09

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10{sup -8}). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  13. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    International Nuclear Information System (INIS)

    Wasiolek, M. A.

    2003-01-01

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10 -8 ). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  14. A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.

    Science.gov (United States)

    Dubljević, Veljko; Racine, Eric

    2014-10-01

    The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).

  15. Effects of emotional tone and visual complexity on processing health information in prescription drug advertising.

    Science.gov (United States)

    Norris, Rebecca L; Bailey, Rachel L; Bolls, Paul D; Wise, Kevin R

    2012-01-01

    This experiment explored how the emotional tone and visual complexity of direct-to-consumer (DTC) drug advertisements affect the encoding and storage of specific risk and benefit statements about each of the drugs in question. Results are interpreted under the limited capacity model of motivated mediated message processing framework. Findings suggest that DTC drug ads should be pleasantly toned and high in visual complexity in order to maximize encoding and storage of risk and benefit information.

  16. Process modeling of the platform choise for development of the multimedia educational complex

    Directory of Open Access Journals (Sweden)

    Ірина Олександрівна Бондар

    2016-10-01

    Full Text Available The article presents a methodical approach to the platform choice as the technological basis for building of open and functional structure and the further implementation of the substantive content of the modules of the network multimedia complex for the discipline. The proposed approach is implemented through the use of mathematical tools. The result of the process modeling is the decision of the most appropriate platform for development of the multimedia complex

  17. Implicit Phonological and Semantic Processing in Children with Developmental Dyslexia: Evidence from Event-Related Potentials

    Science.gov (United States)

    Jednorog, K.; Marchewka, A.; Tacikowski, P.; Grabowska, A.

    2010-01-01

    Dyslexia is characterized by a core phonological deficit, although recent studies indicate that semantic impairment also contributes to this condition. In this study, event-related potentials (ERP) were used to examine whether the N400 wave in dyslexic children is modulated by phonological or semantic priming, similarly to age-matched controls.…

  18. Semantic ambiguity processing in sentence context: Evidence from event-related fMRI

    NARCIS (Netherlands)

    Zempleni, Monika-Zita; Renken, Remco; Hoeks, John C. J.; Hoogduin, Johannes M.; Stowe, Laurie A.

    2007-01-01

    Lexical semantic ambiguity is the phenomenon when a word has multiple meanings (e.g. 'bank'). The aim of this event-related functional MRI study was to identify those brain areas, which are involved in contextually driven ambiguity resolution. Ambiguous words were selected which have a most

  19. Improved motion contrast and processing efficiency in OCT angiography using complex-correlation algorithm

    International Nuclear Information System (INIS)

    Guo, Li; Li, Pei; Pan, Cong; Cheng, Yuxuan; Ding, Zhihua; Li, Peng; Liao, Rujia; Hu, Weiwei; Chen, Zhong

    2016-01-01

    The complex-based OCT angiography (Angio-OCT) offers high motion contrast by combining both the intensity and phase information. However, due to involuntary bulk tissue motions, complex-valued OCT raw data are processed sequentially with different algorithms for correcting bulk image shifts (BISs), compensating global phase fluctuations (GPFs) and extracting flow signals. Such a complicated procedure results in massive computational load. To mitigate such a problem, in this work, we present an inter-frame complex-correlation (CC) algorithm. The CC algorithm is suitable for parallel processing of both flow signal extraction and BIS correction, and it does not need GPF compensation. This method provides high processing efficiency and shows superiority in motion contrast. The feasibility and performance of the proposed CC algorithm is demonstrated using both flow phantom and live animal experiments. (paper)

  20. The Distribution of the Interval between Events of a Cox Process with Shot Noise Intensity

    Directory of Open Access Journals (Sweden)

    Angelos Dassios

    2008-01-01

    Full Text Available Applying piecewise deterministic Markov processes theory, the probability generating function of a Cox process, incorporating with shot noise process as the claim intensity, is obtained. We also derive the Laplace transform of the distribution of the shot noise process at claim jump times, using stationary assumption of the shot noise process at any times. Based on this Laplace transform and from the probability generating function of a Cox process with shot noise intensity, we obtain the distribution of the interval of a Cox process with shot noise intensity for insurance claims and its moments, that is, mean and variance.