WorldWideScience

Sample records for complex event processing

  1. Real-time monitoring of clinical processes using complex event processing and transition systems.

    Science.gov (United States)

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  2. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  3. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  4. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  5. Survey of Applications of Complex Event Processing (CEP in Health Domain

    Directory of Open Access Journals (Sweden)

    Nadeem Mahmood

    2017-12-01

    Full Text Available It is always difficult to manipulate the production of huge amount of data which comes from multiple sources and to extract meaningful information to make appropriate decisions. When data comes from various input resources, to get required streams of events form this complex input network, the one of the strong functionality of Business Intelligence (BI the Complex Event Processing (CEP is the appropriate solution for the above mention problems. Real time processing, pattern matching, stream processing, big data management, sensor data processing and many more are the application areas of CEP. Health domain itself is a multi-dimension domain such as hospital supply chain, OPD management, disease diagnostic, In-patient, out-patient management, and emergency care etc. In this paper, the main focus is to discuss the application areas of Complex Event Processing (CEP in health domain by using sensor device, such that how CEP manipulate health data set events coming from sensor devices such as blood pressure, heart rate, fall detection, sugar level, temperature or any other vital signs and how this systems respond to these events as quickly as possible. Different existing models and application using CEP are discussed and summarized according to different characteristics.

  6. ASPIE: A Framework for Active Sensing and Processing of Complex Events in the Internet of Manufacturing Things

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Rapid perception and processing of critical monitoring events are essential to ensure healthy operation of Internet of Manufacturing Things (IoMT-based manufacturing processes. In this paper, we proposed a framework (active sensing and processing architecture (ASPIE for active sensing and processing of critical events in IoMT-based manufacturing based on the characteristics of IoMT architecture as well as its perception model. A relation model of complex events in manufacturing processes, together with related operators and unified XML-based semantic definitions, are developed to effectively process the complex event big data. A template based processing method for complex events is further introduced to conduct complex event matching using the Apriori frequent item mining algorithm. To evaluate the proposed models and methods, we developed a software platform based on ASPIE for a local chili sauce manufacturing company, which demonstrated the feasibility and effectiveness of the proposed methods for active perception and processing of complex events in IoMT-based manufacturing.

  7. Compliance with Environmental Regulations through Complex Geo-Event Processing

    Directory of Open Access Journals (Sweden)

    Federico Herrera

    2017-11-01

    Full Text Available In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with data coming from autonomous sensors (e.g. reporting carbon emission in protected areas and from citizens providing information (e.g. illegal dumping in a voluntary way. Complex Event Processing (CEP technologies allow processing large amount of event data and detecting patterns from them. However, they do not provide native support for the geographic dimension of events which is essential for monitoring requirements which apply to specific geographic areas. This paper proposes a geospatial extension for CEP that allows monitoring environmental requirements considering the geographic location of the processed data. We extend an existing platform-independent, model-driven approach for CEP adding the geographic location to events and specifying patterns using geographic operators. The use and technical feasibility of the proposal is shown through the development of a case study and the implementation of a prototype.

  8. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  9. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  10. Prediction of Increasing Production Activities using Combination of Query Aggregation on Complex Events Processing and Neural Network

    Directory of Open Access Journals (Sweden)

    Achmad Arwan

    2016-07-01

    Full Text Available AbstrakProduksi, order, penjualan, dan pengiriman adalah serangkaian event yang saling terkait dalam industri manufaktur. Selanjutnya hasil dari event tersebut dicatat dalam event log. Complex Event Processing adalah metode yang digunakan untuk menganalisis apakah terdapat pola kombinasi peristiwa tertentu (peluang/ancaman yang terjadi pada sebuah sistem, sehingga dapat ditangani secara cepat dan tepat. Jaringan saraf tiruan adalah metode yang digunakan untuk mengklasifikasi data peningkatan proses produksi. Hasil pencatatan rangkaian proses yang menyebabkan peningkatan produksi digunakan sebagai data latih untuk mendapatkan fungsi aktivasi dari jaringan saraf tiruan. Penjumlahan hasil catatan event log dimasukkan ke input jaringan saraf tiruan untuk perhitungan nilai aktivasi. Ketika nilai aktivasi lebih dari batas yang ditentukan, maka sistem mengeluarkan sinyal untuk meningkatkan produksi, jika tidak, sistem tetap memantau kejadian. Hasil percobaan menunjukkan bahwa akurasi dari metode ini adalah 77% dari 39 rangkaian aliran event.Kata kunci: complex event processing, event, jaringan saraf tiruan, prediksi peningkatan produksi, proses. AbstractProductions, orders, sales, and shipments are series of interrelated events within manufacturing industry. Further these events were recorded in the event log. Complex event processing is a method that used to analyze whether there are patterns of combinations of certain events (opportunities / threats that occur in a system, so it can be addressed quickly and appropriately. Artificial neural network is a method that we used to classify production increase activities. The series of events that cause the increase of the production used as a dataset to train the weight of neural network which result activation value. An aggregate stream of events inserted into the neural network input to compute the value of activation. When the value is over a certain threshold (the activation value results

  11. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  12. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2013-10-09

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework, to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.

  13. Why Rules Matter in Complex Event Processing...and Vice Versa

    Science.gov (United States)

    Vincent, Paul

    Many commercial and research CEP solutions are moving beyond simple stream query languages to more complete definitions of "process" and thence to "decisions" and "actions". And as capabilities increase in event processing capabilities, there is an increasing realization that the humble "rule" is as relevant to the event cloud as it is to specific services. Less obvious is how much event processing has to offer the process and rule execution and management technologies. Does event processing change the way we should manage businesses, processes and services, together with their embedded (and hopefully managed) rulesets?

  14. Real-time complex event processing for cloud resources

    Science.gov (United States)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  15. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  16. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  17. LHCb Online event processing and filtering

    CERN Document Server

    Alessio, F; Brarda, L; Frank, M; Franek, B; Galli, D; Gaspar, C; Van Herwijnen, E; Jacobsson, R; Jost, B; Köstner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. ...

  18. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  19. Controlling extreme events on complex networks

    Science.gov (United States)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  20. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Anders, Gabriel; The ATLAS collaboration; Kazarov, Andrei; Lehmann Miotto, Giovanna; Santos, Alejandro; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from different sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP directives is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the persistent storage of all relevant data streams that are produced during data taking, the second is a...

  1. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration; Avolio, Giuseppe; Kazarov, Andrei; Lehmann Miotto, Giovanna; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from many sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP rules is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the data reader from existing storage of all relevant data streams that are produced during data taking, the second...

  2. LHCb Online event processing and filtering

    Science.gov (United States)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  3. LHCb Online event processing and filtering

    International Nuclear Information System (INIS)

    Alessio, F; Barandela, C; Brarda, L; Frank, M; Gaspar, C; Herwijnen, E v; Jacobsson, R; Jost, B; Koestner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S; Franek, B; Galli, D

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed

  4. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  5. Pre-attentive processing of spectrally complex sounds with asynchronous onsets: an event-related potential study with human subjects.

    Science.gov (United States)

    Tervaniemi, M; Schröger, E; Näätänen, R

    1997-05-23

    Neuronal mechanisms involved in the processing of complex sounds with asynchronous onsets were studied in reading subjects. The sound onset asynchrony (SOA) between the leading partial and the remaining complex tone was varied between 0 and 360 ms. Infrequently occurring deviant sounds (in which one out of 10 harmonics was different in pitch relative to the frequently occurring standard sound) elicited the mismatch negativity (MMN), a change-specific cortical event-related potential (ERP) component. This indicates that the pitch of standard stimuli had been pre-attentively coded by sensory-memory traces. Moreover, when the complex-tone onset fell within temporal integration window initiated by the leading-partial onset, the deviants elicited the N2b component. This indexes that involuntary attention switch towards the sound change occurred. In summary, the present results support the existence of pre-perceptual integration mechanism of 100-200 ms duration and emphasize its importance in switching attention towards the stimulus change.

  6. Towards a methodology for the engineering of event-driven process applications

    NARCIS (Netherlands)

    Baumgrass, A.; Botezatu, M.; Di Ciccio, C.; Dijkman, R.M.; Grefen, P.W.P.J.; Hewelt, M.; Mendling, J.; Meyer, A.; Pourmirza, S.; Völzer, H.; Reijers, H.; Reichert, M.

    2016-01-01

    Successful applications of the Internet of Things such as smart cities, smart logistics, and predictive maintenance, build on observing and analyzing business-related objects in the real world for business process execution and monitoring. In this context, complex event processing is increasingly

  7. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  8. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    J. King

    2004-01-01

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report

  9. Optimizing access to conditions data in ATLAS event data processing

    CERN Document Server

    Rinaldi, Lorenzo; The ATLAS collaboration

    2018-01-01

    The processing of ATLAS event data requires access to conditions data which is stored in database systems. This data includes, for example alignment, calibration, and configuration information that may be characterized by large volumes, diverse content, and/or information which evolves over time as refinements are made in those conditions. Additional layers of complexity are added by the need to provide this information across the world-wide ATLAS computing grid and the sheer number of simultaneously executing processes on the grid, each demanding a unique set of conditions to proceed. Distributing this data to all the processes that require it in an efficient manner has proven to be an increasing challenge with the growing needs and number of event-wise tasks. In this presentation, we briefly describe the systems in which we have collected information about the use of conditions in event data processing. We then proceed to explain how this information has been used to refine not only reconstruction software ...

  10. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  12. Post-event processing in social anxiety.

    Science.gov (United States)

    Dannahy, Laura; Stopa, Lusia

    2007-06-01

    Clark and Wells' [1995. A cognitive model of social phobia. In: R. Heimberg, M. Liebowitz, D.A. Hope, & F.R. Schneier (Eds.) Social phobia: Diagnosis, assessment and treatment (pp. 69-93). New York: Guildford Press.] cognitive model of social phobia proposes that following a social event, individuals with social phobia will engage in post-event processing, during which they conduct a detailed review of the event. This study investigated the relationship between self-appraisals of performance and post-event processing in individuals high and low in social anxiety. Participants appraised their performance immediately after a conversation with an unknown individual and prior to an anticipated second conversation task 1 week later. The frequency and valence of post-event processing during the week following the conversation was also assessed. The study also explored differences in the metacognitive processes of high and low socially anxious participants. The high socially anxious group experienced more anxiety, predicted worse performance, underestimated their actual performance, and engaged in more post-event processing than low socially anxious participants. The degree of negative post-event processing was linked to the extent of social anxiety and negative appraisals of performance, both immediately after the conversation task and 1 week later. Differences were also observed in some metacognitive processes. The results are discussed in relation to current theory and previous research.

  13. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  14. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  15. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  16. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  17. Third Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2009-01-01

    On June 4th, 2009, the third Dutch Process Control Security Event took place in Amsterdam. The event, organised by the Dutch National Infrastructure against Cybercrime (NICC), attracted both Dutch process control experts and members of the European SCADA and Control Systems Information Exchange

  18. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  19. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  20. Buffer of Events as a Markovian Process

    International Nuclear Information System (INIS)

    Berdugo, J.; Casaus, J.; Mana, C.

    2001-01-01

    In Particle and Asro-Particle Physics experiments, the events which get trough the detectors are read and processes on-line before they are stored for a more detailed processing and future Physics analysis. Since the events are read and, usually, processed sequentially, the time involved in these operations can lead to a significant lose of events which is, to some extent, reduced by using buffers. We present an estimate of the optimum buffer size and the fraction of events lost for a simple experimental condition which serves as an introductory example to the use of Markow Chains.(Author)

  1. Buffer of Events as a Markovian Process

    Energy Technology Data Exchange (ETDEWEB)

    Berdugo, J.; Casaus, J.; Mana, C.

    2001-07-01

    In Particle and Asro-Particle Physics experiments, the events which get trough the detectors are read and processes on-line before they are stored for a more detailed processing and future Physics analysis. Since the events are read and, usually, processed sequentially, the time involved in these operations can lead to a significant lose of events which is, to some extent, reduced by using buffers. We present an estimate of the optimum buffer size and the fraction of events lost for a simple experimental condition which serves as an introductory example to the use of Markow Chains.(Author)

  2. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  3. Cognitive complexity of the medical record is a risk factor for major adverse events.

    Science.gov (United States)

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  4. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    P. Sanchez

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA)

  5. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    P. Sanchez

    2004-11-08

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA).

  6. The ATLAS Event Service: A New Approach to Event Processing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00070566; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre­staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabi...

  7. CRITICAL EVENTS IN CONSTRUCTION PROCESS

    DEFF Research Database (Denmark)

    Jørgensen, Kirsten; Rasmussen, Grane Mikael Gregaard

    2009-01-01

    cause-effects of failures and defects in the construction industry by using an analytical approach (The bowtie model) which is developed in the accident research. Using this model clarifies the relationships within the chain of failures that causes critical events with undesirable consequences......Function failures, defects and poor communication are major problems in the construction industry. These failures and defects are caused by a row of critical events in the construction process. The purpose of this paper is to define “critical events” in the construction process and to investigate....... In this way the causes of failures and the relationships between various failures are rendered visible. A large construction site was observed from start to finish as the empirical element in the research. The research focuses on all kinds of critical events identified throughout every phase during...

  8. Managing the Organizational and Cultural Precursors to Major Events — Recognising and Addressing Complexity

    International Nuclear Information System (INIS)

    Taylor, R. H.; Carhart, N.; May, J.; Wijk, L. G. A. van

    2016-01-01

    illustration will be given of the use of Hierarchical Process Modelling (HPM) to develop a vulnerability tool using the question sets. However, to understand the issues involved more fully, requires the development of models and associated tools which recognise the complexity and interactive nature of the organizational and cultural issues involved. Various repeating patterns of system failure appear in most of the events studied. Techniques such as System Dynamics (SD) can be used to ‘map’ these processes and capture the complexity involved. This highlights interdependencies, incubating vulnerabilities and the impact of time lags within systems. Two examples will be given. In almost all of the events studied, there has been a strong disconnect between the knowledge and aspirations of senior management and those planning and carrying out operations. There has, for example, frequently been a failure to ensure that information flows up and down the management chain are effective. It has often led to conflicts between the need to maintain safety standards through exercising a cautious and questioning attitude in the light of uncertainty and the need to meet production and cost targets. Business pressures have led to shortcuts, failure to provide sufficient oversight so that leaders are aware of the true picture of process and nuclear safety at operational level (often leading to organizational ‘drift’), normalisation of risks, and the establishment of a ‘good news culture’. The development of this disconnect and its consequences have been shown to be interdependent, dynamic and complex. A second example is that of gaining a better appreciation of the deeper factors involved in managing the supply chain and, in particular, of the interface with contractors. Initiating projects with unclear accountabilities and to unrealistic timescales, together with a lack of clarity about the cost implications when safety-related concerns are reported and need to be addressed, have

  9. Fostering Organizational Innovation based on modeling the Marketing Research Process through Event-driven Process Chain (EPC

    Directory of Open Access Journals (Sweden)

    Elena Fleacă

    2016-11-01

    Full Text Available Enterprises competing in an actual business framework are required to win and maintain their competitiveness by flexibility, fast reaction and conformation to the changing customers' needs based on innovation of work related to products, services, and internal processes. The paper addresses these challenges which gain more complex bonds in a case of high pressure for innovation. The methodology commences with a literature review of the current knowledge on innovation through business processes management. Secondly, it has been applied the Event-driven Process Chain tool from the scientific literature to model the variables of marketing research process. The findings highlight benefits of marketing research workflow that enhances the value of market information while reducing costs of obtaining it, in a coherent way.

  10. Cognitive and emotional reactions to daily events: the effects of self-esteem and self-complexity.

    Science.gov (United States)

    Campbell, J D; Chew, B; Scratchley, L S

    1991-09-01

    In this article we examine the effects of self-esteem and self-complexity on cognitive appraisals of daily events and emotional lability. Subjects (n = 67) participated in a 2-week diary study; each day they made five mood ratings, described the most positive and negative events of the day, and rated these two events on six appraisal measures. Neither self-esteem nor self-complexity was related to an extremity measure of mood variability. Both traits were negatively related to measures assessing the frequency of mood change, although the effect of self-complexity dissipated when self-esteem was taken into account. Self-esteem (but not self-complexity) was also related to event appraisals: Subjects with low self-esteem rated their daily events as less positive and as having more impact on their moods. Subjects with high self-esteem made more internal, stable, global attributions for positive events than for negative events, whereas subjects low in self-esteem made similar attributions for both types of events and viewed their negative events as being more personally important than did subjects high in self-esteem. Despite these self-esteem differences in subjects' views of their daily events, naive judges (n = 63) who read the event descriptions and role-played their appraisals of them generally did not distinguish between the events that had been experienced by low self-esteem versus high self-esteem diary subjects.

  11. Service Processes as a Sequence of Events

    NARCIS (Netherlands)

    P.C. Verhoef (Peter); G. Antonides (Gerrit); A.N. de Hoog

    2002-01-01

    textabstractIn this paper the service process is considered as a sequence of events. Using theory from economics and psychology a model is formulated that explains how the utility of each event affects the overall evaluation of the service process. In this model we especially account for the

  12. Alternating event processes during lifetimes: population dynamics and statistical inference.

    Science.gov (United States)

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  13. MEG event-related desynchronization and synchronization deficits during basic somatosensory processing in individuals with ADHD

    Directory of Open Access Journals (Sweden)

    Wang Frank

    2008-02-01

    Full Text Available Abstract Background Attention-Deficit/Hyperactivity Disorder (ADHD is a prevalent, complex disorder which is characterized by symptoms of inattention, hyperactivity, and impulsivity. Convergent evidence from neurobiological studies of ADHD identifies dysfunction in fronto-striatal-cerebellar circuitry as the source of behavioural deficits. Recent studies have shown that regions governing basic sensory processing, such as the somatosensory cortex, show abnormalities in those with ADHD suggesting that these processes may also be compromised. Methods We used event-related magnetoencephalography (MEG to examine patterns of cortical rhythms in the primary (SI and secondary (SII somatosensory cortices in response to median nerve stimulation, in 9 adults with ADHD and 10 healthy controls. Stimuli were brief (0.2 ms non-painful electrical pulses presented to the median nerve in two counterbalanced conditions: unpredictable and predictable stimulus presentation. We measured changes in strength, synchronicity, and frequency of cortical rhythms. Results Healthy comparison group showed strong event-related desynchrony and synchrony in SI and SII. By contrast, those with ADHD showed significantly weaker event-related desynchrony and event-related synchrony in the alpha (8–12 Hz and beta (15–30 Hz bands, respectively. This was most striking during random presentation of median nerve stimulation. Adults with ADHD showed significantly shorter duration of beta rebound in both SI and SII except for when the onset of the stimulus event could be predicted. In this case, the rhythmicity of SI (but not SII in the ADHD group did not differ from that of controls. Conclusion Our findings suggest that somatosensory processing is altered in individuals with ADHD. MEG constitutes a promising approach to profiling patterns of neural activity during the processing of sensory input (e.g., detection of a tactile stimulus, stimulus predictability and facilitating our

  14. Mining process performance from event logs

    NARCIS (Netherlands)

    Adriansyah, A.; Buijs, J.C.A.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    In systems where process executions are not strictly enforced by a predefined process model, obtaining reliable performance information is not trivial. In this paper, we analyzed an event log of a real-life process, taken from a Dutch financial institute, using process mining techniques. In

  15. First Dutch Process Control Security Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2008-01-01

    On May 21st , 2008, the Dutch National Infrastructure against Cyber Crime (NICC) organised their first Process Control Security Event. Mrs. Annemarie Zielstra, the NICC programme manager, opened the event. She welcomed the over 100 representatives of key industry sectors. “Earlier studies in the

  16. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    International Nuclear Information System (INIS)

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning

  17. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  18. Fourth Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.; Zielstra, A.

    2010-01-01

    On December 1st, 2009, the fourth Dutch Process Control Security Event took place in Baarn, The Netherlands. The security event with the title ‘Manage IT!’ was organised by the Dutch National Infrastructure against Cybercrime (NICC). Mid of November, a group of over thirty people participated in the

  19. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  20. Using Complex Event Processing (CEP) and vocal synthesis techniques to improve comprehension of sonified human-centric data

    Science.gov (United States)

    Rimland, Jeff; Ballora, Mark

    2014-05-01

    The field of sonification, which uses auditory presentation of data to replace or augment visualization techniques, is gaining popularity and acceptance for analysis of "big data" and for assisting analysts who are unable to utilize traditional visual approaches due to either: 1) visual overload caused by existing displays; 2) concurrent need to perform critical visually intensive tasks (e.g. operating a vehicle or performing a medical procedure); or 3) visual impairment due to either temporary environmental factors (e.g. dense smoke) or biological causes. Sonification tools typically map data values to sound attributes such as pitch, volume, and localization to enable them to be interpreted via human listening. In more complex problems, the challenge is in creating multi-dimensional sonifications that are both compelling and listenable, and that have enough discrete features that can be modulated in ways that allow meaningful discrimination by a listener. We propose a solution to this problem that incorporates Complex Event Processing (CEP) with speech synthesis. Some of the more promising sonifications to date use speech synthesis, which is an "instrument" that is amenable to extended listening, and can also provide a great deal of subtle nuance. These vocal nuances, which can represent a nearly limitless number of expressive meanings (via a combination of pitch, inflection, volume, and other acoustic factors), are the basis of our daily communications, and thus have the potential to engage the innate human understanding of these sounds. Additionally, recent advances in CEP have facilitated the extraction of multi-level hierarchies of information, which is necessary to bridge the gap between raw data and this type of vocal synthesis. We therefore propose that CEP-enabled sonifications based on the sound of human utterances could be considered the next logical step in human-centric "big data" compression and transmission.

  1. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  2. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  3. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  4. Event-driven processing for hardware-efficient neural spike sorting

    Science.gov (United States)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  5. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex

    Science.gov (United States)

    Keidel, James L.; Ing, Leslie P.; Horner, Aidan J.

    2015-01-01

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or “schemas”). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. SIGNIFICANCE STATEMENT Memories are strengthened via consolidation. We investigated memory for lifelike events using video clips and showed that rehearsing their content dramatically boosts memory consolidation. Using MRI scanning, we measured patterns of brain activity while

  6. Event processing time prediction at the CMS experiment of the Large Hadron Collider

    International Nuclear Information System (INIS)

    Cury, Samir; Gutsche, Oliver; Kcira, Dorian

    2014-01-01

    The physics event reconstruction is one of the biggest challenges for the computing of the LHC experiments. Among the different tasks that computing systems of the CMS experiment performs, the reconstruction takes most of the available CPU resources. The reconstruction time of single collisions varies according to event complexity. Measurements were done in order to determine this correlation quantitatively, creating means to predict it based on the data-taking conditions of the input samples. Currently the data processing system splits tasks in groups with the same number of collisions and does not account for variations in the processing time. These variations can be large and can lead to a considerable increase in the time it takes for CMS workflows to finish. The goal of this study was to use estimates on processing time to more efficiently split the workflow into jobs. By considering the CPU time needed for each job the spread of the job-length distribution in a workflow is reduced.

  7. The ATLAS Event Service: A new approach to event processing

    Science.gov (United States)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  8. Process cubes : slicing, dicing, rolling up and drilling down event data for process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2013-01-01

    Recent breakthroughs in process mining research make it possible to discover, analyze, and improve business processes based on event data. The growth of event data provides many opportunities but also imposes new challenges. Process mining is typically done for an isolated well-defined process in

  9. Radiochemical data collected on events from which radioactivity escaped beyond the borders of the Nevada test range complex

    International Nuclear Information System (INIS)

    Hicks, H.G.

    1981-01-01

    This report identifies all nuclear events in Nevada that are known to have sent radioactivity beyond the borders of the test range complex. There have been 177 such tests, representing seven different types: nuclear detonations in the atmosphere, nuclear excavation events, nuclear safety events, underground nuclear events that inadvertently seeped or vented to the atmosphere, dispersion of plutonium and/or uranium by chemical high explosives, nuclear rocket engine tests, and nuclear ramjet engine tests. The source term for each of these events is given, together with the data base from which it was derived (except where the data are classified). The computer programs used for organizing and processing the data base and calculating radionuclide production are described and included, together with the input and output data and details of the calculations. This is the basic formation needed to make computer modeling studies of the fallout from any of these 177 events

  10. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  11. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  12. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  13. Y-12 National Security Complex Emergency Management Hazards Assessment (EMHA) Process; FINAL

    International Nuclear Information System (INIS)

    Bailiff, E.F.; Bolling, J.D.

    2001-01-01

    This document establishes requirements and standard methods for the development and maintenance of the Emergency Management Hazards Assessment (EMHA) process used by the lead and all event contractors at the Y-12 Complex for emergency planning and preparedness. The EMHA process provides the technical basis for the Y-12 emergency management program. The instructions provided in this document include methods and requirements for performing the following emergency management activities at Y-12: (1) hazards identification; (2) hazards survey, and (3) hazards assessment

  14. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    Science.gov (United States)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  15. Event processing for business organizing the real-time enterprise

    CERN Document Server

    Luckham, David C

    2011-01-01

    Find out how Events Processing (EP) works and how it can workfor you Business Event Processing: An Introduction and StrategyGuide thoroughly describes what EP is, how to use it, and howit relates to other popular information technology architecturessuch as Service Oriented Architecture. Explains how sense and response architectures are being appliedwith tremendous results to businesses throughout the world andshows businesses how they can get started implementing EPShows how to choose business event processing technology tosuit your specific business needs and how to keep costs of adoptingit

  16. Offside Decisions by Expert Assistant Referees in Association Football: Perception and Recall of Spatial Positions in Complex Dynamic Events

    Science.gov (United States)

    Gilis, Bart; Helsen, Werner; Catteeuw, Peter; Wagemans, Johan

    2008-01-01

    This study investigated the offside decision-making process in association football. The first aim was to capture the specific offside decision-making skills in complex dynamic events. Second, we analyzed the type of errors to investigate the factors leading to incorrect decisions. Federation Internationale de Football Association (FIFA; n = 29)…

  17. Epigenetics and Shared Molecular Processes in the Regeneration of Complex Structures

    Directory of Open Access Journals (Sweden)

    Labib Rouhana

    2016-01-01

    Full Text Available The ability to regenerate complex structures is broadly represented in both plant and animal kingdoms. Although regenerative abilities vary significantly amongst metazoans, cumulative studies have identified cellular events that are broadly observed during regenerative events. For example, structural damage is recognized and wound healing initiated upon injury, which is followed by programmed cell death in the vicinity of damaged tissue and a burst in proliferation of progenitor cells. Sustained proliferation and localization of progenitor cells to site of injury give rise to an assembly of differentiating cells known as the regeneration blastema, which fosters the development of new tissue. Finally, preexisting tissue rearranges and integrates with newly differentiated cells to restore proportionality and function. While heterogeneity exists in the basic processes displayed during regenerative events in different species—most notably the cellular source contributing to formation of new tissue—activation of conserved molecular pathways is imperative for proper regulation of cells during regeneration. Perhaps the most fundamental of such molecular processes entails chromatin rearrangements, which prime large changes in gene expression required for differentiation and/or dedifferentiation of progenitor cells. This review provides an overview of known contributions to regenerative processes by noncoding RNAs and chromatin-modifying enzymes involved in epigenetic regulation.

  18. Stressful life events and psychological dysfunction in complex regional pain syndrome type I

    NARCIS (Netherlands)

    Geertzen, JHB; de Bruijn-Kofman, AT; de Bruijn, HP; van de Wiel, HBM; Dijkstra, PU

    Objective: To determine to what extent stressful life events and psychological dysfunction play a role in the pathogenesis of Complex Regional Pain Syndrome type I (CRPS). Design: A comparative study between a CRPS group and a control group. Stressful life events and psychological dysfunction

  19. A Process for Predicting Manhole Events in Manhattan

    OpenAIRE

    Isaac, Delfina F.; Ierome, Steve; Dutta, Haimonti; Radeva, Axinia; Passonneau, Rebecca J.; Rudin, Cynthia

    2009-01-01

    We present a knowledge discovery and data mining process developed as part of the Columbia/Con Edison project on manhole event prediction. This process can assist with real-world prioritization problems that involve raw data in the form of noisy documents requiring significant amounts of pre-processing. The documents are linked to a set of instances to be ranked according to prediction criteria. In the case of manhole event prediction, which is a new application for machine learning, the goal...

  20. Recurrent process mining with live event data

    NARCIS (Netherlands)

    Syamsiyah, A.; van Dongen, B.F.; van der Aalst, W.M.P.; Teniente, E.; Weidlich, M.

    2018-01-01

    In organizations, process mining activities are typically performed in a recurrent fashion, e.g. once a week, an event log is extracted from the information systems and a process mining tool is used to analyze the process’ characteristics. Typically, process mining tools import the data from a

  1. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex.

    Science.gov (United States)

    Bird, Chris M; Keidel, James L; Ing, Leslie P; Horner, Aidan J; Burgess, Neil

    2015-10-28

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or "schemas"). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. Copyright © 2015 Bird, Keidel et al.

  2. Out-of-order event processing in kinetic data structures

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Agrawal, Pankaj

    2011-01-01

    ’s for the maintenance of several fundamental structures such as kinetic sorting and kinetic tournament trees, which overcome the difficulty by employing a refined event scheduling and processing technique. We prove that the new event scheduling mechanism leads to a KDS that is correct except for finitely many short......We study the problem of designing kinetic data structures (KDS’s for short) when event times cannot be computed exactly and events may be processed in a wrong order. In traditional KDS’s this can lead to major inconsistencies from which the KDS cannot recover. We present more robust KDS...

  3. Post-Event Processing in Children with Social Phobia

    Science.gov (United States)

    Schmitz, Julian; Kramer, Martina; Blechert, Jens; Tuschen-Caffier, Brunna

    2010-01-01

    In the aftermath of a distressing social event, adults with social phobia (SP) engage in a review of this event with a focus on its negative aspects. To date, little is known about this post-event processing (PEP) and its relationship with perceived performance in SP children. We measured PEP in SP children (n = 24) and healthy controls (HC; n =…

  4. Consequence Prioritization Process for Potential High Consequence Events (HCE)

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Sarah G. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-10-31

    This document describes the process for Consequence Prioritization, the first phase of the Consequence-Driven Cyber-Informed Engineering (CCE) framework. The primary goal of Consequence Prioritization is to identify potential disruptive events that would significantly inhibit an organization’s ability to provide the critical services and functions deemed fundamental to their business mission. These disruptive events, defined as High Consequence Events (HCE), include both events that have occurred or could be realized through an attack of critical infrastructure owner assets. While other efforts have been initiated to identify and mitigate disruptive events at the national security level, such as Presidential Policy Directive 41 (PPD-41), this process is intended to be used by individual organizations to evaluate events that fall below the threshold for a national security. Described another way, Consequence Prioritization considers threats greater than those addressable by standard cyber-hygiene and includes the consideration of events that go beyond a traditional continuity of operations (COOP) perspective. Finally, Consequence Prioritization is most successful when organizations adopt a multi-disciplinary approach, engaging both cyber security and engineering expertise, as in-depth engineering perspectives are required to recognize and characterize and mitigate HCEs. Figure 1 provides a high-level overview of the prioritization process.

  5. Historical events of the Chemical Processing Department

    Energy Technology Data Exchange (ETDEWEB)

    Lane, W.A.

    1965-11-12

    The purpose of this report is to summarize and document the significant historical events pertinent to the operation of the Chemical Processing facilities at Hanford. The report covers, in chronological order, the major construction activities and historical events from 1944 to September, 1965. Also included are the production records achieved and a history of the department`s unit cost performance.

  6. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  7. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    Science.gov (United States)

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  8. Fine grained event processing on HPCs with the ATLAS Yoda system

    CERN Document Server

    Calafiura, Paolo; The ATLAS collaboration; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    High performance computing facilities present unique challenges and opportunities for HENP event processing. The massive scale of many HPC systems means that fractionally small utilizations can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HENP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficie...

  9. An algebra of discrete event processes

    Science.gov (United States)

    Heymann, Michael; Meyer, George

    1991-01-01

    This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.

  10. Dynamic anticipatory processing of hierarchical sequential events: a common role for Broca's area and ventral premotor cortex across domains?

    Science.gov (United States)

    Fiebach, Christian J; Schubotz, Ricarda I

    2006-05-01

    This paper proposes a domain-general model for the functional contribution of ventral premotor cortex (PMv) and adjacent Broca's area to perceptual, cognitive, and motor processing. We propose to understand this frontal region as a highly flexible sequence processor, with the PMv mapping sequential events onto stored structural templates and Broca's Area involved in more complex, hierarchical or hypersequential processing. This proposal is supported by reference to previous functional neuroimaging studies investigating abstract sequence processing and syntactic processing.

  11. Emotional Granularity Effects on Event-Related Brain Potentials during Affective Picture Processing.

    Science.gov (United States)

    Lee, Ja Y; Lindquist, Kristen A; Nam, Chang S

    2017-01-01

    There is debate about whether emotional granularity , the tendency to label emotions in a nuanced and specific manner, is merely a product of labeling abilities, or a systematic difference in the experience of emotion during emotionally evocative events. According to the Conceptual Act Theory of Emotion (CAT) (Barrett, 2006), emotional granularity is due to the latter and is a product of on-going temporal differences in how individuals categorize and thus make meaning of their affective states. To address this question, the present study investigated the effects of individual differences in emotional granularity on electroencephalography-based brain activity during the experience of emotion in response to affective images. Event-related potentials (ERP) and event-related desynchronization and synchronization (ERD/ERS) analysis techniques were used. We found that ERP responses during the very early (60-90 ms), middle (270-300 ms), and later (540-570 ms) moments of stimulus presentation were associated with individuals' level of granularity. We also observed that highly granular individuals, compared to lowly granular individuals, exhibited relatively stable desynchronization of alpha power (8-12 Hz) and synchronization of gamma power (30-50 Hz) during the 3 s of stimulus presentation. Overall, our results suggest that emotional granularity is related to differences in neural processing throughout emotional experiences and that high granularity could be associated with access to executive control resources and a more habitual processing of affective stimuli, or a kind of "emotional complexity." Implications for models of emotion are also discussed.

  12. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  13. Minimized state complexity of quantum-encoded cryptic processes

    Science.gov (United States)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  14. Complex active regions as the main source of extreme and large solar proton events

    Science.gov (United States)

    Ishkov, V. N.

    2013-12-01

    A study of solar proton sources indicated that solar flare events responsible for ≥2000 pfu proton fluxes mostly occur in complex active regions (CARs), i.e., in transition structures between active regions and activity complexes. Different classes of similar structures and their relation to solar proton events (SPEs) and evolution, depending on the origination conditions, are considered. Arguments in favor of the fact that sunspot groups with extreme dimensions are CARs are presented. An analysis of the flare activity in a CAR resulted in the detection of "physical" boundaries, which separate magnetic structures of the same polarity and are responsible for the independent development of each structure.

  15. Waste Form Features, Events, and Processes

    International Nuclear Information System (INIS)

    R. Schreiner

    2004-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the waste form features, events and processes (FEPs) with respect to modeling used to support the Total System Performance Assessment for License Application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical bases for screening decisions. This information is required by the Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs addressed in this report deal with the issues related to the degradation and potential failure of the waste form and the migration of the waste form colloids. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA, (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical bases for exclusion from TSPA-LA (i.e., why the FEP is excluded). This revision addresses the TSPA-LA FEP list (DTN: MO0407SEPFEPLA.000 [DIRS 170760]). The primary purpose of this report is to identify and document the analyses and resolution of the features, events, and processes (FEPs) associated with the waste form performance in the repository. Forty FEPs were identified that are associated with the waste form performance. This report has been prepared to document the screening methodology used in the process of FEP inclusion and exclusion. The analyses documented in this report are for the license application (LA) base case design (BSC 2004 [DIRS 168489]). In this design, a drip shield is placed over the waste package and no backfill is placed over the drip shield (BSC 2004 [DIRS 168489]). Each FEP may include one or more specific issues that are collectively described by a FEP name and a FEP description. The FEP description may encompass a single feature, process or event, or a few closely related or coupled processes if the entire FEP can be addressed by a single specific screening argument or TSPA-LA disposition. The FEPs are

  16. Waste Form Features, Events, and Processes

    Energy Technology Data Exchange (ETDEWEB)

    R. Schreiner

    2004-10-27

    The purpose of this report is to evaluate and document the inclusion or exclusion of the waste form features, events and processes (FEPs) with respect to modeling used to support the Total System Performance Assessment for License Application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical bases for screening decisions. This information is required by the Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs addressed in this report deal with the issues related to the degradation and potential failure of the waste form and the migration of the waste form colloids. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA, (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical bases for exclusion from TSPA-LA (i.e., why the FEP is excluded). This revision addresses the TSPA-LA FEP list (DTN: MO0407SEPFEPLA.000 [DIRS 170760]). The primary purpose of this report is to identify and document the analyses and resolution of the features, events, and processes (FEPs) associated with the waste form performance in the repository. Forty FEPs were identified that are associated with the waste form performance. This report has been prepared to document the screening methodology used in the process of FEP inclusion and exclusion. The analyses documented in this report are for the license application (LA) base case design (BSC 2004 [DIRS 168489]). In this design, a drip shield is placed over the waste package and no backfill is placed over the drip shield (BSC 2004 [DIRS 168489]). Each FEP may include one or more specific issues that are collectively described by a FEP name and a FEP description. The FEP description may encompass a single feature, process or event, or a few closely related or coupled processes if the entire FEP can be addressed by a single specific screening argument or TSPA-LA disposition. The FEPs are

  17. Brain Signals of Face Processing as Revealed by Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Ela I. Olivares

    2015-01-01

    Full Text Available We analyze the functional significance of different event-related potentials (ERPs as electrophysiological indices of face perception and face recognition, according to cognitive and neurofunctional models of face processing. Initially, the processing of faces seems to be supported by early extrastriate occipital cortices and revealed by modulations of the occipital P1. This early response is thought to reflect the detection of certain primary structural aspects indicating the presence grosso modo of a face within the visual field. The posterior-temporal N170 is more sensitive to the detection of faces as complex-structured stimuli and, therefore, to the presence of its distinctive organizational characteristics prior to within-category identification. In turn, the relatively late and probably more rostrally generated N250r and N400-like responses might respectively indicate processes of access and retrieval of face-related information, which is stored in long-term memory (LTM. New methods of analysis of electrophysiological and neuroanatomical data, namely, dynamic causal modeling, single-trial and time-frequency analyses, are highly recommended to advance in the knowledge of those brain mechanisms concerning face processing.

  18. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  19. Client-Side Event Processing for Personalized Web Advertisement

    Science.gov (United States)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.

  20. Epidemic processes in complex networks

    OpenAIRE

    Pastor Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-01-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The t...

  1. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    Science.gov (United States)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  2. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  3. Effective Complexity of Stationary Process Realizations

    Directory of Open Access Journals (Sweden)

    Arleta Szkoła

    2011-06-01

    Full Text Available The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous paper [1] we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that needs less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.

  4. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  5. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  6. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  7. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    Science.gov (United States)

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  8. Phonological Processes in Complex and Compound Words

    Directory of Open Access Journals (Sweden)

    Alieh Kord Zaferanlu Kambuziya

    2016-02-01

    Full Text Available Abstract This research at making a comparison between phonological processes in complex and compound Persian words. Data are gathered from a 40,000-word Persian dictionary. To catch some results, 4,034 complex words and 1,464 compound ones are chosen. To count the data, "excel" software is used. Some results of the research are: 1- "Insertion" is the usual phonological process in complex words. More than half of different insertions belongs to the consonant /g/. Then /y/ and // are in the second and the third order. The consonant /v/ has the least percentage of all. The most percentage of vowel insertion belongs to /e/. The vowels /a/ and /o/ are in the second and third order. Deletion in complex words can only be seen in consonant /t/ and vowel /e/. 2- The most frequent phonological processes in compounds is consonant deletion. In this process, seven different consonants including /t/, //, /m/, /r/, / ǰ/, /d, and /c/. The only deleted vowel is /e/. In both groups of complex and compound, /t/ deletion can be observed. A sequence of three consonants paves the way for the deletion of one of the consonants, if one of the sequences is a sonorant one like /n/, the deletion process rarely happens. 3- In complex words, consonant deletion causes a lighter syllable weight, whereas vowel deletion causes a heavier syllable weight. So, both of the processes lead to bi-moraic weight. 4- The production of bi-moraic syllable in Persian is preferable to Syllable Contact Law. So, Specific Rules have precedence to Universals. 5- Vowel insertion can be seen in both groups of complex and compound words. In complex words, /e/ insertion has the most fundamental part. The vowels /a/ and /o/ are in the second and third place. Whenever there are two sequences of ultra-heavy syllables. By vowel insertion, the first syllable is broken into two light syllables. The compounds that are influenced by vowel insertion, can be and are pronounced without any insertion

  9. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  10. A review for identification of initiating events in event tree development process on nuclear power plants

    International Nuclear Information System (INIS)

    Riyadi, Eko H.

    2014-01-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events

  11. Software for event oriented processing on multiprocessor systems

    International Nuclear Information System (INIS)

    Fischler, M.; Areti, H.; Biel, J.; Bracker, S.; Case, G.; Gaines, I.; Husby, D.; Nash, T.

    1984-08-01

    Computing intensive problems that require the processing of numerous essentially independent events are natural customers for large scale multi-microprocessor systems. This paper describes the software required to support users with such problems in a multiprocessor environment. It is based on experience with and development work aimed at processing very large amounts of high energy physics data

  12. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  13. Compliance with Environmental Regulations through Complex Geo-Event Processing

    OpenAIRE

    Federico Herrera; Laura González; Daniel Calegari; Bruno Rienzi

    2017-01-01

    In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with ...

  14. A Cognition-based View of Decision Processes in Complex Social-Ecological Systems

    Directory of Open Access Journals (Sweden)

    Kathi K. Beratan

    2007-06-01

    Full Text Available This synthesis paper is intended to provide an overview of individual and collective decision-making processes that might serve as a theoretical foundation for a complexity-based approach to environmental policy design and natural resource management planning. Human activities are the primary drivers of change in the Earth's biosphere today, so efforts to shift the trajectory of social-ecological systems must focus on changes in individual and collective human behavior. Recent advances in understanding the biological basis of thought and memory offer insights of use in designing management and planning processes. The human brain has evolved ways of dealing with complexity and uncertainty, and is particularly attuned to social information. Changes in an individual's schemas, reflecting changes in the patterns of neural connections that are activated by particular stimuli, occur primarily through nonconsious processes in response to experiential learning during repeated exposure to novel situations, ideas, and relationships. Discourse is an important mechanism for schema modification, and thus for behavior change. Through discourse, groups of people construct a shared story - a collective model - that is useful for predicting likely outcomes of actions and events. In effect, good stories are models that filter and organize distributed knowledge about complex situations and relationships in ways that are readily absorbed by human cognitive processes. The importance of discourse supports the view that collaborative approaches are needed to effectively deal with environmental problems and natural resource management challenges. Methods derived from the field of mediation and dispute resolution can help us take advantage of the distinctly human ability to deal with complexity and uncertainty. This cognitive view of decision making supports fundamental elements of resilience management and adaptive co-management, including fostering social learning

  15. The dimerization of the yeast cytochrome bc1 complex is an early event and is independent of Rip1.

    Science.gov (United States)

    Conte, Annalea; Papa, Benedetta; Ferramosca, Alessandra; Zara, Vincenzo

    2015-05-01

    In Saccharomyces cerevisiae the mature cytochrome bc1 complex exists as an obligate homo-dimer in which each monomer consists of ten distinct protein subunits inserted into or bound to the inner mitochondrial membrane. Among them, the Rieske iron-sulfur protein (Rip1), besides its catalytic role in electron transfer, may be implicated in the bc1 complex dimerization. Indeed, Rip1 has the globular domain containing the catalytic center in one monomer while the transmembrane helix interacts with the adjacent monomer. In addition, the lack of Rip1 leads to the accumulation of an immature bc1 intermediate, only loosely associated with cytochrome c oxidase. In this study we have investigated the biogenesis of the yeast cytochrome bc1 complex using epitope tagged proteins to purify native assembly intermediates. We showed that the dimerization process is an early event during bc1 complex biogenesis and that the presence of Rip1, differently from previous proposals, is not essential for this process. We also investigated the multi-step model of bc1 assembly thereby lending further support to the existence of bona fide subcomplexes during bc1 maturation in the inner mitochondrial membrane. Finally, a new model of cytochrome bc1 complex assembly, in which distinct intermediates sequentially interact during bc1 maturation, has been proposed. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    International Nuclear Information System (INIS)

    Jaros, W.

    2005-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  17. A review for identification of initiating events in event tree development process on nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id [Center for Regulatory Assessment of Nuclear Installation and Materials, Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada 8 Jakarta 10120 (Indonesia)

    2014-09-30

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  18. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  19. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Complex analyses on clinical information systems using restricted natural language querying to resolve time-event dependencies.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2018-06-01

    This paper reports on a generic framework to provide clinicians with the ability to conduct complex analyses on elaborate research topics using cascaded queries to resolve internal time-event dependencies in the research questions, as an extension to the proposed Clinical Data Analytics Language (CliniDAL). A cascaded query model is proposed to resolve internal time-event dependencies in the queries which can have up to five levels of criteria starting with a query to define subjects to be admitted into a study, followed by a query to define the time span of the experiment. Three more cascaded queries can be required to define control groups, control variables and output variables which all together simulate a real scientific experiment. According to the complexity of the research questions, the cascaded query model has the flexibility of merging some lower level queries for simple research questions or adding a nested query to each level to compose more complex queries. Three different scenarios (one of them contains two studies) are described and used for evaluation of the proposed solution. CliniDAL's complex analyses solution enables answering complex queries with time-event dependencies at most in a few hours which manually would take many days. An evaluation of results of the research studies based on the comparison between CliniDAL and SQL solutions reveals high usability and efficiency of CliniDAL's solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Epidemic processes in complex networks

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  2. Theory of mind for processing unexpected events across contexts.

    Science.gov (United States)

    Dungan, James A; Stepanovic, Michael; Young, Liane

    2016-08-01

    Theory of mind, or mental state reasoning, may be particularly useful for making sense of unexpected events. Here, we investigated unexpected behavior across both social and non-social contexts in order to characterize the precise role of theory of mind in processing unexpected events. We used functional magnetic resonance imaging to examine how people respond to unexpected outcomes when initial expectations were based on (i) an object's prior behavior, (ii) an agent's prior behavior and (iii) an agent's mental states. Consistent with prior work, brain regions for theory of mind were preferentially recruited when people first formed expectations about social agents vs non-social objects. Critically, unexpected vs expected outcomes elicited greater activity in dorsomedial prefrontal cortex, which also discriminated in its spatial pattern of activity between unexpected and expected outcomes for social events. In contrast, social vs non-social events elicited greater activity in precuneus across both expected and unexpected outcomes. Finally, given prior information about an agent's behavior, unexpected vs expected outcomes elicited an especially robust response in right temporoparietal junction, and the magnitude of this difference across participants correlated negatively with autistic-like traits. Together, these findings illuminate the distinct contributions of brain regions for theory of mind for processing unexpected events across contexts. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  3. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks.

    Science.gov (United States)

    Naveros, Francisco; Garrido, Jesus A; Carrillo, Richard R; Ros, Eduardo; Luque, Niceto R

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  4. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project

  5. Evaluating and predicting overall process risk using event logs

    NARCIS (Netherlands)

    Pika, A.; Van Der Aalst, W.M.P.; Wynn, M.T.; Fidge, C.J.; Ter Hofstede, A.H.M.

    2016-01-01

    Companies standardise and automate their business processes in order to improve process efficiency and minimise operational risks. However, it is difficult to eliminate all process risks during the process design stage due to the fact that processes often run in complex and changeable environments

  6. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  8. Event processing in X-IFU detector onboard Athena.

    Science.gov (United States)

    Ceballos, M. T.; Cobos, B.; van der Kuurs, J.; Fraga-Encinas, R.

    2015-05-01

    The X-ray Observatory ATHENA was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA's Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard ATHENA is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by a consortium of European research institutions currently from France (leadership), Italy, The Netherlands, Belgium, UK, Germany and Spain. From Spain, IFCA (CSIC-UC) is involved in the Digital Readout Electronics (DRE) unit of the X-IFU detector, in particular in the Event Processor Subsytem. We at IFCA are in charge of the development and implementation in the DRE unit of the Event Processing algorithms, designed to recognize, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, and lately extract their main parameters (coordinates, energy, arrival time, grade, etc.) Here we will present the design and performance of the algorithms developed for the event recognition (adjusted derivative), and pulse grading/qualification as well as the progress in the algorithms designed to extract the energy content of the pulses (pulse optimal filtering). IFCA will finally have the responsibility of the implementation on board in the (TBD) FPGAs or micro-processors of the DRE unit, where this Event Processing part will take place, to fit into the limited telemetry of the instrument.

  9. A bioactive molecule in a complex wound healing process: platelet-derived growth factor.

    Science.gov (United States)

    Kaltalioglu, Kaan; Coskun-Cevher, Sule

    2015-08-01

    Wound healing is considered to be particularly important after surgical procedures, and the most important wounds related to surgical procedures are incisional, excisional, and punch wounds. Research is ongoing to identify methods to heal non-closed wounds or to accelerate wound healing; however, wound healing is a complex process that includes many biological and physiological events, and it is affected by various local and systemic factors, including diabetes mellitus, infection, ischemia, and aging. Different cell types (such as platelets, macrophages, and neutrophils) release growth factors during the healing process, and platelet-derived growth factor is a particularly important mediator in most stages of wound healing. This review explores the relationship between platelet-derived growth factor and wound healing. © 2014 The International Society of Dermatology.

  10. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  11. A dataflow meta-computing framework for event processing in the H1 experiment

    International Nuclear Information System (INIS)

    Campbell, A.; Gerhards, R.; Mkrtchyan, T.; Levonian, S.; Grab, C.; Martyniak, J.; Nowak, J.

    2001-01-01

    Linux based networked PCs clusters are replacing both the VME non uniform direct memory access systems and SMP shared memory systems used previously for the online event filtering and reconstruction. To allow an optimal use of the distributed resources of PC clusters an open software framework is presently being developed based on a dataflow paradigm for event processing. This framework allows for the distribution of the data of physics events and associated calibration data to multiple computers from multiple input sources for processing and the subsequent collection of the processed events at multiple outputs. The basis of the system is the event repository, basically a first-in first-out event store which may be read and written in a manner similar to sequential file access. Events are stored in and transferred between repositories as suitably large sequences to enable high throughput. Multiple readers can read simultaneously from a single repository to receive event sequences and multiple writers can insert event sequences to a repository. Hence repositories are used for event distribution and collection. To support synchronisation of the event flow the repository implements barriers. A barrier must be written by all the writers of a repository before any reader can read the barrier. A reader must read a barrier before it may receive data from behind it. Only after all readers have read the barrier is the barrier removed from the repository. A barrier may also have attached data. In this way calibration data can be distributed to all processing units. The repositories are implemented as multi-threaded CORBA objects in C++ and CORBA is used for all data transfers. Job setup scripts are written in python and interactive status and histogram display is provided by a Java program. Jobs run under the PBS batch system providing shared use of resources for online triggering, offline mass reprocessing and user analysis jobs

  12. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  13. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  14. Psychological distress and stressful life events in pediatric complex regional pain syndrome

    Science.gov (United States)

    Wager, Julia; Brehmer, Hannah; Hirschfeld, Gerrit; Zernikow, Boris

    2015-01-01

    BACKGROUND: There is little knowledge regarding the association between psychological factors and complex regional pain syndrome (CRPS) in children. Specifically, it is not known which factors precipitate CRPS and which result from the ongoing painful disease. OBJECTIVES: To examine symptoms of depression and anxiety as well as the experience of stressful life events in children with CRPS compared with children with chronic primary headaches and functional abdominal pain. METHODS: A retrospective chart study examined children with CRPS (n=37) who received intensive inpatient pain treatment between 2004 and 2010. They were compared with two control groups (chronic primary headaches and functional abdominal pain; each n=37), who also received intensive inpatient pain treatment. Control groups were matched with the CRPS group with regard to admission date, age and sex. Groups were compared on symptoms of depression and anxiety as well as stressful life events. RESULTS: Children with CRPS reported lower anxiety and depression scores compared with children with abdominal pain. A higher number of stressful life events before and after the onset of the pain condition was observed for children with CRPS. CONCLUSIONS: Children with CRPS are not particularly prone to symptoms of anxiety or depression. Importantly, children with CRPS experienced more stressful life events than children with chronic headaches or abdominal pain. Prospective long-term studies are needed to further explore the potential role of stressful life events in the etiology of CRPS. PMID:26035287

  15. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  16. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. [Emerging infectious diseases: complex, unpredictable processes].

    Science.gov (United States)

    Guégan, Jean-François

    2016-01-01

    In the light of a double approach, at first empirical, later theoretical and comparative, illustrated by the example of the Buruli ulcer and its mycobacterial agent Mycobacterium ulcerans on which I focused my research activity these last ten years by studying determinants and factors of emerging infectious or parasitic diseases, the complexity of events explaining emerging diseases will be presented. The cascade of events occurring at various levels of spatiotemporal scales and organization of life, which lead to the numerous observed emergences, nowadays requires better taking into account the interactions between host(s), pathogen(s) and the environment by including the behavior of both individuals and the population. In numerous research studies on emerging infectious diseases, microbial hazard is described rather than infectious disease risk, the latter resulting from the confrontation between an association of threatening phenomena, or hazards, and a susceptible population. Beyond, the theme of emerging infectious diseases and its links with global environmental and societal changes leads to reconsider some well-established knowledge in infectiology and parasitology. © Société de Biologie, 2017.

  18. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy.

    Science.gov (United States)

    Fidler, Andrew F; Singh, Ved P; Long, Phillip D; Dahlberg, Peter D; Engel, Gregory S

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  19. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S. [Department of Chemistry, The Institute for Biophysical Dynamics, and The James Franck Institute, The University of Chicago, Chicago, Illinois 60637 (United States); Long, Phillip D.; Dahlberg, Peter D. [Graduate Program in the Biophysical Sciences, The University of Chicago, Chicago, Illinois 60637 (United States)

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  20. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    International Nuclear Information System (INIS)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S.; Long, Phillip D.; Dahlberg, Peter D.

    2013-01-01

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex

  1. Accident and Off-Normal Response and Recovery from Multi-Canister Overpack (MCO) Processing Events

    International Nuclear Information System (INIS)

    ALDERMAN, C.A.

    2000-01-01

    In the process of removing spent nuclear fuel (SNF) from the K Basins through its subsequent packaging, drymg, transportation and storage steps, the SNF Project must be able to respond to all anticipated or foreseeable off-normal and accident events that may occur. Response procedures and recovery plans need to be in place, personnel training established and implemented to ensure the project will be capable of appropriate actions. To establish suitable project planning, these events must first be identified and analyzed for their expected impact to the project. This document assesses all off-normal and accident events for their potential cross-facility or Multi-Canister Overpack (MCO) process reversal impact. Table 1 provides the methodology for establishing the event planning level and these events are provided in Table 2 along with the general response and recovery planning. Accidents and off-normal events of the SNF Project have been evaluated and are identified in the appropriate facility Safety Analysis Report (SAR) or in the transportation Safety Analysis Report for Packaging (SARP). Hazards and accidents are summarized from these safety analyses and listed in separate tables for each facility and the transportation system in Appendix A, along with identified off-normal events. The tables identify the general response time required to ensure a stable state after the event, governing response documents, and the events with potential cross-facility or SNF process reversal impacts. The event closure is predicated on stable state response time, impact to operations and the mitigated annual occurrence frequency of the event as developed in the hazard analysis process

  2. Designing and Securing an Event Processing System for Smart Spaces

    Science.gov (United States)

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  3. Verification and Planning for Stochastic Processes with Asynchronous Events

    National Research Council Canada - National Science Library

    Younes, Hakan L

    2005-01-01

    .... The most common assumption is that of history-independence: the Markov assumption. In this thesis, the author considers the problems of verification and planning for stochastic processes with asynchronous events, without relying on the Markov assumption...

  4. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  5. Visual perception of complex shape-transforming processes.

    Science.gov (United States)

    Schmidt, Filipp; Fleming, Roland W

    2016-11-01

    Morphogenesis-or the origin of complex natural form-has long fascinated researchers from practically every branch of science. However, we know practically nothing about how we perceive and understand such processes. Here, we measured how observers visually infer shape-transforming processes. Participants viewed pairs of objects ('before' and 'after' a transformation) and identified points that corresponded across the transformation. This allowed us to map out in spatial detail how perceived shape and space were affected by the transformations. Participants' responses were strikingly accurate and mutually consistent for a wide range of non-rigid transformations including complex growth-like processes. A zero-free-parameter model based on matching and interpolating/extrapolating the positions of high-salience contour features predicts the data surprisingly well, suggesting observers infer spatial correspondences relative to key landmarks. Together, our findings reveal the operation of specific perceptual organization processes that make us remarkably adept at identifying correspondences across complex shape-transforming processes by using salient object features. We suggest that these abilities, which allow us to parse and interpret the causally significant features of shapes, are invaluable for many tasks that involve 'making sense' of shape. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Self-Exciting Point Process Modeling of Conversation Event Sequences

    Science.gov (United States)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  7. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    Science.gov (United States)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  8. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  9. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  10. Discovering block-structured process models from event logs containing infrequent behaviour

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Aalst, van der W.M.P.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Given an event log describing observed behaviour, process discovery aims to find a process model that ‘best’ describes this behaviour. A large variety of process discovery algorithms has been proposed. However, no existing algorithm returns a sound model in all cases (free of deadlocks and other

  11. Gradation of complexity and predictability of hydrological processes

    Science.gov (United States)

    Sang, Yan-Fang; Singh, Vijay P.; Wen, Jun; Liu, Changming

    2015-06-01

    Quantification of the complexity and predictability of hydrological systems is important for evaluating the impact of climate change on hydrological processes, and for guiding water activities. In the literature, the focus seems to have been on describing the complexity of spatiotemporal distribution of hydrological variables, but little attention has been paid to the study of complexity gradation, because the degree of absolute complexity of hydrological systems cannot be objectively evaluated. Here we show that complexity and predictability of hydrological processes can be graded into three ranks (low, middle, and high). The gradation is based on the difference in the energy distribution of hydrological series and that of white noise under multitemporal scales. It reflects different energy concentration levels and contents of deterministic components of the hydrological series in the three ranks. Higher energy concentration level reflects lower complexity and higher predictability, but scattered energy distribution being similar to white noise has the highest complexity and is almost unpredictable. We conclude that the three ranks (low, middle, and high) approximately correspond to deterministic, stochastic, and random hydrological systems, respectively. The result of complexity gradation can guide hydrological observations and modeling, and identification of similarity patterns among different hydrological systems.

  12. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    Science.gov (United States)

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the

  13. Leveraging the BPEL Event Model to Support QoS-aware Process Execution

    Science.gov (United States)

    Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf

    Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.

  14. The multitalented Mediator complex.

    Science.gov (United States)

    Carlsten, Jonas O P; Zhu, Xuefeng; Gustafsson, Claes M

    2013-11-01

    The Mediator complex is needed for regulated transcription of RNA polymerase II (Pol II)-dependent genes. Initially, Mediator was only seen as a protein bridge that conveyed regulatory information from enhancers to the promoter. Later studies have added many other functions to the Mediator repertoire. Indeed, recent findings show that Mediator influences nearly all stages of transcription and coordinates these events with concomitant changes in chromatin organization. We review the multitude of activities associated with Mediator and discuss how this complex coordinates transcription with other cellular events. We also discuss the inherent difficulties associated with in vivo characterization of a coactivator complex that can indirectly affect diverse cellular processes via changes in gene transcription. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Directory of Open Access Journals (Sweden)

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  16. Temporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis.

    Science.gov (United States)

    Sussman, Elyse; Winkler, István; Kreuzer, Judith; Saher, Marieke; Näätänen, Risto; Ritter, Walter

    2002-12-01

    Our previous study showed that the auditory context could influence whether two successive acoustic changes occurring within the temporal integration window (approximately 200ms) were pre-attentively encoded as a single auditory event or as two discrete events (Cogn Brain Res 12 (2001) 431). The aim of the current study was to assess whether top-down processes could influence the stimulus-driven processes in determining what constitutes an auditory event. Electroencepholagram (EEG) was recorded from 11 scalp electrodes to frequently occurring standard and infrequently occurring deviant sounds. Within the stimulus blocks, deviants either occurred only in pairs (successive feature changes) or both singly and in pairs. Event-related potential indices of change and target detection, the mismatch negativity (MMN) and the N2b component, respectively, were compared with the simultaneously measured performance in discriminating the deviants. Even though subjects could voluntarily distinguish the two successive auditory feature changes from each other, which was also indicated by the elicitation of the N2b target-detection response, top-down processes did not modify the event organization reflected by the MMN response. Top-down processes can extract elemental auditory information from a single integrated acoustic event, but the extraction occurs at a later processing stage than the one whose outcome is indexed by MMN. Initial processes of auditory event-formation are fully governed by the context within which the sounds occur. Perception of the deviants as two separate sound events (the top-down effects) did not change the initial neural representation of the same deviants as one event (indexed by the MMN), without a corresponding change in the stimulus-driven sound organization.

  17. R-process enrichment from a single event in an ancient dwarf galaxy.

    Science.gov (United States)

    Ji, Alexander P; Frebel, Anna; Chiti, Anirudh; Simon, Joshua D

    2016-03-31

    Elements heavier than zinc are synthesized through the rapid (r) and slow (s) neutron-capture processes. The main site of production of the r-process elements (such as europium) has been debated for nearly 60 years. Initial studies of trends in chemical abundances in old Milky Way halo stars suggested that these elements are produced continually, in sites such as core-collapse supernovae. But evidence from the local Universe favours the idea that r-process production occurs mainly during rare events, such as neutron star mergers. The appearance of a plateau of europium abundance in some dwarf spheroidal galaxies has been suggested as evidence for rare r-process enrichment in the early Universe, but only under the assumption that no gas accretes into those dwarf galaxies; gas accretion favours continual r-process enrichment in these systems. Furthermore, the universal r-process pattern has not been cleanly identified in dwarf spheroidals. The smaller, chemically simpler, and more ancient ultrafaint dwarf galaxies assembled shortly after the first stars formed, and are ideal systems with which to study nucleosynthesis events such as the r-process. Reticulum II is one such galaxy. The abundances of non-neutron-capture elements in this galaxy (and others like it) are similar to those in other old stars. Here, we report that seven of the nine brightest stars in Reticulum II, observed with high-resolution spectroscopy, show strong enhancements in heavy neutron-capture elements, with abundances that follow the universal r-process pattern beyond barium. The enhancement seen in this 'r-process galaxy' is two to three orders of magnitude higher than that detected in any other ultrafaint dwarf galaxy. This implies that a single, rare event produced the r-process material in Reticulum II. The r-process yield and event rate are incompatible with the source being ordinary core-collapse supernovae, but consistent with other possible sources, such as neutron star mergers.

  18. Dyadic Event Attribution in Social Networks with Mixtures of Hawkes Processes.

    Science.gov (United States)

    Li, Liangda; Zha, Hongyuan

    2013-01-01

    In many applications in social network analysis, it is important to model the interactions and infer the influence between pairs of actors, leading to the problem of dyadic event modeling which has attracted increasing interests recently. In this paper we focus on the problem of dyadic event attribution, an important missing data problem in dyadic event modeling where one needs to infer the missing actor-pairs of a subset of dyadic events based on their observed timestamps. Existing works either use fixed model parameters and heuristic rules for event attribution, or assume the dyadic events across actor-pairs are independent. To address those shortcomings we propose a probabilistic model based on mixtures of Hawkes processes that simultaneously tackles event attribution and network parameter inference, taking into consideration the dependency among dyadic events that share at least one actor. We also investigate using additive models to incorporate regularization to avoid overfitting. Our experiments on both synthetic and real-world data sets on international armed conflicts suggest that the proposed new method is capable of significantly improve accuracy when compared with the state-of-the-art for dyadic event attribution.

  19. Inclusive Education as Complex Process and Challenge for School System

    Directory of Open Access Journals (Sweden)

    Al-Khamisy Danuta

    2015-08-01

    Full Text Available Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the complexity appears to be one of very significant terms both in science and in philosophy. It appears that despite searching for simple rules, strategies, solutions everything is still more complex. The environment is complex, the organism living in it and exploring it, and just the exploration itself is a complex phenomenon, much more than this could initially seem to be.

  20. Journaling about stressful events: effects of cognitive processing and emotional expression.

    Science.gov (United States)

    Ullrich, Philip M; Lutgendorf, Susan K

    2002-01-01

    The effects of two journaling interventions, one focusing on emotional expression and the other on both cognitive processing and emotional expression, were compared during 1 month of journaling about a stressful or traumatic event. One hundred twenty-two students were randomly assigned to one of three writing conditions: (a) focusing on emotions related to a trauma or stressor, (b) focusing on cognitions and emotions related to a trauma or stressor, or (c) writing factually about media events. Writers focusing on cognitions and emotions developed greater awareness of the positive benefits of the stressful event than the other two groups. This effect was apparently mediated by greater cognitive processing during writing. Writers focusing on emotions alone reported more severe illness symptoms during the study than those in other conditions. This effect appeared to be mediated by a greater focus on negative emotional expression during writing.

  1. Notification Event Architecture for Traveler Screening: Predictive Traveler Screening Using Event Driven Business Process Management

    Science.gov (United States)

    Lynch, John Kenneth

    2013-01-01

    Using an exploratory model of the 9/11 terrorists, this research investigates the linkages between Event Driven Business Process Management (edBPM) and decision making. Although the literature on the role of technology in efficient and effective decision making is extensive, research has yet to quantify the benefit of using edBPM to aid the…

  2. A digital pixel cell for address event representation image convolution processing

    Science.gov (United States)

    Camunas-Mesa, Luis; Acosta-Jimenez, Antonio; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number of neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate events according to their information levels. Neurons with more information (activity, derivative of activities, contrast, motion, edges,...) generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. AER technology has been used and reported for the implementation of various type of image sensors or retinae: luminance with local agc, contrast retinae, motion retinae,... Also, there has been a proposal for realizing programmable kernel image convolution chips. Such convolution chips would contain an array of pixels that perform weighted addition of events. Once a pixel has added sufficient event contributions to reach a fixed threshold, the pixel fires an event, which is then routed out of the chip for further processing. Such convolution chips have been proposed to be implemented using pulsed current mode mixed analog and digital circuit techniques. In this paper we present a fully digital pixel implementation to perform the weighted additions and fire the events. This way, for a given technology, there is a fully digital implementation reference against which compare the mixed signal implementations. We have designed, implemented and tested a fully digital AER convolution pixel. This pixel will be used to implement a full AER convolution chip for programmable kernel image convolution processing.

  3. MATHEMATICAL MODELS OF PROCESSES AND SYSTEMS OF TECHNICAL OPERATION FOR ONBOARD COMPLEXES AND FUNCTIONAL SYSTEMS OF AVIONICS

    Directory of Open Access Journals (Sweden)

    Sergey Viktorovich Kuznetsov

    2017-01-01

    Full Text Available Modern aircraft are equipped with complicated systems and complexes of avionics. Aircraft and its avionics tech- nical operation process is observed as a process with changing of operation states. Mathematical models of avionics pro- cesses and systems of technical operation are represented as Markov chains, Markov and semi-Markov processes. The pur- pose is to develop the graph-models of avionics technical operation processes, describing their work in flight, as well as during maintenance on the ground in the various systems of technical operation. The graph-models of processes and sys- tems of on-board complexes and functional avionics systems in flight are proposed. They are based on the state tables. The models are specified for the various technical operation systems: the system with control of the reliability level, the system with parameters control and the system with resource control. The events, which cause the avionics complexes and func- tional systems change their technical state, are failures and faults of built-in test equipment. Avionics system of technical operation with reliability level control is applicable for objects with constant or slowly varying in time failure rate. Avion- ics system of technical operation with resource control is mainly used for objects with increasing over time failure rate. Avionics system of technical operation with parameters control is used for objects with increasing over time failure rate and with generalized parameters, which can provide forecasting and assign the borders of before-fail technical states. The pro- posed formal graphical approach avionics complexes and systems models designing is the basis for models and complex systems and facilities construction, both for a single aircraft and for an airline aircraft fleet, or even for the entire aircraft fleet of some specific type. The ultimate graph-models for avionics in various systems of technical operation permit the beginning of

  4. Event-Driven Process Chains (EPC)

    Science.gov (United States)

    Mendling, Jan

    This chapter provides a comprehensive overview of Event-driven Process Chains (EPCs) and introduces a novel definition of EPC semantics. EPCs became popular in the 1990s as a conceptual business process modeling language in the context of reference modeling. Reference modeling refers to the documentation of generic business operations in a model such as service processes in the telecommunications sector, for example. It is claimed that reference models can be reused and adapted as best-practice recommendations in individual companies (see [230, 168, 229, 131, 400, 401, 446, 127, 362, 126]). The roots of reference modeling can be traced back to the Kölner Integrationsmodell (KIM) [146, 147] that was developed in the 1960s and 1970s. In the 1990s, the Institute of Information Systems (IWi) in Saarbrücken worked on a project with SAP to define a suitable business process modeling language to document the processes of the SAP R/3 enterprise resource planning system. There were two results from this joint effort: the definition of EPCs [210] and the documentation of the SAP system in the SAP Reference Model (see [92, 211]). The extensive database of this reference model contains almost 10,000 sub-models: 604 of them non-trivial EPC business process models. The SAP Reference model had a huge impact with several researchers referring to it in their publications (see [473, 235, 127, 362, 281, 427, 415]) as well as motivating the creation of EPC reference models in further domains including computer integrated manufacturing [377, 379], logistics [229] or retail [52]. The wide-spread application of EPCs in business process modeling theory and practice is supported by their coverage in seminal text books for business process management and information systems in general (see [378, 380, 49, 384, 167, 240]). EPCs are frequently used in practice due to a high user acceptance [376] and extensive tool support. Some examples of tools that support EPCs are ARIS Toolset by IDS

  5. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  6. GET controller and UNICORN: event-driven process execution and monitoring in logistics

    NARCIS (Netherlands)

    Baumgrass, A.; Di Ciccio, C.; Dijkman, R.M.; Hewelt, M; Mendling, J.; Meyer, Andreas; Pourmirza, S.; Weske, M.H.; Wong, T.Y.

    2015-01-01

    Especially in logistics, process instances often interact with their real-world environment during execution. This is challenging due to the fact that events from this environment are often heterogeneous, lack process instance information, and their import and visualisation in traditional process

  7. EEG-based cognitive load of processing events in 3D virtual worlds is lower than processing events in 2D displays.

    Science.gov (United States)

    Dan, Alex; Reiner, Miriam

    2017-12-01

    Interacting with 2D displays, such as computer screens, smartphones, and TV, is currently a part of our daily routine; however, our visual system is built for processing 3D worlds. We examined the cognitive load associated with a simple and a complex task of learning paper-folding (origami) by observing 2D or stereoscopic 3D displays. While connected to an electroencephalogram (EEG) system, participants watched a 2D video of an instructor demonstrating the paper-folding tasks, followed by a stereoscopic 3D projection of the same instructor (a digital avatar) illustrating identical tasks. We recorded the power of alpha and theta oscillations and calculated the cognitive load index (CLI) as the ratio of the average power of frontal theta (Fz.) and parietal alpha (Pz). The results showed a significantly higher cognitive load index associated with processing the 2D projection as compared to the 3D projection; additionally, changes in the average theta Fz power were larger for the 2D conditions as compared to the 3D conditions, while alpha average Pz power values were similar for 2D and 3D conditions for the less complex task and higher in the 3D state for the more complex task. The cognitive load index was lower for the easier task and higher for the more complex task in 2D and 3D. In addition, participants with lower spatial abilities benefited more from the 3D compared to the 2D display. These findings have implications for understanding cognitive processing associated with 2D and 3D worlds and for employing stereoscopic 3D technology over 2D displays in designing emerging virtual and augmented reality applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Processing communications events in parallel active messaging interface by awakening thread from wait state

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-10-22

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  9. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  10. Atmospheric processes over complex terrain

    Science.gov (United States)

    Banta, Robert M.; Berri, G.; Blumen, William; Carruthers, David J.; Dalu, G. A.; Durran, Dale R.; Egger, Joseph; Garratt, J. R.; Hanna, Steven R.; Hunt, J. C. R.

    1990-06-01

    A workshop on atmospheric processes over complex terrain, sponsored by the American Meteorological Society, was convened in Park City, Utah from 24 vto 28 October 1988. The overall objective of the workshop was one of interaction and synthesis--interaction among atmospheric scientists carrying out research on a variety of orographic flow problems, and a synthesis of their results and points of view into an assessment of the current status of topical research problems. The final day of the workshop was devoted to an open discussion on the research directions that could be anticipated in the next decade because of new and planned instrumentation and observational networks, the recent emphasis on development of mesoscale numerical models, and continual theoretical investigations of thermally forced flows, orographic waves, and stratified turbulence. This monograph represents an outgrowth of the Park City Workshop. The authors have contributed chapters based on their lecture material. Workshop discussions indicated interest in both the remote sensing and predictability of orographic flows. These chapters were solicited following the workshop in order to provide a more balanced view of current progress and future directions in research on atmospheric processes over complex terrain.

  11. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  12. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    International Nuclear Information System (INIS)

    D.L. McGregor

    2000-01-01

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process

  13. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  14. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  15. Responses of diatom communities to hydrological processes during rainfall events

    Science.gov (United States)

    Wu, Naicheng; Faber, Claas; Ulrich, Uta; Fohrer, Nicola

    2015-04-01

    The importance of diatoms as a tracer of hydrological processes has been recently recognized (Pfister et al. 2009, Pfister et al. 2011, Tauro et al. 2013). However, diatom variations in a short-term scale (e.g., sub-daily) during rainfall events have not been well documented yet. In this study, rainfall event-based diatom samples were taken at the outlet of the Kielstau catchment (50 km2), a lowland catchment in northern Germany. A total of nine rainfall events were caught from May 2013 to April 2014. Non-metric multidimensional scaling (NMDS) revealed that diatom communities of different events were well separated along NMDS axis I and II, indicating a remarkable temporal variation. By correlating water level (a proxy of discharge) and different diatom indices, close relationships were found. For example, species richness, biovolume (μm3), Shannon diversity and moisture index01 (%, classified according to van Dam et al. 1994) were positively related with water level at the beginning phase of the rainfall (i.e. increasing limb of discharge peak). However, in contrast, during the recession limb of the discharge peak, diatom indices showed distinct responses to water level declines in different rainfall events. These preliminary results indicate that diatom indices are highly related to hydrological processes. The next steps will include finding out the possible mechanisms of the above phenomena, and exploring the contributions of abiotic variables (e.g., hydrologic indices, nutrients) to diatom community patterns. Based on this and ongoing studies (Wu et al. unpublished data), we will incorporate diatom data into End Member Mixing Analysis (EMMA) and select the tracer set that is best suited for separation of different runoff components in our study catchment. Keywords: Diatoms, Rainfall event, Non-metric multidimensional scaling, Hydrological process, Indices References: Pfister L, McDonnell JJ, Wrede S, Hlúbiková D, Matgen P, Fenicia F, Ector L, Hoffmann L

  16. Episodic events in long-term geological processes: A new classification and its applications

    Directory of Open Access Journals (Sweden)

    Dmitry A. Ruban

    2018-03-01

    Full Text Available Long-term geological processes are usually described with curves reflecting continuous changes in the characteristic parameters through the geological history, and such curves can be employed directly for recognition of episodic (relatively long-term events linked to these changes. The episodic events can be classified into several categories according to their scale (ordinary and anomalous events, “shape” (positive, negative, and neutral events, and relation to long-term trend change (successive, interruptive, facilitative, stabilizing, transformative, increasing, and decreasing. Many types of these events can be defined depending on the combination of the above-mentioned patterns. Of course, spatial rank, duration, and origin can be also considered in description of these events. The proposed classification can be applied to events in some real long-term geological processes, which include global sea-level changes, biodiversity dynamics, lithospheric plate number changes, and palaeoclimate changes. Several case examples prove the usefulness of the classification. It is established that the Early Valanginian (Early Cretaceous eustatic lowstand (the lowest position of the sea level in the entire Cretaceous was negative, but ordinary and only interruptive event. In the other case, it becomes clear that the only end-Ordovician and the Permian/Triassic mass extinctions transformed the trends of the biodiversity dynamics (from increase to decrease and from decrease to increase respectively, and the only Cretaceous/Paleogene mass extinction was really anomalous event on the Phanerozoic biodiversity curve. The new palaeontological data are employed to reconstruct the diversity dynamics of brachiopods in Germany (without the Alps and the Swiss Jura Mountains. The further interpretation of the both diversity curves implies that the Early Toarcian mass extinction affected the regional brachiopod faunas strongly, but this event was only decreasing

  17. The Development of Narrative Productivity, Syntactic Complexity, Referential Cohesion and Event Content in Four- to Eight-Year-Old Finnish Children

    Science.gov (United States)

    Mäkinen, Leena; Loukusa, Soile; Nieminen, Lea; Leinonen, Eeva; Kunnari, Sari

    2014-01-01

    This study focuses on the development of narrative structure and the relationship between narrative productivity and event content. A total of 172 Finnish children aged between four and eight participated. Their picture-elicited narrations were analysed for productivity, syntactic complexity, referential cohesion and event content. Each measure…

  18. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  19. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...... retrospective and prospective disease course histories are used. We examine two methods to correct for the selection depending on which data are used in the analysis. In the first case, the conditional distribution of the process given the pre-selection history is determined. In the second case, an inverse...

  20. Initiating events identification of the IS process using the master logic diagram

    International Nuclear Information System (INIS)

    Cho, Nam Chul; Jae, Moo Sung; Yang, Joon Eon

    2005-01-01

    Hydrogen is very attractive as a future secondary energy carrier considering environmental problems. It is important to produce hydrogen from water by use of carbon free primary energy source. The thermochemical water decomposition cycle is one of the methods for the hydrogen production process from water. Japan Atomic Energy Research Institute (JAERI) has been carrying out an R and D on the IS (iodine.sulfur) process that was first proposed by GA (General Atomic Co.) focusing on demonstration the 'closed-cycle' continuous hydrogen production on developing a feasible and efficient scheme for the HI processing, and on screening and/or developing materials of construction to be used in the corrosive process environment. The successful continuous operation of the IS-process was demonstrated and this process is one of the thermochemical processes, which is the closest to being industrialized. Currently, Korea has also started a research about the IS process and the construction of the IS process system is planned. In this study, for risk analysis of the IS process, initiating events of the IS process are identified by using the Master Logic Diagram (MLD) that is method for initiating event identification

  1. Limitations of red noise in analysing Dansgaard-Oeschger events

    Directory of Open Access Journals (Sweden)

    H. Braun

    2010-02-01

    Full Text Available During the last glacial period, climate records from the North Atlantic region exhibit a pronounced spectral component corresponding to a period of about 1470 years, which has attracted much attention. This spectral peak is closely related to the recurrence pattern of Dansgaard-Oeschger (DO events. In previous studies a red noise random process, more precisely a first-order autoregressive (AR1 process, was used to evaluate the statistical significance of this peak, with a reported significance of more than 99%. Here we use a simple mechanistic two-state model of DO events, which itself was derived from a much more sophisticated ocean-atmosphere model of intermediate complexity, to numerically evaluate the spectral properties of random (i.e., solely noise-driven events. This way we find that the power spectral density of random DO events differs fundamentally from a simple red noise random process. These results question the applicability of linear spectral analysis for estimating the statistical significance of highly non-linear processes such as DO events. More precisely, to enhance our scientific understanding about the trigger of DO events, we must not consider simple "straw men" as, for example, the AR1 random process, but rather test against realistic alternative descriptions.

  2. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  3. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  4. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  5. APNEA list mode data acquisition and real-time event processing

    Energy Technology Data Exchange (ETDEWEB)

    Hogle, R.A.; Miller, P. [GE Corporate Research & Development Center, Schenectady, NY (United States); Bramblett, R.L. [Lockheed Martin Specialty Components, Largo, FL (United States)

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  6. Event recognition by detrended fluctuation analysis: An application to Teide-Pico Viejo volcanic complex, Tenerife, Spain

    International Nuclear Information System (INIS)

    Del Pin, Enrico; Carniel, Roberto; Tarraga, Marta

    2008-01-01

    In this work we investigate the application of the detrended fluctuation analysis (DFA) to seismic data recorded in the island of Tenerife (Canary Islands, Spain) during the month of July 2004, in a phase of possible unrest of the Teide-Pico Viejo volcanic complex. Tectonic events recorded in the area are recognized and located by the Spanish national agency Instituto Geografico Nacional (IGN) and their catalogue is the only currently available dataset, whose completeness unfortunately suffers from the strong presence of anthropogenic noise. In this paper we propose the use of DFA to help to automatically identify events. The evaluation of this case study proves DFA to be a promising tool to be used for rapidly screening large seismic datasets and highlighting time windows with the potential presence of discrete events

  7. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  8. Integrating Continuous-Time and Discrete-Event Concepts in Process Modelling, Simulation and Control

    NARCIS (Netherlands)

    Beek, van D.A.; Gordijn, S.H.F.; Rooda, J.E.; Ertas, A.

    1995-01-01

    Currently, modelling of systems in the process industry requires the use of different specification languages for the specification of the discrete-event and continuous-time subsystems. In this way, models are restricted to individual subsystems of either a continuous-time or discrete-event nature.

  9. Event-related Potentials Reflecting the Processing of Phonological Constraint Violations

    NARCIS (Netherlands)

    Domahs, Ulrike; Kehrein, Wolfgang; Knaus, Johannes; Wiese, Richard; Schlesewsky, Matthias

    2009-01-01

    Flow are violations of phonological constraints processed in word comprehension? The present article reports the results of ail event-related potentials (ERP) Study oil a phonological constraint of German that disallows identical segments within it syllable or word (CC(i)VC(i)). We examined three

  10. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  11. Complex Parts, Complex Data: Why You Need to Understand What Radiation Single Event Testing Data Does and Doesn't Show and the Implications Thereof

    Science.gov (United States)

    LaBel, Kenneth A.; Berg, Melanie D.

    2015-01-01

    Electronic parts (integrated circuits) have grown in complexity such that determining all failure modes and risks from single particle event testing is impossible. In this presentation, the authors will present why this is so and provide some realism on what this means. Its all about understanding actual risks and not making assumptions.

  12. Extrinsic and intrinsic complexities of the Los Alamos plutonium processing facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Roberts, N.J.; Longmire, V.L.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. The authors have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. The authors have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities

  13. Event-related potentials reflecting the processing of phonological constraint violations

    NARCIS (Netherlands)

    Domahs, U.; Kehrein, W.; Knaus, J.; Wiese, R.; Schlesewsky, M.

    2009-01-01

    How are violations of phonological constraints processed in word comprehension? The present article reports the results of an event-related potentials (ERP) study on a phonological constraint of German that disallows identical segments within a syllable or word (CC iVCi). We examined three types of

  14. The Mega Events in the processes of foundation and transformation of the city

    Directory of Open Access Journals (Sweden)

    Mario Coletta

    2012-12-01

    Full Text Available Every city arises from an event, an individual decision supported by a collective engagement, disciplined by rules defining technical and legal norms for implementing and managing, breeding customs, traditions, rituals and shared behaviours, as roots of culture and civilization. The origin of the foundation city, both in the ancient time and in the medieval and modern ages, represents the first major event for the city which resumes, in its physical and management setting-up, the matrix characteristics of urban planning complexity, putting into a dialectic comparison not only “where” (place and space, “when” (age and time and “how” (form and behaviour, but also “why” and “for whom”, turning out as the dominant subjects of the residential making-process since the beginning of civilisations. “Why” resumes the ambit of needs, material and spiritual instances, concrete and abstract instances, strictly connected with the ambit of will, wishes and ambitions, leading decisions and policies at the basis of plans, programmes and projects dominated by “must-can” binomial dialectics. “For whom” determines the transition from the material to the immaterial, from the concreteness of actions to relative aims, from the object to the subject, from the operator to the addressee of the operation, recalling the “ethical reason” which finds its deepest roots in the ideology of “idea” sublimation, fulfilled by new linguistic assumptions, symbolic messages aiming at exalting the membership extension from the family, brotherhood and tribe to the native land, to the territory and the city. In this perspective, the pyramid verticalisation of social relationships, disciplining the urban order, finds a convenient and comfortable acceptance for a faithful commonality (which guarantees trust relations and a progressive process of “civic sense”, which makes the general particular and the particular general, connecting rationality

  15. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  16. Event-Related Potentials and Emotion Processing in Child Psychopathology

    Directory of Open Access Journals (Sweden)

    Georgia eChronaki

    2016-04-01

    Full Text Available In recent years there has been increasing interest in the neural mechanisms underlying altered emotional processes in children and adolescents with psychopathology. This review provides a brief overview of the most up-to-date findings in the field of Event-Related Potentials (ERPs to facial and vocal emotional expressions in the most common child psychopathological conditions. In regards to externalising behaviour (i.e. ADHD, CD, ERP studies show enhanced early components to anger, reflecting enhanced sensory processing, followed by reductions in later components to anger, reflecting reduced cognitive-evaluative processing. In regards to internalising behaviour, research supports models of increased processing of threat stimuli especially at later more elaborate and effortful stages. Finally, in autism spectrum disorders abnormalities have been observed at early visual-perceptual stages of processing. An affective neuroscience framework for understanding child psychopathology can be valuable in elucidating underlying mechanisms and inform preventive intervention.

  17. Modeling the Process of Event Sequence Data Generated for Working Condition Diagnosis

    Directory of Open Access Journals (Sweden)

    Jianwei Ding

    2015-01-01

    Full Text Available Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM, which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis. Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.

  18. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  19. Assessment of Sustainability of Sports Events (Slovenia

    Directory of Open Access Journals (Sweden)

    Aleksandra Golob

    2015-04-01

    Full Text Available Events industry plays an important role in nowadays economy and can have a substantial impact for a successful business; in addition, sustainability-oriented tourism is becoming an important component of development and planning of a tourist destination. Thus, organizing sustainability-oriented events is crucial and should focus on the zero waste event management and consider as many elements of sustainable development as possible. The same stands for organizing sports events. The aim of this paper was to find out to which level the organizers of existing sports events in Slovenia are taking into account different domains of sustainable development. Answering to a common questionnaire the organizers gave us a feedback considering four main areas: environmental, social, cultural, and economic criteria. The plan was to determine the level of sustainability of three sports events and compare them to each other according to the outstanding areas as well as to draw the attention to the importance of organizing sustainability-oriented sports events and minimizing negative effects of those. Since the field of research is complex, dynamic, and has an interdisciplinary character the results were attained using the DEX software which supports a qualitative approach and allows the modelling of complex decision-making processes with a large number of parameters and alternatives. Such methodology enables the input of a preliminary set of sustainability criteria and can be used as a support when deciding on the evaluation of sustainability of events in general.

  20. Focused process improvement events: sustainability of impact on process and performance in an academic radiology department.

    Science.gov (United States)

    Rosenkrantz, Andrew B; Lawson, Kirk; Ally, Rosina; Chen, David; Donno, Frank; Rittberg, Steven; Rodriguez, Joan; Recht, Michael P

    2015-01-01

    To evaluate sustainability of impact of rapid, focused process improvement (PI) events on process and performance within an academic radiology department. Our department conducted PI during 2011 and 2012 in CT, MRI, ultrasound, breast imaging, and research billing. PI entailed participation by all stakeholders, facilitation by the department chair, collection of baseline data, meetings during several weeks, definition of performance metrics, creation of an improvement plan, and prompt implementation. We explore common themes among PI events regarding initial impact and durability of changes. We also assess performance in each area pre-PI, immediately post-PI, and at the time of the current study. All PI events achieved an immediate improvement in performance metrics, often entailing both examination volumes and on-time performance. IT-based solutions, process standardization, and redefinition of staff responsibilities were often central in these changes, and participants consistently expressed improved internal leadership and problem-solving ability. Major environmental changes commonly occurred after PI, including a natural disaster with equipment loss, a change in location or services offered, and new enterprise-wide electronic medical record system incorporating new billing and radiology informatics systems, requiring flexibility in the PI implementation plan. Only one PI team conducted regular post-PI follow-up meetings. Sustained improvement was frequently, but not universally, observed: in the long-term following initial PI, measures of examination volume showed continued progressive improvements, whereas measures of operational efficiency remained stable or occasionally declined. Focused PI is generally effective in achieving performance improvement, although a changing environment influences the sustainability of impact. Thus, continued process evaluation and ongoing workflow modifications are warranted. Copyright © 2015 American College of Radiology

  1. Extrinsic and intrinsic complexities of the Los Alamos Plutonium Processing Facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Longmire, V.L.; Roberts, N.J.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. We have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. We have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities. Examples that will be reported here include audit trails, lot-name multiplicity, etc

  2. Processing ser and estar to locate objects and events

    Science.gov (United States)

    Dussias, Paola E.; Contemori, Carla; Román, Patricia

    2016-01-01

    In Spanish locative constructions, a different form of the copula is selected in relation to the semantic properties of the grammatical subject: sentences that locate objects require estar while those that locate events require ser (both translated in English as ‘to be’). In an ERP study, we examined whether second language (L2) speakers of Spanish are sensitive to the selectional restrictions that the different types of subjects impose on the choice of the two copulas. Twenty-four native speakers of Spanish and two groups of L2 Spanish speakers (24 beginners and 18 advanced speakers) were recruited to investigate the processing of ‘object/event + estar/ser’ permutations. Participants provided grammaticality judgments on correct (object + estar; event + ser) and incorrect (object + ser; event + estar) sentences while their brain activity was recorded. In line with previous studies (Leone-Fernández, Molinaro, Carreiras, & Barber, 2012; Sera, Gathje, & Pintado, 1999), the results of the grammaticality judgment for the native speakers showed that participants correctly accepted object + estar and event + ser constructions. In addition, while ‘object + ser’ constructions were considered grossly ungrammatical, ‘event + estar’ combinations were perceived as unacceptable to a lesser degree. For these same participants, ERP recording time-locked to the onset of the critical word ‘en’ showed a larger P600 for the ser predicates when the subject was an object than when it was an event (*La silla es en la cocina vs. La fiesta es en la cocina). This P600 effect is consistent with syntactic repair of the defining predicate when it does not fit with the adequate semantic properties of the subject. For estar predicates (La silla está en la cocina vs. *La fiesta está en la cocina), the findings showed a central-frontal negativity between 500–700 ms. Grammaticality judgment data for the L2 speakers of Spanish showed that beginners were significantly less

  3. Dogs cannot bark: event-related brain responses to true and false negated statements as indicators of higher-order conscious processing.

    Science.gov (United States)

    Herbert, Cornelia; Kübler, Andrea

    2011-01-01

    The present study investigated event-related brain potentials elicited by true and false negated statements to evaluate if discrimination of the truth value of negated information relies on conscious processing and requires higher-order cognitive processing in healthy subjects across different levels of stimulus complexity. The stimulus material consisted of true and false negated sentences (sentence level) and prime-target expressions (word level). Stimuli were presented acoustically and no overt behavioral response of the participants was required. Event-related brain potentials to target words preceded by true and false negated expressions were analyzed both within group and at the single subject level. Across the different processing conditions (word pairs and sentences), target words elicited a frontal negativity and a late positivity in the time window from 600-1000 msec post target word onset. Amplitudes of both brain potentials varied as a function of the truth value of the negated expressions. Results were confirmed at the single-subject level. In sum, our results support recent suggestions according to which evaluation of the truth value of a negated expression is a time- and cognitively demanding process that cannot be solved automatically, and thus requires conscious processing. Our paradigm provides insight into higher-order processing related to language comprehension and reasoning in healthy subjects. Future studies are needed to evaluate if our paradigm also proves sensitive for the detection of consciousness in non-responsive patients.

  4. Dogs cannot bark: event-related brain responses to true and false negated statements as indicators of higher-order conscious processing.

    Directory of Open Access Journals (Sweden)

    Cornelia Herbert

    Full Text Available The present study investigated event-related brain potentials elicited by true and false negated statements to evaluate if discrimination of the truth value of negated information relies on conscious processing and requires higher-order cognitive processing in healthy subjects across different levels of stimulus complexity. The stimulus material consisted of true and false negated sentences (sentence level and prime-target expressions (word level. Stimuli were presented acoustically and no overt behavioral response of the participants was required. Event-related brain potentials to target words preceded by true and false negated expressions were analyzed both within group and at the single subject level. Across the different processing conditions (word pairs and sentences, target words elicited a frontal negativity and a late positivity in the time window from 600-1000 msec post target word onset. Amplitudes of both brain potentials varied as a function of the truth value of the negated expressions. Results were confirmed at the single-subject level. In sum, our results support recent suggestions according to which evaluation of the truth value of a negated expression is a time- and cognitively demanding process that cannot be solved automatically, and thus requires conscious processing. Our paradigm provides insight into higher-order processing related to language comprehension and reasoning in healthy subjects. Future studies are needed to evaluate if our paradigm also proves sensitive for the detection of consciousness in non-responsive patients.

  5. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  6. Features, Events, and Processes: system Level

    International Nuclear Information System (INIS)

    D. McGregor

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760)

  7. An experimental evaluation of passage-based process discovery

    NARCIS (Netherlands)

    Verbeek, H.M.W.; Aalst, van der W.M.P.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the area of process mining, the ILP Miner is known for the fact that it always returns a Petri net that perfectly fits a given event log. Like for most process discovery algorithms, its complexity is linear in the size of the event log and exponential in the number of event classes (i.e.,

  8. Low latitude ionospheric TEC responses to dynamical complexity quantifiers during transient events over Nigeria

    Science.gov (United States)

    Ogunsua, Babalola

    2018-04-01

    In this study, the values of chaoticity and dynamical complexity parameters for some selected storm periods in the year 2011 and 2012 have been computed. This was done using detrended TEC data sets measured from Birnin-Kebbi, Torro and Enugu global positioning system (GPS) receiver stations in Nigeria. It was observed that the significance of difference (SD) values were mostly greater than 1.96 but surprisingly lower than 1.96 in September 29, 2011. The values of the computed SD were also found to be reduced in most cases just after the geomagnetic storm with immediate recovery a day after the main phase of the storm while the values of Lyapunov exponent and Tsallis entropy remains reduced due to the influence of geomagnetic storms. It was also observed that the value of Lyapunov exponent and Tsallis entropy reveals similar variation pattern during storm period in most cases. Also recorded surprisingly were lower values of these dynamical quantifiers during the solar flare event of August 8th and 9th of the year 2011. The possible mechanisms responsible for these observations were further discussed in this work. However, our observations show that the ionospheric effects of some other possible transient events other than geomagnetic storms can also be revealed by the variation of chaoticity and dynamical complexity.

  9. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  10. Modeling associations between latent event processes governing time series of pulsing hormones.

    Science.gov (United States)

    Liu, Huayu; Carlson, Nichole E; Grunwald, Gary K; Polotsky, Alex J

    2017-10-31

    This work is motivated by a desire to quantify relationships between two time series of pulsing hormone concentrations. The locations of pulses are not directly observed and may be considered latent event processes. The latent event processes of pulsing hormones are often associated. It is this joint relationship we model. Current approaches to jointly modeling pulsing hormone data generally assume that a pulse in one hormone is coupled with a pulse in another hormone (one-to-one association). However, pulse coupling is often imperfect. Existing joint models are not flexible enough for imperfect systems. In this article, we develop a more flexible class of pulse association models that incorporate parameters quantifying imperfect pulse associations. We propose a novel use of the Cox process model as a model of how pulse events co-occur in time. We embed the Cox process model into a hormone concentration model. Hormone concentration is the observed data. Spatial birth and death Markov chain Monte Carlo is used for estimation. Simulations show the joint model works well for quantifying both perfect and imperfect associations and offers estimation improvements over single hormone analyses. We apply this model to luteinizing hormone (LH) and follicle stimulating hormone (FSH), two reproductive hormones. Use of our joint model results in an ability to investigate novel hypotheses regarding associations between LH and FSH secretion in obese and non-obese women. © 2017, The International Biometric Society.

  11. Processing data communications events by awakening threads in parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2016-03-15

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  12. Rare event simulation using Monte Carlo methods

    CERN Document Server

    Rubino, Gerardo

    2009-01-01

    In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...

  13. Characterisation of sub-micron particle number concentrations and formation events in the western Bushveld Igneous Complex, South Africa

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2012-05-01

    Full Text Available South Africa holds significant mineral resources, with a substantial fraction of these reserves occurring and being processed in a large geological structure termed the Bushveld Igneous Complex (BIC. The area is also highly populated by informal, semi-formal and formal residential developments. However, knowledge of air quality and research related to the atmosphere is still very limited in the area. In order to investigate the characteristics and processes affecting sub-micron particle number concentrations and formation events, air ion and aerosol particle size distributions and number concentrations, together with meteorological parameters, trace gases and particulate matter (PM were measured for over two years at Marikana in the heart of the western BIC. The observations showed that trace gas (i.e. SO2, NOx, CO and black carbon concentrations were relatively high, but in general within the limits of local air quality standards. The area was characterised by very high condensation sink due to background aerosol particles, PM10 and O3 concentration. The results indicated that high amounts of Aitken and accumulation mode particles originated from domestic burning for heating and cooking in the morning and evening, while during daytime SO2-based nucleation followed by the growth by condensation of vapours from industrial, residential and natural sources was the most probable source for large number concentrations of nucleation and Aitken mode particles. Nucleation event day frequency was extremely high, i.e. 86% of the analysed days, which to the knowledge of the authors is the highest frequency ever reported. The air mass back trajectory and wind direction analyses showed that the secondary particle formation was influenced both by local and regional pollution and vapour sources. Therefore, our observation of the annual cycle and magnitude of the particle formation and growth rates during

  14. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  15. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  16. Managing unforeseen events in production scheduling and control

    DEFF Research Database (Denmark)

    Arica, E.; Falster, Peter; Hvolby, H. H.

    2016-01-01

    initial plans unfeasible or obsolete during production execution. How to effectively handle the unscheduled events and take corrective actions still remains a central question to academics and practitioners. In this paper, we explore this issue through a review of the relevant literature and an in......The production planning and control process is performed within complex and dynamic organizations made up of customer expectations, equipment, materials, people, information, and technologies. Changes in both internal and external factors can create a variety of unforeseen events, which make...

  17. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores.

    Directory of Open Access Journals (Sweden)

    Lorea Flores

    Full Text Available Habitat complexity can influence predation rates (e.g. by providing refuge but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants, in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants; and 3. as the spatial configuration of structures (measured as fractal dimension. The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology. We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, 'habitat complexity' by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems.

  18. The cost of conservative synchronization in parallel discrete event simulations

    Science.gov (United States)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  19. A computer interface for processing multi-parameter data of multiple event types

    International Nuclear Information System (INIS)

    Katayama, I.; Ogata, H.

    1980-01-01

    A logic circuit called a 'Raw Data Processor (RDP)' which functions as an interface between ADCs and the PDP-11 computer has been developed at RCNP, Osaka University for general use. It enables data processing simultaneously for numbers of events of various types up to 16, and an arbitrary combination of ADCs of any number up to 14 can be assigned to each event type by means of a pinboard matrix. The details of the RDP and its application are described. (orig.)

  20. Events and mega events: leisure and business in tourism

    Directory of Open Access Journals (Sweden)

    Ricardo Alexandre Paiva

    2015-12-01

    Full Text Available The promotion of events and mega events mobilizes at the same time, in a concatenated way or not, leisure and business practices, which are captured by the tourism industry as a stimulus for the reproduction of capitalism, by the amount of other activities which raise (primary, secondary and tertiary , placing the architecture and the city as protagonists in contemporary urban development. In this sense, the article analyzes the articulation of events and mega events to the provision of architecture and urban infrastructure, as well as the construction of the tourist image of the places, motivated by leisure and business activities. The methodological procedures have theoretical and exploratory character and have multidisciplinary intentions. This will be discussed, in a historical perspective, the concepts of leisure and business activities that raise as moving or traveling; next it will be delimited similarities and differences between tourism events and business tourism, entering after the analysis of the distinctions between events and mega events, highlighting the complexity and the role of mega-events as a major symptom of globalization; finally it will be presented the spatial scale developments in architecture and the city in the realization of (mega events, as well as its impact on the city's image. As a synthesis, it is important to notice that spatial developments business tourism, events and mega events are manifested in various scales and with different levels of complexity, revealing the strengths and / or weaknesses of the places. The urban planning, architecture and urbanism are important objects of knowledge and spatial intervention to ensure infrastructure and urban and architectural structures appropriate for events, which should be sensitive to the demands of tourists and host communities.

  1. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    Directory of Open Access Journals (Sweden)

    Xiaoqing Hao

    Full Text Available To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P, weakly positive (p, neutral (o, weakly negative (n, and strongly negative (N. These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism.

  2. A unified approach of catastrophic events

    Directory of Open Access Journals (Sweden)

    S. Nikolopoulos

    2004-01-01

    Full Text Available Although there is an accumulated charge of theoretical, computational, and numerical work, like catastrophe theory, bifurcation theory, stochastic and deterministic chaos theory, there is an important feeling that these matters do not completely cover the physics of real catastrophic events. Recent studies have suggested that a large variety of complex processes, including earthquakes, heartbeats, and neuronal dynamics, exhibits statistical similarities. Here we are studying in terms of complexity and non linear techniques whether isomorphic signatures emerged indicating the transition from the normal state to the both geological and biological shocks. In the last 15 years, the study of Complex Systems has emerged as a recognized field in its own right, although a good definition of what a complex system is, actually is eluded. A basic reason for our interest in complexity is the striking similarity in behaviour close to irreversible phase transitions among systems that are otherwise quite different in nature. It is by now recognized that the pre-seismic electromagnetic time-series contain valuable information about the earthquake preparation process, which cannot be extracted without the use of important computational power, probably in connection with computer Algebra techniques. This paper presents an analysis, the aim of which is to indicate the approach of the global instability in the pre-focal area. Non-linear characteristics are studied by applying two techniques, namely the Correlation Dimension Estimation and the Approximate Entropy. These two non-linear techniques present coherent conclusions, and could cooperate with an independent fractal spectral analysis to provide a detection concerning the emergence of the nucleation phase of the impending catastrophic event. In the context of similar mathematical background, it would be interesting to augment this description of pre-seismic electromagnetic anomalies in order to cover biological

  3. Anxiety symptoms mediate the relationship between exposure to stressful negative life events and depressive symptoms: A conditional process modelling of the protective effects of resilience.

    Science.gov (United States)

    Anyan, Frederick; Worsley, Lyn; Hjemdal, Odin

    2017-10-01

    Resilience has provided a useful framework that elucidates the effects of protective factors to overcome psychological adversities but studies that address the potential contingencies of resilience to protect against direct and indirect negative effects are lacking. These obvious gaps have also resulted in oversimplification of complex processes that can be clarified by moderated mediation associations. This study examines a conditional process modelling of the protective effects of resilience against indirect effects. Two separate samples were recruited in a cross-sectional survey from Australia and Norway to complete the Patient Health Questionnaire -9, Generalized Anxiety Disorder, Stressful Negative Life Events Questionnaire and the Resilience Scale for Adults. The final sample sizes were 206 (females=114; males=91; other=1) and 210 (females=155; males=55) for Australia and Norway respectively. Moderated mediation analyses were conducted across the samples. Anxiety symptoms mediated the relationship between exposure to stressful negative life events and depressive symptoms in both samples. Conditional indirect effects of exposure to stressful negative life events on depressive symptoms mediated by anxiety symptoms showed that high subgroup of resilience was associated with less effect of exposure to stressful negative life events through anxiety symptoms on depressive symptoms than the low subgroup of resilience. As a cross-sectional survey, the present study does not answer questions about causal processes despite the use of a conditional process modelling. These findings support that, resilience protective resources can protect against both direct and indirect - through other channels - psychological adversities. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Replication, refinement & reachability: complexity in dynamic condition-response graphs

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas T.; Slaats, Tijs

    2017-01-01

    We explore the complexity of reachability and run-time refinement under safety and liveness constraints in event-based process models. Our study is framed in the DCR? process language, which supports modular specification through a compositional operational semantics. DCR? encompasses the “Dynami...

  5. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  6. Attentional Mechanisms in Sports via Brain-Electrical Event-Related Potentials

    Science.gov (United States)

    Hack, Johannes; Memmert, Daniel; Rup, Andre

    2009-01-01

    In this study, we examined attention processes in complex, sport-specific decision-making tasks without interdependencies from anticipation. Psychophysiological and performance data recorded from advanced and intermediate level basketball referees were compared. Event-related potentials obtained while judging game situations in foul recognition…

  7. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  8. MadEvent: automatic event generation with MadGraph

    International Nuclear Information System (INIS)

    Maltoni, Fabio; Stelzer, Tim

    2003-01-01

    We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)

  9. Component Neural Systems for the Creation of Emotional Memories during Free Viewing of a Complex, Real-World Event.

    Science.gov (United States)

    Botzung, Anne; Labar, Kevin S; Kragel, Philip; Miles, Amanda; Rubin, David C

    2010-01-01

    To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  10. Component neural systems for the creation of emotional memories during free viewing of a complex, real-world event

    Directory of Open Access Journals (Sweden)

    Anne Botzung

    2010-05-01

    Full Text Available To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI. During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets or from non-viewed portions of the same game (foils. After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan’s perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  11. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    Science.gov (United States)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  12. ATLAS High Level Calorimeter Trigger Software Performance for Cosmic Ray Events

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2009-01-01

    The ATLAS detector is undergoing intense commissioning effort with cosmic rays preparing for the first LHC collisions next spring. Combined runs with all of the ATLAS subsystems are being taken in order to evaluate the detector performance. This is an unique opportunity also for the trigger system to be studied with different detector operation modes, such as different event rates and detector configuration. The ATLAS trigger starts with a hardware based system which tries to identify detector regions where interesting physics objects may be found (eg: large energy depositions in the calorimeter system). An approved event will be further processed by more complex software algorithms at the second level where detailed features are extracted (full detector granularity data for small portions of the detector is available). Events accepted at this level will be further processed at the so-called event filter level. Full detector data at full granularity is available for offline like processing with complete calib...

  13. Emotional processing and psychopathic traits in male college students: An event-related potential study.

    Science.gov (United States)

    Medina, Amy L; Kirilko, Elvira; Grose-Fifer, Jillian

    2016-08-01

    Emotional processing deficits are often considered a hallmark of psychopathy. However, there are relatively few studies that have investigated how the late positive potential (LPP) elicited by both positive and negative emotional stimuli is modulated by psychopathic traits, especially in undergraduates. Attentional deficits have also been posited to be associated with emotional blunting in psychopathy, consequently, results from previous studies may have been influenced by task demands. Therefore, we investigated the relationship between the neural correlates of emotional processing and psychopathic traits by measuring event-related potentials (ERPs) during a task with a relatively low cognitive load. A group of male undergraduates were classified as having either high or low levels of psychopathic traits according to their total scores on the Psychopathic Personality Inventory - Revised (PPI-R). A subgroup of these participants then passively viewed complex emotional and neutral images from the International Affective Picture System (IAPS) while their EEGs were recorded. As hypothesized, in general the late LPP elicited by emotional pictures was found to be significantly reduced for participants with high Total PPI-R scores relative to those with low scores, especially for pictures that were rated as less emotionally arousing. Our data suggest that male undergraduates with high, but subclinical levels of psychopathic traits did not maintain continued higher-order processing of affective information, especially when it was perceived to be less arousing in nature. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  15. Features, Events, and Processes in UZ and Transport

    Energy Technology Data Exchange (ETDEWEB)

    P. Persoff

    2004-11-06

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA.

  16. Uncertainty Reduction for Stochastic Processes on Complex Networks

    Science.gov (United States)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  17. Certain aspects of the reactivity of carotenoids. Redox processes and complexation

    International Nuclear Information System (INIS)

    Polyakov, Nikolay E; Leshina, Tatyana V

    2006-01-01

    The published data on the redox reactions of carotenoids, their supramolecular inclusion complexes and the composition, properties and practical application of these complexes are generalised. Special attention is given to the effect of complexation on radical processes involving carotenoids and on the antioxidant activity of carotenoids.

  18. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  19. Event Processing and Variable Part of Sample Period Determining in Combined Systems Using GA

    Science.gov (United States)

    Strémy, Maximilián; Závacký, Pavol; Jedlička, Martin

    2011-01-01

    This article deals with combined dynamic systems and usage of modern techniques in dealing with these systems, focusing particularly on sampling period design, cyclic processing tasks and related processing algorithms in the combined event management systems using genetic algorithms.

  20. EMiT: a process mining tool

    NARCIS (Netherlands)

    Dongen, van B.F.; Aalst, van der W.M.P.; Cortadella, J.; Reisig, W.

    2004-01-01

    Process mining offers a way to distill process models from event logs originating from transactional systems in logistics, banking, e-business, health-care, etc. The algorithms used for process mining are complex and in practise large logs are needed to derive a high-quality process model. To

  1. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The

  2. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    International Nuclear Information System (INIS)

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  3. Significant events in psychotherapy: An update of research findings.

    Science.gov (United States)

    Timulak, Ladislav

    2010-11-01

    Significant events research represents a specific approach to studying client-identified important moments in the therapy process. The current study provides an overview of the significant events research conducted, the methodology used together with findings and implications. PsychInfo database was searched with keywords such as significant events, important events, significant moments, important moments, and counselling or psychotherapy. The references of the selected studies were also searched. This process led to the identification of 41 primary studies that used client-identified significant event(s) as a main or secondary focus of the study. These were consequently reviewed with regard to their methodology and findings. The findings are presented according to type of study conducted. The impacts of helpful events reported by clients are focused on contributions to therapeutic relationship and to in-session outcomes. Hindering events focus on some client disappointment with the therapist or therapy. The group therapy modality highlighted additional helpful impacts (like learning from others). Perspectives on what is significant in therapy differ between clients and therapists. The intensive qualitative studies reviewed confirm that the processes involved in significant events are complex and ambiguous. Studies show that the helpful events may also contain many hindering elements and that specific events are deeply contextually embedded in the preceding events of therapy. Some studies suggest that helpful significant events are therapeutically productive although this may need to be established further. Specific intensive studies show that the clients' perceptions in therapy may differ dramatically from that of the therapist. Furthermore, the relational and emotional aspects of significant moments may be more important for the clients than the cognitive aspects of therapy which are frequently stressed by therapists. 2010 The British Psychological Society.

  4. Processing of complex shapes with single-mode resonant frequency microwave applicators

    International Nuclear Information System (INIS)

    Fellows, L.A.; Delgado, R.; Hawley, M.C.

    1994-01-01

    Microwave processing is an alternative to conventional composite processing techniques. Single-mode microwave applicators efficiently couple microwave energy into the composite. The application of the microwave energy is greatly affected by the geometry of the composite. In the single mode microwave applicator, two types of modes are available. These modes are best suited to processing flat planar samples or cylindrical samples with geometries that align with the electric fields. Mode-switching is alternating between different electromagnetic modes with the intelligent selection of the modes to alleviate undesirable temperature profiles. This method has improved the microwave heating profiles of materials with complex shapes that do not align with either type of electric field. Parts with two different complex geometries were fabricated from a vinyl toluene/vinyl ester resin with a continuous glass fiber reinforcement by autoclaving and by microwave techniques. The flexural properties of the microwave processed samples were compared to the flexural properties of autoclaved samples. The trends of the mechanical properties for the complex shapes were consistent with the results of experiments with flat panels. This demonstrated that mode-switching techniques are as applicable for the complex shapes as they are for the simpler flat panel geometry

  5. Relationship between single-event upset immunity and fabrication processes of recent memories

    International Nuclear Information System (INIS)

    Nemoto, N.; Shindou, H.; Kuboyama, S.; Matsuda, S.; Itoh, H.; Okada, S.; Nashiyama, I.

    1999-01-01

    Single-Event upset (SEU) immunity for commercial devices were evaluated by irradiation tests using high-energy heavy ions. We show test results and describe the relationship between observed SEU and structures/fabrication processes. We have evaluated single-even upset (SEU) tolerance of recent commercial memory devices using high energy heavy ions in order to find relationship between SEU rate and their fabrication process. It was revealed that the change of the process parameter gives much effect for the SEU rate of the devices. (authors)

  6. Dual RNA Processing Roles of Pat1b via Cytoplasmic Lsm1-7 and Nuclear Lsm2-8 Complexes

    Directory of Open Access Journals (Sweden)

    Caroline Vindry

    2017-08-01

    Full Text Available Pat1 RNA-binding proteins, enriched in processing bodies (P bodies, are key players in cytoplasmic 5′ to 3′ mRNA decay, activating decapping of mRNA in complex with the Lsm1-7 heptamer. Using co-immunoprecipitation and immunofluorescence approaches coupled with RNAi, we provide evidence for a nuclear complex of Pat1b with the Lsm2-8 heptamer, which binds to the spliceosomal U6 small nuclear RNA (snRNA. Furthermore, we establish the set of interactions connecting Pat1b/Lsm2-8/U6 snRNA/SART3 and additional U4/U6.U5 tri-small nuclear ribonucleoprotein particle (tri-snRNP components in Cajal bodies, the site of snRNP biogenesis. RNA sequencing following Pat1b depletion revealed the preferential upregulation of mRNAs normally found in P bodies and enriched in 3′ UTR AU-rich elements. Changes in >180 alternative splicing events were also observed, characterized by skipping of regulated exons with weak donor sites. Our data demonstrate the dual role of a decapping enhancer in pre-mRNA processing as well as in mRNA decay via distinct nuclear and cytoplasmic Lsm complexes.

  7. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  8. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  9. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    International Nuclear Information System (INIS)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-01

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  10. Application of the RES methodology for identifying features, events and processes (FEPs) for near-field analysis of copper-steel canister

    International Nuclear Information System (INIS)

    Vieno, T.; Hautojaervi, A.; Raiko, H.; Ahonen, L.; Salo, J.P.

    1994-12-01

    Rock Engineering Systems (RES) is an approach to discover the important characteristics and interactions of a complex problem. Recently RES has been applied to identify features, events and processes (FEPs) for performance analysis of nuclear waste repositories. The RES methodology was applied to identify FEPs for the near-field analysis of the copper-steel canister for spent fuel disposal. The aims of the exercise were to learn and test the RES methodology and, secondly, to find out how much the results differ when RES is applied by two different groups on the same problem. A similar exercise was previously carried out by a SKB group. A total of 90 potentially significant FEPs were identified. The exercise showed that the RES methodology is a practicable tool to get a comprehensive and transparent picture of a complex problem. The approach is easy to learn and use. It reveals the important characteristics and interactions and organizes them in a format easy to understand. (9 refs., 5 figs., 3 tabs.)

  11. Disaster Response Modeling Through Discrete-Event Simulation

    Science.gov (United States)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  12. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  13. Process variant comparison: using event logs to detect differences in behavior and business rules

    NARCIS (Netherlands)

    Bolt, A.; de Leoni, M.; van der Aalst, W.M.P.

    2018-01-01

    This paper addresses the problem of comparing different variants of the same process. We aim to detect relevant differences between processes based on what was recorded in event logs. We use transition systems to model behavior and to highlight differences. Transition systems are annotated with

  14. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  15. Preparing for novel versus familiar events: shifts in global and local processing

    NARCIS (Netherlands)

    Förster, J.; Liberman, N.; Shapiro, O.

    2009-01-01

    Six experiments examined whether novelty versus familiarity influences global versus local processing styles. Novelty and familiarity were manipulated by either framing a task as new versus familiar or by asking participants to reflect upon novel versus familiar events prior to the task (i.e.,

  16. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  17. Geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin

    Science.gov (United States)

    Vibhava, F.; Graham, W. D.; Maxwell, R. M.

    2012-12-01

    Streamflow at any given location and time is representative of surface and subsurface contributions from various sources. The ability to fully identify the factors controlling these contributions is key to successfully understanding the transport of contaminants through the system. In this study we developed a fully integrated 3D surface water-groundwater-land surface model, PARFLOW, to evaluate geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin in North Central Florida. In addition to traditional model evaluation criterion, such as comparing field observations to model simulated streamflow and groundwater elevations, we quantitatively evaluated the model's predictions of surface-groundwater interactions over space and time using a suite of binary end-member mixing models that were developed using observed specific conductivity differences among surface and groundwater sources throughout the domain. Analysis of model predictions showed that geologic heterogeneity exerts a strong control on both streamflow generation processes and land atmospheric fluxes in this watershed. In the upper basin, where the karst aquifer is overlain by a thick confining layer, approximately 92% of streamflow is "young" event flow, produced by near stream rainfall. Throughout the upper basin the confining layer produces a persistent high surficial water table which results in high evapotranspiration, low groundwater recharge and thus negligible "inter-event" streamflow. In the lower basin, where the karst aquifer is unconfined, deeper water tables result in less evapotranspiration. Thus, over 80% of the streamflow is "old" subsurface flow produced by diffuse infiltration through the epikarst throughout the lower basin, and all surface contributions to streamflow originate in the upper confined basin. Climatic variability provides a secondary control on surface-subsurface and land-atmosphere fluxes, producing significant seasonal and

  18. Complex processing of rubber waste through energy recovery

    Directory of Open Access Journals (Sweden)

    Roman Smelík

    2015-12-01

    Full Text Available This article deals with the applied energy recovery solutions for complex processing of rubber waste for energy recovery. It deals specifically with the solution that could maximize possible use of all rubber waste and does not create no additional waste that disposal would be expensive and dangerous for the environment. The project is economically viable and energy self-sufficient. The outputs of the process could replace natural gas and crude oil products. The other part of the process is also the separation of metals, which can be returned to the metallurgical secondary production.

  19. Application of complex inoculants in improving the process-ability of grey cast iron for cylinder blocks

    Directory of Open Access Journals (Sweden)

    LIU Wei-ming

    2006-05-01

    Full Text Available Effect of several complex inoculants on mechanical properties, process-ability and sensibility of grey cast iron used in cylinder block were investigated. The experimental results showed that the grey cast iron treated with 60%FeSi75+40%RE complex inoculants has tensile strength consistently at about 295 MPa along with good hardness and improved metallurgy quality. While the grey cast iron inoculated with 20%FeSi75+80%Sr compound inoculants has the best process-ability, the lowest cross-section sensibility and the least microhardness difference. The wear amount of the drill increases correspondingly with the increase of the microhardness difference of matrix structure, indicating the great effect of homogeneousness of matrix structure in the grey cast iron on the machinability of the grey cast iron.

  20. New U-Pb ages in the Diablillos Intrusive Complex, Southern Puna, Argentina: A long magmatic event in the Paleozoic Arc, SW Gondwana

    International Nuclear Information System (INIS)

    Ortiz, Agustin; Hauser, Natalia

    2015-01-01

    The Puna geological region comprises Salta, Jujuy and Catamarca provinces, northwestern Argentina. This 4000 meter above sea level high-plateau region lies between the Central Argentinian Andes. The Puna basement in the central Andes consists of Proterozoic–Paleozoic metamorphic rocks and granitoids. Diverse authors, proposed different models to explain the origin of the basement, where two orogenic events are recognized: the Pampean (Upper Precambrian–Lower Cambrian) and Famatinian (Upper Cambrian–Lower Silurian) (e.g. Ramos et al., 1986; Ramos, 1988; Loewy et al., 2004; for opposite points of view see Becchio et al., 1999; Bock et al., 2000; Buttner et al., 2005). Hence, Lucassen et al. (2000) proposed for the Central Andean basement, an evolution in a mobile belt, where the Pampean and Famatinian cycles are not distinct events but, they are one single, non-differentiable event from 600 to 400 Ma. The mobile belt culminated in low-P/ high-T metamorphism at approximately 525-500 Ma. Then, these were followed by a long-lasting high-thermal gradient regime in the mid-crust until Silurian times. Becchio et al., (2011) defined the Diablillos Intrusive Complex (CID, by its Spanish name), emplaced in the Inca Viejo Range. This range splits the Salares Ratones-Centenario with the Salar Diablillos (Fig.1). This Complex is located in the Eastern Magmatic Belt, Southern Puna, Argentina. Here we present new zircons U-Pb ages by LA-MC-ICPMS in the Diablillos Intrusive Complex, contributing to understanding the magmatic event in the lower Paleozoic arc, SW Gondwana. (author)

  1. New U-Pb ages in the Diablillos Intrusive Complex, Southern Puna, Argentina: A long magmatic event in the Paleozoic Arc, SW Gondwana

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, Agustin; Hauser, Natalia [Universidade de Brasilia (UnB), DF (Brazil). Instituto de Geociencias. Lab. de Geocronologia; Becchio, Raul; Nieves, Alexis; Suzano, Nestor [Universidad Nacional de Salta (UNSa)-CONICET, Salta (Argentina)

    2015-07-01

    The Puna geological region comprises Salta, Jujuy and Catamarca provinces, northwestern Argentina. This 4000 meter above sea level high-plateau region lies between the Central Argentinian Andes. The Puna basement in the central Andes consists of Proterozoic–Paleozoic metamorphic rocks and granitoids. Diverse authors, proposed different models to explain the origin of the basement, where two orogenic events are recognized: the Pampean (Upper Precambrian–Lower Cambrian) and Famatinian (Upper Cambrian–Lower Silurian) (e.g. Ramos et al., 1986; Ramos, 1988; Loewy et al., 2004; for opposite points of view see Becchio et al., 1999; Bock et al., 2000; Buttner et al., 2005). Hence, Lucassen et al. (2000) proposed for the Central Andean basement, an evolution in a mobile belt, where the Pampean and Famatinian cycles are not distinct events but, they are one single, non-differentiable event from 600 to 400 Ma. The mobile belt culminated in low-P/ high-T metamorphism at approximately 525-500 Ma. Then, these were followed by a long-lasting high-thermal gradient regime in the mid-crust until Silurian times. Becchio et al., (2011) defined the Diablillos Intrusive Complex (CID, by its Spanish name), emplaced in the Inca Viejo Range. This range splits the Salares Ratones-Centenario with the Salar Diablillos (Fig.1). This Complex is located in the Eastern Magmatic Belt, Southern Puna, Argentina. Here we present new zircons U-Pb ages by LA-MC-ICPMS in the Diablillos Intrusive Complex, contributing to understanding the magmatic event in the lower Paleozoic arc, SW Gondwana. (author)

  2. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  3. Complex diffusion process for noise reduction

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Barari, A.

    2014-01-01

    equations (PDEs) in image restoration and de-noising prompted many researchers to search for an improvement in the technique. In this paper, a new method is presented for signal de-noising, based on PDEs and Schrodinger equations, named as complex diffusion process (CDP). This method assumes that variations...... for signal de-noising. To evaluate the performance of the proposed method, a number of experiments have been performed using Sinusoid, multi-component and FM signals cluttered with noise. The results indicate that the proposed method outperforms the approaches for signal de-noising known in prior art....

  4. Event recognition using signal spectrograms in long pulse experiments

    International Nuclear Information System (INIS)

    Gonzalez, J.; Ruiz, M.; Barrera, E.; Arcas, G.; Lopez, J. M.; Vega, J.

    2010-01-01

    As discharge duration increases, real-time complex analysis of the signal becomes more important. In this context, data acquisition and processing systems must provide models for designing experiments which use event oriented plasma control. One example of advanced data analysis is signal classification. The off-line statistical analysis of a large number of discharges provides information to develop algorithms for the determination of the plasma parameters from measurements of magnetohydrodinamic waves, for example, to detect density fluctuations induced by the Alfven cascades using morphological patterns. The need to apply different algorithms to the signals and to address different processing algorithms using the previous results necessitates the use of an event-based experiment. The Intelligent Test and Measurement System platform is an example of architecture designed to implement distributed data acquisition and real-time processing systems. The processing algorithm sequence is modeled using an event-based paradigm. The adaptive capacity of this model is based on the logic defined by the use of state machines in SCXML. The Intelligent Test and Measurement System platform mixes a local multiprocessing model with a distributed deployment of services based on Jini.

  5. The underlying event in hard scattering processes

    International Nuclear Information System (INIS)

    Field, R.

    2002-01-01

    The authors study the behavior of the underlying event in hard scattering proton-antiproton collisions at 1.8 TeV and compare with the QCD Monte-Carlo models. The underlying event is everything except the two outgoing hard scattered jets and receives contributions from the beam-beam remnants plus initial and final-state radiation. The data indicate that neither ISAJET or HERWIG produce enough charged particles (with p T > 0.5 GeV/c) from the beam-beam remnant component and that ISAJET produces too many charged particles from initial-state radiation. PYTHIA which uses multiple parton scattering to enhance the underlying event does the best job describing the data

  6. An integrative process model of leadership: examining loci, mechanisms, and event cycles.

    Science.gov (United States)

    Eberly, Marion B; Johnson, Michael D; Hernandez, Morela; Avolio, Bruce J

    2013-09-01

    Utilizing the locus (source) and mechanism (transmission) of leadership framework (Hernandez, Eberly, Avolio, & Johnson, 2011), we propose and examine the application of an integrative process model of leadership to help determine the psychological interactive processes that constitute leadership. In particular, we identify the various dynamics involved in generating leadership processes by modeling how the loci and mechanisms interact through a series of leadership event cycles. We discuss the major implications of this model for advancing an integrative understanding of what constitutes leadership and its current and future impact on the field of psychological theory, research, and practice. © 2013 APA, all rights reserved.

  7. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  8. Reconstructing events, from electronic signals to tracks

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Reconstructing tracks in the events taken by LHC experiments is one of the most challenging and computationally expensive software tasks to be carried out in the data processing chain. A typical LHC event is composed of multiple p-p interactions, each leaving signals from many charged particles in the detector and jus building up an environment of unprecedented complexity. In the lecture I will give an overview of event reconstruction in a typical High Energy Physics experiment. After an introduction to particle tracking detectors I will discuss the concepts and techniques required to master the tracking challenge at the LHC. I will explain how track propagation in a realistic detector works, present different techniques for track fitting and track finding. At the end we will see how all of those techniques play together in the ATLAS track reconstruction application.

  9. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  10. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  11. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Directory of Open Access Journals (Sweden)

    Vera Hagemann

    2017-09-01

    Full Text Available Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes, the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58 or low (n = 58 collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes

  12. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  13. Event-driven simulation of neural population synchronization facilitated by electrical coupling.

    Science.gov (United States)

    Carrillo, Richard R; Ros, Eduardo; Barbour, Boris; Boucheny, Christian; Coenen, Olivier

    2007-02-01

    Most neural communication and processing tasks are driven by spikes. This has enabled the application of the event-driven simulation schemes. However the simulation of spiking neural networks based on complex models that cannot be simplified to analytical expressions (requiring numerical calculation) is very time consuming. Here we describe briefly an event-driven simulation scheme that uses pre-calculated table-based neuron characterizations to avoid numerical calculations during a network simulation, allowing the simulation of large-scale neural systems. More concretely we explain how electrical coupling can be simulated efficiently within this computation scheme, reproducing synchronization processes observed in detailed simulations of neural populations.

  14. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  15. Integrating technology into complex intervention trial processes: a case study.

    Science.gov (United States)

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database

  16. Geologic factors in the isolation of nuclear waste: evaluation of long-term geomorphic processes and events

    International Nuclear Information System (INIS)

    Mara, S.J.

    1979-01-01

    In this report the rate, duration, and magnitude of changes from geomorphic processes and events in the Southwest and the Gulf Coast over the next million years are projected. The projections were made by reviewing the pertinent literature; evaluating the geomorphic history of each region, especially that during the Quaternary Period; identifying the geomorphic processes and events likely to be significant in the two regions of interest; and estimating the average and worst-case conditions expected over the next million years

  17. Impact of background noise and sentence complexity on cognitive processing demands

    DEFF Research Database (Denmark)

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive processingdemands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were recorded...... in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the subjectively perceived difficulty of sentence comprehension. The results showed...

  18. Impact of background noise and sentence complexity on cognitive processing effort

    DEFF Research Database (Denmark)

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive pro- cessing demands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were...... recorded in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the sub- jectively perceived difficulty of sentence comprehension. The results...

  19. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  20. Capturing connectivity and causality in complex industrial processes

    CERN Document Server

    Yang, Fan; Shah, Sirish L; Chen, Tongwen

    2014-01-01

    This brief reviews concepts of inter-relationship in modern industrial processes, biological and social systems. Specifically ideas of connectivity and causality within and between elements of a complex system are treated; these ideas are of great importance in analysing and influencing mechanisms, structural properties and their dynamic behaviour, especially for fault diagnosis and hazard analysis. Fault detection and isolation for industrial processes being concerned with root causes and fault propagation, the brief shows that, process connectivity and causality information can be captured in two ways: ·      from process knowledge: structural modeling based on first-principles structural models can be merged with adjacency/reachability matrices or topology models obtained from process flow-sheets described in standard formats; and ·      from process data: cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian ne...

  1. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  2. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  3. Identification of Hazardous Events for Drinking Water Production Process Using Managed Aquifer Recharge in the Nakdong River Delta, Korea

    International Nuclear Information System (INIS)

    Sang-Il, L.; Ji, H.W.

    2016-01-01

    Various hazardous events can cause chemical, microbial or physical hazards to a water supply system. The World Health Organization (WHO) and some countries have introduced the hazardous event analysis for identifying potential events which may be harmful to the safety of drinking water. This study extends the application of the hazardous event analysis into drinking water production using managed aquifer recharge (MAR). MAR is a way of using an aquifer to secure water resources by storing freshwater for future use and pumping it whenever necessary. The entire drinking water production process is subjected to the analysis from the catchment area to the consumer. Hazardous event analysis incorporates site-specific data as well as common issues occurring in the process of drinking water production. The hazardous events are classified based on chemical, microbial or physical characteristics. Likelihood and severity values are assigned, resulting in quantitative risk by multiplying them. The study site is located at a coastal area in the delta of the Nakdong River, South Korea. The site has suffered from salt water intrusion and surface water pollution from the water upstream. Nine major hazardous events were identified out of total 114 events from 10 drinking water production processes. These major hazardous events will provide useful information on what to be done to secure the water quality produced by a new water supply method. (author)

  4. ALADDIN: a neural model for event classification in dynamic processes

    International Nuclear Information System (INIS)

    Roverso, Davide

    1998-02-01

    ALADDIN is a prototype system which combines fuzzy clustering techniques and artificial neural network (ANN) models in a novel approach to the problem of classifying events in dynamic processes. The main motivation for the development of such a system derived originally from the problem of finding new principled methods to perform alarm structuring/suppression in a nuclear power plant (NPP) alarm system. One such method consists in basing the alarm structuring/suppression on a fast recognition of the event generating the alarms, so that a subset of alarms sufficient to efficiently handle the current fault can be selected to be presented to the operator, minimizing in this way the operator's workload in a potentially stressful situation. The scope of application of a system like ALADDIN goes however beyond alarm handling, to include diagnostic tasks in general. The eventual application of the system to domains other than NPPs was also taken into special consideration during the design phase. In this document we report on the first phase of the ALADDIN project which consisted mainly in a comparative study of a series of ANN-based approaches to event classification, and on the proposal of a first system prototype which is to undergo further tests and, eventually, be integrated in existing alarm, diagnosis, and accident management systems such as CASH, IDS, and CAMS. (author)

  5. Short template switch events explain mutation clusters in the human genome.

    Science.gov (United States)

    Löytynoja, Ari; Goldman, Nick

    2017-06-01

    Resequencing efforts are uncovering the extent of genetic variation in humans and provide data to study the evolutionary processes shaping our genome. One recurring puzzle in both intra- and inter-species studies is the high frequency of complex mutations comprising multiple nearby base substitutions or insertion-deletions. We devised a generalized mutation model of template switching during replication that extends existing models of genome rearrangement and used this to study the role of template switch events in the origin of short mutation clusters. Applied to the human genome, our model detects thousands of template switch events during the evolution of human and chimp from their common ancestor and hundreds of events between two independently sequenced human genomes. Although many of these are consistent with a template switch mechanism previously proposed for bacteria, our model also identifies new types of mutations that create short inversions, some flanked by paired inverted repeats. The local template switch process can create numerous complex mutation patterns, including hairpin loop structures, and explains multinucleotide mutations and compensatory substitutions without invoking positive selection, speculative mechanisms, or implausible coincidence. Clustered sequence differences are challenging for current mapping and variant calling methods, and we show that many erroneous variant annotations exist in human reference data. Local template switch events may have been neglected as an explanation for complex mutations because of biases in commonly used analyses. Incorporation of our model into reference-based analysis pipelines and comparisons of de novo assembled genomes will lead to improved understanding of genome variation and evolution. © 2017 Löytynoja and Goldman; Published by Cold Spring Harbor Laboratory Press.

  6. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  7. Cognitive processing in non-communicative patients: what can event-related potentials tell us?

    Directory of Open Access Journals (Sweden)

    Zulay Rosario Lugo

    2016-11-01

    Full Text Available Event-related potentials (ERP have been proposed to improve the differential diagnosis of non-responsive patients. We investigated the potential of the P300 as a reliable marker of conscious processing in patients with locked-in syndrome (LIS. Eleven chronic LIS patients and ten healthy subjects (HS listened to a complex-tone auditory oddball paradigm, first in a passive condition (listen to the sounds and then in an active condition (counting the deviant tones. Seven out of nine HS displayed a P300 waveform in the passive condition and all in the active condition. HS showed statistically significant changes in peak and area amplitude between conditions. Three out of seven LIS patients showed the P3 waveform in the passive condition and 5 of 7 in the active condition. No changes in peak amplitude and only a significant difference at one electrode in area amplitude were observed in this group between conditions. We conclude that, in spite of keeping full consciousness and intact or nearly intact cortical functions, compared to HS, LIS patients present less reliable results when testing with ERP, specifically in the passive condition. We thus strongly recommend applying ERP paradigms in an active condition when evaluating consciousness in non-responsive patients.

  8. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    Science.gov (United States)

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at

  9. BPMN as a Communication Language for the Process- and Event-Oriented Perspectives in Fact-Oriented Conceptual Models

    Science.gov (United States)

    Bollen, Peter

    In this paper we will show how the OMG specification of BPMN (Business Process Modeling Notation) can be used to model the process- and event-oriented perspectives of an application subject area. We will illustrate how the fact-oriented conceptual models for the information-, process- and event perspectives can be used in a 'bottom-up' approach for creating a BPMN model in combination with other approaches, e.g. the use of a textual description. We will use the common doctor's office example as a running example in this article.

  10. Electrospray ionization mass spectrometry for the hydrolysis complexes of cisplatin: implications for the hydrolysis process of platinum complexes.

    Science.gov (United States)

    Feifan, Xie; Pieter, Colin; Jan, Van Bocxlaer

    2017-07-01

    Non-enzyme-dependent hydrolysis of the drug cisplatin is important for its mode of action and toxicity. However, up until today, the hydrolysis process of cisplatin is still not completely understood. In the present study, the hydrolysis of cisplatin in an aqueous solution was systematically investigated by using electrospray ionization mass spectrometry coupled to liquid chromatography. A variety of previously unreported hydrolysis complexes corresponding to monomeric, dimeric and trimeric species were detected and identified. The characteristics of the Pt-containing complexes were investigated by using collision-induced dissociation (CID). The hydrolysis complexes demonstrate distinctive and correlative CID characteristics, which provides tools for an informative identification. The most frequently observed dissociation mechanism was sequential loss of NH 3 , H 2 O and HCl. Loss of the Pt atom was observed as the final step during the CID process. The formation mechanisms of the observed complexes were explored and experimentally examined. The strongly bound dimeric species, which existed in solution, are assumed to be formed from the clustering of the parent compound and its monohydrated or dihydrated complexes. The role of the electrospray process in the formation of some of the observed ions was also evaluated, and the electrospray ionization-related cold clusters were identified. The previously reported hydrolysis equilibria were tested and subsequently refined via a hydrolysis study resulting in a renewed mechanistic equilibrium system of cisplatin as proposed from our results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. On-the-fly auditing of business processes

    NARCIS (Netherlands)

    Hee, van K.M.; Hidders, A.J.H.; Houben, G.J.P.M.; Paredaens, J.; Thiran, P.A.P.; Jensen, K.; Donatelli, S.; Koutny, M.

    2010-01-01

    Information systems supporting business processes are usually very complex. If we have to ensure that certain business rules are enforced in a business process, it is often easier to design a separate system, called a monitor, that collects the events of the business processes and verifies whether

  12. Complex plasmochemical processing of solid fuel

    Directory of Open Access Journals (Sweden)

    Vladimir Messerle

    2012-12-01

    Full Text Available Technology of complex plasmaochemical processing of solid fuel by Ecibastuz bituminous and Turgay brown coals is presented. Thermodynamic and experimental study of the technology was fulfilled. Use of this technology allows producing of synthesis gas from organic mass of coal and valuable components (technical silicon, ferrosilicon, aluminum and silicon carbide and microelements of rare metals: uranium, molybdenum, vanadium etc. from mineral mass of coal. Produced a high-calorific synthesis gas can be used for methanol synthesis, as high-grade reducing gas instead of coke, as well as energy gas in thermal power plants.

  13. Features, events and processes evaluation catalogue for argillaceous media

    International Nuclear Information System (INIS)

    Mazurek, M.; Pearson, F.J.; Volckaert, G.; Bock, H.

    2003-01-01

    The OECD/NEA Working Group on the Characterisation, the Understanding and the Performance of Argillaceous Rocks as Repository Host Formations for the disposal of radioactive waste (known as the 'Clay Club') launched a project called FEPCAT (Features, Events and Processes Catalogue for argillaceous media) in late 1998. The present report provides the results of work performed by an expert group to develop a FEPs database related to argillaceous formations, whether soft or indurated. It describes the methodology used for the work performed, provides a list of relevant FEPs and summarises the knowledge on each of them. It also provides general conclusions and identifies priorities for future work. (authors)

  14. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  15. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    Science.gov (United States)

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  16. Features, Events and Processes in UZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    P. Persoff

    2005-08-04

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA.

  17. Features, Events and Processes in UZ Flow and Transport

    International Nuclear Information System (INIS)

    P. Persoff

    2005-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA

  18. Features, Events, and Processes in UZ Flow and Transport

    International Nuclear Information System (INIS)

    Persoff, P.

    2004-01-01

    The purpose of this report is to evaluate and document the inclusion or exclusion of the unsaturated zone (UZ) features, events, and processes (FEPs) with respect to modeling that supports the total system performance assessment (TSPA) for license application (LA) for a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP, along with the technical basis for the screening decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) in 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The FEPs deal with UZ flow and radionuclide transport, including climate, surface water infiltration, percolation, drift seepage, and thermally coupled processes. This analysis summarizes the implementation of each FEP in TSPA-LA (that is, how the FEP is included) and also provides the technical basis for exclusion from TSPA-LA (that is, why the FEP is excluded). This report supports TSPA-LA

  19. The preparation of reports of a significant event at a uranium processing or uranium handling facility

    International Nuclear Information System (INIS)

    1988-08-01

    Licenses to operate uranium processing or uranium handling facilities require that certain events be reported to the Atomic Energy Control Board (AECB) and to other regulatory authorities. Reports of a significant event describe unusual events which had or could have had a significant impact on the safety of facility operations, the worker, the public or on the environment. The purpose of this guide is to suggest an acceptable method of reporting a significant event to the AECB and to describe the information that should be included. The reports of a significant event are made available to the public in accordance with the provisions of the Access to Information Act and the AECB's policy on public access to licensing information

  20. Mesoscale Convective Complexes (MCCs) over the Indonesian Maritime Continent during the ENSO events

    Science.gov (United States)

    Trismidianto; Satyawardhana, H.

    2018-05-01

    This study analyzed the mesoscale convective complexes (MCCs) over the Indonesian Maritime Continent (IMC) during the El Niño/Southern Oscillation (ENSO) events for the the15-year period from 2001 to 2015. The MCCs identified by infrared satellite imagery that obtained from the Himawari generation satellite data. This study has reported that the frequencies of the MCC occurrences at the El Niño and La Niña were higher than that of neutral conditions during DJF. Peak of MCC occurrences during DJF at La Niña and neutral condition is in February, while El Niño is in January. ENSO strongly affects the occurrence of MCC during the DJF season. The existences of the MCC were also accompanied by increased rainfall intensity at the locations of the MCC occurrences for all ENSO events. During JJA seasons, the MCC occurrences are always found during neutral conditions, El Niño and La Niña in Indian Ocean. MCC occurring during the JJA season on El Niño and neutral conditions averaged much longer than during the DJF season. In contrast, MCCs occurring in La Niña conditions during the JJA season are more rapidly extinct than during the DJF. It indicates that the influence of MCC during La Niña during the DJF season is stronger than during the JJA season.

  1. Can complex cellular processes be governed by simple linear rules?

    Science.gov (United States)

    Selvarajoo, Kumar; Tomita, Masaru; Tsuchiya, Masa

    2009-02-01

    Complex living systems have shown remarkably well-orchestrated, self-organized, robust, and stable behavior under a wide range of perturbations. However, despite the recent generation of high-throughput experimental datasets, basic cellular processes such as division, differentiation, and apoptosis still remain elusive. One of the key reasons is the lack of understanding of the governing principles of complex living systems. Here, we have reviewed the success of perturbation-response approaches, where without the requirement of detailed in vivo physiological parameters, the analysis of temporal concentration or activation response unravels biological network features such as causal relationships of reactant species, regulatory motifs, etc. Our review shows that simple linear rules govern the response behavior of biological networks in an ensemble of cells. It is daunting to know why such simplicity could hold in a complex heterogeneous environment. Provided physical reasons can be explained for these phenomena, major advancement in the understanding of basic cellular processes could be achieved.

  2. The Effects of Syntactic Complexity on Processing Sentences in Noise

    Science.gov (United States)

    Carroll, Rebecca; Ruigendijk, Esther

    2013-01-01

    This paper discusses the influence of stationary (non-fluctuating) noise on processing and understanding of sentences, which vary in their syntactic complexity (with the factors canonicity, embedding, ambiguity). It presents data from two RT-studies with 44 participants testing processing of German sentences in silence and in noise. Results show a…

  3. Working memory processes show different degrees of lateralization : Evidence from event-related potentials

    NARCIS (Netherlands)

    Talsma, D; Wijers, A.A.; Klaver, P; Mulder, G.

    This study aimed to identify different processes in working memory, using event-related potentials (ERPs) and response times. Abstract polygons were presented for memorization and subsequent recall in a delayed matching-to-sample paradigm. Two polygons were presented bilaterally for memorization and

  4. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  5. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    Science.gov (United States)

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity

  6. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  7. On-the-fly auditing of business processes

    NARCIS (Netherlands)

    Hee, van K.M.; Hidders, A.J.H.; Houben, G.J.P.M.; Paredaens, J.; Thiran, P.A.P.

    2009-01-01

    Information systems supporting business process are mostly very complex. If we have to ensure that certain business rules are enforced in a business process, it is often easier to design a separate system, called a monitor, that collects the events of the business processes and verifies whether the

  8. A tool for aligning event logs and prescriptive process models through automated planning

    OpenAIRE

    De Leoni, M.; Lanciano, G.; Marrella, A.

    2017-01-01

    In Conformance Checking, alignment is the problem of detecting and repairing nonconformity between the actual execution of a business process, as recorded in an event log, and the model of the same process. Literature proposes solutions for the alignment problem that are implementations of planning algorithms built ad-hoc for the specific problem. Unfortunately, in the era of big data, these ad-hoc implementations do not scale sufficiently compared with wellestablished planning systems. In th...

  9. Complex reassortment events of unusual G9P[4] rotavirus strains in India between 2011 and 2013.

    Science.gov (United States)

    Doan, Yen Hai; Suzuki, Yoshiyuki; Fujii, Yoshiki; Haga, Kei; Fujimoto, Akira; Takai-Todaka, Reiko; Someya, Yuichi; Nayak, Mukti K; Mukherjee, Anupam; Imamura, Daisuke; Shinoda, Sumio; Chawla-Sarkar, Mamta; Katayama, Kazuhiko

    2017-10-01

    Rotavirus A (RVA) is the predominant etiological agent of acute gastroenteritis in young children worldwide. Recently, unusual G9P[4] rotavirus strains emerged with high prevalence in many countries. Such intergenogroup reassortant strains highlight the ongoing spread of unusual rotavirus strains throughout Asia. This study was undertaken to determine the whole genome of eleven unusual G9P[4] strains detected in India during 2011-2013, and to compare them with other human and animal global RVAs to understand the exact origin of unusual G9P[4] circulating in India and other countries worldwide. Of these 11 RVAs, four G9P[4] strains were double-reassortants with the G9-VP7 and E6-NSP4 genes on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N2-T2-E6-H2). The other strains showed a complex genetic constellation, likely derived from triple reassortment event with the G9-VP7, N1-NSP2 and E6-NSP4 on a DS-1-like genetic backbone (G9-P[4]-I2-R2-C2-M2-A2-N1-T2-E6-H2). Presumably, these unusual G9P[4] strains were generated after several reassortment events between the contemporary co-circulating human rotavirus strains. Moreover, the point mutation S291L at the interaction site between inner and outer capsid proteins of VP6 gene may be important in the rapid spread of this unusual strain. The complex reassortment events within the G9[4] strains may be related to the high prevalence of mixed infections in India as reported in this study and other previous studies. Copyright © 2017. Published by Elsevier B.V.

  10. Process evaluation for complex interventions in health services research: analysing context, text trajectories and disruptions.

    Science.gov (United States)

    Murdoch, Jamie

    2016-08-19

    Process evaluations assess the implementation and sustainability of complex healthcare interventions within clinical trials, with well-established theoretical models available for evaluating intervention delivery within specific contexts. However, there is a need to translate conceptualisations of context into analytical tools which enable the dynamic relationship between context and intervention implementation to be captured and understood. In this paper I propose an alternative approach to the design, implementation and analysis of process evaluations for complex health interventions through a consideration of trial protocols as textual documents, distributed and enacted at multiple contextual levels. As an example, I conduct retrospective analysis of a sample of field notes and transcripts collected during the ESTEEM study - a cluster randomised controlled trial of primary care telephone triage. I draw on theoretical perspectives associated with Linguistic Ethnography to examine the delivery of ESTEEM through staff orientation to different texts. In doing so I consider what can be learned from examining the flow and enactment of protocols for notions of implementation and theoretical fidelity (i.e. intervention delivered as intended and whether congruent with the intervention theory). Implementation of the triage intervention required staff to integrate essential elements of the protocol within everyday practice, seen through the adoption and use of different texts that were distributed across staff and within specific events. Staff were observed deploying texts in diverse ways (e.g. reinterpreting scripts, deviating from standard operating procedures, difficulty completing decision support software), providing numerous instances of disruption to maintaining intervention fidelity. Such observations exposed tensions between different contextual features in which the trial was implemented, offering theoretical explanations for the main trial findings. The value of

  11. ATLAS TDAQ/DCS Event Filter Event Handler Requirements

    CERN Document Server

    Bee, C P; Meessen, C; Qian, Z; Touchard, F; Green, P; Pinfold, J L; Wheeler, S; Negri, A; Scannicchio, D A; Vercesi, V

    2002-01-01

    The second iteration of the Software Development Process of the ATLAS Event Filter has been launched. A summary of the design phase of the first iteration is given in the introduction. The document gives constraints, use cases, functional and non-functional requirements for the Event Handler sub-system of the Event Filter.

  12. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  13. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  14. Episodes, events, and models

    Directory of Open Access Journals (Sweden)

    Sangeet eKhemlani

    2015-10-01

    Full Text Available We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning.

  15. Early referential context effects in sentence processing: Evidence from event-related brain potentials

    NARCIS (Netherlands)

    Berkum, J.J.A. van; Brown, C.M.; Hagoort, P.

    1999-01-01

    An event-related brain potentials experiment was carried out to examine the interplay of referential and structural factors during sentence processing in discourse. Subjects read (Dutch) sentences beginning like “David told the girl that … ” in short story contexts that had introduced either one or

  16. Discovering deviating cases and process variants using trace clustering

    NARCIS (Netherlands)

    Hompes, B.F.A.; Buijs, J.C.A.M.; van der Aalst, W.M.P.; Dixit, P.M.; Buurman, J.

    2015-01-01

    Information systems supporting business processes generate event data which provide the starting point for a range of process mining techniques. Lion's share of real-life processes are complex and ad-hoc, which creates problems for traditional process mining techniques, that cannot deal with such

  17. Investigating source processes of isotropic events

    Science.gov (United States)

    Chiang, Andrea

    This dissertation demonstrates the utility of the complete waveform regional moment tensor inversion for nuclear event discrimination. I explore the source processes and associated uncertainties for explosions and earthquakes under the effects of limited station coverage, compound seismic sources, assumptions in velocity models and the corresponding Green's functions, and the effects of shallow source depth and free-surface conditions. The motivation to develop better techniques to obtain reliable source mechanism and assess uncertainties is not limited to nuclear monitoring, but they also provide quantitative information about the characteristics of seismic hazards, local and regional tectonics and in-situ stress fields of the region . This dissertation begins with the analysis of three sparsely recorded events: the 14 September 1988 US-Soviet Joint Verification Experiment (JVE) nuclear test at the Semipalatinsk test site in Eastern Kazakhstan, and two nuclear explosions at the Chinese Lop Nor test site. We utilize a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long period waveforms and first motion observations provides unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We examine the effects of the free surface on the moment tensor via synthetic testing, and apply the moment tensor based discrimination method to well-recorded chemical explosions. These shallow chemical explosions represent rather severe source-station geometry in terms of the vanishing traction issues. We show that the combined waveform and first motion method enables the unique discrimination of these events, even though the data include unmodeled single force components resulting from the collapse and blowout of the quarry face immediately following the initial

  18. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  19. Processing of visual semantic information to concrete words: temporal dynamics and neural mechanisms indicated by event-related brain potentials( ).

    Science.gov (United States)

    van Schie, Hein T; Wijers, Albertus A; Mars, Rogier B; Benjamins, Jeroen S; Stowe, Laurie A

    2005-05-01

    Event-related brain potentials were used to study the retrieval of visual semantic information to concrete words, and to investigate possible structural overlap between visual object working memory and concreteness effects in word processing. Subjects performed an object working memory task that involved 5 s retention of simple 4-angled polygons (load 1), complex 10-angled polygons (load 2), and a no-load baseline condition. During the polygon retention interval subjects were presented with a lexical decision task to auditory presented concrete (imageable) and abstract (nonimageable) words, and pseudowords. ERP results are consistent with the use of object working memory for the visualisation of concrete words. Our data indicate a two-step processing model of visual semantics in which visual descriptive information of concrete words is first encoded in semantic memory (indicated by an anterior N400 and posterior occipital positivity), and is subsequently visualised via the network for object working memory (reflected by a left frontal positive slow wave and a bilateral occipital slow wave negativity). Results are discussed in the light of contemporary models of semantic memory.

  20. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    National Research Council Canada - National Science Library

    Neu, Charles R; Davenport, Jon; Smith, William R

    2007-01-01

    This paper uses discrete-event simulation modeling, inventory-reduction, and process improvement concepts to identify and analyze possibilities for improving the training continuum at the Marine Corps...

  1. Available processing resources influence encoding-related brain activity before an event

    OpenAIRE

    Galli, Giulia; Gebert, A. Dorothea; Otten, Leun J.

    2013-01-01

    Effective cognitive functioning not only relies on brain activity elicited by an event, but also on activity that precedes it. This has been demonstrated in a number of cognitive domains, including memory. Here, we show that brain activity that precedes the effective encoding of a word into long-term memory depends on the availability of sufficient processing resources. We recorded electrical brain activity from the scalps of healthy adult men and women while they memorized intermixed visual ...

  2. Food processing as an agricultural countermeasure after an accidental contamination event

    International Nuclear Information System (INIS)

    Igreja, Eduardo; Rochedo, Elaine R.R.; Prado, Nadya M.P.D.; Silva, Diogo N.G.

    2013-01-01

    Food processing allows significant reduction in the radionuclide contamination of foodstuffs. The effects of processing on contaminated food depend on the radionuclide, the type of foodstuff and the method of processing. The effectiveness of radionuclide removal from raw material during processing can vary widely; however, processing of raw materials of vegetable and animal origin is often considered one of the most effective countermeasures for reducing the radioactive contamination of the foodstuff to or below permissible levels, and can be applied both domestically and in industrial processing of food. The food processing retention factor, Fr, is the fraction of radionuclide activity that is retained in the food after processing; it is obtained by the product of two quantities, the processing efficiency, Pe, that is the ratio of the fresh weight of the processed food to the weight of the original raw material, and the processing factor, Pf, that is the ratio of the radionuclide activity concentrations in the processed and in the raw material. The objective of this work was to investigate the effect of the reduction in dose due to food processing after a nuclear or radiological accident. Radionuclides considered were Cs-137, Sr-90 and I-131. The effect on total diet of individuals was investigating for a typical diet of the Southeast region, where the Brazilian Nuclear Power Plants are located. The effect was analyzed considering the use of the processing technologies after contamination events occurring in different seasons of the year. (author)

  3. Organotypic three-dimensional culture model of mesenchymal and epithelial cells to examine tissue fusion events.

    Science.gov (United States)

    Tissue fusion during early mammalian development requires coordination of multiple cell types, the extracellular matrix, and complex signaling pathways. Fusion events during processes including heart development, neural tube closure, and palatal fusion are dependent on signaling ...

  4. Complex Ornament Machining Process on a CNC Router

    Directory of Open Access Journals (Sweden)

    Camelia COŞEREANU

    2014-03-01

    Full Text Available The paper investigates the CNC routering possibilities for three species of wood, namely ash (Fraxinus Excelsior, lime wood (Tilia cordata and fir wood (Abies Alba, in order to obtain right surfaces of Art Nouveau sculptured ornaments. Given the complexity of the CNC tool path for getting wavy shapes of Art Nouveau decorations, the choice of processing parameters for each processed species of wood requires a laborious research work to correlate these parameters. Two Art Nouveau ornaments are proposed for the investigation. They are CNC routered using two types of cutting tools. The processed parameters namely the spindle speed, feed speed and depth of cut were the three variables of the machining process for the three species of wood, which were combined so, to provide good surface finish as a quality attribute. There were totally forty six variants of combining the processing parameter which were applied for CNC routering the samples made of the three species of wood. At the end, an optimum combination of the processed parameters is recommended for each species of wood.

  5. Complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit

    International Nuclear Information System (INIS)

    Abdusalyamova, M.N.; Gadoev, S.A.; Dreisinger, D.; Solozhenkin, P.M.

    2013-01-01

    Present article is devoted to complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit. The purpose of research was obtaining the metallic mercury and antimony with further gold and thallium extraction.

  6. Constructing Dynamic Event Trees from Markov Models

    International Nuclear Information System (INIS)

    Paolo Bucci; Jason Kirschenbaum; Tunc Aldemir; Curtis Smith; Ted Wood

    2006-01-01

    In the probabilistic risk assessment (PRA) of process plants, Markov models can be used to model accurately the complex dynamic interactions between plant physical process variables (e.g., temperature, pressure, etc.) and the instrumentation and control system that monitors and manages the process. One limitation of this approach that has prevented its use in nuclear power plant PRAs is the difficulty of integrating the results of a Markov analysis into an existing PRA. In this paper, we explore a new approach to the generation of failure scenarios and their compilation into dynamic event trees from a Markov model of the system. These event trees can be integrated into an existing PRA using software tools such as SAPHIRE. To implement our approach, we first construct a discrete-time Markov chain modeling the system of interest by: (a) partitioning the process variable state space into magnitude intervals (cells), (b) using analytical equations or a system simulator to determine the transition probabilities between the cells through the cell-to-cell mapping technique, and, (c) using given failure/repair data for all the components of interest. The Markov transition matrix thus generated can be thought of as a process model describing the stochastic dynamic behavior of the finite-state system. We can therefore search the state space starting from a set of initial states to explore all possible paths to failure (scenarios) with associated probabilities. We can also construct event trees of arbitrary depth by tracing paths from a chosen initiating event and recording the following events while keeping track of the probabilities associated with each branch in the tree. As an example of our approach, we use the simple level control system often used as benchmark in the literature with one process variable (liquid level in a tank), and three control units: a drain unit and two supply units. Each unit includes a separate level sensor to observe the liquid level in the tank

  7. COMPLEX SIMULATION MODEL OF TRAIN BREAKING-UP PROCESS AT THE HUMPS

    Directory of Open Access Journals (Sweden)

    E. B. Demchenko

    2015-11-01

    Full Text Available Purpose. One of the priorities of station sorting complex functioning improvement is the breaking-up process energy consumptions reduction, namely: fuel consumption for train pushing and electric energy consumption for cut braking. In this regard, an effective solution of the problem of energy consumption reduction at breaking-up subsystem requires a comprehensive handling of train pushing and cut rolling down processes. At the same time, the analysis showed that the current task of pushing process improvement and cut rolling down effectiveness increase are solved separately. To solve this problem it is necessary to develop the complex simulation model of train breaking up process at humps. Methodology. Pushing process simulation was done based on adapted under the shunting conditions traction calculations. In addition, the features of shunting locomotives work at the humps were taken into account. In order to realize the current pushing mode the special algorithm of hump locomotive controlling, which along with the safety shunting operation requirements takes into account behavioral factors associated with engineer control actions was applied. This algorithm provides train smooth acceleration and further movement with speed, which is close to the set speed. Hump locomotive fuel consumptions were determined based on the amount of mechanical work performed by locomotive traction. Findings. The simulation model of train pushing process was developed and combined with existing cut rolling down model. Cut initial velocity is determined during simulation process. The obtained initial velocity is used for further cut rolling process modeling. In addition, the modeling resulted in sufficiently accurate determination of the fuel rates consumed for train breaking-up. Originality. The simulation model of train breaking-up process at the humps, which in contrast to the existing models allows reproducing complexly all the elements of this process in detail

  8. Study of multiple hard-scatter processes from different p p interactions in the same ATLAS event

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    Given the large integrated luminosity and the large average pileup in the 2017 ATLAS dataset, the probability of having two leptonically decaying Z bosons originating from separate $pp$ interactions in the same LHC bunch crossing becomes non-negligible. Such events are searched for and the number observed compared with the expectation. These types of events (also for the case involving other hard scatter processes, such as W, photon or top quark production, in the same event) may cause additional backgrounds for particular physics analyses, and therefore this background must be accounted for when relevant.

  9. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  10. Membrane Processes Based on Complexation Reactions of Pollutants as Sustainable Wastewater Treatments

    Directory of Open Access Journals (Sweden)

    Teresa Poerio

    2009-11-01

    Full Text Available Water is today considered to be a vital and limited resource due to industrial development and population growth. Developing appropriate water treatment techniques, to ensure a sustainable management, represents a key point in the worldwide strategies. By removing both organic and inorganic species using techniques based on coupling membrane processes and appropriate complexing agents to bind pollutants are very important alternatives to classical separation processes in water treatment. Supported Liquid Membrane (SLM and Complexation Ultrafiltration (CP-UF based processes meet the sustainability criteria because they require low amounts of energy compared to pressure driven membrane processes, low amounts of complexing agents and they allow recovery of water and some pollutants (e.g., metals. A more interesting process, on the application point of view, is the Stagnant Sandwich Liquid Membrane (SSwLM, introduced as SLM implementation. It has been studied in the separation of the drug gemfibrozil (GEM and of copper(II as organic and inorganic pollutants in water. Obtained results showed in both cases the higher efficiency of SSwLM with respect to the SLM system configuration. Indeed higher stability (335.5 vs. 23.5 hours for GEM; 182.7 vs. 49.2 for copper(II and higher fluxes (0.662 vs. 0.302 mmol·h-1·m-2 for GEM; 43.3 vs. 31.0 for copper(II were obtained by using the SSwLM. Concerning the CP-UF process, its feasibility was studied in the separation of metals from waters (e.g., from soil washing, giving particular attention to process sustainability such as water and polymer recycle, free metal and water recovery. The selectivity of the CP-UF process was also validated in the separate removal of copper(II and nickel(II both contained in synthetic and real aqueous effluents. Thus, complexation reactions involved in the SSwLM and the CP-UF processes play a key role to meet the sustainability criteria.

  11. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  12. Transient paralysis during acupuncture therapy: a case report of an adverse event.

    Science.gov (United States)

    Beable, Anne

    2013-09-01

    A patient with apparently well-controlled epilepsy with a painful musculoskeletal condition was treated successfully with two sessions of acupuncture. However, 4 h after the first treatment and during the second, an adverse event involving impairment of consciousness occurred. The patient subsequently experienced an increased frequency of complex partial seizures resulting in the loss of his driving licence. A detailed retrospective review of the past medical history indicated that the patient probably had comorbidities in the form of rapid eye movement sleep behaviour disorder and dysfunctional somatosensory/vestibular processing. Acupuncture may have triggered the adverse event via shared neurosubstrates. This adverse event raises possible implications regarding safe clinical acupuncture practice.

  13. The Emotional Impact of Traditional and New Media in Social Events

    Science.gov (United States)

    Salcudean, Minodora; Muresan, Raluca

    2017-01-01

    In past times, media were the sole vector to reflect in their entire complexity the events surrounding major world tragedies. Nowadays, social media are an essential component of the media process and classical press channels are connected to the social networking flow, where they can find information and, at the same time, tap into the emotional…

  14. Blind Source Separation of Event-Related EEG/MEG.

    Science.gov (United States)

    Metsomaa, Johanna; Sarvas, Jukka; Ilmoniemi, Risto Juhani

    2017-09-01

    Blind source separation (BSS) can be used to decompose complex electroencephalography (EEG) or magnetoencephalography data into simpler components based on statistical assumptions without using a physical model. Applications include brain-computer interfaces, artifact removal, and identifying parallel neural processes. We wish to address the issue of applying BSS to event-related responses, which is challenging because of nonstationary data. We introduce a new BSS approach called momentary-uncorrelated component analysis (MUCA), which is tailored for event-related multitrial data. The method is based on approximate joint diagonalization of multiple covariance matrices estimated from the data at separate latencies. We further show how to extend the methodology for autocovariance matrices and how to apply BSS methods suitable for piecewise stationary data to event-related responses. We compared several BSS approaches by using simulated EEG as well as measured somatosensory and transcranial magnetic stimulation (TMS) evoked EEG. Among the compared methods, MUCA was the most tolerant one to noise, TMS artifacts, and other challenges in the data. With measured somatosensory data, over half of the estimated components were found to be similar by MUCA and independent component analysis. MUCA was also stable when tested with several input datasets. MUCA is based on simple assumptions, and the results suggest that MUCA is robust with nonideal data. Event-related responses and BSS are valuable and popular tools in neuroscience. Correctly designed BSS is an efficient way of identifying artifactual and neural processes from nonstationary event-related data.

  15. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna; Oliva, Romina; Cavallo, Luigi; Bonvin, Alexandre M. J. J.

    2017-01-01

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  16. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna

    2017-04-12

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  17. ELECTRIC FACTORS INFLUENCING THE COMPLEX EROSION PROCESSING BY INTRODUCING THE ELECTROLYTE THROUGH THE TRANSFER OBJECT

    Directory of Open Access Journals (Sweden)

    Alin Nioata

    2014-05-01

    Full Text Available The electric and electrochemical complex erosion processing is influenced by a great number of factors acting in tight interdependence and mutually influencing one another for achieving the stability of the processing process and achieving the final technological characteristics.The values taking part in developing the fundamental phenomena of the mechanism of complex erosion prevailing and contributes to the definition of technological characteristics, are factors.The paper presents the U potential difference and electric strength I as determining factors of the complex erosion process as well as other factors deriving from them: the current density, the power of the supply source.

  18. Sex differences in humor processing: An event-related potential study.

    Science.gov (United States)

    Chang, Yi-Tzu; Ku, Li-Chuan; Chen, Hsueh-Chih

    2018-02-01

    Numerous behavioral studies and a handful of functional neuroimaging studies have reported sex differences in humor. However, no study to date has examined differences in the time-course of brain activity during multistage humor processing between the sexes. The purpose of this study was to compare real-time dynamics related to humor processing between women and men, with reference to a proposed three-stage model (involving incongruity detection, incongruity resolution, and elaboration stages). Forty undergraduate students (20 women) underwent event-related potential recording while subjectively rating 30 question-answer-type jokes and 30 question-answer-type statements in a random order. Sex differences were revealed by analyses of the mean amplitudes of difference waves during a specific time window between 1000 and 1300 ms poststimulus onset (P1000-1300). This indicates that women recruited more mental resources to integrate cognitive and emotional components at this late stage. In contrast, men recruited more automated processes during the transition from the cognitive operations of the incongruity resolution stage to the emotional response of the humor elaboration stage. Our results suggest that sex differences in humor processing lie in differences in the integration of cognitive and emotional components, which are closely linked and interact reciprocally, particularly in women. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  20. [Event-related brain potentials when Russian verbs being conjugated: to the problem of language processing modularity].

    Science.gov (United States)

    Dan'ko, S G; Boĭtsova, Iu A; Solov'eva, M L; Chernigovskaia, T V; Medvedev, S V

    2014-01-01

    In the light of alternative conceptions of "two-system" and "single-system" models of language processing the efforts have been undertaken to study brain mechanisnis for generation of regular and irregular forms of Russian verbs. The 19 EEG channels of evoked activity were registered along with casual alternations of speech morphology operations to be compared. Verbs of imperfective aspect in the form of an infinitive, belonging either to a group of productive verbs (default, conventionally regular class), or toan unproductive group of verbs (conventionally irregular class) were presented to healthy subjects. The subjects were requested to produce first person present time forms of these verbs. Results of analysis of event related potentials (ERP) for a group of 22 persons are presented. Statistically reliable ERP amplitude distinctions between the verb groups are found onlyin the latencies 600-850 ms in central and parietal zones of the cortex. In these latencies ERP values associated with a presentation of irregular verbs are negative in relation to ERP values associated with the presentation of regular verbs. The received results are interpreted as a consequence of various complexity of mental work with verbs of these different groups and presumably don't support a hypothesis of universality of the "two-system" brain mechanism for processing of the regular and irregular language forms.

  1. Improved motion contrast and processing efficiency in OCT angiography using complex-correlation algorithm

    International Nuclear Information System (INIS)

    Guo, Li; Li, Pei; Pan, Cong; Cheng, Yuxuan; Ding, Zhihua; Li, Peng; Liao, Rujia; Hu, Weiwei; Chen, Zhong

    2016-01-01

    The complex-based OCT angiography (Angio-OCT) offers high motion contrast by combining both the intensity and phase information. However, due to involuntary bulk tissue motions, complex-valued OCT raw data are processed sequentially with different algorithms for correcting bulk image shifts (BISs), compensating global phase fluctuations (GPFs) and extracting flow signals. Such a complicated procedure results in massive computational load. To mitigate such a problem, in this work, we present an inter-frame complex-correlation (CC) algorithm. The CC algorithm is suitable for parallel processing of both flow signal extraction and BIS correction, and it does not need GPF compensation. This method provides high processing efficiency and shows superiority in motion contrast. The feasibility and performance of the proposed CC algorithm is demonstrated using both flow phantom and live animal experiments. (paper)

  2. Automating the Simulation of SME Processes through a Discrete Event Parametric Model

    Directory of Open Access Journals (Sweden)

    Francesco Aggogeri

    2015-02-01

    Full Text Available At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the user’s specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project.

  3. Social dimension and complexity differentially influence brain responses during feedback processing.

    Science.gov (United States)

    Pfabigan, Daniela M; Gittenberger, Marianne; Lamm, Claus

    2017-10-30

    Recent research emphasizes the importance of social factors during performance monitoring. Thus, the current study investigated the impact of social stimuli -such as communicative gestures- on feedback processing. Moreover, it addressed a shortcoming of previous studies, which failed to consider stimulus complexity as potential confounding factor. Twenty-four volunteers performed a time estimation task while their electroencephalogram was recorded. Either social complex, social non-complex, non-social complex, or non-social non-complex stimuli were used to provide performance feedback. No effects of social dimension or complexity were found for task performance. In contrast, Feedback-Related Negativity (FRN) and P300 amplitudes were sensitive to both factors, with larger FRN and P300 amplitudes after social compared to non-social stimuli, and larger FRN amplitudes after complex positive than non-complex positive stimuli. P2 amplitudes were solely sensitive to feedback valence and social dimension. Subjectively, social complex stimuli were rated as more motivating than non-social complex ones. Independently of each other, social dimension and visual complexity influenced amplitude variation during performance monitoring. Social stimuli seem to be perceived as more salient, which is corroborated by P2, FRN and P300 results, as well as by subjective ratings. This could be explained due to their given relevance during every day social interactions.

  4. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    Science.gov (United States)

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  5. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    Directory of Open Access Journals (Sweden)

    Huajiao Li

    Full Text Available Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  6. Some considerations on Bible translation as complex process | Van ...

    African Journals Online (AJOL)

    It is argued that translation is a complex process: meaning is "created" by decoding the source text on several levels (for instance, grammatical; structural; literary; and socio-cultural levels). This "meaning" must then be encoded into the target language by means of the linguistic, literary, and cultural conventions of the target ...

  7. Frontend event selection with an MBD using Q

    International Nuclear Information System (INIS)

    Amann, J.F.

    1981-01-01

    A problem common to many complex experiments in Nuclear Physics is the need to provide for event selection at a level beyond that readily available in a fast hardware trigger. This may be desirable as a means of reducing the amount of unwanted data going to tape, or be needed to reduce system deadtime, so as not to miss an infrequent good event. The latter criterion is particularly important at low duty factor accelerators such as LAMPF, where instantaneous trigger rates may be quite high. The need for such an event selection mechanism has arisen in conjunction with the installation of a polarimeter in the focal plane of the High Resolution Spectrometer (HRS) at LAMPF. It has been met using a combination of buffered CAMAC electronics and an enhancement to the LAMPF standard Q data acquisition system. The enhancement to Q allows the experimeter to specify at runtime, a set of simple tests to be performed on each event as it is processed by the MBD, and before it is passed to the PDP-11 for taping and further analysis

  8. Older Adults' Coping with Negative Life Events: Common Processes of Managing Health, Interpersonal, and Financial/Work Stressors

    Science.gov (United States)

    Moos, Rudolf H.; Brennan, Penny L.; Schutte, Kathleen K.; Moos, Bernice S.

    2006-01-01

    This study examined how older adults cope with negative life events in health, interpersonal, and financial/work domains and whether common stress and coping processes hold across these three domains. On three occasions, older adults identified the most severe negative event they faced in the last year and described how they appraised and coped…

  9. A simple conceptual model of abrupt glacial climate events

    Directory of Open Access Journals (Sweden)

    H. Braun

    2007-11-01

    Full Text Available Here we use a very simple conceptual model in an attempt to reduce essential parts of the complex nonlinearity of abrupt glacial climate changes (the so-called Dansgaard-Oeschger events to a few simple principles, namely (i the existence of two different climate states, (ii a threshold process and (iii an overshooting in the stability of the system at the start and the end of the events, which is followed by a millennial-scale relaxation. By comparison with a so-called Earth system model of intermediate complexity (CLIMBER-2, in which the events represent oscillations between two climate states corresponding to two fundamentally different modes of deep-water formation in the North Atlantic, we demonstrate that the conceptual model captures fundamental aspects of the nonlinearity of the events in that model. We use the conceptual model in order to reproduce and reanalyse nonlinear resonance mechanisms that were already suggested in order to explain the characteristic time scale of Dansgaard-Oeschger events. In doing so we identify a new form of stochastic resonance (i.e. an overshooting stochastic resonance and provide the first explicitly reported manifestation of ghost resonance in a geosystem, i.e. of a mechanism which could be relevant for other systems with thresholds and with multiple states of operation. Our work enables us to explicitly simulate realistic probability measures of Dansgaard-Oeschger events (e.g. waiting time distributions, which are a prerequisite for statistical analyses on the regularity of the events by means of Monte-Carlo simulations. We thus think that our study is an important advance in order to develop more adequate methods to test the statistical significance and the origin of the proposed glacial 1470-year climate cycle.

  10. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  11. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    Science.gov (United States)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  12. Maximizing information exchange between complex networks

    International Nuclear Information System (INIS)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-01-01

    Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain's response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of modern

  13. Maximizing information exchange between complex networks

    Science.gov (United States)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-10-01

    Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain’s response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of

  14. Maximizing information exchange between complex networks

    Energy Technology Data Exchange (ETDEWEB)

    West, Bruce J. [Mathematical and Information Science, Army Research Office, Research Triangle Park, NC 27708 (United States); Physics Department, Duke University, Durham, NC 27709 (United States)], E-mail: bwest@nc.rr.com; Geneston, Elvis L. [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Physics Department, La Sierra University, 4500 Riverwalk Parkway, Riverside, CA 92515 (United States); Grigolini, Paolo [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Istituto di Processi Chimico Fisici del CNR, Area della Ricerca di Pisa, Via G. Moruzzi, 56124, Pisa (Italy); Dipartimento di Fisica ' E. Fermi' Universita' di Pisa, Largo Pontecorvo 3, 56127 Pisa (Italy)

    2008-10-15

    Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain's response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of

  15. Emotional Picture and Word Processing: An fMRI Study on Effects of Stimulus Complexity

    Science.gov (United States)

    Schlochtermeier, Lorna H.; Kuchinke, Lars; Pehrs, Corinna; Urton, Karolina; Kappelhoff, Hermann; Jacobs, Arthur M.

    2013-01-01

    Neuroscientific investigations regarding aspects of emotional experiences usually focus on one stimulus modality (e.g., pictorial or verbal). Similarities and differences in the processing between the different modalities have rarely been studied directly. The comparison of verbal and pictorial emotional stimuli often reveals a processing advantage of emotional pictures in terms of larger or more pronounced emotion effects evoked by pictorial stimuli. In this study, we examined whether this picture advantage refers to general processing differences or whether it might partly be attributed to differences in visual complexity between pictures and words. We first developed a new stimulus database comprising valence and arousal ratings for more than 200 concrete objects representable in different modalities including different levels of complexity: words, phrases, pictograms, and photographs. Using fMRI we then studied the neural correlates of the processing of these emotional stimuli in a valence judgment task, in which the stimulus material was controlled for differences in emotional arousal. No superiority for the pictorial stimuli was found in terms of emotional information processing with differences between modalities being revealed mainly in perceptual processing regions. While visual complexity might partly account for previously found differences in emotional stimulus processing, the main existing processing differences are probably due to enhanced processing in modality specific perceptual regions. We would suggest that both pictures and words elicit emotional responses with no general superiority for either stimulus modality, while emotional responses to pictures are modulated by perceptual stimulus features, such as picture complexity. PMID:23409009

  16. Crystallization process of a three-dimensional complex plasma

    Science.gov (United States)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  17. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    Science.gov (United States)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  18. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    Science.gov (United States)

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  19. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    Science.gov (United States)

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  20. Event Management of RFID Data Streams: Fast Moving Consumer Goods Supply Chains

    Science.gov (United States)

    Mo, John P. T.; Li, Xue

    Radio Frequency Identification (RFID) is a wireless communication technology that uses radio-frequency waves to transfer information between tagged objects and readers without line of sight. This creates tremendous opportunities for linking real world objects into a world of "Internet of things". Application of RFID to Fast Moving Consumer Goods sector will introduce billions of RFID tags in the world. Almost everything is tagged for tracking and identification purposes. This phenomenon will impose a new challenge not only to the network capacity but also to the scalability of processing of RFID events and data. This chapter uses two national demonstrator projects in Australia as case studies to introduce an event managementframework to process high volume RFID data streams in real time and automatically transform physical RFID observations into business-level events. The model handles various temporal event patterns, both simple and complex, with temporal constraints. The model can be implemented in a data management architecture that allows global RFID item tracking and enables fast, large-scale RFID deployment.

  1. Information spread of emergency events: path searching on social networks.

    Science.gov (United States)

    Dai, Weihui; Hu, Hongzhi; Wu, Tunan; Dai, Yonghui

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning.

  2. Information Spread of Emergency Events: Path Searching on Social Networks

    Directory of Open Access Journals (Sweden)

    Weihui Dai

    2014-01-01

    Full Text Available Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning.

  3. Toward understanding the thermodynamics of TALSPEAK process. Medium effects on actinide complexation

    International Nuclear Information System (INIS)

    Zalupski, Peter R.; Martin, Leigh R.; Nash, Ken; Nakamura, Yoshinobu; Yamamoto, Masahiko

    2009-01-01

    The ingenious combination of lactate and diethylenetriamine-N,N,N',N(double p rime),N(double p rime)-pentaacetic acid (DTPA) as an aqueous actinide-complexing medium forms the basis of the successful separation of americium and curium from lanthanides known as the TALSPEAK process. While numerous reports in the prior literature have focused on the optimization of this solvent extraction system, considerably less attention has been devoted to the understanding of the basic thermodynamic features of the complex fluids responsible for the separation. The available thermochemical information of both lactate and DTPA protonation and metal complexation reactions are representative of the behavior of these ions under idealized conditions. Our previous studies of medium effects on lactate protonation suggest that significant departures from the speciation predicted based on reported thermodynamic values should be expected in the TALSPEAK aqueous environment. Thermodynamic parameters describing the separation chemistry of this process thus require further examination at conditions significantly removed from conventional ideal systems commonly employed in fundamental solution chemistry. Such thermodynamic characterization is the key to predictive modelling of TALSPEAK. Improved understanding will, in principle, allow process technologists to more efficiently respond to off-normal conditions during large scale process operation. In this report, the results of calorimetric and potentiometric investigations of the effects of aqueous electrolytes on the thermodynamic parameters for lactate protonation and lactate complexation of americium and neodymium will be presented. Studies on the lactate protonation equilibrium will clearly illustrate distinct thermodynamic variations between strong electrolyte aqueous systems and buffered lactate environment.

  4. European accelerator facilities for single event effects testing

    Energy Technology Data Exchange (ETDEWEB)

    Adams, L; Nickson, R; Harboe-Sorensen, R [ESA-ESTEC, Noordwijk (Netherlands); Hajdas, W; Berger, G

    1997-03-01

    Single event effects are an important hazard to spacecraft and payloads. The advances in component technology, with shrinking dimensions and increasing complexity will give even more importance to single event effects in the future. The ground test facilities are complex and expensive and the complexities of installing a facility are compounded by the requirement that maximum control is to be exercised by users largely unfamiliar with accelerator technology. The PIF and the HIF are the result of experience gained in the field of single event effects testing and represent a unique collaboration between space technology and accelerator experts. Both facilities form an essential part of the European infrastructure supporting space projects. (J.P.N.)

  5. Complexity measures of music

    Science.gov (United States)

    Pease, April; Mahmoodi, Korosh; West, Bruce J.

    2018-03-01

    We present a technique to search for the presence of crucial events in music, based on the analysis of the music volume. Earlier work on this issue was based on the assumption that crucial events correspond to the change of music notes, with the interesting result that the complexity index of the crucial events is mu ~ 2, which is the same inverse power-law index of the dynamics of the brain. The search technique analyzes music volume and confirms the results of the earlier work, thereby contributing to the explanation as to why the brain is sensitive to music, through the phenomenon of complexity matching. Complexity matching has recently been interpreted as the transfer of multifractality from one complex network to another. For this reason we also examine the mulifractality of music, with the observation that the multifractal spectrum of a computer performance is significantly narrower than the multifractal spectrum of a human performance of the same musical score. We conjecture that although crucial events are demonstrably important for information transmission, they alone are not suficient to define musicality, which is more adequately measured by the multifractality spectrum.

  6. Social anxiety and post-event processing among African-American individuals.

    Science.gov (United States)

    Buckner, Julia D; Dean, Kimberlye E

    2017-03-01

    Social anxiety is among the most prevalent psychiatric conditions, yet little attention has been paid to whether putative cognitive vulnerability factors related to social anxiety in predominantly White samples are related to social anxiety among historically underrepresented groups. We tested whether one such vulnerability factor, post-event processing (PEP; detailed review of social event that can increase state social anxiety) was related to social anxiety among African-American (AA; n = 127) persons, who comprise one of the largest underrepresented racial groups in the U.S. Secondarily, we tested whether AA participants differed from non-Hispanic White participants (n = 127) on PEP and social anxiety and whether race moderated the relation between PEP and social anxiety. Data were collected online among undergraduates. PEP was positively correlated with social anxiety among AA participants, even after controlling for depression and income, pr = .30, p = .001. AA and White participants did not differ on social anxiety or PEP, β = -1.57, 95% CI: -5.11, 1.96. The relation of PEP to social anxiety did not vary as a function of race, β = 0.00, 95% CI: -0.02, 0.02. PEP may be an important cognitive vulnerability factor related to social anxiety among AA persons suffering from social anxiety.

  7. The Associative Structure of Memory for Multi-Element Events

    Science.gov (United States)

    2013-01-01

    The hippocampus is thought to be an associative memory “convergence zone,” binding together the multimodal elements of an experienced event into a single engram. This predicts a degree of dependency between the retrieval of the different elements comprising an event. We present data from a series of studies designed to address this prediction. Participants vividly imagined a series of person–location–object events, and memory for these events was assessed across multiple trials of cued retrieval. Consistent with the prediction, a significant level of dependency was found between the retrieval of different elements from the same event. Furthermore, the level of dependency was sensitive both to retrieval task, with higher dependency during cued recall than cued recognition, and to subjective confidence. We propose a simple model, in which events are stored as multiple pairwise associations between individual event elements, and dependency is captured by a common factor that varies across events. This factor may relate to between-events modulation of the strength of encoding, or to a process of within-event “pattern completion” at retrieval. The model predicts the quantitative pattern of dependency in the data when changes in the level of guessing with retrieval task and confidence are taken into account. Thus, we find direct behavioral support for the idea that memory for complex multimodal events depends on the pairwise associations of their constituent elements and that retrieval of the various elements corresponding to the same event reflects a common factor that varies from event to event. PMID:23915127

  8. Reliability of the peer-review process for adverse event rating.

    Directory of Open Access Journals (Sweden)

    Alan J Forster

    Full Text Available Adverse events are poor patient outcomes caused by medical care. Their identification requires the peer-review of poor outcomes, which may be unreliable. Combining physician ratings might improve the accuracy of adverse event classification.To evaluate the variation in peer-reviewer ratings of adverse outcomes; determine the impact of this variation on estimates of reviewer accuracy; and determine the number of reviewers who judge an adverse event occurred that is required to ensure that the true probability of an adverse event exceeded 50%, 75% or 95%.Thirty physicians rated 319 case reports giving details of poor patient outcomes following hospital discharge. They rated whether medical management caused the outcome using a six-point ordinal scale. We conducted latent class analyses to estimate the prevalence of adverse events as well as the sensitivity and specificity of each reviewer. We used this model and Bayesian calculations to determine the probability that an adverse event truly occurred to each patient as function of their number of positive ratings.The overall median score on the 6-point ordinal scale was 3 (IQR 2,4 but the individual rater median score ranged from a minimum of 1 (in four reviewers to a maximum median score of 5. The overall percentage of cases rated as an adverse event was 39.7% (3798/9570. The median kappa for all pair-wise combinations of the 30 reviewers was 0.26 (IQR 0.16, 0.42; Min = -0.07, Max = 0.62. Reviewer sensitivity and specificity for adverse event classification ranged from 0.06 to 0.93 and 0.50 to 0.98, respectively. The estimated prevalence of adverse events using a latent class model with a common sensitivity and specificity for all reviewers (0.64 and 0.83 respectively was 47.6%. For patients to have a 95% chance of truly having an adverse event, at least 3 of 3 reviewers are required to deem the outcome an adverse event.Adverse event classification is unreliable. To be certain that a case

  9. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  10. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer science s, information science , bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www. science direct.com/ science /article/pii/S0005109818301730

  11. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www.sciencedirect.com/science/article/pii/S0005109818301730

  12. Event detection intelligent camera development

    International Nuclear Information System (INIS)

    Szappanos, A.; Kocsis, G.; Molnar, A.; Sarkozi, J.; Zoletnik, S.

    2008-01-01

    A new camera system 'event detection intelligent camera' (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized for digital signal processor (DSP) and field programmable gate array (FPGA) architectures. At full resolution a camera sends 12 bit sampled 1280 x 1024 pixels with 444 fps which means 1.43 Terabyte over half an hour. To analyse such a huge amount of data is time consuming and has a high computational complexity. We plan to overcome this problem by EDICAM's preprocessing concepts. EDICAM camera system integrates all the advantages of CMOS sensor chip technology and fast network connections. EDICAM is built up from three different modules with two interfaces. A sensor module (SM) with reduced hardware and functional elements to reach a small and compact size and robust action in harmful environment as well. An image processing and control unit (IPCU) module handles the entire user predefined events and runs image processing algorithms to generate trigger signals. Finally a 10 Gigabit Ethernet compatible image readout card functions as the network interface for the PC. In this contribution all the concepts of EDICAM and the functions of the distinct modules are described

  13. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  14. Attenuation of deep semantic processing during mind wandering: an event-related potential study.

    Science.gov (United States)

    Xu, Judy; Friedman, David; Metcalfe, Janet

    2018-03-21

    Although much research shows that early sensory and attentional processing is affected by mind wandering, the effect of mind wandering on deep (i.e. semantic) processing is relatively unexplored. To investigate this relation, we recorded event-related potentials as participants studied English-Spanish word pairs, one at a time, while being intermittently probed for whether they were 'on task' or 'mind wandering'. Both perceptual processing, indexed by the P2 component, and deep processing, indexed by a late, sustained slow wave maximal at parietal electrodes, was attenuated during periods preceding participants' mind wandering reports. The pattern when participants were on task, rather than mind wandering, is similar to the subsequent memory or difference in memory effect. These results support previous findings of sensory attenuation during mind wandering, and extend them to a long-duration slow wave by suggesting that the deeper and more sustained levels of processing are also disrupted.

  15. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  16. Isotopic constraints on contamination processes in the Tonian Goiás Stratiform Complex

    Science.gov (United States)

    Giovanardi, Tommaso; Mazzucchelli, Maurizio; Lugli, Federico; Girardi, Vicente A. V.; Correia, Ciro T.; Tassinari, Colombo C. G.; Cipriani, Anna

    2018-06-01

    The Tonian Goiás Stratiform Complex (TGSC, Goiás, central Brazil), is one of the largest mafic-ultramafic layered complexes in the world, emplaced during the geotectonic events that led to the Gondwana accretion. In this study, we present trace elements and in-situ U/Pb-Lu-Hf analyses of zircons and 87Sr/86Sr ratios of plagioclases from anorthosites and gabbros of the TGSC. Although formed by three isolated bodies (Cana Brava, Niquelândia and Barro Alto), and characterized by a Lower and Upper Sequence (LS and US), our new U/Pb zircon data confirm recent geochemical, geochronological, and structural evidences that the TGSC has originated from a single intrusive body in the Neoproterozoic. New Hf and Sr isotope ratios construe a complex contamination history for the TGSC, with different geochemical signatures in the two sequences. The low Hf and high Sr isotope ratios of the Lower Sequence (εHf(t) from -4.2 down to -27.5; 87Sr/86Sr = 0.706605-0.729226), suggest the presence of a crustal component and are consistent with contamination from meta-pelitic and calc-silicate rocks found as xenoliths within the Sequence. The more radiogenic Hf isotope ratios and low Sr isotope composition of the Upper Sequence (εHf(t) from 11.3 down to -8.4; 87Sr/86Sr = 0.702368-0.702452), suggest a contamination from mantle-derived metabasalts in agreement with the occurrences of amphibolite xenoliths in the US stratigraphy. The differential contamination of the two sequences is explained by the intrusion of the TGSC in a stratified crust dominated by metasedimentary rocks in its deeper part and metavolcanics at shallower levels. Moreover, the differential thermal gradient in the two crystallizing sequences might have contributed to the preservation and recrystallization of inherited zircon grains in the US and total dissolution or magmatic overgrowth of the LS zircons via melt/rock reaction processes.

  17. ATM-Dependent Phosphorylation of All Three Members of the MRN Complex: From Sensor to Adaptor.

    Science.gov (United States)

    Lavin, Martin F; Kozlov, Sergei; Gatei, Magtouf; Kijas, Amanda W

    2015-10-23

    The recognition, signalling and repair of DNA double strand breaks (DSB) involves the participation of a multitude of proteins and post-translational events that ensure maintenance of genome integrity. Amongst the proteins involved are several which when mutated give rise to genetic disorders characterised by chromosomal abnormalities, cancer predisposition, neurodegeneration and other pathologies. ATM (mutated in ataxia-telangiectasia (A-T) and members of the Mre11/Rad50/Nbs1 (MRN complex) play key roles in this process. The MRN complex rapidly recognises and locates to DNA DSB where it acts to recruit and assist in ATM activation. ATM, in the company of several other DNA damage response proteins, in turn phosphorylates all three members of the MRN complex to initiate downstream signalling. While ATM has hundreds of substrates, members of the MRN complex play a pivotal role in mediating the downstream signalling events that give rise to cell cycle control, DNA repair and ultimately cell survival or apoptosis. Here we focus on the interplay between ATM and the MRN complex in initiating signaling of breaks and more specifically on the adaptor role of the MRN complex in mediating ATM signalling to downstream substrates to control different cellular processes.

  18. Self-Complexity, Daily Events, and Perceived Quality of Life.

    Science.gov (United States)

    Kardash, CarolAnne M.; Okun, Morris A.

    Recent research has demonstrated that self-cognitions can play an important role in physical and emotional well-being. One important aspect of self-cognition concerns the complexity of self-representations. This study tested the hypothesis that self-complexity, as assessed by Linville's self-trait sorting task, would moderate the effects of…

  19. Study of the degradation of organic molecules complexing radionuclides by using Advanced Oxidation Processes

    International Nuclear Information System (INIS)

    Rekab, K.

    2014-01-01

    This research thesis reports the study of the application of two AOPs (Advanced Oxidation Processes) to degrade and mineralise organic molecules which are complexing radio-elements, and thus to allow their concentrations by trapping on mineral matrices. EDTA (ethylene diamine tetraacetic acid) is chosen as reference organic complexing agent for preliminary tests performed with inactive cobalt 59 before addressing actual nuclear effluents with active cobalt 60. The author first presents the industrial context (existing nuclear wastes, notably liquid effluents and their processing) and proposes an overview of the state of the art on adsorption and precipitation of cobalt (natural and radioactive isotope). Then, the author presents the characteristics of the various studied oxides, the photochemical reactor used to perform tests, experimental techniques and operational modes. Results are then presented regarding various issues: adsorption of EDTA and the Co-EDTA complex, and cobalt precipitation; determination of the lamp photon flow by chemical actinometry and by using the Keitz method; efficiency of different processes (UV, UV/TiO 2 , UV/H 2 O 2 ) to degrade EDTA and to degrade the Co-EDTA complex; processing of a nuclear effluent coming from La Hague pools with determination of decontamination factors

  20. Cascade of links in complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Yeqian; Sun, Bihui [Department of Management Science, School of Government, Beijing Normal University, 100875 Beijing (China); Zeng, An, E-mail: anzeng@bnu.edu.cn [School of Systems Science, Beijing Normal University, 100875 Beijing (China)

    2017-01-30

    Cascading failure is an important process which has been widely used to model catastrophic events such as blackouts and financial crisis in real systems. However, so far most of the studies in the literature focus on the cascading process on nodes, leaving the possibility of link cascade overlooked. In many real cases, the catastrophic events are actually formed by the successive disappearance of links. Examples exist in the financial systems where the firms and banks (i.e. nodes) still exist but many financial trades (i.e. links) are gone during the crisis, and the air transportation systems where the airports (i.e. nodes) are still functional but many airlines (i.e. links) stop operating during bad weather. In this letter, we develop a link cascade model in complex networks. With this model, we find that both artificial and real networks tend to collapse even if a few links are initially attacked. However, the link cascading process can be effectively terminated by setting a few strong nodes in the network which do not respond to any link reduction. Finally, a simulated annealing algorithm is used to optimize the location of these strong nodes, which significantly improves the robustness of the networks against the link cascade. - Highlights: • We propose a link cascade model in complex networks. • Both artificial and real networks tend to collapse even if a few links are initially attacked. • The link cascading process can be effectively terminated by setting a few strong nodes. • A simulated annealing algorithm is used to optimize the location of these strong nodes.

  1. Cascade of links in complex networks

    International Nuclear Information System (INIS)

    Feng, Yeqian; Sun, Bihui; Zeng, An

    2017-01-01

    Cascading failure is an important process which has been widely used to model catastrophic events such as blackouts and financial crisis in real systems. However, so far most of the studies in the literature focus on the cascading process on nodes, leaving the possibility of link cascade overlooked. In many real cases, the catastrophic events are actually formed by the successive disappearance of links. Examples exist in the financial systems where the firms and banks (i.e. nodes) still exist but many financial trades (i.e. links) are gone during the crisis, and the air transportation systems where the airports (i.e. nodes) are still functional but many airlines (i.e. links) stop operating during bad weather. In this letter, we develop a link cascade model in complex networks. With this model, we find that both artificial and real networks tend to collapse even if a few links are initially attacked. However, the link cascading process can be effectively terminated by setting a few strong nodes in the network which do not respond to any link reduction. Finally, a simulated annealing algorithm is used to optimize the location of these strong nodes, which significantly improves the robustness of the networks against the link cascade. - Highlights: • We propose a link cascade model in complex networks. • Both artificial and real networks tend to collapse even if a few links are initially attacked. • The link cascading process can be effectively terminated by setting a few strong nodes. • A simulated annealing algorithm is used to optimize the location of these strong nodes.

  2. Events as spaces for upgrading : Automotive events in Shanghai

    NARCIS (Netherlands)

    E. van Tuijl (Erwin); K. Dittrich (Koen)

    2015-01-01

    textabstractThis study contributes to the literature dealing with upgrading of the Chinese automotive industry by analysing the role of events in the upgrading process. By combining literature on temporary clusters with that of knowledge sourcing and upgrading, we investigate how firms use events

  3. WAITING TIME DISTRIBUTION OF SOLAR ENERGETIC PARTICLE EVENTS MODELED WITH A NON-STATIONARY POISSON PROCESS

    International Nuclear Information System (INIS)

    Li, C.; Su, W.; Fang, C.; Zhong, S. J.; Wang, L.

    2014-01-01

    We present a study of the waiting time distributions (WTDs) of solar energetic particle (SEP) events observed with the spacecraft WIND and GOES. The WTDs of both solar electron events (SEEs) and solar proton events (SPEs) display a power-law tail of ∼Δt –γ . The SEEs display a broken power-law WTD. The power-law index is γ 1 = 0.99 for the short waiting times (<70 hr) and γ 2 = 1.92 for large waiting times (>100 hr). The break of the WTD of SEEs is probably due to the modulation of the corotating interaction regions. The power-law index, γ ∼ 1.82, is derived for the WTD of the SPEs which is consistent with the WTD of type II radio bursts, indicating a close relationship between the shock wave and the production of energetic protons. The WTDs of SEP events can be modeled with a non-stationary Poisson process, which was proposed to understand the waiting time statistics of solar flares. We generalize the method and find that, if the SEP event rate λ = 1/Δt varies as the time distribution of event rate f(λ) = Aλ –α exp (– βλ), the time-dependent Poisson distribution can produce a power-law tail WTD of ∼Δt α –3 , where 0 ≤ α < 2

  4. Application of IAEA's International Nuclear Event Scale to events at testing/research reactors in Japan

    International Nuclear Information System (INIS)

    Nozawa, Masao; Watanabe, Norio

    1999-01-01

    The International Nuclear Event Scale (INES) is a means for providing prompt, clear and consistent information related to nuclear events and facilitating communication between the nuclear community, the media and the public on such events. This paper describes the INES rating process for events at testing/research reactors and nuclear fuel processing facilities and experience on the application of the INES scale in Japan. (author)

  5. The light-makeup advantage in facial processing: Evidence from event-related potentials

    OpenAIRE

    Tagai, Keiko; Shimakura, Hitomi; Isobe, Hiroko; Nittono, Hiroshi

    2017-01-01

    The effects of makeup on attractiveness have been evaluated using mainly subjective measures. In this study, event-related brain potentials (ERPs) were recorded from a total of 45 Japanese women (n = 23 and n = 22 for Experiment 1 and 2, respectively) to examine the neural processing of faces with no makeup, light makeup, and heavy makeup. To have the participants look at each face carefully, an identity judgement task was used: they were asked to judge whether the two faces presented in succ...

  6. Impacts of natural events and processes on groundwater flow conditions: a case study in the Horonobe Area, Hokkaido, Northern Japan

    International Nuclear Information System (INIS)

    Niizato, T.; Yasue, K.I.; Kurikami, H.

    2009-01-01

    In order to assess the long-term stability of the geological environments for over several hundred thousand years, it is important to consider the influence of natural events and processes, such as uplift, subsidence, denudation and climate change, on the geological environments, especially in an active region such as Japan. This study presents a conceptual model related to the future natural events and processes which have potential impacts on the groundwater flow conditions in the Horonobe area, Hokkaido, northern Japan on the basis of the neo-tectonics, palaeogeography, palaeo-climate, historical development of landform, and present state of groundwater flow conditions. We conclude that it is important to consider interactions among natural events and processes on the describing of the best-possible approximation of the time-variation of geological environment. (authors)

  7. A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali; Dierks, Meghan

    2017-03-01

    In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations

  8. Synthesis of uranium and thorium dioxides by Complex Sol-Gel Processes (CSGP). Synthesis of uranium oxides by Complex Sol-Gel Processes (CSGP)

    International Nuclear Information System (INIS)

    Deptula, A.; Brykala, M.; Lada, W.; Olczak, T.; Wawszczak, D.; Chmielewski, A.G.; Modolo, G.; Daniels, H.

    2010-01-01

    In the Institute of Nuclear Chemistry and Technology (INCT), a new method of synthesis of uranium and thorium dioxides by original variant of sol-gel method - Complex Sol-Gel Process (CSGP), has been elaborated. The main modification step is the formation of nitrate-ascorbate sols from components alkalized by aqueous ammonia. Those sols were gelled into: - irregularly agglomerates by evaporation of water; - medium sized microspheres (diameter <150) by IChTJ variant of sol-gel processes by water extraction from drops of emulsion sols in 2-ethylhexanol-1 by this solvent. Uranium dioxide was obtained by a reduction of gels with hydrogen at temperatures >700 deg. C, while thorium dioxide by a simple calcination in the air atmosphere. (authors)

  9. Mapping stochastic processes onto complex networks

    International Nuclear Information System (INIS)

    Shirazi, A H; Reza Jafari, G; Davoudi, J; Peinke, J; Reza Rahimi Tabar, M; Sahimi, Muhammad

    2009-01-01

    We introduce a method by which stochastic processes are mapped onto complex networks. As examples, we construct the networks for such time series as those for free-jet and low-temperature helium turbulence, the German stock market index (the DAX), and white noise. The networks are further studied by contrasting their geometrical properties, such as the mean length, diameter, clustering, and average number of connections per node. By comparing the network properties of the original time series investigated with those for the shuffled and surrogate series, we are able to quantify the effect of the long-range correlations and the fatness of the probability distribution functions of the series on the networks constructed. Most importantly, we demonstrate that the time series can be reconstructed with high precision by means of a simple random walk on their corresponding networks

  10. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  11. Ionic flotation of complexing oxyanions. Thermodynamics of uranyl complexation in a sulfuric medium. Definition of selectivity conditions of the process

    International Nuclear Information System (INIS)

    Bouzat, G.

    1987-01-01

    Oxyanion ionic flotation process with dodecylamine hydrochloride as collector is studied by investigation of flotation and filtration recovery curves, suspension turbidity, conductimetric measurements, and solubility of ionic species. The process is controlled by chemical reactions of precipitation and by adsorption of surfactant confering hydrophobic or hydrophilic surface properties to the solid phase respectively when one or two monolayers of surfactant are successively adsorbed. Equilibrium constants (in terms of activity) of the uranium (VI) complexation with sulphate oxyanions are calculated through Raman spectroscopic study of uranyl sulphate aqueous solutions. The complexation results in a shift of the symmetrical stretching vibration band of U0 2 to low wavenumbers and an increase of their cross section. The solubility curves of ionic species in the precipitation of uranyl -sulphate complexes by dodecylamine hydrochloride are modelled. The knowledge of solubility products, activity coefficients of the species and critical micellar concentration of the surfactant allow the modelling of flotation recovery curves. Temperature and collector structure modifications are studied in terms of optimization parameters and uranium extraction of mine drainage water is processed. Residual concentration of surfactant is considerably lowered by adsorption on montmorillonite

  12. Children's Verbal Working Memory: Role of Processing Complexity in Predicting Spoken Sentence Comprehension

    Science.gov (United States)

    Magimairaj, Beula M.; Montgomery, James W.

    2012-01-01

    Purpose: This study investigated the role of processing complexity of verbal working memory tasks in predicting spoken sentence comprehension in typically developing children. Of interest was whether simple and more complex working memory tasks have similar or different power in predicting sentence comprehension. Method: Sixty-five children (6- to…

  13. Role of ion chromatograph in nuclear fuel fabrication process at Nuclear Fuel Complex

    International Nuclear Information System (INIS)

    Balaji Rao, Y.; Prasada Rao, G.; Prahlad, B.; Saibaba, N.

    2012-01-01

    The present paper discusses the different applications of ion chromatography followed in nuclear fuel fabrication process at Nuclear Fuel Complex. Some more applications of IC for characterization of nuclear materials and which are at different stages of method development at Control Laboratory, Nuclear Fuel Complex are also highlighted

  14. Social Anxiety and Post-Event Processing: The Impact of Race

    Science.gov (United States)

    Buckner, Julia D.; Dean, Kimberlye E.

    2016-01-01

    Background Social anxiety is among the most prevalent psychiatric conditions, yet little attention has been paid to whether putative cognitive vulnerability factors related to social anxiety in predominantly White samples are related to social anxiety among historically underrepresented groups. Design We tested whether one such vulnerability factor, post-event processing (PEP; detailed review of social event that can increase state social anxiety) was related to social anxiety among African American (AA; n=127) persons, who comprise one of the largest underrepresented racial groups in the U.S. Secondarily, we tested whether AA participants differed from non-Hispanic White participants (n=127) on PEP and social anxiety and whether race moderated the relation between PEP and social anxiety. Method Data were collected online among undergraduates. Results PEP was positively correlated with social anxiety among AA participants, even after controlling for depression and income, pr=.30, p=.001. AA and White participants did not differ on social anxiety or PEP, β=−1.57, 95% C.I.: −5.11, 1.96. The relation of PEP to social anxiety did not vary as a function of race, β=0.00, 95% C.I.: −0.02, 0.02. Conclusions PEP may be an important cognitive vulnerability factor related to social anxiety among AA persons suffering from social anxiety. PMID:27576610

  15. Effects of Grammatical Categories on Children's Visual Language Processing: Evidence from Event-Related Brain Potentials

    Science.gov (United States)

    Weber-Fox, Christine; Hart, Laura J.; Spruill, John E., III

    2006-01-01

    This study examined how school-aged children process different grammatical categories. Event-related brain potentials elicited by words in visually presented sentences were analyzed according to seven grammatical categories with naturally varying characteristics of linguistic functions, semantic features, and quantitative attributes of length and…

  16. Context processing in adolescents with autism spectrum disorder: How complex could it be?

    Science.gov (United States)

    Ben-Yosef, Dekel; Anaki, David; Golan, Ofer

    2017-03-01

    The ability of individuals with Autism Spectrum Disorder (ASD) to process context has long been debated: According to the Weak Central Coherence theory, ASD is characterized by poor global processing, and consequently-poor context processing. In contrast, the Social Cognition theory argues individuals with ASD will present difficulties only in social context processing. The complexity theory of autism suggests context processing in ASD will depend on task complexity. The current study examined this controversy through two priming tasks, one presenting human stimuli (facial expressions) and the other presenting non-human stimuli (animal faces). Both tasks presented visual targets, preceded by congruent, incongruent, or neutral auditory primes. Local and global processing were examined by presenting the visual targets in three spatial frequency conditions: High frequency, low frequency, and broadband. Tasks were administered to 16 adolescents with high functioning ASD and 16 matched typically developing adolescents. Reaction time and accuracy were measured for each task in each condition. Results indicated that individuals with ASD processed context for both human and non-human stimuli, except in one condition, in which human stimuli had to be processed globally (i.e., target presented in low frequency). The task demands presented in this condition, and the performance deficit shown in the ASD group as a result, could be understood in terms of cognitive overload. These findings provide support for the complexity theory of autism and extend it. Our results also demonstrate how associative priming could support intact context processing of human and non-human stimuli in individuals with ASD. Autism Res 2017, 10: 520-530. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  17. Complement-mediated solubilization of immune complexes and their interaction with complement C3 receptors

    DEFF Research Database (Denmark)

    Petersen, Ivan; Baatrup, Gunnar; Jepsen, H H

    1985-01-01

    Some of the molecular events in the complement (C)-mediated solubilization of immune complexes (IC) have been clarified in recent years. The solubilization is primarily mediated by alternative C pathway proteins whereas factors in the classical pathway accelerate the process. Components of the me......Some of the molecular events in the complement (C)-mediated solubilization of immune complexes (IC) have been clarified in recent years. The solubilization is primarily mediated by alternative C pathway proteins whereas factors in the classical pathway accelerate the process. Components...... of the cellular localization, expression and structure of the C3 receptors, especially the C3b (CR1) receptor, has been considerably extended in the last few years, whereas our understanding of the physiological role of these receptors is still fragmentary. However, it is becoming increasingly evident...

  18. FLCNDEMF: An Event Metamodel for Flood Process Information Management under the Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Nengcheng Chen

    2015-06-01

    Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.

  19. Complex Event Extraction using DRUM

    Science.gov (United States)

    2015-10-01

    towards tackling these challenges . Figure 9. Evaluation results for eleven teams. The diamond ◆ represents the results of our system. The two topmost...Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000). The UniProt

  20. 10th International Conference on Dependability and Complex Systems

    CERN Document Server

    Mazurkiewicz, Jacek; Sugier, Jarosław; Walkowiak, Tomasz; Kacprzyk, Janusz

    2015-01-01

    Building upon a long tradition of scientifi c conferences dealing with problems of reliability in technical systems, in 2006 Department of Computer Engineering at Wrocław University of Technology established DepCoS-RELCOMEX series of events in order to promote a comprehensive approach to evaluation of system performability which is now commonly called dependability. Contemporary complex systems integrate variety of technical, information, soft ware and human (users, administrators and management) resources. Their complexity comes not only from involved technical and organizational structures but mainly from complexity of information processes that must be implemented in specific operational environment (data processing, monitoring, management, etc.). In such a case traditional methods of reliability evaluation focused mainly on technical levels are insufficient and more innovative, multidisciplinary methods of dependability analysis must be applied. Selection of submissions for these proceedings exemplify di...

  1. Numerical simulations of an advection fog event over Shanghai Pudong International Airport with the WRF model

    Science.gov (United States)

    Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun

    2017-10-01

    A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.

  2. Managing complexity in process digitalisation with dynamic condition response graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Debois, Søren; Slaats, Tijs

    2017-01-01

    . Sadly, it is also witnessed by a number of expensive failed digitalisation projects. In this paper we point to two key problems in state-of-The art BPM technologies: 1) the use of rigid flow diagrams as the "source code" of process digitalisation is not suitable for managing the complexity of knowledge...

  3. Simulation of Weak Signals of Nanotechnology Innovation in Complex System

    Directory of Open Access Journals (Sweden)

    Sun Hi Yoo

    2018-02-01

    Full Text Available It is especially indispensable for new businesses or industries to predict the innovation of new technologies. This requires an understanding of how the complex process of innovation, which is accomplished through more efficient products, processes, services, technologies, or ideas, is adopted and diffused in the market, government, and society. Furthermore, detecting “weak signals” (signs of change in science and technology (S&T is also important to foretell events associated with innovations in technology. Thus, we explore the dynamic behavior of weak signals of a specific technological innovation using the agent-based simulating tool NetLogo. This study provides a deeper understanding of the early stages of complex technology innovation, and the models are capable of analyzing initial complex interaction structures between components of technologies and between agents engaged in collective invention.

  4. The First Prototype for the FastTracker Processing Unit

    CERN Document Server

    Andreani, A; The ATLAS collaboration; Beretta, M; Bogdan, M; Citterio, M; Alberti, F; Giannetti, P; Lanza, A; Magalotti, D; Piendibene, M; Shochet, M; Stabile, A; Tang, J; Tompkins, L

    2012-01-01

    Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment complexity and the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive selections. We present the first prototype of a new Processing Unit, the core of the FastTracker processor for Atlas, whose computing power is such that a couple of hundreds of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV in the ATLAS events up to Phase II instantaneous luminosities (5×1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below hundreds of microseconds. We plan extremely powerful, very compact and low consumption units for the far future, essential to increase efficiency and purity of the Level 2 selected samples through the intensive use of tracking. This strategy requires massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generall...

  5. Iterative Calibration: A Novel Approach for Calibrating the Molecular Clock Using Complex Geological Events.

    Science.gov (United States)

    Loeza-Quintana, Tzitziki; Adamowicz, Sarah J

    2018-02-01

    During the past 50 years, the molecular clock has become one of the main tools for providing a time scale for the history of life. In the era of robust molecular evolutionary analysis, clock calibration is still one of the most basic steps needing attention. When fossil records are limited, well-dated geological events are the main resource for calibration. However, biogeographic calibrations have often been used in a simplistic manner, for example assuming simultaneous vicariant divergence of multiple sister lineages. Here, we propose a novel iterative calibration approach to define the most appropriate calibration date by seeking congruence between the dates assigned to multiple allopatric divergences and the geological history. Exploring patterns of molecular divergence in 16 trans-Bering sister clades of echinoderms, we demonstrate that the iterative calibration is predominantly advantageous when using complex geological or climatological events-such as the opening/reclosure of the Bering Strait-providing a powerful tool for clock dating that can be applied to other biogeographic calibration systems and further taxa. Using Bayesian analysis, we observed that evolutionary rate variability in the COI-5P gene is generally distributed in a clock-like fashion for Northern echinoderms. The results reveal a large range of genetic divergences, consistent with multiple pulses of trans-Bering migrations. A resulting rate of 2.8% pairwise Kimura-2-parameter sequence divergence per million years is suggested for the COI-5P gene in Northern echinoderms. Given that molecular rates may vary across latitudes and taxa, this study provides a new context for dating the evolutionary history of Arctic marine life.

  6. Erosion processes by water in agricultural landscapes: a low-cost methodology for post-event analyses

    Science.gov (United States)

    Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo

    2015-04-01

    Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography

  7. Nonlinear impacts of small-scale natural events on Nineteenth Century human decision-making

    Science.gov (United States)

    McCormack, S. M.; Schlichting, K. M.; Urbanova, T.; Allen, T. L.; Ruffing, C. M.; Hermans, C. M.

    2009-12-01

    Natural climatological events that occurred throughout the Nineteenth Century, such as floods, droughts and hurricanes had long-lived, far-reaching consequences on the human decision-making processes occurring in the northeast United States. These events impacted the hydrological cycle, both directly -though the building of various structures- and indirectly - through an increased understanding of science; and the changing relationship between humans and their environment. This paper examines these events and associated processes through: 1) identifying specific natural events throughout the time period, occurring globally, with initial conditions conducive to long-lived consequences; 2) examining the relationship between scientific enquiry, natural events and the proliferation of dams in the northeast landscape; and 3) the growth of public health concerns, awareness of bacteriology, and municipal water supply systems. Results of this research indicate that the relationship between knowledge systems, natural events and subsequent engineering or technological fixes is complex and highly dependent on initial conditions. It highlights the time period where humans became increasingly dependent on engineered solutions to environmental problems, many of which still hold fast in our contemporary landscape. It is relevant to natural, social and governance structures in place today. The principles behind the occurrence of the natural phenomena and subsequent research and design have not changed; and understanding key events or stages in the past is tantamount to making predictions for the future.

  8. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun; Fu, Fengjiang; Chung, Neal Tai-Shung

    2014-01-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  9. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun

    2014-07-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  10. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  11. On improved understanding of plasma-chemical processes in complex low-temperature plasmas

    Science.gov (United States)

    Röpcke, Jürgen; Loffhagen, Detlef; von Wahl, Eric; Nave, Andy S. C.; Hamann, Stephan; van Helden, Jean-Piere H.; Lang, Norbert; Kersten, Holger

    2018-05-01

    Over the last years, chemical sensing using optical emission spectroscopy (OES) in the visible spectral range has been combined with methods of mid infrared laser absorption spectroscopy (MIR-LAS) in the molecular fingerprint region from 3 to 20 μm, which contains strong rotational-vibrational absorption bands of a large variety of gaseous species. This optical approach established powerful in situ diagnostic tools to study plasma-chemical processes of complex low-temperature plasmas. The methods of MIR-LAS enable to detect stable and transient molecular species in ground and excited states and to measure the concentrations and temperatures of reactive species in plasmas. Since kinetic processes are inherent to discharges ignited in molecular gases, high time resolution on sub-second timescales is frequently desired for fundamental studies as well as for process monitoring in applied research and industry. In addition to high sensitivity and good temporal resolution, the capacity for broad spectral coverage enabling multicomponent detection is further expanding the use of OES and MIR-LAS techniques. Based on selected examples, this paper reports on recent achievements in the understanding of complex low-temperature plasmas. Recently, a link with chemical modeling of the plasma has been provided, which is the ultimate objective for a better understanding of the chemical and reaction kinetic processes occurring in the plasma. Contribution to the Topical Issue "Fundamentals of Complex Plasmas", edited by Jürgen Meichsner, Michael Bonitz, Holger Fehske, Alexander Piel.

  12. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    Science.gov (United States)

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    Introduction Hydrological or geomorphological processes in nature are often very diverse and complex. This is partly due to the regional characteristics which vary over time and space, as well as changeable process-initiating and -controlling factors. Despite being aware of this complexity, such aspects are usually neglected in the modelling of hazard-related maps due to several reasons. But particularly when it comes to creating more realistic maps, this would be an essential component to consider. The first important step towards solving this problem would be to collect data relating to regional conditions which vary over time and geographical location, along with indicators of complex processes. Data should be acquired promptly during and after events, and subsequently digitally combined and analysed. Study area In June 2009, considerable damage occurred in the residential area of Klingfurth (Lower Austria) as a result of great pre-event wetness and repeatedly heavy rainfall, leading to flooding, debris flow deposit and gravitational mass movement. One of the causes is the fact that the meso-scale watershed (16 km²) of the Klingfurth stream is characterised by adverse geological and hydrological conditions. Additionally, the river system network with its discharge concentration within the residential zone contributes considerably to flooding, particularly during excessive rainfall across the entire region, as the flood peaks from different parts of the catchment area are superposed. First results of mapping Hydro(geo)logical surveys across the entire catchment area have shown that - over 600 gravitational mass movements of various type and stage have occurred. 516 of those have acted as a bed load source, while 325 mass movements had not reached the final stage yet and could thus supply bed load in the future. It should be noted that large mass movements in the initial or intermediate stage were predominately found in clayey-silty areas and weathered material

  13. Event management for large scale event-driven digital hardware spiking neural networks.

    Science.gov (United States)

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The complexity of patient safety reporting systems in UK dentistry.

    Science.gov (United States)

    Renton, T; Master, S

    2016-10-21

    Since the 'Francis Report', UK regulation focusing on patient safety has significantly changed. Healthcare workers are increasingly involved in NHS England patient safety initiatives aimed at improving reporting and learning from patient safety incidents (PSIs). Unfortunately, dentistry remains 'isolated' from these main events and continues to have a poor record for reporting and learning from PSIs and other events, thus limiting improvement of patient safety in dentistry. The reasons for this situation are complex.This paper provides a review of the complexities of the existing systems and procedures in relation to patient safety in dentistry. It highlights the conflicting advice which is available and which further complicates an overly burdensome process. Recommendations are made to address these problems with systems and procedures supporting patient safety development in dentistry.

  15. Cueing Complex Animations: Does Direction of Attention Foster Learning Processes?

    Science.gov (United States)

    Lowe, Richard; Boucheix, Jean-Michel

    2011-01-01

    The time course of learners' processing of a complex animation was studied using a dynamic diagram of a piano mechanism. Over successive repetitions of the material, two forms of cueing (standard colour cueing and anti-cueing) were administered either before or during the animated segment of the presentation. An uncued group and two other control…

  16. Available processing resources influence encoding-related brain activity before an event.

    Science.gov (United States)

    Galli, Giulia; Gebert, A Dorothea; Otten, Leun J

    2013-09-01

    Effective cognitive functioning not only relies on brain activity elicited by an event, but also on activity that precedes it. This has been demonstrated in a number of cognitive domains, including memory. Here, we show that brain activity that precedes the effective encoding of a word into long-term memory depends on the availability of sufficient processing resources. We recorded electrical brain activity from the scalps of healthy adult men and women while they memorized intermixed visual and auditory words for later recall. Each word was preceded by a cue that indicated the modality of the upcoming word. The degree to which processing resources were available before word onset was manipulated by asking participants to make an easy or difficult perceptual discrimination on the cue. Brain activity before word onset predicted later recall of the word, but only in the easy discrimination condition. These findings indicate that anticipatory influences on long-term memory are limited in capacity and sensitive to the degree to which attention is divided between tasks. Prestimulus activity that affects later encoding can only be engaged when the necessary cognitive resources can be allocated to the encoding process. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  18. Dystrophic Cardiomyopathy: Complex Pathobiological Processes to Generate Clinical Phenotype

    Directory of Open Access Journals (Sweden)

    Takeshi Tsuda

    2017-09-01

    Full Text Available Duchenne muscular dystrophy (DMD, Becker muscular dystrophy (BMD, and X-linked dilated cardiomyopathy (XL-DCM consist of a unique clinical entity, the dystrophinopathies, which are due to variable mutations in the dystrophin gene. Dilated cardiomyopathy (DCM is a common complication of dystrophinopathies, but the onset, progression, and severity of heart disease differ among these subgroups. Extensive molecular genetic studies have been conducted to assess genotype-phenotype correlation in DMD, BMD, and XL-DCM to understand the underlying mechanisms of these diseases, but the results are not always conclusive, suggesting the involvement of complex multi-layers of pathological processes that generate the final clinical phenotype. Dystrophin protein is a part of dystrophin-glycoprotein complex (DGC that is localized in skeletal muscles, myocardium, smooth muscles, and neuronal tissues. Diversity of cardiac phenotype in dystrophinopathies suggests multiple layers of pathogenetic mechanisms in forming dystrophic cardiomyopathy. In this review article, we review the complex molecular interactions involving the pathogenesis of dystrophic cardiomyopathy, including primary gene mutations and loss of structural integrity, secondary cellular responses, and certain epigenetic and other factors that modulate gene expressions. Involvement of epigenetic gene regulation appears to lead to specific cardiac phenotypes in dystrophic hearts.

  19. A First-level Event Selector for the CBM Experiment at FAIR

    International Nuclear Information System (INIS)

    Cuveland, J de; Lindenstruth, V

    2011-01-01

    The CBM experiment at the upcoming FAIR accelerator aims to create highest baryon densities in nucleus-nucleus collisions and to explore the properties of super-dense nuclear matter. Event rates of 10 MHz are needed for high-statistics measurements of rare probes, while event selection requires complex global triggers like secondary vertex search. To meet these demands, the CBM experiment uses self-triggered detector front-ends and a data push readout architecture. The First-level Event Selector (FLES) is the central physics selection system in CBM. It receives all hits and performs online event selection on the 1 TByte/s input data stream. The event selection process requires high-throughput event building and full event reconstruction using fast, vectorized track reconstruction algorithms. The current FLES architecture foresees a scalable high-performance computer. To achieve the high throughput and computation efficiency, all available computing devices will have to be used, in particular FPGAs at the first stages of the system and heterogeneous many-core architectures such as CPUs for efficient track reconstruction. A high-throughput network infrastructure and flow control in the system are other key aspects. In this paper, we present the foreseen architecture of the First-level Event Selector.

  20. Analyzing the evolution of beta-endorphin post-translational processing events: studies on reptiles.

    Science.gov (United States)

    Shoureshi, Pezhman; Baron, Andrea; Szynskie, Laura; Dores, Robert M

    2007-01-01

    In many cartilaginous fishes, most ray-finned fishes, lungfishes, and amphibians, the post-translational processing of POMC includes the monobasic cleavage of beta-endorphin to yield an opioid that is eight to ten amino acids in length. The amino acid motif within the beta-endorphin sequence required for a monobasic cleavage event is -E-R-(S/G)-Q-. Mammals and birds lack this motif and as a result beta-endorphin(1-8) is a not an end-product in either group. Since both mammals and birds were derived from ancestors with reptilian origins, an analysis of beta-endorphin sequences from extant groups of reptiles should provide insights into the manner in which beta-endorphin post-translational processing mechanisms have evolved in amniotes. To this end a POMC cDNA was cloned from the pituitary of the turtle, Chrysemys scripta. The beta-endorphin sequence in this species was compared to other reptile beta-endorphin sequences (i.e., Chinese soft shell turtle and gecko) and to known bird and mammal sequences. This analysis indicated that either the loss of the arginine residue at the cleavage site (the two turtle species, chick, and human) or a substitution at the glutamine position in the consensus sequence (gecko and ostrich) would account for the loss of the monobasic cleavage reaction in that species. Since amphibians are capable of performing the beta-endorphin monobasic reaction, it would appear that the amino acid substitutions that eliminated this post-translational process event in reptilian-related tetrapods must have occurred in the ancestral amniotes.

  1. Complex service recovery processes: how to avoid triple deviation

    OpenAIRE

    Edvardsson, Bo; Tronvoll, Bård; Höykinpuro, Ritva

    2011-01-01

    Purpose – This article seeks to develop a new framework to outline factors that influence the resolution of unfavourable service experiences as a result of double deviation. The focus is on understanding and managing complex service recovery processes. Design/methodology/approach – An inductive, explorative and narrative approach was selected. Data were collected in the form of narratives from the field through interviews with actors at various levels in organisations as well as with custo...

  2. Quantum-information processing in disordered and complex quantum systems

    International Nuclear Information System (INIS)

    Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2006-01-01

    We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations

  3. Aridity and decomposition processes in complex landscapes

    Science.gov (United States)

    Ossola, Alessandro; Nyman, Petter

    2015-04-01

    Decomposition of organic matter is a key biogeochemical process contributing to nutrient cycles, carbon fluxes and soil development. The activity of decomposers depends on microclimate, with temperature and rainfall being major drivers. In complex terrain the fine-scale variation in microclimate (and hence water availability) as a result of slope orientation is caused by differences in incoming radiation and surface temperature. Aridity, measured as the long-term balance between net radiation and rainfall, is a metric that can be used to represent variations in water availability within the landscape. Since aridity metrics can be obtained at fine spatial scales, they could theoretically be used to investigate how decomposition processes vary across complex landscapes. In this study, four research sites were selected in tall open sclerophyll forest along a aridity gradient (Budyko dryness index ranging from 1.56 -2.22) where microclimate, litter moisture and soil moisture were monitored continuously for one year. Litter bags were packed to estimate decomposition rates (k) using leaves of a tree species not present in the study area (Eucalyptus globulus) in order to avoid home-field advantage effects. Litter mass loss was measured to assess the activity of macro-decomposers (6mm litter bag mesh size), meso-decomposers (1 mm mesh), microbes above-ground (0.2 mm mesh) and microbes below-ground (2 cm depth, 0.2 mm mesh). Four replicates for each set of bags were installed at each site and bags were collected at 1, 2, 4, 7 and 12 months since installation. We first tested whether differences in microclimate due to slope orientation have significant effects on decomposition processes. Then the dryness index was related to decomposition rates to evaluate if small-scale variation in decomposition can be predicted using readily available information on rainfall and radiation. Decomposition rates (k), calculated fitting single pool negative exponential models, generally

  4. Visual pattern discovery in timed event data

    Science.gov (United States)

    Schaefer, Matthias; Wanner, Franz; Mansmann, Florian; Scheible, Christian; Stennett, Verity; Hasselrot, Anders T.; Keim, Daniel A.

    2011-01-01

    Business processes have tremendously changed the way large companies conduct their business: The integration of information systems into the workflows of their employees ensures a high service level and thus high customer satisfaction. One core aspect of business process engineering are events that steer the workflows and trigger internal processes. Strict requirements on interval-scaled temporal patterns, which are common in time series, are thereby released through the ordinal character of such events. It is this additional degree of freedom that opens unexplored possibilities for visualizing event data. In this paper, we present a flexible and novel system to find significant events, event clusters and event patterns. Each event is represented as a small rectangle, which is colored according to categorical, ordinal or intervalscaled metadata. Depending on the analysis task, different layout functions are used to highlight either the ordinal character of the data or temporal correlations. The system has built-in features for ordering customers or event groups according to the similarity of their event sequences, temporal gap alignment and stacking of co-occurring events. Two characteristically different case studies dealing with business process events and news articles demonstrate the capabilities of our system to explore event data.

  5. Localizing components of a complex task : sentence processing and working memory

    NARCIS (Netherlands)

    Stowe, L.A.; Broere, C.A.J.; Paans, A.MJ; Wijers, A.A.; Mulder, G.; Vaalburg, W.; Zwarts, Frans

    1998-01-01

    THREE areas of the left hemisphere play different roles in sentence comprehension. An area of posterior middle and superior temporal gyrus shows activation correlated with the structural complexity of a sentence, suggesting that this area supports processing of sentence structure. The lateral

  6. The Capabilities of Chaos and Complexity

    Directory of Open Access Journals (Sweden)

    David L. Abel

    2009-01-01

    Full Text Available To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a low-informational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization?

  7. Multiple-predators-based capture process on complex networks

    International Nuclear Information System (INIS)

    Sharafat, Rajput Ramiz; Pu Cunlai; Li Jie; Chen Rongbin; Xu Zhongqi

    2017-01-01

    The predator/prey (capture) problem is a prototype of many network-related applications. We study the capture process on complex networks by considering multiple predators from multiple sources. In our model, some lions start from multiple sources simultaneously to capture the lamb by biased random walks, which are controlled with a free parameter α . We derive the distribution of the lamb’s lifetime and the expected lifetime 〈 T 〉. Through simulation, we find that the expected lifetime drops substantially with the increasing number of lions. Moreover, we study how the underlying topological structure affects the capture process, and obtain that locating on small-degree nodes is better than on large-degree nodes to prolong the lifetime of the lamb. The dense or homogeneous network structures are against the survival of the lamb. We also discuss how to improve the capture efficiency in our model. (paper)

  8. Business Event Notification Service (BENS)

    Data.gov (United States)

    Department of Veterans Affairs — BENS provides a notification of pre-defined business events to applications, portals, and automated business processes. Such events are defined in the Event Catalog,...

  9. The "Mozart effect": an electroencephalographic analysis employing the methods of induced event-related desynchronization/synchronization and event-related coherence.

    Science.gov (United States)

    Jausovec, Norbert; Habe, Katarina

    2003-01-01

    The event-related responses of 18 individuals were recorded while they were listening to 3 music clips of 6 s duration which were repeated 30 times each. The music clips differed in the level of their complex structure, induced mood, musical tempo and prominent frequency. They were taken from Mozart's sonata (K. 448), and Brahms' Hungarian dance (no. 5). The third clip was a simplified version of the theme taken from Haydn's symphony (no. 94) played by a computer synthesizer. Significant differences in induced event-related desynchronization between the 3 music clips were only observed in the lower-1 alpha band which is related to attentional processes. A similar pattern was observed for the coherence measures. While respondents listened to the Mozart clip, coherence in the lower alpha bands increased more, whereas in the gamma band a less pronounced increase was observed as compared with the Brahms and Haydn clips. The clustering of the three clips based on EEG measures distinguished between the Mozart clip on the one hand, and the Haydn and Brahms clips on the other, even though the Haydn and Brahms clips were at the opposite extremes with regard to the mood they induced in listeners, musical tempo, and complexity of structure. This would suggest that Mozart's music--with no regard to the level of induced mood, musical tempo and complexity--influences the level of arousal. It seems that modulations in the frequency domain of Mozart's sonata have the greatest influence on the reported neurophysiological activity.

  10. Molecular study of the dentin-pulp complex responses to caries progression

    Directory of Open Access Journals (Sweden)

    Yani Corvianindya Rahayu

    2007-03-01

    Full Text Available The dentin-pulp complex exhibits various responses to caries, including events of injury, defense, and repair. The overall responses dependent on pulp cell activity and the signaling processes, which regulate the behavior of these cells. The signals for tissue repair are thought to be mediated by dentin-bound growth factors released during caries progression. Growth factors are a key of molecules responsible for signaling a variety of cellular process following dental injury. The endogenous proteolytic enzymes (Matrix metalloproteinases, MMPs present in dentin matrix might also participate in releasing bioactive molecule. Several members of the MMP family are found in the soft and hard tissue compartment of dentin-pulp complex. Their presumed role in many physiological process during the development and maintenance of the dentin-pulp complex, they may also contribute to the pathogenesis of dentin caries and the responses elicited by caries.

  11. Composition and microstructure alteration of triticale grain surface after processing by enzymes of cellulase complex

    Directory of Open Access Journals (Sweden)

    Elena Kuznetsova

    2016-01-01

    Full Text Available It is found that the pericarp tissue of grain have considerable strength and stiffness, that has an adverse effect on quality of whole-grain bread. Thereby, there exists the need for preliminary chemical and biochemical processing of durable cell walls before industrial use. Increasingly used in the production of bread finds an artificial hybrid of the traditional grain crops of wheat and rye - triticale, grain which has high nutritional value. The purpose of this research was to evaluate the influence of cellulose complex (Penicillium canescens enzymes on composition and microstructure alteration of triticale grain surface, for grain used in baking. Triticale grain was processed by cellulolytic enzyme preparations with different composition (producer is Penicillium canescens. During experiment it is found that triticale grain processing by enzymes of cellulase complex leads to an increase in the content of water-soluble pentosans by 36.3 - 39.2%. The total amount of low molecular sugars increased by 3.8 - 10.5 %. Studies show that under the influence of enzymes the microstructure of the triticale grain surface is changing. Microphotographs characterizing grain surface structure alteration in dynamic (every 2 hours during 10 hours of substrate hydrolysis are shown. It is found that the depth and direction of destruction process for non-starch polysaccharides of grain integument are determined by the composition of the enzyme complex preparation and duration of exposure. It is found, that xylanase involved in the modification of hemicelluloses fiber having both longitudinal and radial orientation. Hydrolysis of non-starch polysaccharides from grain shells led to increase of antioxidant activity. Ferulic acid was identified in alcoholic extract of triticale grain after enzymatic hydrolysis under the influence of complex preparation containing cellulase, xylanase and β-glucanase. Grain processing by independent enzymes containing in complex

  12. Fault detection and diagnosis for complex multivariable processes using neural networks

    International Nuclear Information System (INIS)

    Weerasinghe, M.

    1998-06-01

    Development of a reliable fault diagnosis method for large-scale industrial plants is laborious and often difficult to achieve due to the complexity of the targeted systems. The main objective of this thesis is to investigate the application of neural networks to the diagnosis of non-catastrophic faults in an industrial nuclear fuel processing plant. The proposed methods were initially developed by application to a simulated chemical process prior to further validation on real industrial data. The diagnosis of faults at a single operating point is first investigated. Statistical data conditioning methods of data scaling and principal component analysis are investigated to facilitate fault classification and reduce the complexity of neural networks. Successful fault diagnosis was achieved with significantly smaller networks than using all process variables as network inputs. Industrial processes often manufacture at various operating points, but demonstrated applications of neural networks for fault diagnosis usually only consider a single (primary) operating point. Developing a standard neural network scheme for fault diagnosis at all operating points would be usually impractical due to the unavailability of suitable training data for less frequently used (secondary) operating points. To overcome this problem, the application of a single neural network for the diagnosis of faults operating at different points is investigated. The data conditioning followed the same techniques as used for the fault diagnosis of a single operating point. The results showed that a single neural network could be successfully used to diagnose faults at operating points other than that it is trained for, and the data conditioning significantly improved the classification. Artificial neural networks have been shown to be an effective tool for process fault diagnosis. However, a main criticism is that details of the procedures taken to reach the fault diagnosis decisions are embedded in

  13. Detection of unusual events and trends in complex non-stationary data streams

    International Nuclear Information System (INIS)

    Charlton-Perez, C.; Perez, R.B.; Protopopescu, V.; Worley, B.A.

    2011-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for diverse applications, ranging from power plant operation to homeland security. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden events inside intermittent signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  14. Design process and philosophy of TVA's latest advance control room complex

    International Nuclear Information System (INIS)

    Owens, G.R.; Masters, D.W.

    1979-01-01

    TVA's latest nuclear power plant control room design includes a greater emphasis on human factors as compared to their earlier plant designs. This emphasis has resulted in changes in the overall design philosophy and design process. This paper discusses some of the prominent design features of both the control room and the surrounding control room complex. In addition, it also presents some of the important activities involved in the process of developing the advanced control room design

  15. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  16. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  17. Initiating events study of the first extraction cycle process in a model reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Renze; Zhang, Jian Gang; Zhuang, Dajie; Feng, Zong Yang [China Institute for Radiation Protection, Taiyuan (China)

    2016-06-15

    Definition and grouping of initiating events (IEs) are important basics for probabilistic safety assessment (PSA). An IE in a spent fuel reprocessing plant (SFRP) is an event that probably leads to the release of dangerous material to jeopardize workers, public and environment. The main difference between SFRPs and nuclear power plants (NPPs) is that hazard materials spread diffusely in a SFRP and radioactive material is just one kind of hazard material. Since the research on IEs for NPPs is in-depth around the world, there are several general methods to identify IEs: reference of lists in existence, review of experience feedback, qualitative analysis method, and deductive analysis method. While failure mode and effect analysis (FMEA) is an important qualitative analysis method, master logic diagram (MLD) method is the deductive analysis method. IE identification in SFRPs should be consulted with the experience of NPPs, however the differences between SFRPs and NPPs should be considered seriously. The plutonium uranium reduction extraction (Purex) process is adopted by the technics in a model reprocessing plant. The first extraction cycle (FEC) is the pivotal process in the Purex process. Whether the FEC can function safely and steadily would directly influence the production process of the whole plant-production quality. Important facilities of the FEC are installed in the equipment cells (ECs). In this work, IEs in the FEC process were identified and categorized by FMEA and MLD two methods, based on the fact that ECs are containments in the plant. The results show that only two ECs in the FEC do not need to be concerned particularly with safety problems, and criticality, fire and red oil explosion are IEs which should be emphatically analyzed. The results are accordant with the references.

  18. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  19. Reconstruction efficiency and precision for the events detected by the BIS-2 installation using the Perun pattern recognition program

    International Nuclear Information System (INIS)

    Burilkov, D.T.; Genchev, V.I.; Markov, P.K.; Likhachev, M.F.; Takhtamyshev, G.G.; Todorov, P.T.; Trayanov, P.K.

    1982-01-01

    Results of studying the efficiency and accuracy of the track and event reconstruction with the Perun pattern recognition program used in the experiments carried out at the BIS-2 installation are presented. The Monte Carlo method is used for simulating the processes of neutron interactions with matter. The con-- clusion is made that the Perun program correctly, with good accuracy and high efficiency reconstructs complex multiparticle events [ru

  20. A robust approach to extract biomedical events from literature.

    Science.gov (United States)

    Bui, Quoc-Chinh; Sloot, Peter M A

    2012-10-15

    The abundance of biomedical literature has attracted significant interest in novel methods to automatically extract biomedical relations from the literature. Until recently, most research was focused on extracting binary relations such as protein-protein interactions and drug-disease relations. However, these binary relations cannot fully represent the original biomedical data. Therefore, there is a need for methods that can extract fine-grained and complex relations known as biomedical events. In this article we propose a novel method to extract biomedical events from text. Our method consists of two phases. In the first phase, training data are mapped into structured representations. Based on that, templates are used to extract rules automatically. In the second phase, extraction methods are developed to process the obtained rules. When evaluated against the Genia event extraction abstract and full-text test datasets (Task 1), we obtain results with F-scores of 52.34 and 53.34, respectively, which are comparable to the state-of-the-art systems. Furthermore, our system achieves superior performance in terms of computational efficiency. Our source code is available for academic use at http://dl.dropbox.com/u/10256952/BioEvent.zip.

  1. Numerical simulation of complex part manufactured by selective laser melting process

    Science.gov (United States)

    Van Belle, Laurent

    2017-10-01

    Selective Laser Melting (SLM) process belonging to the family of the Additive Manufacturing (AM) technologies, enable to build parts layer by layer, from metallic powder and a CAD model. Physical phenomena that occur in the process have the same issues as conventional welding. Thermal gradients generate significant residual stresses and distortions in the parts. Moreover, the large and complex parts to manufacturing, accentuate the undesirable effects. Therefore, it is essential for manufacturers to offer a better understanding of the process and to ensure production reliability of parts with high added value. This paper focuses on the simulation of manufacturing turbine by SLM process in order to calculate residual stresses and distortions. Numerical results will be presented.

  2. The investigation of form and processes in the coastal zone under extreme storm events - the case study of Rethymno, Greece

    Science.gov (United States)

    Afentoulis, Vasileios; Mohammadi, Bijan; Tsoukala, Vasiliki

    2017-04-01

    Coastal zone is a significant geographical and particular region, since it gathers a wide range of social-human's activities and appears to be a complex as well as fragile system of natural variables. Coastal communities are increasingly at risk from serious coastal hazards, such as shoreline erosion and flooding related to extreme hydro-meteorological events: storm surges, heavy precipitation, tsunamis and tides. In order to investigate the impact of these extreme events on the coastal zone, it is necessary to describe the driving mechanisms which contribute to its destabilization and more precisely the interaction between the wave forces and the transport of sediment. The aim of the present study is to examine the capability of coastal zone processes simulation under extreme wave events, using numerical models, in the coastal area of Rethymno, Greece. Rethymno city is one of the eleven case study areas of PEARL (Preparing for Extreme And Rare events in coastal regions) project, an EU funded research project, which aims at developing adaptive risk management strategies for coastal communities focusing on extreme hydro-meteorological events, with a multidisciplinary approach integrating social, environmental and technical research and innovation so as to increase the resilience of coastal regions all over the world. Within this framework, three different numerical models have been used: the MIKE 21 - DHI, the XBeach model and a numerical formulation for sea bed evolution, developed by Afaf Bouharguane and Bijan Mohammadi (2013). For the determination of the wave and hydrodynamic conditions, as well as the assessment of the sediment transport components, the MIKE 21 SW and the MIKE 21 FM modules have been applied and the bathymetry of Rethymno is arranged into a 2D unstructured mesh. This method of digitalization was selected because of its ability to easily represent the complex geometry of the coastal zone. It allows smaller scale wave characteristics to be

  3. Disentangling the complex evolutionary history of the Western Palearctic blue tits (Cyanistes spp.) - phylogenomic analyses suggest radiation by multiple colonization events and subsequent isolation.

    Science.gov (United States)

    Stervander, Martin; Illera, Juan Carlos; Kvist, Laura; Barbosa, Pedro; Keehnen, Naomi P; Pruisscher, Peter; Bensch, Staffan; Hansson, Bengt

    2015-05-01

    Isolated islands and their often unique biota continue to play key roles for understanding the importance of drift, genetic variation and adaptation in the process of population differentiation and speciation. One island system that has inspired and intrigued evolutionary biologists is the blue tit complex (Cyanistes spp.) in Europe and Africa, in particular the complex evolutionary history of the multiple genetically distinct taxa of the Canary Islands. Understanding Afrocanarian colonization events is of particular importance because of recent unconventional suggestions that these island populations acted as source of the widespread population in mainland Africa. We investigated the relationship between mainland and island blue tits using a combination of Sanger sequencing at a population level (20 loci; 12 500 nucleotides) and next-generation sequencing of single population representatives (>3 200 000 nucleotides), analysed in coalescence and phylogenetic frameworks. We found (i) that Afrocanarian blue tits are monophyletic and represent four major clades, (ii) that the blue tit complex has a continental origin and that the Canary Islands were colonized three times, (iii) that all island populations have low genetic variation, indicating low long-term effective population sizes and (iv) that populations on La Palma and in Libya represent relicts of an ancestral North African population. Further, demographic reconstructions revealed (v) that the Canary Islands, conforming to traditional views, hold sink populations, which have not served as source for back colonization of the African mainland. Our study demonstrates the importance of complete taxon sampling and an extensive multimarker study design to obtain robust phylogeographical inferences. © 2015 John Wiley & Sons Ltd.

  4. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  5. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  6. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    Science.gov (United States)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative

  7. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  8. Top event prevention in complex systems

    International Nuclear Information System (INIS)

    Youngblood, R.W.; Worrell, R.B.

    1995-01-01

    A key step in formulating a regulatory basis for licensing complex and potentially hazardous facilities is identification of a collection of design elements that is necessary and sufficient to achieve the desired level of protection of the public, the workers, and the environment. Here, such a collection of design elements will be called a ''prevention set.'' At the design stage, identifying a prevention set helps to determine what elements to include in the final design. Separately, a prevention-set argument could be used to limit the scope of regulatory oversight to a subset of design elements. This step can be taken during initial review of a design, or later as part of an effort to justify relief from regulatory requirements that are burdensome but provide little risk reduction. This paper presents a systematic approach to the problem of optimally choosing a prevention set

  9. Processing a Complex Architectural Sampling with Meshlab: the Case of Piazza della Signoria

    Science.gov (United States)

    Callieri, M.; Cignoni, P.; Dellepiane, M.; Ranzuglia, G.; Scopigno, R.

    2011-09-01

    The paper presents a recent 3D scanning project performed with long range scanning technology showing how a complex sampled dataset can be processed with the features available in MeshLab, an open source tool. MeshLab is an open source mesh processing system. It is a portable and extensible system aimed to help the processing of the typical not-so-small unstructured models that arise in 3D scanning, providing a set of tools for editing, cleaning, processing, inspecting, rendering and converting meshes. The MeshLab system started in late 2005 as a part of a university course, and considerably evolved since then thanks to the effort of the Visual Computing Lab and of the support of several funded EC projects. MeshLab gained so far an excellent visibility and distribution, with several thousands downloads every month, and a continuous evolution. The aim of this scanning campaign was to sample the façades of the buildings located in Piazza della Signoria (Florence, Italy). This digital 3D model was required, in the framework of a Regional Project, as a basic background model to present a complex set of images using a virtual navigation metaphor (following the PhotoSynth approach). Processing of complex dataset, such as the ones produced by long range scanners, often requires specialized, difficult to use and costly software packages. We show in the paper how it is possible to process this kind of data inside an open source tool, thanks to the many new features recently introduced in MeshLab for the management of large sets of sampled point.

  10. Developmentally regulated expression and complex processing of barley pri-microRNAs

    Directory of Open Access Journals (Sweden)

    Kruszka Katarzyna

    2013-01-01

    Full Text Available Abstract Background MicroRNAs (miRNAs regulate gene expression via mRNA cleavage or translation inhibition. In spite of barley being a cereal of great economic importance, very little data is available concerning its miRNA biogenesis. There are 69 barley miRNA and 67 pre-miRNA sequences available in the miRBase (release 19. However, no barley pri-miRNA and MIR gene structures have been shown experimentally. In the present paper, we examine the biogenesis of selected barley miRNAs and the developmental regulation of their pri-miRNA processing to learn more about miRNA maturation in barely. Results To investigate the organization of barley microRNA genes, nine microRNAs - 156g, 159b, 166n, 168a-5p/168a-3p, 171e, 397b-3p, 1120, and 1126 - were selected. Two of the studied miRNAs originate from one MIR168a-5p/168a-3p gene. The presence of all miRNAs was confirmed using a Northern blot approach. The miRNAs are encoded by genes with diverse organizations, representing mostly independent transcription units with or without introns. The intron-containing miRNA transcripts undergo complex splicing events to generate various spliced isoforms. We identified miRNAs that were encoded within introns of the noncoding genes MIR156g and MIR1126. Interestingly, the intron that encodes miR156g is spliced less efficiently than the intron encoding miR1126 from their specific precursors. miR397b-3p was detected in barley as a most probable functional miRNA, in contrast to rice where it has been identified as a complementary partner miRNA*. In the case of miR168a-5p/168a-3p, we found the generation of stable, mature molecules from both pre-miRNA arms, confirming evolutionary conservation of the stability of both species, as shown in rice and maize. We suggest that miR1120, located within the 3′ UTR of a protein-coding gene and described as a functional miRNA in wheat, may represent a siRNA generated from a mariner-like transposable element. Conclusions Seven of the

  11. Assessing traumatic event exposure: general issues and preliminary findings for the Stressful Life Events Screening Questionnaire.

    Science.gov (United States)

    Goodman, L A; Corcoran, C; Turner, K; Yuan, N; Green, B L

    1998-07-01

    This article reviews the psychometric properties of the Stressful Life Events Screening Questionnaire (SLESQ), a recently developed trauma history screening measure, and discusses the complexities involved in assessing trauma exposure. There are relatively few general measures of exposure to a variety of types of traumatic events, and most of those that exist have not been subjected to rigorous psychometric evaluation. The SLESQ showed good test-retest reliability, with a median kappa of .73, adequate convergent validity (with a lengthier interview) with a median kappa of .64, and good discrimination between Criterion A and non-Criterion A events. The discussion addresses some of the challenges of assessing traumatic event exposure along the dimensions of defining traumatic events, assessment methodologies, reporting consistency, and incident validation.

  12. Modular-block complex on preparation and processing of oil slurry, spilled and raw petroleum

    International Nuclear Information System (INIS)

    Pak, V.V.

    1999-01-01

    The use of non-serial small petroleum equipment for development of remote and low output petroleum deposit, collection and processing of spilled petroleum is urgent issue. Joint-stock company Montazhengineering developed and mastered production of small modular-block complexes for preparation and processing of petroleum. The complex can include the following modules: preparation of raw petroleum for getting commodity petroleum; petroleum processing installation for getting gas, diesel fuel and black oil; installation for refining of nafta; installation for cleaning petroleum products from mercaptans; installation for getting basic oil; installation for getting bitumen and mastic; installation for processing of spilled petroleum and oil slurry. Each of modules can work separately and in various combinations depending on input and necessary assortment of commodity petroleum. One of urgent ecological problem in Kazakhstan petroleum-processing regions is large number barns with spilled petroleum and oil slurry. Their processing will allow to solve the ecological and economical problems. Processing of spilled petroleum and oil slurry in developed installations it's possible to get commodity petroleum and petroleum products

  13. A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.

    Science.gov (United States)

    Dubljević, Veljko; Racine, Eric

    2014-10-01

    The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).

  14. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    Science.gov (United States)

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  15. Using the calculational simulating complexes when making the computer process control systems for NPP

    International Nuclear Information System (INIS)

    Zimakov, V.N.; Chernykh, V.P.

    1998-01-01

    The problems on creating calculational-simulating (CSC) and their application by developing the program and program-technical means for computer-aided process control systems at NPP are considered. The abo- ve complex is based on the all-mode real time mathematical model, functioning at a special complex of computerized means

  16. Gaussian random-matrix process and universal parametric correlations in complex systems

    International Nuclear Information System (INIS)

    Attias, H.; Alhassid, Y.

    1995-01-01

    We introduce the framework of the Gaussian random-matrix process as an extension of Dyson's Gaussian ensembles and use it to discuss the statistical properties of complex quantum systems that depend on an external parameter. We classify the Gaussian processes according to the short-distance diffusive behavior of their energy levels and demonstrate that all parametric correlation functions become universal upon the appropriate scaling of the parameter. The class of differentiable Gaussian processes is identified as the relevant one for most physical systems. We reproduce the known spectral correlators and compute eigenfunction correlators in their universal form. Numerical evidence from both a chaotic model and weakly disordered model confirms our predictions

  17. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    International Nuclear Information System (INIS)

    Perez, Rafael B.; Protopopescu, Vladimir A.; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method

  18. Simulating flaring events in complex active regions driven by observed magnetograms

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has

  19. RNA assemblages orchestrate complex cellular processes

    DEFF Research Database (Denmark)

    Nielsen, Finn Cilius; Hansen, Heidi Theil; Christiansen, Jan

    2016-01-01

    Eukaryotic mRNAs are monocistronic, and therefore mechanisms exist that coordinate the synthesis of multiprotein complexes in order to obtain proper stoichiometry at the appropriate intracellular locations. RNA-binding proteins containing low-complexity sequences are prone to generate liquid drop...

  20. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    International Nuclear Information System (INIS)

    Taylor, J'Tia Patrice; Shropshire, David E.

    2009-01-01

    This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated system

  1. Features, Events, and Processes in SZ Flow and Transport

    International Nuclear Information System (INIS)

    Economy, K.

    2004-01-01

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), (f) (DIRS 156605). This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded)

  2. Features, Events, and Processes in SZ Flow and Transport

    International Nuclear Information System (INIS)

    S. Kuzio

    2005-01-01

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.11(d), (e), (f) [DIRS 173273]. This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded)

  3. Features, Events, and Processes in SZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    K. Economy

    2004-11-16

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either ''Included'' or ''Excluded'', is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), (f) (DIRS 156605). This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded).

  4. Features, Events, and Processes in SZ Flow and Transport

    Energy Technology Data Exchange (ETDEWEB)

    S. Kuzio

    2005-08-20

    This analysis report evaluates and documents the inclusion or exclusion of the saturated zone (SZ) features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment (TSPA) for license application (LA) of a nuclear waste repository at Yucca Mountain, Nevada. A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for the decision. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.11(d), (e), (f) [DIRS 173273]. This scientific report focuses on FEP analysis of flow and transport issues relevant to the SZ (e.g., fracture flow in volcanic units, anisotropy, radionuclide transport on colloids, etc.) to be considered in the TSPA model for the LA. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded).

  5. National Institutes of Health–Sponsored Clinical Islet Transplantation Consortium Phase 3 Trial: Manufacture of a Complex Cellular Product at Eight Processing Facilities

    Science.gov (United States)

    Balamurugan, A.N.; Szot, Gregory L.; Kin, Tatsuya; Liu, Chengyang; Czarniecki, Christine W.; Barbaro, Barbara; Bridges, Nancy D.; Cano, Jose; Clarke, William R.; Eggerman, Thomas L.; Hunsicker, Lawrence G.; Kaufman, Dixon B.; Khan, Aisha; Lafontant, David-Erick; Linetsky, Elina; Luo, Xunrong; Markmann, James F.; Naji, Ali; Korsgren, Olle; Oberholzer, Jose; Turgeon, Nicole A.; Brandhorst, Daniel; Chen, Xiaojuan; Friberg, Andrew S.; Lei, Ji; Wang, Ling-jia; Wilhelm, Joshua J.; Willits, Jamie; Zhang, Xiaomin; Hering, Bernhard J.; Posselt, Andrew M.; Stock, Peter G.; Shapiro, A.M. James

    2016-01-01

    Eight manufacturing facilities participating in the National Institutes of Health–sponsored Clinical Islet Transplantation (CIT) Consortium jointly developed and implemented a harmonized process for the manufacture of allogeneic purified human pancreatic islet (PHPI) product evaluated in a phase 3 trial in subjects with type 1 diabetes. Manufacturing was controlled by a common master production batch record, standard operating procedures that included acceptance criteria for deceased donor organ pancreata and critical raw materials, PHPI product specifications, certificate of analysis, and test methods. The process was compliant with Current Good Manufacturing Practices and Current Good Tissue Practices. This report describes the manufacturing process for 75 PHPI clinical lots and summarizes the results, including lot release. The results demonstrate the feasibility of implementing a harmonized process at multiple facilities for the manufacture of a complex cellular product. The quality systems and regulatory and operational strategies developed by the CIT Consortium yielded product lots that met the prespecified characteristics of safety, purity, potency, and identity and were successfully transplanted into 48 subjects. No adverse events attributable to the product and no cases of primary nonfunction were observed. PMID:27465220

  6. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  7. Spatial and Semantic Processing between Audition and Vision: An Event-Related Potential Study

    Directory of Open Access Journals (Sweden)

    Xiaoxi Chen

    2011-10-01

    Full Text Available Using a crossmodal priming paradigm, this study investigated how the brain bound the spatial and semantic features in multisensory processing. The visual stimuli (pictures of animals were presented after the auditory stimuli (sounds of animals, and the stimuli from different modalities may match spatially (or semantically or not. Participants were required to detect the head orientation of the visual target (an oddball paradigm. The event-related potentials (ERPs to the visual stimuli was enhanced by spatial attention (150–170 ms irrespectively of semantic information. The early crossmodal attention effect for the visual stimuli was more negative in the spatial-congruent condition than in the spatial-incongruent condition. By contrast, the later effects of spatial ERPs were significant only for the semantic- congruent condition (250–300 ms. These findings indicated that spatial attention modulated early visual processing, and semantic and spatial features were simultaneously used to orient attention and modulate later processing stages.

  8. Probing the lifetimes of auditory novelty detection processes.

    Science.gov (United States)

    Pegado, Felipe; Bekinschtein, Tristan; Chausson, Nicolas; Dehaene, Stanislas; Cohen, Laurent; Naccache, Lionel

    2010-08-01

    Auditory novelty detection can be fractionated into multiple cognitive processes associated with their respective neurophysiological signatures. In the present study we used high-density scalp event-related potentials (ERPs) during an active version of the auditory oddball paradigm to explore the lifetimes of these processes by varying the stimulus onset asynchrony (SOA). We observed that early MMN (90-160 ms) decreased when the SOA increased, confirming the evanescence of this echoic memory system. Subsequent neural events including late MMN (160-220 ms) and P3a/P3b components of the P3 complex (240-500 ms) did not decay with SOA, but showed a systematic delay effect supporting a two-stage model of accumulation of evidence. On the basis of these observations, we propose a distinction within the MMN complex of two distinct events: (1) an early, pre-attentive and fast-decaying MMN associated with generators located within superior temporal gyri (STG) and frontal cortex, and (2) a late MMN more resistant to SOA, corresponding to the activation of a distributed cortical network including fronto-parietal regions. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  9. Functional definition of the N450 event-related brain potential marker of conflict processing: a numerical stroop study

    OpenAIRE

    Szűcs, Denes; Soltész, F

    2012-01-01

    BACKGROUND: Several conflict processing studies aimed to dissociate neuroimaging phenomena related to stimulus and response conflict processing. However, previous studies typically did not include a paradigm-independent measure of either stimulus or response conflict. Here we have combined electro-myography (EMG) with event-related brain potentials (ERPs) in order to determine whether a particularly robust marker of conflict processing, the N450 ERP effect usually related to the activity of t...

  10. Deterministic ripple-spreading model for complex networks.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  11. Primary photochemical processes for Pt(iv) diazido complexes prospective in photodynamic therapy of tumors.

    Science.gov (United States)

    Shushakov, Anton A; Pozdnyakov, Ivan P; Grivin, Vjacheslav P; Plyusnin, Victor F; Vasilchenko, Danila B; Zadesenets, Andrei V; Melnikov, Alexei A; Chekalin, Sergey V; Glebov, Evgeni M

    2017-07-25

    Diazide diamino complexes of Pt(iv) are considered as prospective prodrugs in oxygen-free photodynamic therapy (PDT). Primary photophysical and photochemical processes for cis,trans,cis-[Pt(N 3 ) 2 (OH) 2 (NH 3 ) 2 ] and trans,trans,trans-[Pt(N 3 ) 2 (OH) 2 (NH 3 ) 2 ] complexes were studied by means of stationary photolysis, nanosecond laser flash photolysis and ultrafast kinetic spectroscopy. The process of photolysis is multistage. The first stage is the photosubstitution of an azide ligand to a water molecule. This process was shown to be a chain reaction involving redox stages. Pt(iv) and Pt(iii) intermediates responsible for the chain propagation were recorded using ultrafast kinetic spectroscopy and nanosecond laser flash photolysis. The mechanism of photosubstitution is proposed.

  12. Soundscapes, events, resistance

    Directory of Open Access Journals (Sweden)

    Andrea Mubi Brighenti

    2008-12-01

    Full Text Available Put it bluntly, a soundscape is the sonic counterpart, or component, of landscape. From such minimal assumption, some interesting consequences follow: just as landscape is far from being a simple stage-set upon which events take place, soundscape, too, is itself evental, i.e., it consists of events. Not only because its nature, far from being acoustics is always ‘psychoacoustics’, as Murray Schafer (1977/1994 first argued. Processes of environmental perception are of course there.

  13. RAVEN. Dynamic Event Tree Approach Level III Milestone

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-07-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics are not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (DPRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed to perform two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, the control logic infrastructure is used to model stochastic events, such as components failures, and perform uncertainty propagation. Such stochastic modeling is deployed using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This report focuses on the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, a DPRA analysis, using DET, of a simplified pressurized water reactor for a Station Black-Out (SBO) scenario is presented.

  14. RAVEN: Dynamic Event Tree Approach Level III Milestone

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Alfonsi; Cristian Rabiti; Diego Mandelli; Joshua Cogliati; Robert Kinoshita

    2013-07-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics are not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (DPRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed to perform two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, the control logic infrastructure is used to model stochastic events, such as components failures, and perform uncertainty propagation. Such stochastic modeling is deployed using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This report focuses on the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, a DPRA analysis, using DET, of a simplified pressurized water reactor for a Station Black-Out (SBO) scenario is presented.

  15. Event Investigation

    International Nuclear Information System (INIS)

    Korosec, D.

    2000-01-01

    The events in the nuclear industry are investigated from the license point of view and from the regulatory side too. It is well known the importance of the event investigation. One of the main goals of such investigation is to prevent the circumstances leading to the event and the consequences of the event. The protection of the nuclear workers against nuclear hazard, and the protection of general public against dangerous effects of an event could be achieved by systematic approach to the event investigation. Both, the nuclear safety regulatory body and the licensee shall ensure that operational significant events are investigated in a systematic and technically sound manner to gather information pertaining to the probable causes of the event. One of the results should be appropriate feedback regarding the lessons of the experience to the regulatory body, nuclear industry and general public. In the present paper a general description of systematic approach to the event investigation is presented. The systematic approach to the event investigation works best where cooperation is present among the different divisions of the nuclear facility or regulatory body. By involving management and supervisors the safety office can usually improve their efforts in the whole process. The end result shall be a program which serves to prevent events and reduce the time and efforts solving the root cause which initiated each event. Selection of the proper method for the investigation and an adequate review of the findings and conclusions lead to the higher level of the overall nuclear safety. (author)

  16. CLARA: A Contemporary Approach to Physics Data Processing

    Energy Technology Data Exchange (ETDEWEB)

    V Gyurjyan, D Abbott, J Carbonneau, G Gilfoyle, D Heddle, G Heyes, S Paul, C Timmer, D Weygand, E Wolin

    2011-12-01

    In traditional physics data processing (PDP) systems, data location is static and is accessed by analysis applications. In comparison, CLARA (CLAS12 Reconstruction and Analysis framework) is an environment where data processing algorithms filter continuously flowing data. In CLARA's domain of loosely coupled services, data is not stored, but rather flows from one service to another, mutating constantly along the way. Agents, performing event processing, can then subscribe to particular data/events at any stage of the data transformation, and make intricate decisions (e.g. particle ID) by correlating events from multiple, parallel data streams and/or services. This paper presents a PDP application development framework based on service oriented and event driven architectures. This system allows users to design (Java, C++, and Python languages are supported) and deploy data processing services, as well as dynamically compose PDP applications using available services. The PDP service bus provides a layer on top of a distributed pub-sub middleware implementation, which allows complex service composition and integration without writing code. Examples of service creation and deployment, along with the CLAS12 track reconstruction application design will be presented.

  17. Imaging transient blood vessel fusion events in zebrafish by correlative volume electron microscopy.

    Directory of Open Access Journals (Sweden)

    Hannah E J Armer

    Full Text Available The study of biological processes has become increasingly reliant on obtaining high-resolution spatial and temporal data through imaging techniques. As researchers demand molecular resolution of cellular events in the context of whole organisms, correlation of non-invasive live-organism imaging with electron microscopy in complex three-dimensional samples becomes critical. The developing blood vessels of vertebrates form a highly complex network which cannot be imaged at high resolution using traditional methods. Here we show that the point of fusion between growing blood vessels of transgenic zebrafish, identified in live confocal microscopy, can subsequently be traced through the structure of the organism using Focused Ion Beam/Scanning Electron Microscopy (FIB/SEM and Serial Block Face/Scanning Electron Microscopy (SBF/SEM. The resulting data give unprecedented microanatomical detail of the zebrafish and, for the first time, allow visualization of the ultrastructure of a time-limited biological event within the context of a whole organism.

  18. Comparing the temporal dynamics of thematic and taxonomic processing using event-related potentials.

    Directory of Open Access Journals (Sweden)

    Olivera Savic

    Full Text Available We report the results of a study comparing the temporal dynamics of thematic and taxonomic knowledge activation in a picture-word priming paradigm using event-related potentials. Although we found no behavioral differences between thematic and taxonomic processing, ERP data revealed distinct patterns of N400 and P600 amplitude modulation for thematic and taxonomic priming. Thematically related target stimuli elicited less negativity than taxonomic targets between 280-460 ms after stimulus onset, suggesting easier semantic processing of thematic than taxonomic relationships. Moreover, P600 mean amplitude was significantly increased for taxonomic targets between 520-600 ms, consistent with a greater need for stimulus reevaluation in that condition. These results offer novel evidence in favor of a dissociation between thematic and taxonomic thinking in the early phases of conceptual evaluation.

  19. Comparing the temporal dynamics of thematic and taxonomic processing using event-related potentials.

    Science.gov (United States)

    Savic, Olivera; Savic, Andrej M; Kovic, Vanja

    2017-01-01

    We report the results of a study comparing the temporal dynamics of thematic and taxonomic knowledge activation in a picture-word priming paradigm using event-related potentials. Although we found no behavioral differences between thematic and taxonomic processing, ERP data revealed distinct patterns of N400 and P600 amplitude modulation for thematic and taxonomic priming. Thematically related target stimuli elicited less negativity than taxonomic targets between 280-460 ms after stimulus onset, suggesting easier semantic processing of thematic than taxonomic relationships. Moreover, P600 mean amplitude was significantly increased for taxonomic targets between 520-600 ms, consistent with a greater need for stimulus reevaluation in that condition. These results offer novel evidence in favor of a dissociation between thematic and taxonomic thinking in the early phases of conceptual evaluation.

  20. A hierarchy of event-related potential markers of auditory processing in disorders of consciousness

    Directory of Open Access Journals (Sweden)

    Steve Beukema

    2016-01-01

    Full Text Available Functional neuroimaging of covert perceptual and cognitive processes can inform the diagnoses and prognoses of patients with disorders of consciousness, such as the vegetative and minimally conscious states (VS;MCS. Here we report an event-related potential (ERP paradigm for detecting a hierarchy of auditory processes in a group of healthy individuals and patients with disorders of consciousness. Simple cortical responses to sounds were observed in all 16 patients; 7/16 (44% patients exhibited markers of the differential processing of speech and noise; and 1 patient produced evidence of the semantic processing of speech (i.e. the N400 effect. In several patients, the level of auditory processing that was evident from ERPs was higher than the abilities that were evident from behavioural assessment, indicating a greater sensitivity of ERPs in some cases. However, there were no differences in auditory processing between VS and MCS patient groups, indicating a lack of diagnostic specificity for this paradigm. Reliably detecting semantic processing by means of the N400 effect in passively listening single-subjects is a challenge. Multiple assessment methods are needed in order to fully characterise the abilities of patients with disorders of consciousness.

  1. The Role of Episodic Memory in Controlled Evaluative Judgments about Attitudes: An Event-Related Potential Study

    Science.gov (United States)

    Johnson, Ray, Jr.; Simon, Elizabeth J.; Henkell, Heather; Zhu, John

    2011-01-01

    Event-related potentials (ERPs) are unique in their ability to provide information about the timing of activity in the neural networks that perform complex cognitive processes. Given the dearth of extant data from normal controls on the question of whether attitude representations are stored in episodic or semantic memory, the goal here was to…

  2. Event boundaries and anaphoric reference.

    Science.gov (United States)

    Thompson, Alexis N; Radvansky, Gabriel A

    2016-06-01

    The current study explored the finding that parsing a narrative into separate events impairs anaphor resolution. According to the Event Horizon Model, when a narrative event boundary is encountered, a new event model is created. Information associated with the prior event model is removed from working memory. So long as the event model containing the anaphor referent is currently being processed, this information should still be available when there is no narrative event boundary, even if reading has been disrupted by a working-memory-clearing distractor task. In those cases, readers may reactivate their prior event model, and anaphor resolution would not be affected. Alternatively, comprehension may not be as event oriented as this account suggests. Instead, any disruption of the contents of working memory during comprehension, event related or not, may be sufficient to disrupt anaphor resolution. In this case, reading comprehension would be more strongly guided by other, more basic language processing mechanisms and the event structure of the described events would play a more minor role. In the current experiments, participants were given stories to read in which we included, between the anaphor and its referent, either the presence of a narrative event boundary (Experiment 1) or a narrative event boundary along with a working-memory-clearing distractor task (Experiment 2). The results showed that anaphor resolution was affected by narrative event boundaries but not by a working-memory-clearing distractor task. This is interpreted as being consistent with the Event Horizon Model of event cognition.

  3. The Effects of Visual Complexity for Japanese Kanji Processing with High and Low Frequencies

    Science.gov (United States)

    Tamaoka, Katsuo; Kiyama, Sachiko

    2013-01-01

    The present study investigated the effects of visual complexity for kanji processing by selecting target kanji from different stroke ranges of visually simple (2-6 strokes), medium (8-12 strokes), and complex (14-20 strokes) kanji with high and low frequencies. A kanji lexical decision task in Experiment 1 and a kanji naming task in Experiment 2…

  4. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  5. Mathematical foundations of event trees

    International Nuclear Information System (INIS)

    Papazoglou, Ioannis A.

    1998-01-01

    A mathematical foundation from first principles of event trees is presented. The main objective of this formulation is to offer a formal basis for developing automated computer assisted construction techniques for event trees. The mathematical theory of event trees is based on the correspondence between the paths of the tree and the elements of the outcome space of a joint event. The concept of a basic cylinder set is introduced to describe joint event outcomes conditional on specific outcomes of basic events or unconditional on the outcome of basic events. The concept of outcome space partition is used to describe the minimum amount of information intended to be preserved by the event tree representation. These concepts form the basis for an algorithm for systematic search for and generation of the most compact (reduced) form of an event tree consistent with the minimum amount of information the tree should preserve. This mathematical foundation allows for the development of techniques for automated generation of event trees corresponding to joint events which are formally described through other types of graphical models. Such a technique has been developed for complex systems described by functional blocks and it is reported elsewhere. On the quantification issue of event trees, a formal definition of a probability space corresponding to the event tree outcomes is provided. Finally, a short discussion is offered on the relationship of the presented mathematical theory with the more general use of event trees in reliability analysis of dynamic systems

  6. Ionospheric effects during severe space weather events seen in ionospheric service data products

    Science.gov (United States)

    Jakowski, Norbert; Danielides, Michael; Mayer, Christoph; Borries, Claudia

    Space weather effects are closely related to complex perturbation processes in the magnetosphere-ionosphere-thermosphere systems, initiated by enhanced solar energy input. To understand and model complex space weather processes, different views on the same subject are helpful. One of the ionosphere key parameters is the Total Electron Content (TEC) which provides a first or-der approximation of the ionospheric range error in Global Navigation Satellite System (GNSS) applications. Additionally, horizontal gradients and time rate of change of TEC are important for estimating the perturbation degree of the ionosphere. TEC maps can effectively be gener-ated using ground based GNSS measurements from global receiver networks. Whereas ground based GNSS measurements provide good horizontal resolution, space based radio occultation measurements can complete the view by providing information on the vertical plasma density distribution. The combination of ground based TEC and vertical sounding measurements pro-vide essential information on the shape of the vertical electron density profile by computing the equivalent slab thickness at the ionosonde station site. Since radio beacon measurements at 150/400 MHz are well suited to trace the horizontal structure of Travelling Ionospheric Dis-turbances (TIDs), these data products essentially complete GNSS based TEC mapping results. Radio scintillation data products, characterising small scale irregularities in the ionosphere, are useful to estimate the continuity and availability of transionospheric radio signals. The different data products are addressed while discussing severe space weather events in the ionosphere e.g. events in October/November 2003. The complementary view of different near real time service data products is helpful to better understand the complex dynamics of ionospheric perturbation processes and to forecast the development of parameters customers are interested in.

  7. Process modeling of the platform choise for development of the multimedia educational complex

    Directory of Open Access Journals (Sweden)

    Ірина Олександрівна Бондар

    2016-10-01

    Full Text Available The article presents a methodical approach to the platform choice as the technological basis for building of open and functional structure and the further implementation of the substantive content of the modules of the network multimedia complex for the discipline. The proposed approach is implemented through the use of mathematical tools. The result of the process modeling is the decision of the most appropriate platform for development of the multimedia complex

  8. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  9. Management of a Complex Open Channel Network During Flood Events

    Science.gov (United States)

    Franchini, M.; Valiani, A.; Schippa, L.; Mascellani, G.

    2003-04-01

    Most part of the area around Ferrara (Italy) is below the mean sea level and an extensive drainage system combined with several pump stations allows the use of this area for both urban development and industrial and agricultural activities. The three main channels of this hydraulic system constitute the Ferrara Inland Waterway (total length approximately 70 km), which connects the Po river near Ferrara to the sea. Because of the level difference between the upstream and dowstream ends of the waterway, three locks are located along it, each of them combined with a set of gates to control the water levels. During rainfall events, most of the water of the basin flows into the waterway and heavy precipitations sometimes cause flooding in several areas. This is due to the insufficiency of the channel network dimensions and an inadequate manual operation of the gates. This study presents a hydrological-hydraulic model for the entire Ferrara basin and a system of rules in order to operate the gates. In particular, their opening is designed to be regulated in real time by monitoring the water level in several sections along the channels. Besides flood peak attenuation, this operation strategy contributes also to the maintenance of a constant water level for irrigation and fluvial navigation during the dry periods. With reference to the flood event of May 1996, it is shown that this floodgate operation policy, unlike that which was actually adopted during that event, would lead to a significant flood peak attenuation, avoiding flooding in the area upstream of Ferrara.

  10. Safety related events at nuclear installations in 1995

    DEFF Research Database (Denmark)

    Korsbech, Uffe C C

    1996-01-01

    Nuclear safety related events of significance at least corresponding to level 2 of the International Nuclear Event Scale are described. In 1995 only two events occured at nuclear power plants, and four events occured at plants using ionizing radiation for processing or research.......Nuclear safety related events of significance at least corresponding to level 2 of the International Nuclear Event Scale are described. In 1995 only two events occured at nuclear power plants, and four events occured at plants using ionizing radiation for processing or research....

  11. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor II of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  12. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While, the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor 2 of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  13. NMD Classifier: A reliable and systematic classification tool for nonsense-mediated decay events.

    Directory of Open Access Journals (Sweden)

    Min-Kung Hsu

    Full Text Available Nonsense-mediated decay (NMD degrades mRNAs that include premature termination codons to avoid the translation and accumulation of truncated proteins. This mechanism has been found to participate in gene regulation and a wide spectrum of biological processes. However, the evolutionary and regulatory origins of NMD-targeted transcripts (NMDTs have been less studied, partly because of the complexity in analyzing NMD events. Here we report NMD Classifier, a tool for systematic classification of NMD events for either annotated or de novo assembled transcripts. This tool is based on the assumption of minimal evolution/regulation-an event that leads to the least change is the most likely to occur. Our simulation results indicate that NMD Classifier can correctly identify an average of 99.3% of the NMD-causing transcript structural changes, particularly exon inclusions/exclusions and exon boundary alterations. Researchers can apply NMD Classifier to evolutionary and regulatory studies by comparing NMD events of different biological conditions or in different organisms.

  14. Strategy for introduction of rainwater management facility considering rainfall event applied on new apartment complex

    Science.gov (United States)

    KIM, H.; Lee, D. K.; Yoo, S.

    2014-12-01

    As regional torrential rains become frequent due to climate change, urban flooding happens very often. That is why it is necessary to prepare for integrated measures against a wide range of rainfall. This study proposes introduction of effective rainwater management facilities to maximize the rainwater runoff reductions and recover natural water circulation for unpredictable extreme rainfall in apartment complex scale. The study site is new apartment complex in Hanam located in east of Seoul, Korea. It has an area of 7.28ha and is analysed using the EPA-SWMM and STORM model. First, it is analyzed that green infrastructure(GI) had efficiency of flood reduction at the various rainfall events and soil characteristics, and then the most effective value of variables are derived. In case of rainfall event, Last 10 years data of 15 minutes were used for analysis. A comparison between A(686mm rainfall during 22days) and B(661mm/4days) knew that soil infiltration of A is 17.08% and B is 5.48% of the rainfall. Reduction of runoff after introduction of the GI of A is 24.76% and B is 6.56%. These results mean that GI is effective to small rainfall intensity, and artificial rainwater retarding reservoir is needed at extreme rainfall. Second, set of target year is conducted for the recovery of hydrological cycle at the predevelopment. And an amount of infiltration, evaporation, surface runoff of the target year and now is analysed on the basis of land coverage, and an arrangement of LID facilities. Third, rainwater management scenarios are established and simulated by the SWMM-LID. Rainwater management facilities include GI(green roof, porous pavement, vegetative swale, ecological pond, and raingarden), and artificial rainwater. Design scenarios are categorized five type: 1)no GI, 2)conventional GI design(current design), 3)intensive GI design, 4)GI design+rainwater retarding reservoir 5)maximized rainwater retarding reservoir. Intensive GI design is to have attribute value to

  15. New strategies in actinide separation - water-soluble complexing agents for the innovative SANEX process

    International Nuclear Information System (INIS)

    Ruff, Christian M.; Muelllich, Udo; Geist, Andreas; Panak, Petra J.

    2012-01-01

    Reduction of the radiotoxicity and thermal output of radioactive wastes prior to their permanent disposal is a topic of extreme interest for the issue of final nuclear waste disposal. One possibility to this end is a process referred to as actinide separation. This process can be optimised by means of a newly developed water-soluble molecule, as has been shown in studies on the molecule's complex chemistry using ultra-modern laser-based spectroscopy methods under process-relevant reaction conditions. Through the use of curium (III) and europium (III), which as members of the trivalent actinides and lanthanides family have excellent spectroscopic properties, it has been possible to generate spectroscopic and thermodynamic data which will facilitate our understanding of the complex chemistry and extraction chemistry of this molecule family.

  16. Modelling of the quenching process in complex superconducting magnet systems

    International Nuclear Information System (INIS)

    Hagedorn, D.; Rodriguez-Mateos, F.

    1992-01-01

    This paper reports that the superconducting twin bore dipole magnet for the proposed Large Hadron Collider (LHC) at CERN shows a complex winding structure consisting of eight compact layers each of them electromagnetically and thermally coupled with the others. This magnet is only one part of an electrical circuit; test and operation conditions are characterized by different circuits. In order to study the quenching process in this complex system, design adequate protection schemes, and provide a basis for the dimensioning of protection devices such as heaters, current breakers and dump resistors, a general simulation tool called QUABER has been developed using the analog system analysis program SABER. A complete set of electro-thermal models has been crated for the propagation of normal regions. Any network extension or modification is easy to implement without rewriting the whole set of differential equations

  17. Does ego development increase during midlife? The effects of openness and accommodative processing of difficult events.

    Science.gov (United States)

    Lilgendahl, Jennifer Pals; Helson, Ravenna; John, Oliver P

    2013-08-01

    Although Loevinger's model of ego development is a theory of personality growth, there are few studies that have examined age-related change in ego level over developmentally significant periods of adulthood. To address this gap in the literature, we examined mean-level change and individual differences in change in ego level over 18 years of midlife. In this longitudinal study, participants were 79 predominantly White, college-educated women who completed the Washington University Sentence Completion Test in early (age 43) and late (age 61) midlife as well as measures of the trait of Openness (ages 21, 43, 52, and 61) and accommodative processing (assessed from narratives of difficult life events at age 52). As hypothesized, the sample overall showed a mean-level increase in ego level from age 43 to age 61. Additionally, a regression analysis showed that both the trait of Openness at age 21 and accommodative processing of difficult events that occurred during (as opposed to prior to) midlife were each predictive of increasing ego level from age 43 to age 61. These findings counter prior claims that ego level remains stable during adulthood and contribute to our understanding of the underlying processes involved in personality growth in midlife. © 2012 Wiley Periodicals, Inc.

  18. Complex Networks Dynamics Based on Events-Phase Synchronization and Intensity Correlation Applied to The Anomaly Patterns and Extremes in The Tropical African Climate System

    Science.gov (United States)

    Oluoch, K.; Marwan, N.; Trauth, M.; Loew, A.; Kurths, J.

    2012-04-01

    The African continent lie almost entirely within the tropics and as such its (tropical) climate systems are predominantly governed by the heterogeneous, spatial and temporal variability of the Hadley and Walker circulations. The variabilities in these meridional and zonal circulations lead to intensification or suppression of the intensities, durations and frequencies of the Inter-tropical Convergence Zone (ICTZ) migration, trade winds and subtropical high-pressure regions and the continental monsoons. The above features play a central role in determining the African rainfall spatial and temporal variability patterns. The current understanding of these climate features and their influence on the rainfall patterns is not sufficiently understood. Like many real-world systems, atmospheric-oceanic processes exhibit non-linear properties that can be better explored using non-linear (NL) methods of time-series analysis. Over the recent years, the complex network approach has evolved as a powerful new player in understanding spatio-temporal dynamics and evolution of complex systems. Together with NL techniques, it is continuing to find new applications in many areas of science and technology including climate research. We would like to use these two powerful methods to understand the spatial structure and dynamics of African rainfall anomaly patterns and extremes. The method of event synchronization (ES) developed by Quiroga et al., 2002 and first applied to climate networks by Malik et al., 2011 looks at correlations with a dynamic time lag and as such, it is a more intuitive way to correlate a complex and heterogeneous system like climate networks than a fixed time delay most commonly used. On the other hand, the short comings of ES is its lack of vigorous test statistics for the significance level of the correlations, and the fact that only the events' time indices are synchronized while all information about how the relative intensities propagate within network

  19. Complex rupture process of the Mw 7.8, 2016, Kaikoura earthquake, New Zealand, and its aftershock sequence

    Science.gov (United States)

    Cesca, S.; Zhang, Y.; Mouslopoulou, V.; Wang, R.; Saul, J.; Savage, M.; Heimann, S.; Kufner, S.-K.; Oncken, O.; Dahm, T.

    2017-11-01

    The M7.8 Kaikoura Earthquake that struck the northeastern South Island, New Zealand, on November 14, 2016 (local time), is one of the largest ever instrumentally recorded earthquakes in New Zealand. It occurred at the southern termination of the Hikurangi subduction margin, where the subducting Pacific Plate transitions into the dextral Alpine transform fault. The earthquake produced significant distributed uplift along the north-eastern part of the South Island, reaching a peak amplitude of ∼8 m, which was accompanied by large (≥10 m) horizontal coseismic displacements at the ground surface along discrete active faults. The seismic waveforms' expression of the main shock indicate a complex rupture process. Early automated centroid moment tensor solutions indicated a strong non-double-couple term, which supports a complex rupture involving multiple faults. The hypocentral distribution of aftershocks, which appears diffuse over a broad region, clusters spatially along lineaments with different orientations. A key question of global interest is to shed light on the mechanism with which such a complex rupture occurred, and whether the underlying plate-interface was involved in the rupture. The consequences for seismic hazard of such a distributed, shallow faulting is important to be assessed. We perform a broad seismological analysis, combining regional and teleseismic seismograms, GPS and InSAR, to determine the rupture process of the main shock and moment tensors of 118 aftershocks down to Mw 4.2. The joint interpretation of the main rupture and aftershock sequence allow reconstruction of the geometry, and suggests sequential activation and slip distribution on at least three major active fault domains. We find that the rupture nucleated as a weak strike-slip event along the Humps Fault, which progressively propagated northward onto a shallow reverse fault, where most of the seismic moment was released, before it triggered slip on a second set of strike

  20. Using Simple Manipulatives to Improve Student Comprehension of a Complex Biological Process: Protein Synthesis

    Science.gov (United States)

    Guzman, Karen; Bartlett, John

    2012-01-01

    Biological systems and living processes involve a complex interplay of biochemicals and macromolecular structures that can be challenging for undergraduate students to comprehend and, thus, misconceptions abound. Protein synthesis, or translation, is an example of a biological process for which students often hold many misconceptions. This article…