WorldWideScience

Sample records for human event analysis

  1. Defining Human Failure Events for Petroleum Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  2. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  3. Top-down and bottom-up definitions of human failure events in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  4. Weighted symbolic analysis of human behavior for event detection

    Science.gov (United States)

    Rosani, A.; Boato, G.; De Natale, F. G. B.

    2013-03-01

    Automatic video analysis and understanding has become a high interest research topic, with applications to video browsing, content-based video indexing, and visual surveillance. However, the automation of this process is still a challenging task, due to clutters produced by low-level processing operations. This common problem can be solved by embedding signi cant contextual information into the data, as well as using simple syntactic approaches to perform the matching between actual sequences and models. In this context we propose a novel framework that employs a symbolic representation of complex activities through sequences of atomic actions based on a weighted Context-Free Grammar.

  5. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    OpenAIRE

    Marko Čepin

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysi...

  6. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  7. Coordination activities of human planners during rescheduling: Case analysis and event handling procedure

    OpenAIRE

    De Snoo, Cees; Van Wezel, Wout; Wortmann, Hans; Gaalman, Gerard J.C.

    2010-01-01

    Abstract This paper addresses the process of event handling and rescheduling in manufacturing practice. Firms are confronted with many diverse events, like new or changed orders, machine breakdowns, and material shortages. These events influence the feasibility and optimality of schedules, and thus induce rescheduling. In many manufacturing firms, schedules are created by several human planners. Coordination between them is needed to respond to events adequately. In this paper,...

  8. Empirical analysis of collective human behavior for extraordinary events in blogosphere

    CERN Document Server

    Sano, Yukie; Watanabe, Hayafumi; Takayasu, Hideki; Takayasu, Misako

    2011-01-01

    To explain collective human behavior in blogosphere, we survey more than 1.8 billion entries and observe statistical properties of word appearance. We first estimate the basic properties of number fluctuation of ordinary words that appear almost uniformly. Then, we focus on those words that show dynamic growth with a tendency to diverge on a certain day, and also news words, that are typical keywords for natural disasters, grow suddenly with the occurrence of events and decay gradually with time. In both cases, the functional forms of growth and decay are generally approximated by power laws with exponents around -1 for a period of about 80 days. Our empirical analysis can be applied for the prediction of word frequency in blogosphere.

  9. A Mid-Layer Model for Human Reliability Analysis: Understanding the Cognitive Causes of Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Stacey M. L. Hendrickson; April M. Whaley; Ronald L. Boring; James Y. H. Chang; Song-Hua Shen; Ali Mosleh; Johanna H. Oxstrand; John A. Forester; Dana L. Kelly; Erasmia L. Lois

    2010-06-01

    The Office of Nuclear Regulatory Research (RES) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method’s middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identified human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.

  10. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  11. A novel human ex vivo model for the analysis of molecular events during lung cancer chemotherapy

    Directory of Open Access Journals (Sweden)

    Lang Dagmar S

    2007-06-01

    Full Text Available Abstract Background Non-small cell lung cancer (NSCLC causes most of cancer related deaths in humans and is characterized by poor prognosis regarding efficiency of chemotherapeutical treatment and long-term survival of the patients. The purpose of the present study was the development of a human ex vivo tissue culture model and the analysis of the effects of conventional chemotherapy, which then can serve as a tool to test new chemotherapeutical regimens in NSCLC. Methods In a short-term tissue culture model designated STST (Short-Term Stimulation of Tissues in combination with the novel *HOPE-fixation and paraffin embedding method we examined the responsiveness of 41 human NSCLC tissue specimens to the individual cytotoxic drugs carboplatin, vinorelbine or gemcitabine. Viability was analyzed by LIFE/DEAD assay, TUNEL-staining and colorimetric MTT assay. Expression of Ki-67 protein and of BrdU (bromodeoxyuridine uptake as markers for proliferation and of cleaved (activated effector caspase-3 as indicator of late phase apoptosis were assessed by immunohistochemistry. Transcription of caspase-3 was analyzed by RT-PCR. Flow cytometry was utilized to determine caspase-3 in human cancer cell lines. Results Viability, proliferation and apoptosis of the tissues were moderately affected by cultivation. In human breast cancer, small-cell lung cancer (SCLC and human cell lines (CPC-N, HEK proliferative capacity was clearly reduced by all 3 chemotherapeutic agents in a very similar manner. Cleavage of caspase-3 was induced in the chemo-sensitive types of cancer (breast cancer, SCLC. Drug-induced effects in human NSCLC tissues were less evident than in the chemo-sensitive tumors with more pronounced effects in adenocarcinomas as compared to squamous cell carcinomas. Conclusion Although there was high heterogeneity among the individual tumor tissue responses as expected, we clearly demonstrate specific multiple drug-induced effects simultaneously. Thus, STST

  12. Live imaging analysis of human gastric epithelial spheroids reveals spontaneous rupture, rotation and fusion events.

    Science.gov (United States)

    Sebrell, T Andrew; Sidar, Barkan; Bruns, Rachel; Wilkinson, Royce A; Wiedenheft, Blake; Taylor, Paul J; Perrino, Brian A; Samuelson, Linda C; Wilking, James N; Bimczok, Diane

    2018-02-01

    Three-dimensional cultures of primary epithelial cells including organoids, enteroids and epithelial spheroids have become increasingly popular for studies of gastrointestinal development, mucosal immunology and epithelial infection. However, little is known about the behavior of these complex cultures in their three-dimensional culture matrix. Therefore, we performed extended time-lapse imaging analysis (up to 4 days) of human gastric epithelial spheroids generated from adult tissue samples in order to visualize the dynamics of the spheroids in detail. Human gastric epithelial spheroids cultured in our laboratory grew to an average diameter of 443.9 ± 34.6 μm after 12 days, with the largest spheroids reaching diameters of >1000 μm. Live imaging analysis revealed that spheroid growth was associated with cyclic rupture of the epithelial shell at a frequency of 0.32 ± 0.1/day, which led to the release of luminal contents. Spheroid rupture usually resulted in an initial collapse, followed by spontaneous re-formation of the spheres. Moreover, spheroids frequently rotated around their axes within the Matrigel matrix, possibly propelled by basolateral pseudopodia-like formations of the epithelial cells. Interestingly, adjacent spheroids occasionally underwent luminal fusion, as visualized by injection of individual spheroids with FITC-Dextran (4 kDa). In summary, our analysis revealed unexpected dynamics in human gastric spheroids that challenge our current view of cultured epithelia as static entities and that may need to be considered when performing spheroid infection experiments.

  13. Coordination activities of human planners during rescheduling : case analysis and event handling procedure

    NARCIS (Netherlands)

    de Snoo, C.; van Wezel, W.M.C.; Wortmann, J.C.; Gaalman, G.J.C.

    2011-01-01

    This paper addresses the process of event handling and rescheduling in manufacturing practice. Firms are confronted with many diverse events, such as new or changed orders, machine breakdowns, and material shortages. These events influence the feasibility and optimality of schedules, and thus induce

  14. Event-based sampling for reducing communication load in realtime human motion analysis by wireless inertial sensor networks

    Directory of Open Access Journals (Sweden)

    Laidig Daniel

    2016-09-01

    Full Text Available We examine the usefulness of event-based sampling approaches for reducing communication in inertial-sensor-based analysis of human motion. To this end we consider realtime measurement of the knee joint angle during walking, employing a recently developed sensor fusion algorithm. We simulate the effects of different event-based sampling methods on a large set of experimental data with ground truth obtained from an external motion capture system. This results in a reduced wireless communication load at the cost of a slightly increased error in the calculated angles. The proposed methods are compared in terms of best balance of these two aspects. We show that the transmitted data can be reduced by 66% while maintaining the same level of accuracy.

  15. Time-frequency analysis of chemosensory event-related potentials to characterize the cortical representation of odors in humans.

    Directory of Open Access Journals (Sweden)

    Caroline Huart

    Full Text Available BACKGROUND: The recording of olfactory and trigeminal chemosensory event-related potentials (ERPs has been proposed as an objective and non-invasive technique to study the cortical processing of odors in humans. Until now, the responses have been characterized mainly using across-trial averaging in the time domain. Unfortunately, chemosensory ERPs, in particular, olfactory ERPs, exhibit a relatively low signal-to-noise ratio. Hence, although the technique is increasingly used in basic research as well as in clinical practice to evaluate people suffering from olfactory disorders, its current clinical relevance remains very limited. Here, we used a time-frequency analysis based on the wavelet transform to reveal EEG responses that are not strictly phase-locked to onset of the chemosensory stimulus. We hypothesized that this approach would significantly enhance the signal-to-noise ratio of the EEG responses to chemosensory stimulation because, as compared to conventional time-domain averaging, (1 it is less sensitive to temporal jitter and (2 it can reveal non phase-locked EEG responses such as event-related synchronization and desynchronization. METHODOLOGY/PRINCIPAL FINDINGS: EEG responses to selective trigeminal and olfactory stimulation were recorded in 11 normosmic subjects. A Morlet wavelet was used to characterize the elicited responses in the time-frequency domain. We found that this approach markedly improved the signal-to-noise ratio of the obtained EEG responses, in particular, following olfactory stimulation. Furthermore, the approach allowed characterizing non phase-locked components that could not be identified using conventional time-domain averaging. CONCLUSION/SIGNIFICANCE: By providing a more robust and complete view of how odors are represented in the human brain, our approach could constitute the basis for a robust tool to study olfaction, both for basic research and clinicians.

  16. EVENT PLANNING USING FUNCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  17. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  18. Whole genome analysis of selected human and animal rotaviruses identified in Uganda from 2012 to 2014 reveals complex genome reassortment events between human, bovine, caprine and porcine strains.

    Science.gov (United States)

    Bwogi, Josephine; Jere, Khuzwayo C; Karamagi, Charles; Byarugaba, Denis K; Namuwulya, Prossy; Baliraine, Frederick N; Desselberger, Ulrich; Iturriza-Gomara, Miren

    2017-01-01

    Rotaviruses of species A (RVA) are a common cause of diarrhoea in children and the young of various other mammals and birds worldwide. To investigate possible interspecies transmission of RVAs, whole genomes of 18 human and 6 domestic animal RVA strains identified in Uganda between 2012 and 2014 were sequenced using the Illumina HiSeq platform. The backbone of the human RVA strains had either a Wa- or a DS-1-like genetic constellation. One human strain was a Wa-like mono-reassortant containing a DS-1-like VP2 gene of possible animal origin. All eleven genes of one bovine RVA strain were closely related to those of human RVAs. One caprine strain had a mixed genotype backbone, suggesting that it emerged from multiple reassortment events involving different host species. The porcine RVA strains had mixed genotype backbones with possible multiple reassortant events with strains of human and bovine origin.Overall, whole genome characterisation of rotaviruses found in domestic animals in Uganda strongly suggested the presence of human-to animal RVA transmission, with concomitant circulation of multi-reassortant strains potentially derived from complex interspecies transmission events. However, whole genome data from the human RVA strains causing moderate and severe diarrhoea in under-fives in Uganda indicated that they were primarily transmitted from person-to-person.

  19. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  20. Parasitic Events in Envelope Analysis

    Directory of Open Access Journals (Sweden)

    J. Doubek

    2001-01-01

    Full Text Available Envelope analysis allows fast fault location of individual gearboxes and parts of bearings by repetition frequency determination of the mechanical catch of an amplitude-modulated signal. Systematic faults arise when using envelope analysis on a signal with strong changes. The source of these events is the range of function definition of used in convolution integral definition. This integral is used for Hilbert image calculation of analyzed signal. Overshoots (almost similar to Gibbs events on a synthetic signal using the Fourier series are result from these faults. Overshoots are caused by parasitic spectral lines in the frequency domain, which can produce faulty diagnostic analysis.This paper describes systematic arising during faults rising by signal numerical calculation using envelope analysis with Hilbert transform. It goes on to offer a mathematical analysis of these systematic faults.

  1. Parallel Event Analysis Under Unix

    Science.gov (United States)

    Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.

    The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.

  2. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    Science.gov (United States)

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.

  3. Whole-genome sequence analysis of a Korean G11P[25] rotavirus strain identifies several porcine-human reassortant events.

    Science.gov (United States)

    Than, Van Thai; Park, Jong-Hwa; Chung, In Sik; Kim, Jong Bum; Kim, Wonyong

    2013-11-01

    A rare rotavirus, RVA/Human-wt/KOR/CAU12-2/2012/G11P[25], was isolated from a 16-year-old female with fever and diarrhea during the 2012 rotavirus surveillance in South Korea using a cell culture system, and its full genome sequence was determined and analyzed. Strain CAU12-2 exhibited a G11-P[25]-I12-R1-C1-M1-A1-N1-T1-E1-H1 genotype constellation. Phylogenetic analysis of this strain revealed that it is a human-porcine reassortant of two distant relatives of the G11 strains circulating in the world. The VP7 and VP4 genes are most closely related to those of human G11P[25] viruses (Dhaka6, KTM368, and N-38 strains) identified in South Asia, whereas the VP1 gene originated from a porcine G11P[7] virus (YM strain) that was identified in South America. The VP6 gene was found to belong to the new genotype I12. This study indicates that the G11-P[25]-I12 genotype was introduced into the South Korean population by interspecies transmissions of human and animal rotaviruses, followed by multiple reassortment events.

  4. Event Shape Analysis in ALICE

    CERN Document Server

    Ortiz Velasquez, Antonio

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  5. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  6. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique. [CETAT computer program

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program.

  7. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  8. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  9. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H. [Tampere Univ. of Technology (Finland)

    1996-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  10. Whole Genomic Analysis of an Unusual Human G6P[14] Rotavirus Strain Isolated from a Child with Diarrhea in Thailand: Evidence for Bovine-To-Human Interspecies Transmission and Reassortment Events.

    Science.gov (United States)

    Tacharoenmuang, Ratana; Komoto, Satoshi; Guntapong, Ratigorn; Ide, Tomihiko; Haga, Kei; Katayama, Kazuhiko; Kato, Takema; Ouchi, Yuya; Kurahashi, Hiroki; Tsuji, Takao; Sangkitporn, Somchai; Taniguchi, Koki

    2015-01-01

    An unusual rotavirus strain, SKT-27, with the G6P[14] genotypes (RVA/Human-wt/THA/SKT-27/2012/G6P[14]), was identified in a stool specimen from a hospitalized child aged eight months with severe diarrhea. In this study, we sequenced and characterized the complete genome of strain SKT-27. On whole genomic analysis, strain SKT-27 was found to have a unique genotype constellation: G6-P[14]-I2-R2-C2-M2-A3-N2-T6-E2-H3. The non-G/P genotype constellation of this strain (I2-R2-C2-M2-A3-N2-T6-E2-H3) is commonly shared with rotavirus strains from artiodactyls such as cattle. Phylogenetic analysis indicated that nine of the 11 genes of strain SKT-27 (VP7, VP4, VP6, VP2-3, NSP1, NSP3-5) appeared to be of artiodactyl (likely bovine) origin, while the remaining VP1 and NSP2 genes were assumed to be of human origin. Thus, strain SKT-27 was found to have a bovine rotavirus genetic backbone, and thus is likely to be of bovine origin. Furthermore, strain SKT-27 appeared to be derived through interspecies transmission and reassortment events involving bovine and human rotavirus strains. Of note is that the VP7 gene of strain SKT-27 was located in G6 lineage-5 together with those of bovine rotavirus strains, away from the clusters comprising other G6P[14] strains in G6 lineages-2/6, suggesting the occurrence of independent bovine-to-human interspecies transmission events. To our knowledge, this is the first report on full genome-based characterization of human G6P[14] strains that have emerged in Southeast Asia. Our observations will provide important insights into the origin of G6P[14] strains, and into dynamic interactions between human and bovine rotavirus strains.

  11. Whole Genomic Analysis of an Unusual Human G6P[14] Rotavirus Strain Isolated from a Child with Diarrhea in Thailand: Evidence for Bovine-To-Human Interspecies Transmission and Reassortment Events.

    Directory of Open Access Journals (Sweden)

    Ratana Tacharoenmuang

    Full Text Available An unusual rotavirus strain, SKT-27, with the G6P[14] genotypes (RVA/Human-wt/THA/SKT-27/2012/G6P[14], was identified in a stool specimen from a hospitalized child aged eight months with severe diarrhea. In this study, we sequenced and characterized the complete genome of strain SKT-27. On whole genomic analysis, strain SKT-27 was found to have a unique genotype constellation: G6-P[14]-I2-R2-C2-M2-A3-N2-T6-E2-H3. The non-G/P genotype constellation of this strain (I2-R2-C2-M2-A3-N2-T6-E2-H3 is commonly shared with rotavirus strains from artiodactyls such as cattle. Phylogenetic analysis indicated that nine of the 11 genes of strain SKT-27 (VP7, VP4, VP6, VP2-3, NSP1, NSP3-5 appeared to be of artiodactyl (likely bovine origin, while the remaining VP1 and NSP2 genes were assumed to be of human origin. Thus, strain SKT-27 was found to have a bovine rotavirus genetic backbone, and thus is likely to be of bovine origin. Furthermore, strain SKT-27 appeared to be derived through interspecies transmission and reassortment events involving bovine and human rotavirus strains. Of note is that the VP7 gene of strain SKT-27 was located in G6 lineage-5 together with those of bovine rotavirus strains, away from the clusters comprising other G6P[14] strains in G6 lineages-2/6, suggesting the occurrence of independent bovine-to-human interspecies transmission events. To our knowledge, this is the first report on full genome-based characterization of human G6P[14] strains that have emerged in Southeast Asia. Our observations will provide important insights into the origin of G6P[14] strains, and into dynamic interactions between human and bovine rotavirus strains.

  12. HUMAN FAILURE EVENT DEPENDENCE: WHAT ARE THE LIMITS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: This paper discusses the differences between classical human reliability analysis (HRA) dependence and the full spectrum of probabilistic dependence. Positive influence suggests an error increases the likelihood of subsequent errors or success increases the likelihood of subsequent success. Currently the typical method for dependence in HRA implements the Technique for Human Error Rate Prediction (THERP) positive dependence equations. This assumes that the dependence between two human failure events varies at discrete levels between zero and complete dependence (as defined by THERP). Dependence in THERP does not consistently span dependence values between 0 and 1. In contrast, probabilistic dependence employs Bayes Law, and addresses a continuous range of dependence. Methods: Using the laws of probability, complete dependence and maximum positive dependence do not always agree. Maximum dependence is when two events overlap to their fullest amount. Maximum negative dependence is the smallest amount that two events can overlap. When the minimum probability of two events overlapping is less than independence, negative dependence occurs. For example, negative dependence is when an operator fails to actuate Pump A, thereby increasing his or her chance of actuating Pump B. The initial error actually increases the chance of subsequent success. Results: Comparing THERP and probability theory yields different results in certain scenarios; with the latter addressing negative dependence. Given that most human failure events are rare, the minimum overlap is typically 0. And when the second event is smaller than the first event the max dependence is less than 1, as defined by Bayes Law. As such alternative dependence equations are provided along with a look-up table defining the maximum and maximum negative dependence given the probability of two events. Conclusions: THERP dependence has been used ubiquitously for decades, and has provided approximations of

  13. Negated bio-events: analysis and identification

    Science.gov (United States)

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  14. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  15. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  16. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life......Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... of the animals....

  17. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P.; Paulasaari, H. [Tampere Univ. of Technology (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  18. Analysis of Future Event Set Algorithms for Discrete Event Simulation

    OpenAIRE

    McCormack, William M.; Sargent, Robert G.

    1980-01-01

    This work reports on new analytical and empirical results on the performance of algorithms for handling the future event set in discrete event simulation. These results provide a clear insight to the factors affecting algorithm performance; evaluate the "hold" model, often used to study future event set algorithms; and determine the best algorithm(s) to use.

  19. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  20. Aversive Life Events Enhance Human Freezing Responses

    NARCIS (Netherlands)

    Hagenaars, M.A.; Stins, J.F.; Roelofs, K.

    2012-01-01

    In the present study, we investigated the effect of prior aversive life events on freezing-like responses. Fifty healthy females were presented neutral, pleasant, and unpleasant images from the International Affective Picture System while standing on a stabilometric platform and wearing a polar band

  1. Aversive life events enhance human freezing responses

    NARCIS (Netherlands)

    Hagenaars, M.A.; Stins, J.F.; Roelofs, K.

    2012-01-01

    In the present study, we investigated the effect of prior aversive life events on freezing-like responses. Fifty healthy females were presented neutral, pleasant, and unpleasant images from the International Affective Picture System while standing on a stabilometric platform and wearing a polar band

  2. Multistate Demography and Event History Analysis

    OpenAIRE

    Hannan, M. T.

    1982-01-01

    In this paper, Michael Hannan explores a merger of two methodologies for the purpose of analyzing the direct and indirect long-run implications of behavioral responses to public policies: multistate demography and life history or event history analysis. He argues that such a combined approach allows one to project levels of well-being in heterogeneous populations facing changing social policies.

  3. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  4. Analysis of organisation sport event of event Red Bull Crashed Ice 2009

    OpenAIRE

    Rak, Zdeněk

    2009-01-01

    Title: Analysis of organisation sport event of event Red Bull Crashed Ice 2009 Work goal: Analysis of organisation of event... Methods: Descriptive analysis, SWOT analysis, Interview with experts. Annotation: Organisation of Red Bull Crashed Ice includes possible areas of improvement with concrete suggestions to activities of organisational process of this sport event. Introduced proposals result from results of SWOT analysis and interviews with professionals. Results: Conclusion and advices ...

  5. Multistate event history analysis with frailty

    Directory of Open Access Journals (Sweden)

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  6. Effects of Human Management Events on Conspecific Aggression in Captive Rhesus Macaques (Macaca mulatta).

    Science.gov (United States)

    Theil, Jacob H; Beisner, Brianne A; Hill, Ashley E; McCowan, Brenda

    2017-03-01

    Conspecific aggression in outdoor-housed rhesus macaques (Macaca mulatta) at primate research facilities is a leading source of trauma and can potentially influence animal wellbeing and research quality. Although aggression between macaques is a normal part of daily social interactions, human presence might affect the frequency of various behaviors and instigate increases in conspecific aggression. We sought to determine how and which human management events affect conspecific aggression both immediately after an event and throughout the course of a day. From June 2008 through December 2009, we recorded agonistic encounters among macaques living in 7 social groups in large outdoor field cages. Behavioral data were then synchronized with specific management events (for example, feeding, enclosure cleaning, animal catching) that occurred within or near the enclosure. By using an Information Theoretical approach, 2 generalized linear mixed models were developed to estimate the effects of human management events on 1) aggression after individual management events and 2) daily levels of aggression. Univariate analysis revealed an increase in the rate of aggression after a management event occurred. The best predictor of aggression in a cage was the type of management event that occurred. Various factors including the number of daily management events, the total time of management events, the technicians involved, reproductive season, and their interactions also showed significant associations with daily aggression levels. Our findings demonstrate that human management events are associated with an increase in conspecific aggression between rhesus macaques and thus have implications regarding how humans manage primates in research facilities.

  7. Event time analysis of longitudinal neuroimage data.

    Science.gov (United States)

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  9. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2017-05-01

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  10. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  11. Burnout as a risk factor for antidepressant treatment - a repeated measures time-to-event analysis of 2936 Danish human service workers.

    Science.gov (United States)

    Madsen, Ida E H; Lange, Theis; Borritz, Marianne; Rugulies, Reiner

    2015-06-01

    Burnout is a state of emotional exhaustion, feelings of reduced personal accomplishment, and withdrawal from work thought to occur as a consequence of prolonged occupational stress. The condition is not included in the diagnostic classifications, but is considered likely to develop into depressive disorder in some cases. We examined the prospective association between burnout and antidepressant treatment, as an indicator of clinically significant mental disorder. We further investigated potential effect-modifiers of the association, to identify factors that may prevent this progression of burnout. We used questionnaire data from a three-wave study of Danish human service workers conducted during 1999-2005, linked with national register data on purchases of antidepressants (ATC: N06A). We included 4788 observations from 2936 individuals (81% women) and analysed data by Aalens additive hazards modeling, examining the risk of entering antidepressant treatment in relation to the level of work-related burnout measured by the Copenhagen Burnout inventory. As effect-modifiers we examined both sociodemographic factors and a range of psychosocial work environment factors. The level of burnout predicted antidepressant treatment. This association was modified by sex (p burnout was associated with a 5% increased risk of antidepressant treatment per year of follow-up. This risk difference was 1% for women. Due to the sex specific patterns, we restricted effect modification analyses to women. We found no effect-modification by the examined work environment factors, though a sensitivity analysis indicated a possible stronger association in women of lower occupational position. In conclusion, burnout predicted antidepressant treatment, with a stronger association in men than women. We found no evidence of effect-modification by any of the examined psychosocial work environment factors. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  13. Event driven adaptation, land use and human coping strategies

    DEFF Research Database (Denmark)

    Reenberg, Anette; Birch-Thomsen, Torben; Fog, Bjarne

    and the concept of coupled human-environmental timelines. Secondly, with point of departure in a baseline characterization of Bellona Island derived from a comprehensive survey in the late 1960s and resent fieldwork in late 2006, we present the case of Bellona Island. Key issues addressed concern climatic events...... perceive cause-effect relationships between societal and environmental events and their individual and collective management of resources. The coupled human-environment timelines are used to discuss ways in which the local communities' adaptive resource management strategies have been employed in the face...... of main drivers of change, incl. climatic and socio-economic changes in the recent past....

  14. Classification system for reporting events involving human malfunctions

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Pedersen, O.M.; Mancini, G.

    1981-01-01

    The report describes a set of categories for reporting indus-trial incidents and events involving human malfunction. The classification system aims at ensuring information adequate for improvement of human work situations and man-machine interface systems and for attempts to quantify "human error......" rates. The classification system has a multifacetted non-hierarchical struc-ture and its compatibility with Isprals ERDS classification is described. The collection of the information in general and for quantification purposes are discussed. 24 categories, 12 of which being human factors oriented...

  15. Human Auditory Processing: Insights from Cortical Event-related Potentials

    Directory of Open Access Journals (Sweden)

    Alexandra P. Key

    2016-04-01

    Full Text Available Human communication and language skills rely heavily on the ability to detect and process auditory inputs. This paper reviews possible applications of the event-related potential (ERP technique to the study of cortical mechanisms supporting human auditory processing, including speech stimuli. Following a brief introduction to the ERP methodology, the remaining sections focus on demonstrating how ERPs can be used in humans to address research questions related to cortical organization, maturation and plasticity, as well as the effects of sensory deprivation, and multisensory interactions. The review is intended to serve as a primer for researchers interested in using ERPs for the study of the human auditory system.

  16. Investigation of tissue-specific human orthologous alternative splice events in pig

    DEFF Research Database (Denmark)

    Hillig, Ann-Britt Nygaard; Jørgensen, Claus Bøttcher; Salicio, Susanna Cirera

    2010-01-01

    investigated alternative splice events detected in humans, in orthologous pig genes. A total of 17 genes with predicted exon skipping events were selected for further studies. The splice events for the selected genes were experimentally verified using real-time quantitative PCR analysis (qPCR) with splice......Alternative splicing of pre-mRNA can contribute to differences between tissues or cells either by regulating gene expression or creating proteins with various functions encoded by one gene. The number of investigated alternative splice events in pig has so far been limited. In this study we have...

  17. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  18. [Dealing with competing events in survival analysis].

    Science.gov (United States)

    Béchade, Clémence; Lobbedez, Thierry

    2015-04-01

    Survival analyses focus on the occurrences of an event of interest, in order to determine risk factors and estimate a risk. Competing events prevent from observing the event of interest. If there are competing events, it can lead to a bias in the risk's estimation. The aim of this article is to explain why Cox model is not appropriate when there are competing events, and to present Fine and Gray model, which can help when dealing with competing risks. Copyright © 2015 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  19. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail: katharina.loewe@tu-berlin.de

    2007-12-15

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  20. Economic Multipliers and Mega-Event Analysis

    OpenAIRE

    Victor Matheson

    2004-01-01

    Critics of economic impact studies that purport to show that mega-events such as the Olympics bring large benefits to the communities “lucky” enough to host them frequently cite the use of inappropriate multipliers as a primary reason why these impact studies overstate the true economic gains to the hosts of these events. This brief paper shows in a numerical example how mega-events may lead to inflated multipliers and exaggerated claims of economic benefits.

  1. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    Science.gov (United States)

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  2. Analysis of large Danube flood events at Vienna since 1700

    Science.gov (United States)

    Kiss, Andrea; Blöschl, Günter; Hohensinner, Severin; Perdigao, Rui

    2014-05-01

    Whereas Danube water level measurements are available in Vienna from 1820 onwards, documentary evidence plays a significant role in the long-term understanding of Danube hydrological processes. Based on contemporary documentary evidence and early instrumental measurements, in the present paper we aim to provide an overview and a hydrological analysis of major Danube flood events, and the changes occurred in flood behaviour in Vienna in the last 300 years. Historical flood events are discussed and analysed according to types, seasonality, frequency and magnitude. Concerning historical flood events we apply a classification of five-scaled indices that considers height, magnitude, length and impacts. The rich data coverage in Vienna, both in terms of documentary evidence and early instrumental measurements, provide us with the possibility to create a relatively long overlap between documentary evidence and instrumental measurements. This makes possible to evaluate and, to some extent, improve the index reconstruction. While detecting causes of changes in flood regime, we aim to provide an overview on the atmospheric background through some characteristic examples, selected great flood events (e.g. 1787). Moreover, we also seek for the answer for such questions as in what way early (pre-instrumental period) human impact such as water regulations and urban development changed flood behaviour in the town, and how much it might have an impact on flood classification.

  3. Human Genomic Deletions Generated by SVA-Associated Events.

    Science.gov (United States)

    Lee, Jungnam; Ha, Jungsu; Son, Seung-Yeol; Han, Kyudong

    2012-01-01

    Mobile elements are responsible for half of the human genome. Among the elements, L1 and Alu are most ubiquitous. They use L1 enzymatic machinery to move in their host genomes. A significant amount of research has been conducted about these two elements. The results showed that these two elements have played important roles in generating genomic variations between human and chimpanzee lineages and even within a species, through various mechanisms. SVA elements are a third type of mobile element which uses the L1 enzymatic machinery to propagate in the human genome but has not been studied much relative to the other elements. Here, we attempt the first identification of the human genomic deletions caused by SVA elements, through the comparison of human and chimpanzee genome sequences. We identified 13 SVA recombination-associated deletions (SRADs) and 13 SVA insertion-mediated deletions (SIMDs) in the human genome and characterized them, focusing on deletion size and the mechanisms causing the events. The results showed that the SRADs and SIMDs have deleted 15,752 and 30,785 bp, respectively, in the human genome since the divergence of human and chimpanzee and that SRADs were caused by two different mechanisms, nonhomologous end joining and nonallelic homologous recombination.

  4. A Fourier analysis of extreme events

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  5. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  6. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    The Run Sum analysis method and the Low Flow Frequency Analysis using Weibull distribution were used to characterise the drought event. The analyses firstly noted that the two catchments under study responded differently to rainfall events. The Low flow frequency analysis was used to identify a threshold value, below ...

  7. Whole-Genome Analysis of Gene Conversion Events

    Science.gov (United States)

    Hsu, Chih-Hao; Zhang, Yu; Hardison, Ross; Miller, Webb

    Gene conversion events are often overlooked in analyses of genome evolution. In a conversion event, an interval of DNA sequence (not necessarily containing a gene) overwrites a highly similar sequence. The event creates relationships among genomic intervals that can confound attempts to identify orthologs and to transfer functional annotation between genomes. Here we examine 1,112,202 paralogous pairs of human genomic intervals, and detect conversion events in about 13.5% of them. Properties of the putative gene conversions are analyzed, such as the lengths of the paralogous pairs and the spacing between their sources and targets. Our approach is illustrated using conversion events in the beta-globin gene cluster.

  8. Analysis of loss of offsite power events reported in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  9. The Human Brain Encodes Event Frequencies While Forming Subjective Beliefs

    Science.gov (United States)

    d’Acremont, Mathieu; Schultz, Wolfram; Bossaerts, Peter

    2015-01-01

    To make adaptive choices, humans need to estimate the probability of future events. Based on a Bayesian approach, it is assumed that probabilities are inferred by combining a priori, potentially subjective, knowledge with factual observations, but the precise neurobiological mechanism remains unknown. Here, we study whether neural encoding centers on subjective posterior probabilities, and data merely lead to updates of posteriors, or whether objective data are encoded separately alongside subjective knowledge. During fMRI, young adults acquired prior knowledge regarding uncertain events, repeatedly observed evidence in the form of stimuli, and estimated event probabilities. Participants combined prior knowledge with factual evidence using Bayesian principles. Expected reward inferred from prior knowledge was encoded in striatum. BOLD response in specific nodes of the default mode network (angular gyri, posterior cingulate, and medial prefrontal cortex) encoded the actual frequency of stimuli, unaffected by prior knowledge. In this network, activity increased with frequencies and thus reflected the accumulation of evidence. In contrast, Bayesian posterior probabilities, computed from prior knowledge and stimulus frequencies, were encoded in bilateral inferior frontal gyrus. Here activity increased for improbable events and thus signaled the violation of Bayesian predictions. Thus, subjective beliefs and stimulus frequencies were encoded in separate cortical regions. The advantage of such a separation is that objective evidence can be recombined with newly acquired knowledge when a reinterpretation of the evidence is called for. Overall this study reveals the coexistence in the brain of an experience-based system of inference and a knowledge-based system of inference. PMID:23804108

  10. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  11. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  12. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  13. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    Science.gov (United States)

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  14. Chromothripsis: how does such a catastrophic event impact human reproduction?

    Science.gov (United States)

    Pellestor, Franck

    2014-03-01

    The recent discovery of a new kind of massive chromosomal rearrangement, baptized chromothripsis (chromo for chromosomes, thripsis for shattering into pieces), greatly modifies our understanding of molecular mechanisms implicated in the repair of DNA damage and the genesis of complex chromosomal rearrangements. Initially described in cancers, and then in constitutional rearrangements, chromothripsis is characterized by the shattering of one (or a few) chromosome(s) segments followed by a chaotic reassembly of the chromosomal fragments, occurring during one unique cellular event. The diversity and the high complexity of chromothripsis events raise questions about their origin, their ties to chromosome instability and their impact in pathology. Several causative mechanisms, involving abortive apoptosis, telomere erosion, mitotic errors, micronuclei formation and p53 inactivation, have been proposed. The remarkable point is that all these mechanisms have been identified in the field of human reproduction as causal factors for reproductive failures and chromosomal abnormalities. Consequently, it seems important to consider this unexpected catastrophic phenomenon in the context of fertilization and early embryonic development in order to discuss its potential impact on human reproduction.

  15. A Dynamic Approach to Modeling Dependence Between Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory

    2015-09-01

    In practice, most HRA methods use direct dependence from THERP—the notion that error be- gets error, and one human failure event (HFE) may increase the likelihood of subsequent HFEs. In this paper, we approach dependence from a simulation perspective in which the effects of human errors are dynamically modeled. There are three key concepts that play into this modeling: (1) Errors are driven by performance shaping factors (PSFs). In this context, the error propagation is not a result of the presence of an HFE yielding overall increases in subsequent HFEs. Rather, it is shared PSFs that cause dependence. (2) PSFs have qualities of lag and latency. These two qualities are not currently considered in HRA methods that use PSFs. Yet, to model the effects of PSFs, it is not simply a matter of identifying the discrete effects of a particular PSF on performance. The effects of PSFs must be considered temporally, as the PSFs will have a range of effects across the event sequence. (3) Finally, there is the concept of error spilling. When PSFs are activated, they not only have temporal effects but also lateral effects on other PSFs, leading to emergent errors. This paper presents the framework for tying together these dynamic dependence concepts.

  16. The flood event explorer - a web based framework for rapid flood event analysis

    Science.gov (United States)

    Schröter, Kai; Lüdtke, Stefan; Kreibich, Heidi; Merz, Bruno

    2015-04-01

    Flood disaster management, recovery and reconstruction planning benefit from rapid evaluations of flood events and expected impacts. The near real time in-depth analysis of flood causes and key drivers for flood impacts requires a close monitoring and documentation of hydro-meteorological and socio-economic factors. Within the CEDIM's Rapid Flood Event Analysis project a flood event analysis system is developed which enables the near real-time evaluation of large scale floods in Germany. The analysis system includes functionalities to compile event related hydro-meteorological data, to evaluate the current flood situation, to assess hazard intensity and to estimate flood damage to residential buildings. A German flood event database is under development, which contains various hydro-meteorological information - in the future also impact information -for all large-scale floods since 1950. This data base comprises data on historic flood events which allow the classification of ongoing floods in terms of triggering processes and pre-conditions, critical controls and drivers for flood losses. The flood event analysis system has been implemented in a database system which automatically retrieves and stores data from more than 100 online discharge gauges on a daily basis. The current discharge observations are evaluated in a long term context in terms of flood frequency analysis. The web-based frontend visualizes the current flood situation in comparison to any past flood from the flood catalogue. The regional flood data base for Germany contains hydro-meteorological data and aggregated severity indices for a set of 76 historic large-scale flood events in Germany. This data base has been used to evaluate the key drivers for the flood in June 2013.

  17. RESEARCH ON VISUAL ANALYSIS METHODS OF TERRORISM EVENTS

    Directory of Open Access Journals (Sweden)

    W. Guo

    2016-06-01

    Full Text Available Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events’ spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event’s spatio-temporal characteristics are discussed. Suitable event data table structure based on “5W” theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  18. Multistate event history analysis with frailty

    NARCIS (Netherlands)

    Bijwaard, G.E.

    2014-01-01

    BACKGROUND In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. OBJECTIVE This

  19. A Fourier analysis of extremal events

    DEFF Research Database (Denmark)

    Zhao, Yuwei

    is the extremal periodogram. The extremal periodogram shares numerous asymptotic properties with the periodogram of a linear process in classical time series analysis: the asymptotic distribution of the periodogram ordinates at the Fourier frequencies have a similar form and smoothed versions of the periodogram...

  20. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  1. Video analysis of motor events in REM sleep behavior disorder.

    Science.gov (United States)

    Frauscher, Birgit; Gschliesser, Viola; Brandauer, Elisabeth; Ulmer, Hanno; Peralta, Cecilia M; Müller, Jörg; Poewe, Werner; Högl, Birgit

    2007-07-30

    In REM sleep behavior disorder (RBD), several studies focused on electromyographic characterization of motor activity, whereas video analysis has remained more general. The aim of this study was to undertake a detailed and systematic video analysis. Nine polysomnographic records from 5 Parkinson patients with RBD were analyzed and compared with sex- and age-matched controls. Each motor event in the video during REM sleep was classified according to duration, type of movement, and topographical distribution. In RBD, a mean of 54 +/- 23.2 events/10 minutes of REM sleep (total 1392) were identified and visually analyzed. Seventy-five percent of all motor events lasted Disorder Society

  2. Sovereign Default Analysis through Extreme Events Identification

    Directory of Open Access Journals (Sweden)

    Vasile George MARICA

    2015-06-01

    Full Text Available This paper investigates contagion in international credit markets through the use of a novel jump detection technique proposed by Chan and Maheuin (2002. This econometrical methodology is preferred because it is non-linear by definition and not a subject to volatility bias. Also, the identified jumps in CDS premiums are considered as outliers positioned beyond any stochastic movement that can and is already modelled through well-known linear analysis. Though contagion is hard to define, we show that extreme discrete movements in default probabilities inferred from CDS premiums can lead to sound economic conclusions about the risk profile of sovereign nations in international bond markets. We find evidence of investor sentiment clustering for countries with unstable political regimes or that are engaged in armed conflict. Countries that have in their recent history faced currency or financial crises are less vulnerable to external unexpected shocks. First we present a brief history of sovereign defaults with an emphasis on their increased frequency and geographical reach, as financial markets become more and more integrated. We then pass to a literature review of the most important definitions for contagion, and discuss what quantitative methods are available to detect the presence of contagion. The paper continues with the details for the methodology of jump detection through non-linear modelling and its use in the field of contagion identification. In the last sections we present the estimation results for simultaneous jumps between emerging markets CDS and draw conclusions on the difference of behavior in times of extreme movement versus tranquil periods.

  3. A Quantitative Index to Support Recurrence Prevention Plans of Human-Related Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea [KAERI, Daejeon (Korea, Republic of); Kim, Do Sam; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    In Korea, HuRAM+ (Human related event Root cause Analysis Method plus) was developed to scrutinize the causes of the human-related events. The information of the human-related events investigated by the HuRAM+ method has been also managed by a database management system, R-tracer. It is obvious that accumulating data of human error causes aims to support plans that reduce recurrences of similar events. However, in spite of the efforts for the development of the human error database, it was indicated that the database does not provide useful empirical basis for establishment of the recurrence prevention plans, because the framework to interpret the collected data and apply the insights from the data into the prevention plants has not been developed yet. In this paper, in order to support establishment of the recurrence prevention plans, a quantitative index, Human Error Repeat Interval (HERI), was proposed and its applications to human error prevention were introduced. In this paper, a quantitative index, the HERI was proposed and the statistics of HERIs were introduced. These estimations can be employed to evaluate effects of recurrence prevention plans to human errors. If a mean HERI score is low and the linear trend is not positive, it can be suspected that the recurrence prevention plans applied every human-related event has not been effectively propagated. For reducing repetitive error causes, the system design or operational culture can be reviewed. If there is a strong and negative trend, systematic investigation of the root causes behind these trends is required. Likewise, we expect that the HERI index will provide significant basis for establishing or adjusting prevention plans of human errors. The accurate estimation and application of HERI scores is expected to be done after accumulating more data. When a scatter plot of HERIs is fitted by two or more models, a statistical model selection method can be employed. Some criteria have been introduced by

  4. Adverse events with use of antiepileptic drugs: a prescription and event symmetry analysis

    DEFF Research Database (Denmark)

    Tsiropoulos, Ioannis; Andersen, Morten; Hallas, Jesper

    2009-01-01

    PURPOSE: To assess adverse events with use of antiepileptic drugs (AEDs) by the method of sequence symmetry analysis. METHODS: We used data from two population-based sources in Funen County, Denmark (population 2006: 479 000); prescription data from Odense University Pharmacoepidemiological Datab...

  5. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Science.gov (United States)

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  6. SIMULATED HUMAN ERROR PROBABILITY AND ITS APPLICATION TO DYNAMIC HUMAN FAILURE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: Human reliability analysis (HRA) methods typically analyze human failure events (HFEs) at the overall task level. For dynamic HRA, it is important to model human activities at the subtask level. There exists a disconnect between dynamic subtask level and static task level that presents issues when modeling dynamic scenarios. For example, the SPAR-H method is typically used to calculate the human error probability (HEP) at the task level. As demonstrated in this paper, quantification in SPAR-H does not translate to the subtask level. Methods: Two different discrete distributions were generated for each SPAR-H Performance Shaping Factor (PSF) to define the frequency of PSF levels. The first distribution was a uniform, or uninformed distribution that assumed the frequency of each PSF level was equally likely. The second non-continuous distribution took the frequency of PSF level as identified from an assessment of the HERA database. These two different approaches were created to identify the resulting distribution of the HEP. The resulting HEP that appears closer to the known distribution, a log-normal centered on 1E-3, is the more desirable. Each approach then has median, average and maximum HFE calculations applied. To calculate these three values, three events, A, B and C are generated from the PSF level frequencies comprised of subtasks. The median HFE selects the median PSF level from each PSF and calculates HEP. The average HFE takes the mean PSF level, and the maximum takes the maximum PSF level. The same data set of subtask HEPs yields starkly different HEPs when aggregated to the HFE level in SPAR-H. Results: Assuming that each PSF level in each HFE is equally likely creates an unrealistic distribution of the HEP that is centered at 1. Next the observed frequency of PSF levels was applied with the resulting HEP behaving log-normally with a majority of the values under 2.5% HEP. The median, average and maximum HFE calculations did yield

  7. Statistical Analysis of Loss of Offsite Power Events

    Directory of Open Access Journals (Sweden)

    Andrija Volkanovski

    2016-01-01

    Full Text Available This paper presents the results of the statistical analysis of the loss of offsite power events (LOOP registered in four reviewed databases. The reviewed databases include the IRSN (Institut de Radioprotection et de Sûreté Nucléaire SAPIDE database and the GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH VERA database reviewed over the period from 1992 to 2011. The US NRC (Nuclear Regulatory Commission Licensee Event Reports (LERs database and the IAEA International Reporting System (IRS database were screened for relevant events registered over the period from 1990 to 2013. The number of LOOP events in each year in the analysed period and mode of operation are assessed during the screening. The LOOP frequencies obtained for the French and German nuclear power plants (NPPs during critical operation are of the same order of magnitude with the plant related events as a dominant contributor. A frequency of one LOOP event per shutdown year is obtained for German NPPs in shutdown mode of operation. For the US NPPs, the obtained LOOP frequency for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered in three databases (IRSN, GRS, and NRC.

  8. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  9. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  10. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  11. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    Science.gov (United States)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied

  12. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  13. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  14. Deconstructive Misalignment: Archives, Events, and Humanities Approaches in Academic Development

    Directory of Open Access Journals (Sweden)

    Trevor M. Holmes

    2015-06-01

    Full Text Available Using poetry, role play, readers’ theatre, and creative manipulations of space through yarn and paper weaving, a workshop in 2008 challenged one of educational development’s more pervasive and least questioned notions (“constructive alignment” associated most often with the work of John Biggs. This paper describes the reasoning behind using humanities approaches specifically in this case and more generally in the Challenging Academic Development Collective’s work, as well as problematising the notions of “experiment” and “results” by unarchiving and re-archiving such a nonce-event. The critical stakes in using an anti-empirical method are broached, and readers are encouraged to experience their own version of the emergent truths of such approaches by drawing their own conclusions. En 2008, par le biais de la poésie, du jeu de rôles, du théâtre lu et de manipulations créatrices de l’espace avec de la laine et des tissages en papier, un atelier a mis au défi une des notions les plus généralisées et les moins remises en question du développement éducatif, l’alignement constructif, le plus souvent associé aux travaux de John Biggs. Cet article décrit le raisonnement qui se cache sous l’utilisation des approches des humanités tout spécialement dans ce cas et de manière plus générale dans les travaux du Collectif sur le développement académique stimulant. L’article traite également de la problématique sur les notions d’« expérience » et de « résultats » en désarchivant et en réarchivant une telle circonstance. Les enjeux principaux de l’utilisation de cette méthode anti-empirique sont abordés et les lecteurs sont encouragés à faire l’expérience de leur propre version des vérités qui émergent de telles approches en tirant leurs propres conclusions.

  15. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  16. The distribution of events in the human menstrual cycle.

    Science.gov (United States)

    Udry, J R; Morris, N M

    1977-11-01

    Daily reports of 85 married couples concerning their sexual behaviour for about 3 menstrual cycles per couple were organized according to menstrual events by using six techniques of aggregation. While there were some similarities among the different displays, including an apparent peak about 6 days before mid-cycle, different methods of aggregation produce widely different frequency curves. It is concluded that there is no single method of display of events of the menstrual cycle which will fit all investigations.

  17. An analysis of the 2016 Hitomi breakup event

    Science.gov (United States)

    Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig

    2017-04-01

    The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.

  18. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  19. Human exposure and sensitivity to globally extreme wildfire events.

    Science.gov (United States)

    Bowman, David M J S; Williamson, Grant J; Abatzoglou, John T; Kolden, Crystal A; Cochrane, Mark A; Smith, Alistair M S

    2017-02-06

    Extreme wildfires have substantial economic, social and environmental impacts, but there is uncertainty whether such events are inevitable features of the Earth's fire ecology or a legacy of poor management and planning. We identify 478 extreme wildfire events defined as the daily clusters of fire radiative power from MODIS, within a global 10 × 10 km lattice, between 2002 and 2013, which exceeded the 99.997th percentile of over 23 million cases of the ΣFRP 100 km -2 in the MODIS record. These events are globally distributed across all flammable biomes, and are strongly associated with extreme fire weather conditions. Extreme wildfire events reported as being economically or socially disastrous (n = 144) were concentrated in suburban areas in flammable-forested biomes of the western United States and southeastern Australia, noting potential biases in reporting and the absence of globally comprehensive data of fire disasters. Climate change projections suggest an increase in days conducive to extreme wildfire events by 20 to 50% in these disaster-prone landscapes, with sharper increases in the subtropical Southern Hemisphere and European Mediterranean Basin.

  20. Identification and Analysis of Full Scale Ventilation Events

    Directory of Open Access Journals (Sweden)

    Luca Savio

    2012-01-01

    Full Text Available The present paper deals with propeller ventilation in full scale. The paper is based on full scale monitoring data from an offshore supply ship during normal operation. The data was collected by the on-line monitoring system HeMoS, developed by Rolls Royce Marine. The data covering one year and a half of ship operations were made available within the framework of the Era-Net Martec project PropSeas. The ventilation events are identified by means of an analysis procedure based on fuzzy logic. The paper contains both a basic introduction to fuzzy logic and a detailed description of the analysis procedure. The analysis procedure is then adopted to process the available data, find ventilation events, and form a set which is further analyzed including weather observations.

  1. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  2. Complex epithelial remodeling underlie the fusion event in early fetal development of the human penile urethra.

    Science.gov (United States)

    Shen, Joel; Overland, Maya; Sinclair, Adriane; Cao, Mei; Yue, Xuan; Cunha, Gerald; Baskin, Laurence

    We recently described a two-step process of urethral plate canalization and urethral fold fusion to form the human penile urethra. Canalization ("opening zipper") opens the solid urethral plate into a groove, and fusion ("closing zipper") closes the urethral groove to form the penile urethra. We hypothesize that failure of canalization and/or fusion during human urethral formation can lead to hypospadias. Herein, we use scanning electron microscopy (SEM) and analysis of transverse serial sections to better characterize development of the human fetal penile urethra as contrasted to the development of the human fetal clitoris. Eighteen 7-13 week human fetal external genitalia specimens were analyzed by SEM, and fifteen additional human fetal specimens were sectioned for histologic analysis. SEM images demonstrate canalization of the urethral/vestibular plate in the developing male and female external genitalia, respectively, followed by proximal to distal fusion of the urethral folds in males only. The fusion process during penile development occurs sequentially in multiple layers and through the interlacing of epidermal "cords". Complex epithelial organization is also noted at the site of active canalization. The demarcation between the epidermis of the shaft and the glans becomes distinct during development, and the epithelial tag at the distal tip of the penile and clitoral glans regresses as development progresses. In summary, SEM analysis of human fetal specimens supports the two-zipper hypothesis of formation of the penile urethra. The opening zipper progresses from proximal to distal along the shaft of the penis and clitoris into the glans in identical fashion in both sexes. The closing zipper mechanism is active only in males and is not a single process but rather a series of layered fusion events, uniquely different from the simple fusion of two epithelial surfaces as occurs in formation of the palate and neural tube. Copyright © 2016 International Society

  3. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  4. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  5. Distinctiveness enhances long-term event memory in non-human primates, irrespective of reinforcement.

    Science.gov (United States)

    Lewis, Amy; Call, Josep; Berntsen, Dorthe

    2017-08-01

    Non-human primates are capable of recalling events that occurred as long as 3 years ago, and are able to distinguish between similar events; akin to human memory. In humans, distinctiveness enhances memory for events, however, it is unknown whether the same occurs in non-human primates. As such, we tested three great ape species on their ability to remember an event that varied in distinctiveness. Across three experiments, apes witnessed a baiting event in which one of three identical containers was baited with food. After a delay of 2 weeks, we tested their memory for the location of the baited container. Apes failed to recall the baited container when the event was undistinctive (Experiment 1), but were successful when it was distinctive (Experiment 2), although performance was equally good in a less-distinctive condition. A third experiment (Experiment 3) confirmed that distinctiveness, independent of reinforcement, was a consistent predictor of performance. These findings suggest that distinctiveness may enhance memory for events in non-human primates in the same way as in humans, and provides further evidence of basic similarities between the ways apes and humans remember past events. © 2017 Wiley Periodicals, Inc.

  6. Integrating pedestrian simulation, tracking and event detection for crowd analysis

    OpenAIRE

    Butenuth, Matthias; Burkert, Florian; Kneidl, Angelika; Borrmann, Andre; Schmidt, Florian; Hinz, Stefan; Sirmacek, Beril; Hartmann, Dirk

    2011-01-01

    In this paper, an overall framework for crowd analysis is presented. Detection and tracking of pedestrians as well as detection of dense crowds is performed on image sequences to improve simulation models of pedestrian flows. Additionally, graph-based event detection is performed by using Hidden Markov Models on pedestrian trajectories utilizing knowledge from simulations. Experimental results show the benefit of our integrated framework using simulation and real-world data for crowd anal...

  7. Practical guidance for statistical analysis of operational event data

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  8. Event-by-event analysis of high multiplicity Pb(158 GeV/nucleon)-Ag/Br collisions

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, M.L.; Deines-Jones, P. [Louisiana State University, Baton Rouge (United States); Dabrowska, A. [Institute of Nuclear Physics, Cracow (Poland)] [and others; KLM Collaboration

    1998-08-01

    High multiplicity nucleus-nucleus collisions are studied on an event-by- event basis. Different methods of analysis of individual collision events are presented and their ability to reveal anomalous features of the events is discussed. This study is based on full acceptance measurements of particle production in the interactions of 158 GeV/nucleon Pb with the heavy target nuclei in nuclear emulsion. No events are observed with global characteristics that differ significantly from expectations based on either Monte Carlo simulations, or the characteristics of the entire sample of events. On the other hand, it is shown that systematic analysis of particle density fluctuations in phase space domains of varying size, performed in terms of factorial moments, can be used as an effective triggering for events with large dynamical fluctuations. (author) 17 refs, 7 figs, 3 tabs

  9. Empirical Green's function analysis of recent moderate events in California

    Science.gov (United States)

    Hough, S.E.

    2001-01-01

    I use seismic data from portable digital stations and the broadband Terrascope network in southern California to investigate radiated earthquake source spectra and discuss the results in light of previous studies on both static stress drop and apparent stress. Applying the empirical Green's function (EGF) method to two sets of M 4-6.1 events, I obtain deconvolved source-spectra estimates and corner frequencies. The results are consistent with an ??2 source model and constant Brune stress drop. However, consideration of the raw spectral shapes of the largest events provides evidence for a high-frequency decay more shallow than ??2. The intermediate (???f-1) slope cannot be explained plausibly with attenuation or site effects and is qualitatively consistent with a model incorporating directivity effects and a fractional stress-drop rupture process, as suggested by Haddon (1996). However, the results obtained in this study are not consistent with the model of Haddon (1996) in that the intermediate slope is not revealed with EGF analysis. This could reflect either bandwidth limitations inherent in EGF analysis or perhaps a rupture process that is not self-similar. I show that a model with an intermediate spectral decay can also reconcile the apparent discrepancy between the scaling of static stress drop and that of apparent stress drop for moderate-to-large events.

  10. A Dendrochronological Analysis of Mississippi River Flood Events

    Science.gov (United States)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  11. Probability distribution analysis of observational extreme events and model evaluation

    Science.gov (United States)

    Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.

    2016-12-01

    Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.

  12. EventView - The Design Behind an Analysis Framework

    CERN Document Server

    Cranmer, K; Shibata, A

    2007-01-01

    The development of software used to process petabytes of data per year is an elaborate project. The complexity of the detector means components of very diverse nature are required to process the data and one needs well defined frameworks that are both flexible and maintainable. Modern programming architecture based on object-oriented component design supports desirable features of such frameworks. The principle has been applied in almost all sub-systems of ATLAS software and its robustness has benefited the collaboration. An implementation of such framework for physics analysis, however, did not exist before the work presented in this paper. As it turns out the realisation of object-oriented analysis framework is closely related to the design of the event data object. In this paper, we well review the design behind the analysis framework developed around a data class called ``EventView''. It is a highly integrated part of the ATLAS software framework and is now becoming a standard platform for physics analysi...

  13. Dysbiotic Events in Gut Microbiota: Impact on Human Health

    Science.gov (United States)

    Schippa, Serena; Conte, Maria Pia

    2014-01-01

    The human body is colonized by a large number of microbes coexisting peacefully with their host. The most colonized site is the gastrointestinal tract (GIT). More than 70% of all the microbes in the human body are in the colon. The microorganism population is 10 times larger of the total number of our somatic and germ cells. Two bacterial phyla, accounting for more than 90% of the bacterial cells, dominate the healthy adult intestine: Firmicutes and Bacteroidetes. Considerable variability in the microbiota compositions between people is found when we look at the taxonomic level of species, and strains within species. It is possible to assert that the human microbiota could be compared to a fingerprint. The microbiota acts as a barrier from pathogens, exerts important metabolic functions, and regulates inflammatory response by stimulating the immune system. Gut microbial imbalance (dysbiosis), has been linked to important human diseases such as inflammation related disorders. The present review summarizes our knowledge on the gut microbiota in a healthy context, and examines intestinal dysbiosis in inflammatory bowel disease (IBD) patients; the most frequently reported disease proven to be associated with changes in the gut microbiota. PMID:25514560

  14. Dysbiotic Events in Gut Microbiota: Impact on Human Health

    Directory of Open Access Journals (Sweden)

    Serena Schippa

    2014-12-01

    Full Text Available The human body is colonized by a large number of microbes coexisting peacefully with their host. The most colonized site is the gastrointestinal tract (GIT. More than 70% of all the microbes in the human body are in the colon. The microorganism population is 10 times larger of the total number of our somatic and germ cells. Two bacterial phyla, accounting for more than 90% of the bacterial cells, dominate the healthy adult intestine: Firmicutes and Bacteroidetes. Considerable variability in the microbiota compositions between people is found when we look at the taxonomic level of species, and strains within species. It is possible to assert that the human microbiota could be compared to a fingerprint. The microbiota acts as a barrier from pathogens, exerts important metabolic functions, and regulates inflammatory response by stimulating the immune system. Gut microbial imbalance (dysbiosis, has been linked to important human diseases such as inflammation related disorders. The present review summarizes our knowledge on the gut microbiota in a healthy context, and examines intestinal dysbiosis in inflammatory bowel disease (IBD patients; the most frequently reported disease proven to be associated with changes in the gut microbiota.

  15. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  16. Event-scale power law recession analysis: quantifying methodological uncertainty

    Science.gov (United States)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  17. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  18. Parallel evolutionary events in the haptoglobin gene clusters of rhesus monkey and human

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, L.M.; Maeda, N. [Univ. of North Carolina, Chapel Hill, NC (United States)

    1994-08-01

    Parallel occurrences of evolutionary events in the haptoglobin gene clusters of rhesus monkeys and humans were studied. We found six different haplotypes among 11 individuals from two rhesus monkey families. The six haplotypes include two types of haptoglobin gene clusters: one type with a single gene and the other with two genes. DNA sequence analysis indicates that the one-gene and the two-gene clusters were both formed by unequal homologous crossovers between two genes of an ancestral three-gene cluster, near exon 5, the longest exon of the gene. This exon is also the location where a separate unequal homologous crossover occured in the human lineage, forming the human two-gene haptoglobin gene cluster from an ancestral three-gene cluster. The occurrence of independent homologous unequal crossovers in rhesus monkey and in human within the same region of DNA suggests that the evolutionary history of the haptoglobin gene cluster in primates is the consequence of frequent homologous pairings facilitated by the longest and most conserved exon of the gene. 27 refs., 7 figs., 1 tab.

  19. The Consequential Problems of Unexpected Events for Human Element and Construction Organizations

    OpenAIRE

    Amir Khosravi; Abdul Hakim Bin Mohammed

    2013-01-01

    Unexpected events are unpredictable or beyond the control of human. The aim of this study was to identify the consequential problems of unexpected events faced by construction managers and project managers. In undertaking this investigation, we used an exploratory semi-structured interview and a questionnaire survey method. The results of this research showed that the consequential problems of unexpected events were frequently wicked, wicked messes and messes types of problems. These wicked, ...

  20. Human kinematics and event control: On-line movement registration as a means for experimental manipulation

    NARCIS (Netherlands)

    Oudejans, R.R.D.; Coolen, H.

    2003-01-01

    In human movement and sports science, manipulations of perception and action are common and often comprise the control of events, such as opening or closing liquid crystal goggles. Most of these events are externally controlled, independent of the actions of the participants. Less common, although

  1. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  2. Event by Event Analysis of High Multiplicity Events Produced in 158 A GeV/c 208 Pb- 208 Pb Collisions

    CERN Document Server

    Ahmad, Shakeel; Kumar, Ashwini; Chaturvedi, O S K; Ahmad, A; Zafar, M; Irfan, M; Singh, B K

    2015-01-01

    An extensive analysis of individual high multiplicity events produced in 158 A GeV /c 208Pb- 208Pb collisions is carried by adopting different methods to examine the anomalous behavior of these rare events. A method of selecting the events with densely populated narrow regions or spikes out of a given sample of collision events is discussed.Employing this approach two events with large spikes in their eta- and phi- distributions are selected for further analysis. For the sake of comparison, another two events which do not exhibit such spikes are simultaneously analyzed. The findings suggest that the systematic studies of particle density fluctuations in one- and two-dimensional phase-spaces and comparison with those obtained from the studies of correlation free Monte Carlo events, would be useful for identifying the events with large dynamical fluctuations. Formation of clusters or jet like phenomena in multihadronic final states in individual events is also discussed and the experimental findings are compare...

  3. Human motion analysis and modeling

    Science.gov (United States)

    Prussing, Keith; Cathcart, J. Michael; Kocher, Brian

    2011-06-01

    Georgia Tech has investigated methods for the detection and tracking of personnel in a variety of acquisition environments. This research effort focused on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. As a fundamental part of this research effort, Georgia Tech collected motion capture data on an individual for a variety of walking speeds, carrying loads, and load distributions. These data formed the basis for deriving fundamental properties of the individual's motion and supported the development of a physiologically-based human motion model. Subsequently this model aided the derivation and analysis of motion-based observables, particularly changes in the motion of various body components resulting from load variations. This paper will describe the data acquisition process, development of the human motion model, and use of the model in the observable analysis. Video sequences illustrating the motion data and modeling results will also be presented.

  4. Multivariate cluster analysis of forest fire events in Portugal

    Science.gov (United States)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  5. Nonstochastic Analysis of Manufacturing Systems Using Timed-Event Graphs

    DEFF Research Database (Denmark)

    Hulgaard, Henrik; Amon, Tod

    1996-01-01

    Using automated methods to analyze the temporal behavior ofmanufacturing systems has proven to be essential and quite beneficial.Popular methodologies include Queueing networks, Markov chains,simulation techniques, and discrete event systems (such as Petrinets). These methodologies are primarily......, which we argue can be useful for verifying{\\em correct} operation. We model manufacturing systems using timedevent graphs which are similar to decision free Petri nets augmentedwith timing information, and present an example that demonstratesthe efficacy of non-stochastic analysis....... stochastic. Performanceevaluation mandates results which are probabilistic in nature (such asthe average rate of part deliveries) and relies on probabilisticinputs (such as the probability of breakdown, or the distributionsassociated with a manufacturing process). This paper examinesnon-stochastic analysis...

  6. Flooding in river mouths: human caused or natural events? Five centuries of flooding events in the SW Netherlands, 1500-2000

    NARCIS (Netherlands)

    de Kraker, A.M.J.

    2015-01-01

    This paper looks into flood events of the past 500 years in the SW Netherlands, addressing the issue of what kind of flooding events have occurred and which ones have mainly natural causes and which ones are predominantly human induced. The flood events are classified into two major categories: (a)

  7. Discussion of Comments from a Peer Review of A Technique for Human Event Anlysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A,; Wreathall J.

    1999-01-28

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of draft NUREG- 1624, a peer review of the new human reliability analysis method its documentation and the results of an initial test of the method was held over a two-day period in June 1998 in Seattle, Washington. Four internationally known and respected experts in HK4 or probabilistic risk assessment were selected to serve as the peer reviewers. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. They were asked to focus on two major aspects: Are the basic premises of ATHEANA on solid ground and is the conceptual basis adequate? Is the ATHEANA implementation process adequate given the description of the intended users in the documentation? The four peer reviewers asked questions and provided oral comments during the peer review meeting and provided written comments approximately two weeks after the completion of the meeting. This paper discusses their major comments.

  8. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy)

    Science.gov (United States)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela; Perrotta, Piero; Russo, Luigi; Tansi, Carlo

    2017-11-01

    Calabria (southern Italy) is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  9. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    Science.gov (United States)

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  10. Bayesian analysis for extreme climatic events: A review

    Science.gov (United States)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  11. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  12. Analysis of SUSY Heavy Higgs events at CLIC

    CERN Document Server

    Quevillon, J

    2009-01-01

    This paper reports the results of a study of the supersymmetric neutral heavy Higgs boson production channel e+e− → H◦A◦ → bb ̄bb ̄ at √s = 3 TeV. Reconstruction of data simulated at generator level shows a significant degradation of SUSY Heavy Higgs signal caused by γγ to hadrons background at s = 3 TeV. The importance of analysis procedures such as event cuts and transversal momentum cuts during jet-clustering to reduce the impact of the hadron background is underlined. Reconstruction at both the generator level and at the level of a full detector simulation forces us to introduce cuts to improve the quality of the results. This note describes a preliminary study of SUSY Heavy Higgs at CLIC - a more detailed paper on an extended study is in preparation.

  13. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  14. Nonlinear impacts of small-scale natural events on Nineteenth Century human decision-making

    Science.gov (United States)

    McCormack, S. M.; Schlichting, K. M.; Urbanova, T.; Allen, T. L.; Ruffing, C. M.; Hermans, C. M.

    2009-12-01

    Natural climatological events that occurred throughout the Nineteenth Century, such as floods, droughts and hurricanes had long-lived, far-reaching consequences on the human decision-making processes occurring in the northeast United States. These events impacted the hydrological cycle, both directly -though the building of various structures- and indirectly - through an increased understanding of science; and the changing relationship between humans and their environment. This paper examines these events and associated processes through: 1) identifying specific natural events throughout the time period, occurring globally, with initial conditions conducive to long-lived consequences; 2) examining the relationship between scientific enquiry, natural events and the proliferation of dams in the northeast landscape; and 3) the growth of public health concerns, awareness of bacteriology, and municipal water supply systems. Results of this research indicate that the relationship between knowledge systems, natural events and subsequent engineering or technological fixes is complex and highly dependent on initial conditions. It highlights the time period where humans became increasingly dependent on engineered solutions to environmental problems, many of which still hold fast in our contemporary landscape. It is relevant to natural, social and governance structures in place today. The principles behind the occurrence of the natural phenomena and subsequent research and design have not changed; and understanding key events or stages in the past is tantamount to making predictions for the future.

  15. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  16. Changes in extreme events and the potential impacts on human health.

    Science.gov (United States)

    Bell, Jesse E; Brown, Claudia Langford; Conlon, Kathryn; Herring, Stephanie; Kunkel, Kenneth E; Lawrimore, Jay; Luber, George; Schreck, Carl; Smith, Adam; Uejio, Christopher

    2017-11-29

    Extreme weather and climate-related events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, dust storms, flooding rains, coastal flooding, storm surge, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden. More information is needed about the impacts of climate change on public health and economies to effectively plan for and adapt to climate change. This article describes some of the ways extreme events are changing and provides examples of the potential impacts on human health and infrastructure. It also identifies key research gaps to be addressed to improve the resilience of public health to extreme events in the future. Extreme weather and climate events affect human health by causing death, injury, and illness, as well as having large socio-economic impacts. Climate change has caused changes in extreme event frequency, intensity and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, flooding rains, coastal flooding, storm surge, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden.

  17. Meaningful-Experience Creation and Event Management A Post-Event Analysis of Copenhagen Carnival 2009

    Directory of Open Access Journals (Sweden)

    Sarah Holst Kjær

    2011-06-01

    Full Text Available A carnival is a cultural event within the experience economy, and can be considered an activity of added value to a city when creating place-awareness for tourists and residents. ‘Culture’ is used as a way to regenerate post-industrial and run down places, when studying EU – as well as Nordic – cultural policy reports. This might be too much to expect from the cultural sector though. Amongst other external factors, cultural policy ideals co-create and affect the experiential content of an event in various ways. Thus studying a carnival one has to include external and internal factors in order to evaluate their meaningfulness in the total experience of the event. One way to investigate what a meaningful experience is can be to apply a cultural consumer perspective. How different consumer segments directly and indirectly inform the event organisation and how the consumer’s cultural preconceptions judge the event is vital when an event organisation designs and improves its experience concepts and experience setting. Thus, the way the carnival’s venue and activities are culturally received is closely linked to the management of the organisation’s external and internal resources. The goal of an event organisation is to produce meaningful and appealing experience concepts and perform them in real time. But how is this organised in practice? This article evaluates the production of the Copenhagen Carnival 2009 and is based on ethnographic material. Through a model of Value Framework for Experience Production by the Dutch experience economists Albert Boswijk, Thomas Thijssen & Ed Peelen (2007 I analyse how the practical organisation, technical solutions and cultural assumptions of a carnival are part of an event organisation’s work-process when creating a spectacle. Furthermore, the organisation of voluntary professional culture workers and the navigation in a metropolitan, political and institutional context is examined through the

  18. Monitoring adverse events following immunisation in developing countries: experience from human papillomavirus vaccination demonstration projects.

    Science.gov (United States)

    Jain, Kriti M; Paul, Proma; LaMontagne, D Scott

    2013-03-01

    Surveillance of adverse events following immunisation (AEFIs) is important for maintaining trust in vaccination. This paper discusses retrospective reports by parents and guardians of girls experiencing AEFIs during human papillomavirus (HPV) vaccine demonstration projects in Uganda and Vietnam. A secondary analysis of data from a population-based survey measuring HPV vaccine coverage of eligible girls and acceptability among parents and guardians was conducted. Survey data from parents were analysed for frequency and type of AEFI and actions taken. Of the 1700 eligible households contacted, all responded to the survey; of those, 1313 respondents had an eligible child who had received at least one dose of the HPV vaccine. Data were missing from 49 respondents, resulting in 1264 surveys. Twenty-five percent reported an AEFI, with fever (29.1%) and pain or swelling at the injection site (62.0%) being the most common. Events totalled 386 (10.5%) of the 3684 doses administered. Most parents reported that they took no action (63.9%) or cared for girls at home (16.1%) following an AEFI. Thirty-three parents sought advice from health workers or attended a clinic for 46 events (0.8% of all doses). Frequency of reporting varied by respondent identity, geographic location and vaccination location. AEFIs reported were similar to Phase III vaccine trials. Most parents reporting AEFIs took no action or treated the girl at home, suggesting that most AEFIs were not serious enough to contact the health system. AEFI reports were more frequent when solicited in surveys compared with reports from routine monitoring.

  19. Feature extraction of event-related potentials using wavelets: an application to human performance monitoring

    Science.gov (United States)

    Trejo, L. J.; Shensa, M. J.

    1999-01-01

    This report describes the development and evaluation of mathematical models for predicting human performance from discrete wavelet transforms (DWT) of event-related potentials (ERP) elicited by task-relevant stimuli. The DWT was compared to principal components analysis (PCA) for representation of ERPs in linear regression and neural network models developed to predict a composite measure of human signal detection performance. Linear regression models based on coefficients of the decimated DWT predicted signal detection performance with half as many free parameters as comparable models based on PCA scores. In addition, the DWT-based models were more resistant to model degradation due to over-fitting than PCA-based models. Feed-forward neural networks were trained using the backpropagation algorithm to predict signal detection performance based on raw ERPs, PCA scores, or high-power coefficients of the DWT. Neural networks based on high-power DWT coefficients trained with fewer iterations, generalized to new data better, and were more resistant to overfitting than networks based on raw ERPs. Networks based on PCA scores did not generalize to new data as well as either the DWT network or the raw ERP network. The results show that wavelet expansions represent the ERP efficiently and extract behaviorally important features for use in linear regression or neural network models of human performance. The efficiency of the DWT is discussed in terms of its decorrelation and energy compaction properties. In addition, the DWT models provided evidence that a pattern of low-frequency activity (1 to 3.5 Hz) occurring at specific times and scalp locations is a reliable correlate of human signal detection performance. Copyright 1999 Academic Press.

  20. Cluster analysis of indermediate deep events in the southeastern Aegean

    Science.gov (United States)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2015-04-01

    The Hellenic subduction zone (HSZ) is the seismically most active region in Europe where the oceanic African litosphere is subducting beneath the continental Aegean plate. Although there are numerous studies of seismicity in the HSZ, very few focus on the eastern HSZ and the Wadati-Benioff-Zone of the subducting slab in that part of the HSZ. In order to gain a better understanding of the geodynamic processes in the region a dense local seismic network is required. From September 2005 to March 2007, the temporary seismic network EGELADOS has been deployed covering the entire HSZ. It consisted of 56 onshore and 23 offshore broadband stations with addition of 19 stations from GEOFON, NOA and MedNet to complete the network. Here, we focus on a cluster of intermediate deep seismicity recorded by the EGELADOS network within the subducting African slab in the region of the Nysiros volcano. The cluster consists of 159 events at 80 to 190 km depth with magnitudes between 0.2 and 4.1 that were located using nonlinear location tool NonLinLoc. A double-difference earthquake relocation using the HypoDD software is performed with both manual readings of onset times and differential traveltimes obtained by separate cross correlation of P- and S-waveforms. Single event locations are compared to relative relocations. The event hypocenters fall into a thin zone close to the top of the slab defining its geometry with an accuracy of a few kilometers. At intermediate depth the slab is dipping towards the NW at an angle of about 30°. That means it is dipping steeper than in the western part of the HSZ. The edge of the slab is clearly defined by an abrupt disappearance of intermediate depths seismicity towards the NE. It is found approximately beneath the Turkish coastline. Furthermore, results of a cluster analysis based on the cross correlation of three-component waveforms are shown as a function of frequency and the spatio-temporal migration of the seismic activity is analysed.

  1. Chain of events analysis for a scuba diving fatality.

    Science.gov (United States)

    Lippmann, John; Stevenson, Christopher; McD Taylor, David; Williams, Jo; Mohebbi, Mohammadreza

    2017-09-01

    A scuba diving fatality usually involves a series of related events culminating in death. Several studies have utilised a chain of events-type analysis (CEA) to isolate and better understand the accident sequence in order to facilitate the creation of relevant countermeasures. The aim of this research was to further develop and better define a process for performing a CEA to reduce potential subjectivity and increase consistency between analysts. To develop more comprehensive and better-defined criteria, existing criteria were modified and a template was created and tested using a CEA. Modifications comprised addition of a category for pre-disposing factors, expansion of criteria for the triggers and disabling agents present during the incident, and more specific inclusion criteria to better encompass a dataset of 56 fatalities. Four investigators (raters) used both the previous criteria and this template, in randomly assigned order, to examine a sample of 13 scuba diver deaths. Individual results were scored against the group consensus for the CEA. Raters' agreement consistency was compared using the Index of Concordance and intra-class correlation coefficients (ICC). The template is presented. The index of concordance between the raters increased from 62% (194⁄312) using the previous criteria to 82% (257⁄312) with use of this template indicating a substantially higher inter-rater agreement when allocating criteria. The agreement in scoring with and without template use was also quantified by ICC which were generally graded as low, illustrating a substantial change in consistency of scoring before and after template use. The template for a CEA for a scuba diving fatality improves consistency of interpretation between users and may improve comparability of diving fatality reports.

  2. Balzac and human gait analysis.

    Science.gov (United States)

    Collado-Vázquez, S; Carrillo, J M

    2015-05-01

    People have been interested in movement analysis in general, and gait analysis in particular, since ancient times. Aristotle, Hippocrates, Galen, Leonardo da Vinci and Honoré de Balzac all used observation to analyse the gait of human beings. The purpose of this study is to compare Honoré de Balzac's writings with a scientific analysis of human gait. Honoré de Balzac's Theory of walking and other works by that author referring to gait. Honoré de Balzac had an interest in gait analysis, as demonstrated by his descriptions of characters which often include references to their way of walking. He also wrote a treatise entitled Theory of walking (Théorie de la demarche) in which he employed his keen observation skills to define gait using a literary style. He stated that the walking process is divided into phases and listed the factors that influence gait, such as personality, mood, height, weight, profession and social class, and also provided a description of the correct way of walking. Balzac considered gait analysis to be very important and this is reflected in both his character descriptions and Theory of walking, his analytical observation of gait. In our own technology-dominated times, this serves as a reminder of the importance of observation. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  3. The use of Kolmogorov-Smirnov test in event-by-event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tomasik, Boris [Univerzita Mateja Bela, Tajovskeho 40, 97401 Banska Bystrica (Slovakia); Czech Technical University in Prague, FNSPE, Brehova 11, 11519 Prague (Czech Republic); Melo, Ivan [Zilinska Univerzita, Univerzitna 1, 01026 Zilina (Slovakia); Torrieri, Giorgio [FIAS, Goethe-Universitaet, Ruth-Moufang-Str. 1, 60438 Frankfurt (Germany); Vogel, Sascha; Bleicher, Marcus [Institut fuer Theoretische Physik, Goethe-Universitaet, Max-von-Laue-Str. 1, 60438 Frankfurt (Germany)

    2009-11-01

    We propose to use the Kolmogorov-Smirnov test to uncover non-statistical differences between events created in heavy ion collisions within the same centrality class. The advantage of the method over other approaches which are currently in use, is that it is sensitive to any difference between the events and is not restricted to simple moments of the distribution of hadrons. The particular application examined here is the identification of the fireball decay due to spinodal fragmentation and/or sudden rise of the bulk viscosity.

  4. Nurses' critical event risk assessments: a judgement analysis.

    Science.gov (United States)

    Thompson, Carl; Bucknall, Tracey; Estabrookes, Carole A; Hutchinson, Alison; Fraser, Kim; de Vos, Rien; Binnecade, Jan; Barrat, Gez; Saunders, Jane

    2009-02-01

    To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses appear to be both misinterpreting and mismanaging the nursing-knowledge 'basics' such as heart rate, respiratory rate and oxygenation. Whilst many medical interventions originate from nurses, up to 26% of nurses' responses to abnormal signs result in delays of between one and three hours. A double system judgement analysis using Brunswik's lens model of cognition was undertaken with 245 Dutch, UK, Canadian and Australian acute care nurses. Nurses were asked to judge the likelihood of a critical event, 'at-risk' status, and whether they would intervene in response to 50 computer-presented clinical scenarios in which data on heart rate, systolic blood pressure, urine output, oxygen saturation, conscious level and oxygenation support were varied. Nurses were also presented with a protocol recommendation and also placed under time pressure for some of the scenarios. The ecological criterion was the predicted level of risk from the Modified Early Warning Score assessments of 232 UK acute care inpatients. Despite receiving identical information, nurses varied considerably in their risk assessments. The differences can be partly explained by variability in weightings given to information. Time and protocol recommendations were given more weighting than clinical information for key dichotomous choices such as classifying a patient as 'at risk' and deciding to intervene. Nurses' weighting of cues did not mirror the same information's contribution to risk in real patients. Nurses synthesized information in non-linear ways that contributed little to decisional accuracy. The low-moderate achievement (R(a)) statistics suggests that nurses' assessments of risk were largely inaccurate

  5. ANALYSIS OF A MEDIA EVENT. CASE STUDY: EUROVISION 2012

    Directory of Open Access Journals (Sweden)

    Nicoleta CIACU

    2012-01-01

    Full Text Available The objective of the study is to identify the characteristics of a media event and to analyze the specific features of a major event in Europe, the Eurovision Song Contest. The research design was based on the theoretical presentation of the media event concept related to the interpretation of the specific features of this year’s edition. This case study starts from framing the event into the restorative event category because the event itself is the result of an over-exposure, both pre and post event and especially during it. Another aspect that gives Eurovision the label of a“media event” comes from its interrupting nature. That is given by the mobilization of the public who abandoned their daily activities and participated at the event on the ground, in Baku, or in front of the TV. The anticipated nature of the event is reflected in the frequency with which it took place from 1956 to present and in its over-exposure as well, being the longest running program in the television history, with the largest international audience of the unsporting broadcasts.

  6. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  8. Analysis of Multi Muon Events in the L3 Detector

    CERN Document Server

    Schmitt, Volker

    2000-01-01

    The muon density distribution in air showers initiated by osmi parti les is sensitive to the hemi al omposition of osmi rays. The density an be measured via the multipli ity distribution in a nite size dete tor, as it is L3. With a shallow depth of 30 meters under ground, the dete tor provides an ex ellent fa ility to measure a high muon rate, but being shielded from the hadroni and ele troni shower omponent. Subje t of this thesis is the des ription of the L3 Cosmi s experiment (L3+C), whi h is taking data sin e May 1999 and the analysis of muon bundles in the large magneti spe trometer of L3. The new osmi trigger and readout system is brie y des ribed. The in uen e of dierent primaries on the multipli ity distribution has been investigated using Monte Carlo event samples, generated with the CORSIKA program. The simulation results showed that L3+C measures in the region of the \\knee" of the primary spe trum of osmi rays. A new pattern re ognition has been developed and added to the re onstru tion ode, whi h ...

  9. Analysis of the human Alu Ye lineage

    Directory of Open Access Journals (Sweden)

    Jurka Jerzy

    2005-02-01

    Full Text Available Abstract Background Alu elements are short (~300 bp interspersed elements that amplify in primate genomes through a process termed retroposition. The expansion of these elements has had a significant impact on the structure and function of primate genomes. Approximately 10 % of the mass of the human genome is comprised of Alu elements, making them the most abundant short interspersed element (SINE in our genome. The majority of Alu amplification occurred early in primate evolution, and the current rate of Alu retroposition is at least 100 fold slower than the peak of amplification that occurred 30–50 million years ago. Alu elements are therefore a rich source of inter- and intra-species primate genomic variation. Results A total of 153 Alu elements from the Ye subfamily were extracted from the draft sequence of the human genome. Analysis of these elements resulted in the discovery of two new Alu subfamilies, Ye4 and Ye6, complementing the previously described Ye5 subfamily. DNA sequence analysis of each of the Alu Ye subfamilies yielded average age estimates of ~14, ~13 and ~9.5 million years old for the Alu Ye4, Ye5 and Ye6 subfamilies, respectively. In addition, 120 Alu Ye4, Ye5 and Ye6 loci were screened using polymerase chain reaction (PCR assays to determine their phylogenetic origin and levels of human genomic diversity. Conclusion The Alu Ye lineage appears to have started amplifying relatively early in primate evolution and continued propagating at a low level as many of its members are found in a variety of hominoid (humans, greater and lesser ape genomes. Detailed sequence analysis of several Alu pre-integration sites indicated that multiple types of events had occurred, including gene conversions, near-parallel independent insertions of different Alu elements and Alu-mediated genomic deletions. A potential hotspot for Alu insertion in the Fer1L3 gene on chromosome 10 was also identified.

  10. Chapter 12: Human microbiome analysis.

    Directory of Open Access Journals (Sweden)

    Xochitl C Morgan

    Full Text Available Humans are essentially sterile during gestation, but during and after birth, every body surface, including the skin, mouth, and gut, becomes host to an enormous variety of microbes, bacterial, archaeal, fungal, and viral. Under normal circumstances, these microbes help us to digest our food and to maintain our immune systems, but dysfunction of the human microbiota has been linked to conditions ranging from inflammatory bowel disease to antibiotic-resistant infections. Modern high-throughput sequencing and bioinformatic tools provide a powerful means of understanding the contribution of the human microbiome to health and its potential as a target for therapeutic interventions. This chapter will first discuss the historical origins of microbiome studies and methods for determining the ecological diversity of a microbial community. Next, it will introduce shotgun sequencing technologies such as metagenomics and metatranscriptomics, the computational challenges and methods associated with these data, and how they enable microbiome analysis. Finally, it will conclude with examples of the functional genomics of the human microbiome and its influences upon health and disease.

  11. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  12. Nurses' critical event risk assessments: a judgement analysis

    NARCIS (Netherlands)

    Thompson, Carl; Bucknall, Tracey; Estabrookes, Carole A.; Hutchinson, Alison; Fraser, Kim; de Vos, Rien; Binnecade, Jan; Barrat, Gez; Saunders, Jane

    2009-01-01

    To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses

  13. Novel adverse events of vortioxetine: A disproportionality analysis in USFDA adverse event reporting system database.

    Science.gov (United States)

    Subeesh, Viswam; Singh, Hemendra; Maheswari, Eswaran; Beulah, Elsa

    2017-12-01

    Signal detection is one of the most advanced and emerging field in pharmacovigilance. It is a modern method of detecting new reaction (which can be desired or undesired) of a drug. It facilitates early adverse drug reaction detection which enables health professionals to identify adverse events that may not have been identified in pre-marketing clinical trials. Vortioxetine, the first mixed serotonergic antidepressant was initially approved by the US Food and Drug Administration (USFDA) on September 30, 2013 for the treatment of adults with Major Depressive Disorder (MDD). This study was to identify the signal strength for vortioxetine associated ADRs using data mining technique in USFDA Adverse Event Reporting System (AERS) database. Most commonly used three data mining algorithms, Reporting Odds Ratio (ROR), Proportional Reporting Ratio (PRR) and Information Component (IC) were selected for the study and they were applied retrospectively in USFDA AERS database from 2015Q1 to 2016Q3. A value of ROR-1.96SE >1, PRR≥2, IC- 2SD>0 were considered as the positive signal. A study population of 61,22,000 were reported all over the world. Among which 3481 reactions were associated with vortioxetine which comprised of 632 unique events encompassed with 27 clinically relevant reactions. ROR, PRR and IC showed positive signal for weight loss, agitation, anger, ketoacidosis, insomnia and abnormal dreams. The present study suggests that vortioxetine may result in these adverse events. Further pharmacoepidemiologic studies are necessary to confirm this conclusion and to improve the precision of the prevalence and/or the risk factors of this ADRs. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    Science.gov (United States)

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  15. Simulating the physiology of athletes during endurance sports events: Modelling human energy conversion and metabolism

    NARCIS (Netherlands)

    Beek, J.H.G.M. van; Supandi, F.; Gavai, A.K.; Graaf, A.A. de; Binsl, T.W.; Hettling, H.

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events.We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We

  16. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    NARCIS (Netherlands)

    J.W. van Strien (Jan); L.A. Isbell (Lynne A.)

    2017-01-01

    textabstractStudies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to

  17. Eliciting Children's Recall of Events: How Do Computers Compare with Humans?

    Science.gov (United States)

    Powell, Martine B.; Wilson, J. Clare; Thomson, Donald M.

    2002-01-01

    Describes a study that investigated the usefulness of an interactive computer program in eliciting children's reports about an event. Compared results of interviews by computer with interviews with humans with children aged five through eight that showed little benefit in computers over face-to-face interviews. (Author/LRW)

  18. Effects of stimulus repetitions on the event-related potential of humans and rats

    NARCIS (Netherlands)

    Sambeth, A.; Maes, J.H.R.; Quian Quiroga, R.; Coenen, A.M.L.

    2004-01-01

    The present study compared the effects of repeated stimulus presentations on the event-related potential (ERP) of humans and rats. Both species were presented with a total of 100 auditory stimuli, divided into four blocks of 25 stimuli. By means of wavelet denoising, single-trial ERPs were

  19. Genetics of the human electroencephalogram (EEG) and event-related brain potentials (ERPs): a review

    NARCIS (Netherlands)

    van Beijsterveldt, C.E.M.; Boomsma, D.I.

    1994-01-01

    Twin and family studies of normal variation in the human electroencephalogram (EEG) and event related potentials (ERPs) are reviewed. Most of these studies are characterized by small sample sizes. However, by summarizing these studies in one paper, we may be able to gain some insight into the

  20. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  1. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  2. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  3. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  4. Incidence of patient safety events and process-related human failures during intra-hospital transportation of patients: retrospective exploration from the institutional incident reporting system.

    Science.gov (United States)

    Yang, Shu-Hui; Jerng, Jih-Shuin; Chen, Li-Chin; Li, Yu-Tsu; Huang, Hsiao-Fang; Wu, Chao-Ling; Chan, Jing-Yuan; Huang, Szu-Fen; Liang, Huey-Wen; Sun, Jui-Sheng

    2017-11-03

    Intra-hospital transportation (IHT) might compromise patient safety because of different care settings and higher demand on the human operation. Reports regarding the incidence of IHT-related patient safety events and human failures remain limited. To perform a retrospective analysis of IHT-related events, human failures and unsafe acts. A hospital-wide process for the IHT and database from the incident reporting system in a medical centre in Taiwan. All eligible IHT-related patient safety events between January 2010 to December 2015 were included. Incidence rate of IHT-related patient safety events, human failure modes, and types of unsafe acts. There were 206 patient safety events in 2 009 013 IHT sessions (102.5 per 1 000 000 sessions). Most events (n=148, 71.8%) did not involve patient harm, and process events (n=146, 70.9%) were most common. Events at the location of arrival (n=101, 49.0%) were most frequent; this location accounted for 61.0% and 44.2% of events with patient harm and those without harm, respectively (pprocess step was the preparation of the transportation team (n=91, 48.9%). Contributing unsafe acts included perceptual errors (n=14, 7.5%), decision errors (n=56, 30.1%), skill-based errors (n=48, 25.8%), and non-compliance (n=68, 36.6%). Multivariate analysis showed that human failure found in the arrival and hand-off sub-process (OR 4.84, pprocess at the location of arrival and prevent errors other than omissions. Long-term monitoring of IHT-related events is also warranted. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Identification of human-induced initiating events in the low power and shutdown operation using the commission error search and assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Chan; Kim, Jong Hyun [KEPCO International Nuclear Graduate School (KINGS), Ulsan (Korea, Republic of)

    2015-03-15

    Human-induced initiating events, also called Category B actions in human reliability analysis, are operator actions that may lead directly to initiating events. Most conventional probabilistic safety analyses typically assume that the frequency of initiating events also includes the probability of human-induced initiating events. However, some regulatory documents require Category B actions to be specifically analyzed and quantified in probabilistic safety analysis. An explicit modeling of Category B actions could also potentially lead to important insights into human performance in terms of safety. However, there is no standard procedure to identify Category B actions. This paper describes a systematic procedure to identify Category B actions for low power and shutdown conditions. The procedure includes several steps to determine operator actions that may lead to initiating events in the low power and shutdown stages. These steps are the selection of initiating events, the selection of systems or components, the screening of unlikely operating actions, and the quantification of initiating events. The procedure also provides the detailed instruction for each step, such as operator's action, information required, screening rules, and the outputs. Finally, the applicability of the suggested approach is also investigated by application to a plant example.

  6. Fifty Years of THERP and Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø National Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.

  7. Time to tenure in Spanish universities: an event history analysis.

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  8. Time to tenure in Spanish universities: an event history analysis.

    Directory of Open Access Journals (Sweden)

    Luis Sanz-Menéndez

    Full Text Available Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  9. Time to Tenure in Spanish Universities: An Event History Analysis

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  10. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  11. Individual Differences in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Ronald L. Boring

    2014-06-01

    While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.

  12. Human Reliability Analysis for Small Modular Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  13. Adverse Drug Event Ontology: Gap Analysis for Clinical Surveillance Application

    Science.gov (United States)

    Adam, Terrence J.; Wang, Jin

    2015-01-01

    Adverse drug event identification and management are an important patient safety problem given the potential for event prevention. Previous efforts to provide structured data methods for population level identification of adverse drug events have been established, but important gaps in coverage remain. ADE identification gaps contribute to suboptimal and inefficient event identification. To address the ADE identification problem, a gap assessment was completed with the creation of a proposed comprehensive ontology using a Minimal Clinical Data Set framework incorporating existing identification approaches, clinical literature and a large set of inpatient clinical data. The new ontology was developed and tested using the National Inpatient Sample database with the validation results demonstrating expanded ADE identification capacity. In addition, the newly proposed ontology elements are noted to have significant inpatient mortality, above median inpatient costs and a longer length of stay when compared to existing ADE ontology elements and patients without ADE exposure. PMID:26306223

  14. Meta-analysis framework for exact inferences with application to the analysis of rare events.

    Science.gov (United States)

    Yang, Guang; Liu, Dungang; Wang, Junyuan; Xie, Min-Ge

    2016-12-01

    The usefulness of meta-analysis has been recognized in the evaluation of drug safety, as a single trial usually yields few adverse events and offers limited information. For rare events, conventional meta-analysis methods may yield an invalid inference, as they often rely on large sample theories and require empirical corrections for zero events. These problems motivate research in developing exact methods, including Tian et al.'s method of combining confidence intervals (2009, Biostatistics, 10, 275-281) and Liu et al.'s method of combining p-value functions (2014, JASA, 109, 1450-1465). This article shows that these two exact methods can be unified under the framework of combining confidence distributions (CDs). Furthermore, we show that the CD method generalizes Tian et al.'s method in several aspects. Given that the CD framework also subsumes the Mantel-Haenszel and Peto methods, we conclude that the CD method offers a general framework for meta-analysis of rare events. We illustrate the CD framework using two real data sets collected for the safety analysis of diabetes drugs. © 2016, The International Biometric Society.

  15. The 'dirty downside' of global sporting events: focus on human trafficking for sexual exploitation.

    Science.gov (United States)

    Finkel, R; Finkel, M L

    2015-01-01

    Human trafficking is as complex human rights and public health issue. The issue of human trafficking for sexual exploitation at large global sporting events has proven to be elusive given the clandestine nature of the industry. This piece examines the issue from a public health perspective. This is a literature review of the 'most comprehensive' studies published on the topic. A PubMed search was done using MeSH terms 'human traffickings' and 'sex trafficking' and 'human rights abuses'. Subheadings included 'statistics and numerical data', 'legislation and jurispudence', 'prevention and control', and 'therapy'. Only papers published in English were reviewed. The search showed that very few well-designed empirical studies have been conducted on the topic and only one pertinent systematic review was identified. Findings show a high prevalence of physical violence among those trafficked compared to non-trafficked women. Sexually transmitted infections and HIV AIDS are prevalent and preventive care is virtually non-existent. Quantifying human trafficking for sexual exploitation at large global sporting events has proven to be elusive given the clandestine nature of the industry. This is not to say that human trafficking for sex as well as forced sexual exploitation does not occur. It almost certainly exists, but to what extent is the big question. It is a hidden problem on a global scale in plain view with tremendous public health implications. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  16. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  17. Human Modeling For Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Tran, Donald; Stambolian, Damon; Henderson, Gena; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs.

  18. Exploring the Human Ecology of the Younger Dryas Extraterrestrial Impact Event

    Science.gov (United States)

    Kennett, D. J.; Erlandson, J. M.; Braje, T. J.; Culleton, B. J.

    2007-05-01

    Several lines of evidence now exist for a major extraterrestrial impact event in North America at 12.9 ka (the YDB). This impact partially destabilized the Laurentide and Cordilleran ice sheets, triggered abrupt Younger Dryas cooling and extensive wildfires, and contributed to megafaunal extinction. This event also occurred soon after the well established colonization of the Americas by anatomically modern humans. Confirmation of this event would represent the first near-time extraterrestrial impact with significant effects on human populations. These likely included widespread, abrupt human mortality, population displacement, migration into less effected or newly established habitats, loss of cultural traditions, and resource diversification in the face of the massive megafaunal extinction and population reductions in surviving animal populations. Ultimately, these transformations established the context for the special character of plant and animal domestication and the emergence of agricultural economies in North America. We explore the Late Pleistocene archaeological record in North America within the context of documented major biotic changes associated with the YDB in North America and of the massive ecological affects hypothesized for this event.

  19. Statistical language analysis for automatic exfiltration event detection.

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  20. Human motion analysis and characterization

    Science.gov (United States)

    Cathcart, J. Michael; Prussing, Keith; Kocher, Brian

    2011-06-01

    Georgia Tech has investigated methods for the detection and tracking of personnel in a variety of acquisition environments. This research effort focused on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. Both aspects are needed to support the development of personnel detection and tracking algorithms. As a fundamental part of this research effort, Georgia Tech collected motion capture data on an individual for a variety of walking speeds, carrying loads, and load distributions. These data formed the basis for deriving fundamental properties of the individual's motion and the derivation of motionbased observables, and changes in these fundamental properties arising from load variations. Analyses were conducted to characterize the motion properties of various body components such as leg swing, arm swing, head motion, and full body motion. This paper will describe the data acquisition process, extraction of motion characteristics, and analysis of these data. Video sequences illustrating the motion data and analysis results will also be presented.

  1. 'It pushed me back into the human race': evaluative findings from a community Christmas event.

    Science.gov (United States)

    Collins, Tracy; Kenney, Christine; Hesk, Gabrielle

    2017-09-01

    Many older people in Britain spend Christmas day alone. The Christmas period may be especially difficult for older people who are socially isolated, living with dementia or who have physical impairments, and may feel particularly marginalised at this time of year. This paper draws on evaluative research findings from a community Christmas event held in December 2014 at the University of Salford for older people and their carers who would be on their own on Christmas day. A multi-method approach was employed, seven guests took part in semi-structured interviews to explore their experiences and perceptions of the event, seven staff and student volunteers participated in a group interview to explore and discuss their participation in the event. Data collection took place during April and May 2015. Interview transcripts were subjected to thematic analysis. Three overarching themes were identified from the interviews: 'reasons for participants attending the event', 'a different Christmas day: the impact on guests and volunteers', and 'learning, planning and moving forwards'. The findings illustrate that a range of people participated in the Christmas day event for a variety of reasons. The event itself had a positive impact, including the shared experience of social belonging, for all involved. There are tangible longer term benefits as a result of the event, such as ongoing contact between participants and the development of supportive networks in the local community. © 2016 The Authors. Health and Social Care in the Community Published by John Wiley & Sons Ltd.

  2. Adaptation to climate change: Using nighttime lights satellite data to explore human response to flood events

    Science.gov (United States)

    Mård, Johanna; Di Baldassarre, Giuliano

    2017-04-01

    To better understand the impact of climate change, we need to uncover how (and to what extent) societies respond and adapt to it. Yet the dynamics resulting from two-way feedbacks between nature and society remain largely unknown. Here we present an interdisciplinary study aiming to uncover one of the least quantified aspects of human-nature interactions, the spatial-temporal distribution of demographic changes following the occurrence of extreme events. To this end, we use nighttime light satellite data in four contrasting case studies in both low- and high-income countries (Lower Limpopo River in Mozambique, Mekong River in Vietnam and Cambodia, Brisbane River in Australia and Mississippi River at St. Louis in USA), and explore the interplay between flooding events and changes in population distribution in the period 1992-2013. Our study shows the challenges and opportunities of nighttime lights in unraveling the way humans adapt to climate change. Specific results show that population distribution of societies that strongly rely on structural measures ("fighting floods" policies) is not significantly affected by the occurrence of flood events. Conversely, learning dynamics emerge in societies that mainly rely on non-structural measures ("living with floods" policies) in terms of relative population in floodplain areas, i.e. reduced human proximity to rivers. Lastly, we propose the development of a novel approach to exploit the growing availability of worldwide information, such as nighttime lights satellite data, to uncover human adaptation to climate change across scales and along gradients of social and natural conditions.

  3. Synoptic analysis of heavy rainfall events over the Mumbai Region

    Science.gov (United States)

    Roth, G.; Lomazzi, M.; Entekhabi, D.; Pinto, J.; Rudari, R.

    2009-12-01

    Over the Indian Subcontinent, almost 75% of the annual precipitation is expected to fall during the South Asia Monsoon (SAM) season, conventionally defined between June 1 and September 30. While precipitation patterns show a very strong spatial heterogeneity, the maximum annual values, mainly associated with orographic forcing, occur in the Western Coast of the Indian Peninsula. This is a relatively narrow strip (width generally lower than 100 km) bounded to the West by the North Indian Ocean and to the East by the Western Ghats mountain range. The interaction between SAM rainfall events and relatively short and steep rivers makes this area particularly prone to flash floods. As a consequence, great damages are produced both in terms of life and economic losses. This work aims at identifying large-scale SAM meteorological patterns associated with the triggering of extreme rainfall events affecting the Mumbai region (approximately 18-20°N, 72.5-73.5°E), an area of great economic importance for India and a very highly populated one (around 20 million people). To this aim, seventy years of daily rainfall data are analyzed and compared to a database of damage-causing precipitations. Event days are selected with a twin-threshold function related to daily rainfall height and soil moisture content. To detect typical large-scale features, event days are compared to non-event days by analyzing MSLP, SST, and vertical wind profiles. Further, the storm-related processes are analyzed with moisture sources (via backtracing) and moisture flux convergence fields. First results on selected event days show that they are typically characterized by remote moisture sources (from S-W Arabian Sea) and increased lower level westerly winds which cause enhanced moisture flux convergence, leading to precipitable water’s enhancement.

  4. Identification of recurrent regulated alternative splicing events across human solid tumors

    Science.gov (United States)

    Danan-Gotthold, Miri; Golan-Gerstl, Regina; Eisenberg, Eli; Meir, Keren; Karni, Rotem; Levanon, Erez Y.

    2015-01-01

    Cancer is a complex disease that involves aberrant gene expression regulation. Discriminating the modified expression patterns driving tumor biology from the many that have no or little contribution is important for understanding cancer molecular basis. Recurrent deregulation patterns observed in multiple cancer types are enriched for such driver events. Here, we studied splicing alterations in hundreds of matched tumor and normal RNA-seq samples of eight solid cancer types. We found hundreds of cassette exons for which splicing was altered in multiple cancer types and identified a set of highly frequent altered splicing events. Specific splicing regulators, including RBFOX2, MBNL1/2 and QKI, appear to account for many splicing alteration events in multiple cancer types. Together, our results provide a first global analysis of regulated splicing alterations in cancer and identify common events with a potential causative role in solid tumor development. PMID:25908786

  5. Analysis of human collagen sequences.

    Science.gov (United States)

    Nassa, Manisha; Anand, Pracheta; Jain, Aditi; Chhabra, Aastha; Jaiswal, Astha; Malhotra, Umang; Rani, Vibha

    2012-01-01

    The extracellular matrix is fast emerging as important component mediating cell-cell interactions, along with its established role as a scaffold for cell support. Collagen, being the principal component of extracellular matrix, has been implicated in a number of pathological conditions. However, collagens are complex protein structures belonging to a large family consisting of 28 members in humans; hence, there exists a lack of in depth information about their structural features. Annotating and appreciating the functions of these proteins is possible with the help of the numerous biocomputational tools that are currently available. This study reports a comparative analysis and characterization of the alpha-1 chain of human collagen sequences. Physico-chemical, secondary structural, functional and phylogenetic classification was carried out, based on which, collagens 12, 14 and 20, which belong to the FACIT collagen family, have been identified as potential players in diseased conditions, owing to certain atypical properties such as very high aliphatic index, low percentage of glycine and proline residues and their proximity in evolutionary history. These collagen molecules might be important candidates to be investigated further for their role in skeletal disorders.

  6. Analysis of human vergence dynamics.

    Science.gov (United States)

    Tyler, Christopher W; Elsaid, Anas M; Likova, Lora T; Gill, Navdeep; Nicholas, Spero C

    2012-10-25

    Disparity vergence is commonly viewed as being controlled by at least two mechanisms, an open-loop vergence-specific burst mechanism analogous to the ballistic drive of saccades, and a closed-loop feedback mechanism controlled by the disparity error. We show that human vergence dynamics for disparity jumps of a large textured field have a typical time course consistent with predominant control by the open-loop vergence-specific burst mechanism, although various subgroups of the population show radically different vergence behaviors. Some individuals show markedly slow divergence responses, others slow convergence responses, others slow responses in both vergence directions, implying that the two vergence directions have separate control mechanisms. The faster time courses usually had time-symmetric velocity waveforms implying open-loop burst control, while the slow response waveforms were usually time-asymmetric implying closed-loop feedback control. A further type of behavior seen in a distinct subpopulation was a compound anomalous divergence response consisting of an initial convergence movement followed by a large corrective divergence movement with time courses implying closed-loop feedback control. This analysis of the variety of human vergence responses thus contributes substantially to the understanding of the oculomotor control mechanisms underlying the generation of vergence movements [corrected].

  7. Utilization of Concept Mapping Program at the Root Cause Analysis of Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bae-Joo; Kim, Gwang-Bong [Nuclear Power Education Institute, Ulsan (Korea, Republic of)

    2008-05-15

    KHNP introduced the Corrective Action Program (CAP) as a part of the nuclear operation innovation. The Key program of the CAP is the Root Cause Analysis (RCA) program. The RCA is a technique to extract root causes and take actions to prevent a recurrence in the event that management doesn't want it to happen again. KHNP establishes a temporary team for RCA of some event. KHNP should assign some human resources to the temporary team. KNPEI introduced a RCA training program from Comanche Peak Steam Electric Station in 2005 and began training the engineers from 2006. But the RCA Program doesn't operate well at the stations because of two reasons. KNPEI performed a research project from March 2006 to September 2007 to capture experience knowledge from seniors and transfer it to juniors. As part of the research activity KNPEI introduced a Concept Mapping Program and set up a Concept Mapping server to capture experience knowledge. Originally, the Concept Mapping Program was to teach conceptual knowledge by remote. But this Concept Mapping Program has some characteristics that can be used in root causes analysis. The purpose of this report is to suggest the utilization method in root causes analysis in the Concept Mapping.

  8. Performance analysis of immigration operation by discrete event ...

    African Journals Online (AJOL)

    Discrete event modelling and simulation were used to analyse the performance of immigration operation in Botswana. The relationships between length of queues of immigrants, queuing time, service time and engagem-ent of duty officer were investigated. Data collected by direct observation and clock-timing of processing ...

  9. Event-time analysis of reproductive traits in dairy heifers.

    NARCIS (Netherlands)

    Vargas, B.; Lende, van der T.; Baaijen, M.; Arendonk, van J.A.M.

    1998-01-01

    Data on the reproductive traits of dairy heifers were analyzed using event-time techniques. Traits analyzed were age at first calving (n = 4631), days to first breeding, and days open (n = 1992) during the first lactation. A proportional hazard model was used that included fixed effects of

  10. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    Nafiisah

    high deficit in rainfall. The dry spell started at the beginning of the November 1998 and ended in January 1999, provoking serious drought conditions in the country and affecting all ... detail by determining both duration and deficit volume of the event (Figure 1). Accordingly ... commercial and domestic purposes. Rainfall ...

  11. Descriptive Analysis of Air Force Non-Fatal Suicide Events

    Science.gov (United States)

    2006-07-01

    otherwise in any manner construed, as licensing the holder or any other person or corporation; or as conveying any rights or permission to manufcture , use...collectior of information, including suggestions for reducing the burden, to the Department of Defense, Executive Services and Communications Directorate...the Air Force Suicide Prevention Program, the Suicide Event Surveillance System (SESS) was developed in response to a peak in 1994 in the number of

  12. Events in time: Basic analysis of Poisson data

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  13. Sustained fecal-oral human-to-human transmission following a zoonotic event

    NARCIS (Netherlands)

    M.T. de Graaf (Marieke); Beck, R. (Relja); S. Caccio (Simone); B. Duim; P.L.A. Fraaij (Pieter); Le Guyader, F.S. (Françoise S.); Lecuit, M. (Marc); Le Pendu, J. (Jacques); E. de Wit (Emmie); C. Schultsz (Constance)

    2017-01-01

    textabstractBacterial, viral and parasitic zoonotic pathogens that transmit via the fecal-oral route have a major impact on global health. However, the mechanisms underlying the emergence of such pathogens from the animal reservoir and their persistence in the human population are poorly understood.

  14. Sustained fecal-oral human-to-human transmission following a zoonotic event

    NARCIS (Netherlands)

    de Graaf, Miranda; Beck, Relja; Caccio, Simone M; Duim, Birgitta|info:eu-repo/dai/nl/143855352; Fraaij, Pieter LA; Le Guyader, Françoise S; Lecuit, Marc; Le Pendu, Jacques; de Wit, Emmie; Schultsz, Constance

    2016-01-01

    Bacterial, viral and parasitic zoonotic pathogens that transmit via the fecal-oral route have a major impact on global health. However, the mechanisms underlying the emergence of such pathogens from the animal reservoir and their persistence in the human population are poorly understood. Here, we

  15. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  16. Analysis of Chain of Events in Major Historic Power Outages

    Directory of Open Access Journals (Sweden)

    HUANG, T.

    2014-08-01

    Full Text Available Contemporary power systems are facing increasing intricate conditions that have never been considered when initially designing the infrastructure, such as malicious threats, accommodating smart grids, etc. As a consequence, blackouts albeit seldom but stubbornly keep appearing from time to time the world around, and demonstrate their devastating capability to create vast damage on both power systems and the society at large. Patterns of the blackout starting from the first triggering events to the system final status have emerged. A framework of a coding system was proposed in this paper in order to capture the common feature in the system evolution during the development of cascades. Cascades in a blackout can be tracked by a chain of events with the help of the codes. It is facile to adopt the framework to build up a knowledge base of blackouts. By applying the proposed framework to 31 selected historic blackouts, most frequent events, effects and origins are identified; the findings can provide useful information for grid designers and security experts for ranking the most imminent issues in their study.

  17. Analysis of strong wind events around Adelie Land, East Antarctica

    Directory of Open Access Journals (Sweden)

    G. Mastrantonio

    2003-06-01

    Full Text Available Strong wind events at Dumont d'Urville (DdU, an East Antarctic coastal station, and Dome C, an interior station, were studied to determine if the wind along the Adelie Land coast increases with the approach of the depression from the west of the site or after its passage to the east of it. The events for the year 1993 were studied using synoptic observations, mean sea level pressure charts and composite infrared satellite images. It was found that the winds are enhanced with the approach of a depression from the west towards the DdU coast. The wind increases in response to the decreasing pressure at the coastal site and increasing downslope pressure difference (dp. The wind starts decreasing once the system moves to the east of DdU and the pressure at DdU starts building up, as reported in some earlier studies. The response of wind to the approaching depression is not the same for all the events but depends on the downslope pressure difference and the movement of the depression that is often conditioned by the presence of a blocking high to the northeast. The wind comes down if the system starts penetrating inland due to the presence of the high pressure ridge to the northeast and decreasing dp. It is observed that the winds at Dome C increase to as high as 17 m s-1 with the inland penetration of the depression.

  18. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    Lexis diagram; current status; prevalent cohort; interim analysis; pharmaco-epidemiology; inverse probability weighting......Lexis diagram; current status; prevalent cohort; interim analysis; pharmaco-epidemiology; inverse probability weighting...

  19. ERPWAVELAB A toolbox for multi-channel analysis of time-frequency transformed event related potentials

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai; Arnfred, Sidse M.

    2006-01-01

    The toolbox 'ERPWAVELAB' is developed for multi-channel time-frequency analysis of event related activity of EEG and MEG data. The toolbox provides tools for data analysis and visualization of the most commonly used measures of time-frequency transformed event related data as well as data...

  20. Thermodynamical analysis of human thermal comfort

    OpenAIRE

    Prek, Matjaž

    2015-01-01

    Traditional methods of human thermal comfort analysis are based on the first law of thermodynamics. These methods use an energy balance of the human body to determine heat transfer between the body and its environment. By contrast, the second law of thermodynamics introduces the useful concept of exergy. It enables the determination of the exergy consumption within the human body dependent on human and environmental factors. Human body exergy consumption varies with the combination of environ...

  1. A Human Body Analysis System

    Directory of Open Access Journals (Sweden)

    Girondel Vincent

    2006-01-01

    Full Text Available This paper describes a system for human body analysis (segmentation, tracking, face/hands localisation, posture recognition from a single view that is fast and completely automatic. The system first extracts low-level data and uses part of the data for high-level interpretation. It can detect and track several persons even if they merge or are completely occluded by another person from the camera's point of view. For the high-level interpretation step, static posture recognition is performed using a belief theory-based classifier. The belief theory is considered here as a new approach for performing posture recognition and classification using imprecise and/or conflicting data. Four different static postures are considered: standing, sitting, squatting, and lying. The aim of this paper is to give a global view and an evaluation of the performances of the entire system and to describe in detail each of its processing steps, whereas our previous publications focused on a single part of the system. The efficiency and the limits of the system have been highlighted on a database of more than fifty video sequences where a dozen different individuals appear. This system allows real-time processing and aims at monitoring elderly people in video surveillance applications or at the mixing of real and virtual worlds in ambient intelligence systems.

  2. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  3. Analysis of earth-moving systems using discrete-event

    Directory of Open Access Journals (Sweden)

    K.M. Shawki

    2015-09-01

    Full Text Available Discrete-event simulation has been widely used technique in analyzing construction operations for the past three decades due to its great effect on optimizing cost and productivity. In this paper we will present Arena as a tool for simulating earthwork operations, the advantage of Arena is its easiness and flexibility in simulating most kinds of models in different areas of construction. A case study will be presented, a model will be built and results obtained to reveal the mentioned objectives.

  4. Adverse clinical events reported during Invisalign treatment: Analysis of the MAUDE database.

    Science.gov (United States)

    Allareddy, Veerasathpurush; Nalliah, Romesh; Lee, Min Kyeong; Rampa, Sankeerth; Allareddy, Veerajalandhar

    2017-11-01

    The objectives of this study were to examine adverse clinical events after the use of the Invisalign system and to provide an overview of the actions taken by the manufacturer to address these events. A retrospective analysis of the Manufacturer and User Facility Device Experience database of the United States Food and Drug Administration was used. All medical device reports reported to the United States Food and Drug Administration pertaining to products of Align Technology from November 1, 2006, to November 30, 2016, were analyzed. Qualitative content analysis was conducted of event descriptions and manufacturer narrative reports. A total of 173 medical device reports were reported in the Manufacturer and User Facility Device Experience database: 169 (97.7%) were designated as adverse event reports, and 45 (26%) were deemed by the treating doctor to be serious or life threatening. The most medical device reports that reported a serious or life-threatening event were in 2014 (50%). The most frequently reported adverse event was difficulty breathing (56 events) followed by sore throat (35 events), swollen throat (34 events), swollen tongue (31 events), hives and itchiness (31 events), anaphylaxis (30 events), swollen lips (27 events), and feeling of throat closing/tight airway/airway obstruction/laryngospasm (24 events). Serious or life-threatening events could be associated with use of Invisalign systems. Health care providers should be aware of these events and know how to handle them if they arise in their practices. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  5. Humans can integrate feedback of discrete events in their sensorimotor control of a robotic hand.

    Science.gov (United States)

    Cipriani, Christian; Segil, Jacob L; Clemente, Francesco; ff Weir, Richard F; Edin, Benoni

    2014-11-01

    Providing functionally effective sensory feedback to users of prosthetics is a largely unsolved challenge. Traditional solutions require high band-widths for providing feedback for the control of manipulation and yet have been largely unsuccessful. In this study, we have explored a strategy that relies on temporally discrete sensory feedback that is technically simple to provide. According to the Discrete Event-driven Sensory feedback Control (DESC) policy, motor tasks in humans are organized in phases delimited by means of sensory encoded discrete mechanical events. To explore the applicability of DESC for control, we designed a paradigm in which healthy humans operated an artificial robot hand to lift and replace an instrumented object, a task that can readily be learned and mastered under visual control. Assuming that the central nervous system of humans naturally organizes motor tasks based on a strategy akin to DESC, we delivered short-lasting vibrotactile feedback related to events that are known to forcefully affect progression of the grasp-lift-and-hold task. After training, we determined whether the artificial feedback had been integrated with the sensorimotor control by introducing short delays and we indeed observed that the participants significantly delayed subsequent phases of the task. This study thus gives support to the DESC policy hypothesis. Moreover, it demonstrates that humans can integrate temporally discrete sensory feedback while controlling an artificial hand and invites further studies in which inexpensive, noninvasive technology could be used in clever ways to provide physiologically appropriate sensory feedback in upper limb prosthetics with much lower band-width requirements than with traditional solutions.

  6. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  7. Evaluation of 6 and 10 Year-Old Child Human Body Models in Emergency Events.

    Science.gov (United States)

    Gras, Laure-Lise; Stockman, Isabelle; Brolin, Karin

    2017-01-01

    Emergency events can influence a child's kinematics prior to a car-crash, and thus its interaction with the restraint system. Numerical Human Body Models (HBMs) can help understand the behaviour of children in emergency events. The kinematic responses of two child HBMs-MADYMO 6 and 10 year-old models-were evaluated and compared with child volunteers' data during emergency events-braking and steering-with a focus on the forehead and sternum displacements. The response of the 6 year-old HBM was similar to the response of the 10 year-old HBM, however both models had a different response compared with the volunteers. The forward and lateral displacements were within the range of volunteer data up to approximately 0.3 s; but then, the HBMs head and sternum moved significantly downwards, while the volunteers experienced smaller displacement and tended to come back to their initial posture. Therefore, these HBMs, originally intended for crash simulations, are not too stiff and could be able to reproduce properly emergency events thanks, for instance, to postural control.

  8. ANALYSIS OF HUMAN RESOURCES MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Anis Cecilia - Nicoleta

    2010-07-01

    Full Text Available Along with other material, financial resources, human resource is an indispensable element of each work process. The concept of human resource derives exactly from the fact that it has a limited nature and it is consumed by usage in the workplace. Any work process cannot be developed without the labour factor. Work is essentially a conscious activity specific to humans through which they release certain labour objects and transforms them according to his needs.

  9. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2017-12-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  10. Analysis of ionospheric parameters during Solar events and geomagnetic storms

    Directory of Open Access Journals (Sweden)

    Mandrikova Oksana

    2016-01-01

    Full Text Available The paper shows new methods of analysis of ionospheric and magnetic data applying the models of multicomponent constructions (MCM models developed by the authors. Based on ground station data, the analysis of ionospheric and magnetic data during increased solar activity was carried out.

  11. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    Science.gov (United States)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  12. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    Science.gov (United States)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  13. Identification and analysis of alternative splicing events in Phaseolus vulgaris and Glycine max.

    Science.gov (United States)

    Iñiguez, Luis P; Ramírez, Mario; Barbazuk, William B; Hernández, Georgina

    2017-08-22

    The vast diversification of proteins in eukaryotic cells has been related with multiple transcript isoforms from a single gene that result in alternative splicing (AS) of primary transcripts. Analysis of RNA sequencing data from expressed sequence tags and next generation RNA sequencing has been crucial for AS identification and genome-wide AS studies. For the identification of AS events from the related legume species Phaseolus vulgaris and Glycine max, 157 and 88 publicly available RNA-seq libraries, respectively, were analyzed. We identified 85,570 AS events from P. vulgaris in 72% of expressed genes and 134,316 AS events in 70% of expressed genes from G. max. These were categorized in seven AS event types with intron retention being the most abundant followed by alternative acceptor and alternative donor, representing ~75% of all AS events in both plants. Conservation of AS events in homologous genes between the two species was analyzed where an overrepresentation of AS affecting 5'UTR regions was observed for certain types of AS events. The conservation of AS events was experimentally validated for 8 selected genes, through RT-PCR analysis. The different types of AS events also varied by relative position in the genes. The results were consistent in both species. The identification and analysis of AS events are first steps to understand their biological relevance. The results presented here from two related legume species reveal high conservation, over ~15-20 MY of divergence, and may point to the biological relevance of AS.

  14. Human error analysis of commercial aviation accidents using the human factors analysis and classification system (HFACS)

    Science.gov (United States)

    2001-02-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...

  15. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  16. Combining regional climate and national human development scenarios to estimate future vulnerability to extreme climate and weather events

    Science.gov (United States)

    Patt, A.; Nussbaumer, P.

    2009-04-01

    Extreme climate and weather events such as droughts, floods, and tropical cyclones account for over 60% of the loss of life, and over 90% of total impacts, from natural disasters. Both observed trends and global climate models (GCMs) suggest that the frequency and intensity of extreme events is increasing, and will continue to increase as a result of climate change. Among planners and policy-makers at both national and international levels there is thus concern that this rise in extreme events will lead to greater losses in the future. Since low levels of development are associated with greater numbers of people killed and needing emergency assistance from natural disasters, the concern is most pronounced for least developed countries. If, however, these countries make substantial improvements in their levels of human development, as leading forecasters suggest may be the case over the coming decades, then their vulnerability to extreme events may fall. In this study, we examine the potential combined effects of increased extreme event frequency and improved levels of human development, to generate scenarios of risk levels into the second half of the century. It is the African continent for which these results may be the most relevant, since it is widely viewed as most vulnerable to increased risks from climate change; we focus on the particular country of Mozambique, which has experienced high losses from droughts, floods, and tropical cyclones in recent decades, and stands out as being among the most vulnerable in Africa. To assess the change in risk levels from the present until 2060, we pull together three pieces of analysis. The first is a statistical analysis of the losses from 1990-2007 from climate-related disasters, using national level data from the Centre for Research on the Epidemiology of Disasters (CRED) and the United Nations. From this analysis, we establish statistical relationships between several drivers of vulnerability—including country size

  17. Quantifying the effect of interannual ocean variability on the attribution of extreme climate events to human influence

    Science.gov (United States)

    Risser, Mark D.; Stone, Dáithí A.; Paciorek, Christopher J.; Wehner, Michael F.; Angélil, Oliver

    2017-11-01

    In recent years, the climate change research community has become highly interested in describing the anthropogenic influence on extreme weather events, commonly termed "event attribution." Limitations in the observational record and in computational resources motivate the use of uncoupled, atmosphere/land-only climate models with prescribed ocean conditions run over a short period, leading up to and including an event of interest. In this approach, large ensembles of high-resolution simulations can be generated under factual observed conditions and counterfactual conditions that might have been observed in the absence of human interference; these can be used to estimate the change in probability of the given event due to anthropogenic influence. However, using a prescribed ocean state ignores the possibility that estimates of attributable risk might be a function of the ocean state. Thus, the uncertainty in attributable risk is likely underestimated, implying an over-confidence in anthropogenic influence. In this work, we estimate the year-to-year variability in calculations of the anthropogenic contribution to extreme weather based on large ensembles of atmospheric model simulations. Our results both quantify the magnitude of year-to-year variability and categorize the degree to which conclusions of attributable risk are qualitatively affected. The methodology is illustrated by exploring extreme temperature and precipitation events for the northwest coast of South America and northern-central Siberia; we also provides results for regions around the globe. While it remains preferable to perform a full multi-year analysis, the results presented here can serve as an indication of where and when attribution researchers should be concerned about the use of atmosphere-only simulations.

  18. Analysis of recurrent events: a systematic review of randomised controlled trials of interventions to prevent falls.

    Science.gov (United States)

    Donaldson, Meghan G; Sobolev, Boris; Cook, Wendy L; Janssen, Patti A; Khan, Karim M

    2009-03-01

    there are several well-developed statistical methods for analysing recurrent events. Although there are guidelines for reporting the design and methodology of randomised controlled trials (RCTs), analysis guidelines do not exist to guide the analysis for RCTs with recurrent events. Application of statistical methods that do not account for recurrent events may provide erroneous results when used to test the efficacy of an intervention. It is unknown what proportion of RCTs of falls prevention studies have utilised statistical methods that incorporate recurrent events. we conducted a systematic review of RCTs of interventions to prevent falls in community-dwelling older persons. We searched Medline from 1994 to November 2006. We determined the proportion of studies that reported using three statistical methods appropriate for the analysis of recurrent events (negative binomial regression, Andersen-Gill extension of the Cox model and the WLW marginal model). fewer than one-third of 83 papers that reported falls as an outcome utilised any appropriate statistical method (negative binomial regression, Andersen-Gill extension of the Cox model and Cox marginal model) to analyse recurrent events and fewer than 15% utilised graphical methods to represent falls data. RCTs that have a recurrent event end-point should include an analysis appropriate for recurrent event data such as negative binomial regression, Andersen-Gill extension of the Cox model and/or the WLW marginal model. We recommend that researchers and clinicians seek consultation with a statistician with expertise in recurrent event methodology.

  19. Human Capital Development: Comparative Analysis of BRICs

    Science.gov (United States)

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera

    2012-01-01

    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  20. A review of the use of human factors classification frameworks that identify causal factors for adverse events in the hospital setting.

    Science.gov (United States)

    Mitchell, R J; Williamson, A M; Molesworth, B; Chung, A Z Q

    2014-01-01

    Various human factors classification frameworks have been used to identified causal factors for clinical adverse events. A systematic review was conducted to identify human factors classification frameworks that identified the causal factors (including human error) of adverse events in a hospital setting. Six electronic databases were searched, identifying 1997 articles and 38 of these met inclusion criteria. Most studies included causal contributing factors as well as error and error type, but the nature of coding varied considerably between studies. The ability of human factors classification frameworks to provide information on specific causal factors for an adverse event enables the focus of preventive attention on areas where improvements are most needed. This review highlighted some areas needing considerable improvement in order to meet this need, including better definition of terms, more emphasis on assessing reliability of coding and greater sophistication in analysis of results of the classification. Practitioner Summary: Human factors classification frameworks can be used to identify causal factors of clinical adverse events. However, this review suggests that existing frameworks are diverse, limited in their identification of the context of human error and have poor reliability when used by different individuals.

  1. Metagenomic Analysis of the Human Gut Microbiome

    DEFF Research Database (Denmark)

    dos Santos, Marcelo Bertalan Quintanilha

    of our results changes the way we link the gut microbiome with diseases. Our results indicate that inflammatory diseases will affect the ecological system of the human gut microbiome, reducing its diversity. Classification analysis of healthy and unhealthy individuals demonstrates that unhealthy......Understanding the link between the human gut microbiome and human health is one of the biggest scientific challenges in our decade. Because 90% of our cells are bacteria, and the microbial genome contains 200 times more genes than the human genome, the study of the human microbiome has...... the potential to impact many areas of our health. This PhD thesis is the first study to generate a large amount of experimental data on the DNA and RNA of the human gut microbiome. This was made possible by our development of a human gut microbiome array capable of profiling any human gut microbiome. Analysis...

  2. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  3. Spatial and Temporal Considerations for Analysis of Single-Event Mechanisms in FinFET Technologies

    Science.gov (United States)

    2017-03-01

    Spatial and Temporal Considerations for Analysis of Single-Event Mechanisms in FinFET Technologies Patrick Nsengiyumva, Lloyd W. Massengill...Abstract: This paper examines the sensitivity of single- event simulation results to spatial and temporal single- event model parameter values...represent the charge generated by an ion are based upon Gaussian spatial and temporal representations. The Gaussian model and basis for model parameter

  4. Event-based media processing and analysis: A survey of the literature

    OpenAIRE

    Tzelepis, Christos; Ma, Zhigang; MEZARIS, Vasileios; Ionescu, Bogdan; Kompatsiaris, Ioannis; Boato, Giulia; Sebe, Nicu; Yan, Shuicheng

    2016-01-01

    Research on event-based processing and analysis of media is receiving an increasing attention from the scientific community due to its relevance for an abundance of applications, from consumer video management and video surveillance to lifelogging and social media. Events have the ability to semantically encode relationships of different informational modalities, such as visual-audio-text, time, involved agents and objects, with the spatio-temporal component of events being a key feature for ...

  5. Assessing a Sport/Cultural Events Network: An Application of Social Network Analysis

    OpenAIRE

    Ziakas, V; Costa, CA

    2009-01-01

    The purpose of this study was to assess the complexity of a sport/cultural events network. To that intent, a social network analysis was conducted in a small community in the US. The study had three main objectives: (1) Examine relationships among organisations involved in planning and implementing sport and cultural events based on their communication, exchange of resources, and assistance; (2) Identify the most important actors within the events network and their relationships; (3) Investig...

  6. [Analysis on the adverse events of cupping therapy in the application].

    Science.gov (United States)

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  7. What can we learn from the deadly flash floods? Post Event Review Capability (PERC) analysis of the Bavaria and Baden-Wurttemberg flood events in Summer 2016

    Science.gov (United States)

    Szoenyi, Michael

    2017-04-01

    In May/June 2016, stationary low pressure systems brought intense rainfall with record-braking intensities of well above 100 mm rain in few hours locally in the southern states of Baden-Wurttemberg and Bavaria, Germany. In steep terrains, small channels and creeks became devastating torrents impacting, among others, the villages of Simbach/Inn, Schwäbisch-Gmünd and Braunsbach. Just few days prior, France had also seen devastating rainfall and flooding. Damage in Germany alone is estimated at 2.8 M USD, of which less than 50% are insured. The loss of life was significant, with 18 fatalities reported across the events. This new forensic event analysis as part of Zurich's Post Event Review Capability (PERC) investigates the flash flood events following these record rainfalls in Southern Germany and tries to answer the following questions holistically, across the five capitals (5C) and the full disaster risk management (DRM) cycle, which are key to understanding how to become more resilient to such flood events: - Why have these intense rainfall events led to such devastating consequences? The EU Floods directive and its implementation in the various member states, as well as the 2002 and 2013 Germany floods, have focused on larger rivers and the main asset concentration. The pathway and mechanism of the 2016 floods are very different and need to be better understood. Flash floods and surface flooding may need to become the new focus and be much better communicated to people at risk, as the awareness for such perils has been identified as low. - How can the prevalence for such flash floods be better identified and mapped? Research indicated that affected people and decision makers alike attribute the occurrence of such flash floods as arbitrary, but we argue that hotspots can and must be identified based on an overlay of rainfall intensity maps, topography leading to flash flood processes, and vulnerable assets. In Germany, there are currently no comprehensive hazard

  8. Genome-Wide Analysis of Polyadenylation Events in Schmidtea mediterranea

    Directory of Open Access Journals (Sweden)

    Vairavan Lakshmanan

    2016-10-01

    Full Text Available In eukaryotes, 3′ untranslated regions (UTRs play important roles in regulating posttranscriptional gene expression. The 3′UTR is defined by regulated cleavage/polyadenylation of the pre-mRNA. The advent of next-generation sequencing technology has now enabled us to identify these events on a genome-wide scale. In this study, we used poly(A-position profiling by sequencing (3P-Seq to capture all poly(A sites across the genome of the freshwater planarian, Schmidtea mediterranea, an ideal model system for exploring the process of regeneration and stem cell function. We identified the 3′UTRs for ∼14,000 transcripts and thus improved the existing gene annotations. We found 97 transcripts, which are polyadenylated within an internal exon, resulting in the shrinking of the ORF and loss of a predicted protein domain. Around 40% of the transcripts in planaria were alternatively polyadenylated (ApA, resulting either in an altered 3′UTR or a change in coding sequence. We identified specific ApA transcript isoforms that were subjected to miRNA mediated gene regulation using degradome sequencing. In this study, we also confirmed a tissue-specific expression pattern for alternate polyadenylated transcripts. The insights from this study highlight the potential role of ApA in regulating the gene expression essential for planarian regeneration.

  9. Genome-Wide Analysis of Polyadenylation Events in Schmidtea mediterranea.

    Science.gov (United States)

    Lakshmanan, Vairavan; Bansal, Dhiru; Kulkarni, Jahnavi; Poduval, Deepak; Krishna, Srikar; Sasidharan, Vidyanand; Anand, Praveen; Seshasayee, Aswin; Palakodeti, Dasaradhi

    2016-10-13

    In eukaryotes, 3' untranslated regions (UTRs) play important roles in regulating posttranscriptional gene expression. The 3'UTR is defined by regulated cleavage/polyadenylation of the pre-mRNA. The advent of next-generation sequencing technology has now enabled us to identify these events on a genome-wide scale. In this study, we used poly(A)-position profiling by sequencing (3P-Seq) to capture all poly(A) sites across the genome of the freshwater planarian, Schmidtea mediterranea, an ideal model system for exploring the process of regeneration and stem cell function. We identified the 3'UTRs for ∼14,000 transcripts and thus improved the existing gene annotations. We found 97 transcripts, which are polyadenylated within an internal exon, resulting in the shrinking of the ORF and loss of a predicted protein domain. Around 40% of the transcripts in planaria were alternatively polyadenylated (ApA), resulting either in an altered 3'UTR or a change in coding sequence. We identified specific ApA transcript isoforms that were subjected to miRNA mediated gene regulation using degradome sequencing. In this study, we also confirmed a tissue-specific expression pattern for alternate polyadenylated transcripts. The insights from this study highlight the potential role of ApA in regulating the gene expression essential for planarian regeneration. Copyright © 2016 Lakshmanan et al.

  10. Time-resolved human kinome RNAi screen identifies a network regulating mitotic-events as early regulators of cell proliferation.

    Directory of Open Access Journals (Sweden)

    Jitao David Zhang

    Full Text Available Analysis of biological processes is frequently performed with the help of phenotypic assays where data is mostly acquired in single end-point analysis. Alternative phenotypic profiling techniques are desired where time-series information is essential to the biological question, for instance to differentiate early and late regulators of cell proliferation in loss-of-function studies. So far there is no study addressing this question despite of high unmet interests, mostly due to the limitation of conventional end-point assaying technologies. We present the first human kinome screen with a real-time cell analysis system (RTCA to capture dynamic RNAi phenotypes, employing time-resolved monitoring of cell proliferation via electrical impedance. RTCA allowed us to investigate the dynamics of phenotypes of cell proliferation instead of using conventional end-point analysis. By introducing data transformation with first-order derivative, i.e. the cell-index growth rate, we demonstrate this system suitable for high-throughput screenings (HTS. The screen validated previously identified inhibitor genes and, additionally, identified activators of cell proliferation. With the information of time kinetics available, we could establish a network of mitotic-event related genes to be among the first displaying inhibiting effects after RNAi knockdown. The time-resolved screen captured kinetics of cell proliferation caused by RNAi targeting human kinome, serving as a resource for researchers. Our work establishes RTCA technology as a novel robust tool with biological and pharmacological relevance amenable for high-throughput screening.

  11. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  12. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  13. Stochastic Analysis of Rainfall Events in Ilorin, Nigeria | Ogunlela ...

    African Journals Online (AJOL)

    ... lognormal, logPearson typeIII exponential and extreme value type I distributions – were used in this study because of their desirable properties. The analysis was based on 41 years of daily and monthly rainfall data (1955-1995) for Ilorin, with peak values computed for each year.Weibull plotting positions and the number ...

  14. Statistical analysis of geodetic networks for detecting regional events

    Science.gov (United States)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  15. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  16. An analysis of fog events at Belgrade International Airport

    Science.gov (United States)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  17. [Intraoperative adverse events in minor oral surgery. Risk analysis].

    Science.gov (United States)

    Reich, W; Maurer, P; Schubert, J

    2005-11-01

    The aim of this prospective study was to evaluate oral surgical procedures performed as day surgery under local anesthesia. We examined patients' general condition, and besides checking for intraoperative complications we analyzed postoperative bleeding in patients with hemostatic disorders. The patient population consisted of 1540 patients (797 female, 743 male), who underwent a total of 2055 minor oral surgical procedures over a 5-year period (1998-2002). Before the treatment started a data file was made for each patient, which contained information on his or her past medical history, concomitant medication, why the operation was indicated, premedication, anesthetic and surgical techniques applied, and postoperative treatment. Systemic pathologies influencing surgical decisions were found in 316 patients (20.5%), affecting 676 interventions (32.9%). In 109 patients (5.3% of the 2055) altered hemostasis was found. The surgical procedures recorded were: (operative) tooth extraction (n=394), interventions for surgical conservation of teeth (n=272), treatment for cysts (n=140), surgical revisions (n=46) and preprosthetic surgery (n=19). Passing complications, mostly systemic in nature, occurred during 27 sessions of local anesthesia (1.3%). There were 87 adverse events intraoperatively (4,2%), most of which were confined to the surgical field; specifically 15% of these complications took the form of hemorrhage. We observed no significant correlation between the occurrence of intraoperative complications and patients' gender, predisposing systemic pathologies including bleeding disorders, or age. Postoperative hemorrhage was observed significantly more frequently in patients with impaired hemostasis and required admission to hospital for inpatient treatment in 2 cases. According to our investigation, oral surgery can be performed in patients with compromised general condition with as few intraoperative complications as in patients with no general medical problems

  18. Analysis of human emotion in human-robot interaction

    Science.gov (United States)

    Blar, Noraidah; Jafar, Fairul Azni; Abdullah, Nurhidayu; Muhammad, Mohd Nazrin; Kassim, Anuar Muhamed

    2015-05-01

    There is vast application of robots in human's works such as in industry, hospital, etc. Therefore, it is believed that human and robot can have a good collaboration to achieve an optimum result of work. The objectives of this project is to analyze human-robot collaboration and to understand humans feeling (kansei factors) when dealing with robot that robot should adapt to understand the humans' feeling. Researches currently are exploring in the area of human-robot interaction with the intention to reduce problems that subsist in today's civilization. Study had found that to make a good interaction between human and robot, first it is need to understand the abilities of each. Kansei Engineering in robotic was used to undergo the project. The project experiments were held by distributing questionnaire to students and technician. After that, the questionnaire results were analyzed by using SPSS analysis. Results from the analysis shown that there are five feelings which significant to the human in the human-robot interaction; anxious, fatigue, relaxed, peaceful, and impressed.

  19. The human impact of tsunamis: a historical review of events 1900-2009 and systematic literature review.

    Science.gov (United States)

    Doocy, Shannon; Daniels, Amy; Dick, Anna; Kirsch, Thomas D

    2013-04-16

    Introduction. Although rare, tsunamis have the potential to cause considerable loss of life and injury as well as widespread damage to the natural and built environments. The objectives of this review were to describe the impact of tsunamis on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of tsunamis were compiled using two methods, a historical review from 1900 to mid 2009 of tsunami events from multiple databases and a systematic literature review to October 2012 of publications. Analysis included descriptive statistics and bivariate tests for associations between tsunami mortality and characteristics using STATA 11. Findings. There were 255,195 deaths (range 252,619-275,784) and 48,462 injuries (range 45,466-51,457) as a result of tsunamis from 1900 to 2009. The majority of deaths (89%) and injuries reported during this time period were attributed to a single event -the 2004 Indian Ocean tsunami. Findings from the systematic literature review indicate that the primary cause of tsunami-related mortality is drowning, and that females, children and the elderly are at increased mortality risk. The few studies that reported on tsunami-related injury suggest that males and young adults are at increased injury-risk. Conclusions. Early warning systems may help mitigate tsunami-related loss of life.

  20. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism.

    Science.gov (United States)

    van Beek, Johannes H G M; Supandi, Farahaniza; Gavai, Anand K; de Graaf, Albert A; Binsl, Thomas W; Hettling, Hannes

    2011-11-13

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible.

  1. ALFA detector, Background removal and analysis for elastic events

    CERN Document Server

    Belaloui, Nazim

    2017-01-01

    I worked on the ALFA project, which has the aim to measure the total cross section in PP collisions as a function of t, the momentum transfer by measuring the scattering angle of the protons. This measurement is done for all available energies; so far 7, 8 and 13 TeV. There are many analysis steps and we have focused on enhancing the signal-to-noise ratio. First of all I tried to be more familiar with ROOT, worked on understanding the code used to access to the data, plotting histograms, then cutting-off background.

  2. The analysis of terminal endpoint events in stepped wedge designs.

    Science.gov (United States)

    Zhan, Zhuozhao; de Bock, Geertruida H; Wiggers, Theo; van den Heuvel, Edwin

    2016-10-30

    The stepped wedge design is a unique clinical trial design that allows for a sequential introduction of an intervention. However, the statistical analysis is unclear when this design is applied in survival data. The time-dependent introduction of the intervention in combination with terminal endpoints and interval censoring makes the analysis more complicated. In this paper, a time-on-study scale discrete survival model was constructed. Simulations were conducted primarily to study the performance of our model for different settings of the stepped wedge design. Secondary, we compared our approach to continuous Cox proportional hazard model. The results show that the discrete survival model estimates the intervention effects unbiasedly. If the length of the censoring interval is increased, the precision of the estimates is decreased. Without left truncation and late entry, the number of steps improves the precision of the estimates, whereas in combination of left truncation and late entry, the number of steps decreases the precision. Given the same number of participants and clusters, a parallel group design has higher precision than a stepped wedge design. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    National Research Council Canada - National Science Library

    Harris, Daniel M; Westfall, John M; Fernald, Douglas H; Duclos, Christine W; West, David R; Niebauer, Linda; Marr, Linda; Quintela, Javan; Main, Deborah S

    2005-01-01

    .... This paper presents a mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for data collection and analysis...

  4. Twelve Tips for Promoting Significant Event Analysis To Enhance Reflection in Undergraduate Medical Students.

    Science.gov (United States)

    Henderson, Emma; Berlin, Anita; Freeman, George; Fuller, Jon

    2002-01-01

    Points out the importance of the facilitation of reflection and development of reflective abilities in professional development and describes 12 tips for undergraduate medical students to increase their abilities of writing reflective and creative event analysis. (Author/YDS)

  5. LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.; Phillips, Aaron R.; Star, Keith T.; Jensen, Jeffrey L.; Nordwall, Douglas J.; Thompson, Seth R.; Peterson, Elena S.

    2016-08-04

    The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires) to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.

  6. Solar Energetic Particles Events and Human Exploration: Measurements in a Space Habitat

    Science.gov (United States)

    Narici, L.; Berrilli, F.; Casolino, M.; Del Moro, D.; Forte, R.; Giovannelli, L.; Martucci, M.; Mergè, M.; Picozza, P.; Rizzo, A.; Scardigli, S.; Sparvoli, R.; Zeitlin, C.

    2016-12-01

    Solar activity is the source of Space Weather disturbances. Flares, CME and coronal holes modulate physical conditions of circumterrestrial and interplanetary space and ultimately the fluxes of high-energy ionized particles, i.e., solar energetic particle (SEP) and galactic cosmic ray (GCR) background. This ionizing radiation affects spacecrafts and biological systems, therefore it is an important issue for human exploration of space. During a deep space travel (for example the trip to Mars) radiation risk thresholds may well be exceeded by the crew, so mitigation countermeasures must be employed. Solar particle events (SPE) constitute high risks due to their impulsive high rate dose. Forecasting SPE appears to be needed and also specifically tailored to the human exploration needs. Understanding the parameters of the SPE that produce events leading to higher health risks for the astronauts in deep space is therefore a first priority issue. Measurements of SPE effects with active devices in LEO inside the ISS can produce important information for the specific SEP measured, relative to the specific detector location in the ISS (in a human habitat with a shield typical of manned space-crafts). Active detectors can select data from specific geo-magnetic regions along the orbits, allowing geo-magnetic selections that best mimic deep space radiation. We present results from data acquired in 2010 - 2012 by the detector system ALTEA inside the ISS (18 SPEs detected). We compare this data with data from the detector Pamela on a LEO satellite, with the RAD data during the Curiosity Journey to Mars, with GOES data and with several Solar physical parameters. While several features of the radiation modulation are easily understood by the effect of the geomagnetic field, as an example we report a proportionality of the flux in the ISS with the energetic proton flux measured by GOES, some features appear more difficult to interpret. The final goal of this work is to find the

  7. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  8. The Human Impact of Floods: a Historical Review of Events 1980-2009 and Systematic Literature Review

    Science.gov (United States)

    Doocy, Shannon; Daniels, Amy; Murray, Sarah; Kirsch, Thomas D.

    2013-01-01

    Background. Floods are the most common natural disaster and the leading cause of natural disaster fatalities worldwide. Risk of catastrophic losses due to flooding is significant given deforestation and the increasing proximity of large populations to coastal areas, river basins and lakeshores. The objectives of this review were to describe the impact of flood events on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters Methods. Data on the impact of floods were compiled using two methods, a historical review of flood events from 1980 to 2009 from multiple databases and a systematic literature review of publications ending in October 2012. Analysis included descriptive statistics, bivariate tests for associations and multinomial logistic regression of flood characteristics and mortality using Stata 11.0. Findings. There were 539,811 deaths (range: 510,941 to 568,680), 361,974 injuries and 2,821,895,005 people affected by floods between 1980 and 2009. Inconsistent reporting suggests this is an underestimate, particularly in terms of the injured and affected populations. The primary cause of flood-related mortality is drowning; in developed countries being in a motor-vehicle and male gender are associated with increased mortality, whereas female gender may be linked to higher mortality in low-income countries. Conclusions. Expanded monitoring of floods, improved mitigation measures, and effective communication with civil authorities and vulnerable populations has the potential to reduce loss of life in future flood events. PMID:23857425

  9. Quantitative analysis of human behavior.

    Science.gov (United States)

    Iacovitti, G

    2010-01-01

    Many aspects of individual as well as social behaviours of human beings can be analyzed in a quantitative way using typical scientific methods, based on empirical measurements and mathematical inference. Measurements are made possible today by the large variety of sensing devices, while formal models are synthesized using modern system and information theories.

  10. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care.

    Science.gov (United States)

    Bowie, Paul; McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested "guiding tools" based on human factors principles. Mixed-methods development of guiding tools (Personal Booklet-to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad-to guide a team-based systems analysis; and a written Report Format) by a multiprofessional "expert" group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement.

  11. A Specification Patterns System for Discrete Event Systems Analysis

    Directory of Open Access Journals (Sweden)

    Jose Creissac Campos

    2013-08-01

    Full Text Available As formal verification tools gain popularity, the problem arises of making them more accessible to engineers. A correct understanding of the logics used to express the properties of a system's behaviour is needed in order to guarantee that properties correctly encode the intent of the verification process. Writing appropriate properties, in a logic suitable for verification, is a skilful process. Errors in this step of the process can create serious problems since a false sense of safety is gained from the analysis. However, when compared to the effort put into developing and applying modelling languages, little attention has been devoted to the process of writing properties that accurately capture verification requirements. In this paper we illustrate how a collection of property patterns can help in simplifying the process of generating logical formulae from informally expressed requirements.

  12. Analysis of meiosis regulators in human gonads

    DEFF Research Database (Denmark)

    Jørgensen, Anne; Nielsen, John E; Jensen, Martin Blomberg

    2012-01-01

    The mitosis-meiosis switch is a key event in the differentiation of germ cells. In humans, meiosis is initiated in fetal ovaries, whereas in testes meiotic entry is inhibited until puberty. The purpose of this study was to examine the expression pattern of meiosis regulators in human gonads...... and to investigate a possible role of DMRT1 in the regulation of meiotic entry. The expression pattern of DMRT1, STRA8, SCP3, DMC1, NANOS3, CYP26B1 and NANOS2 was investigated by RT-PCR and immunohistochemistry in a series of human testis samples from fetal life to adulthood, and in fetal ovaries. DMRT1...... with their role in initiation and progression of meiosis. The putative meiosis inhibitors, CYP26B1 and NANOS2, were primarily expressed in Leydig cells and spermatocytes, respectively. In conclusion, the expression pattern of the investigated meiotic regulators is largely conserved in the human gonads compared...

  13. Human Reliability Analysis for Digital Human-Machine Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2014-06-01

    This paper addresses the fact that existing human reliability analysis (HRA) methods do not provide guidance on digital human-machine interfaces (HMIs). Digital HMIs are becoming ubiquitous in nuclear power operations, whether through control room modernization or new-build control rooms. Legacy analog technologies like instrumentation and control (I&C) systems are costly to support, and vendors no longer develop or support analog technology, which is considered technologically obsolete. Yet, despite the inevitability of digital HMI, no current HRA method provides guidance on how to treat human reliability considerations for digital technologies.

  14. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  15. Space Mission Human Reliability Analysis (HRA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  16. ARSENIC SPECIATION ANALYSIS IN HUMAN SALIVA

    Science.gov (United States)

    Background: Determination of arsenic species in human saliva is potentially useful for biomonitoring of human exposure to arsenic and for studying arsenic metabolism. However, there is no report on the speciation analysis of arsenic in saliva. Methods: Arsenic species in saliva ...

  17. Analysis of human reliability in the APS of fire. Application of NUREG-1921; Analisis de Fiabilidad Humana en el APS de Incendios. Aplicacion del NUREG-1921

    Energy Technology Data Exchange (ETDEWEB)

    Perez Torres, J. L.; Celaya Meler, M. A.

    2014-07-01

    An analysis of human reliability in a probabilistic safety analysis (APS) of fire aims to identify, describe, analyze and quantify, in a manner traceable, human actions that can affect the mitigation of an initiating event produced by a fire. (Author)

  18. Integrative analysis of 111 reference human epigenomes

    Science.gov (United States)

    Kundaje, Anshul; Meuleman, Wouter; Ernst, Jason; Bilenky, Misha; Yen, Angela; Kheradpour, Pouya; Zhang, Zhizhuo; Heravi-Moussavi, Alireza; Liu, Yaping; Amin, Viren; Ziller, Michael J; Whitaker, John W; Schultz, Matthew D; Sandstrom, Richard S; Eaton, Matthew L; Wu, Yi-Chieh; Wang, Jianrong; Ward, Lucas D; Sarkar, Abhishek; Quon, Gerald; Pfenning, Andreas; Wang, Xinchen; Claussnitzer, Melina; Coarfa, Cristian; Harris, R Alan; Shoresh, Noam; Epstein, Charles B; Gjoneska, Elizabeta; Leung, Danny; Xie, Wei; Hawkins, R David; Lister, Ryan; Hong, Chibo; Gascard, Philippe; Mungall, Andrew J; Moore, Richard; Chuah, Eric; Tam, Angela; Canfield, Theresa K; Hansen, R Scott; Kaul, Rajinder; Sabo, Peter J; Bansal, Mukul S; Carles, Annaick; Dixon, Jesse R; Farh, Kai-How; Feizi, Soheil; Karlic, Rosa; Kim, Ah-Ram; Kulkarni, Ashwinikumar; Li, Daofeng; Lowdon, Rebecca; Mercer, Tim R; Neph, Shane J; Onuchic, Vitor; Polak, Paz; Rajagopal, Nisha; Ray, Pradipta; Sallari, Richard C; Siebenthall, Kyle T; Sinnott-Armstrong, Nicholas; Stevens, Michael; Thurman, Robert E; Wu, Jie; Zhang, Bo; Zhou, Xin; Beaudet, Arthur E; Boyer, Laurie A; De Jager, Philip; Farnham, Peggy J; Fisher, Susan J; Haussler, David; Jones, Steven; Li, Wei; Marra, Marco; McManus, Michael T; Sunyaev, Shamil; Thomson, James A; Tlsty, Thea D; Tsai, Li-Huei; Wang, Wei; Waterland, Robert A; Zhang, Michael; Chadwick, Lisa H; Bernstein, Bradley E; Costello, Joseph F; Ecker, Joseph R; Hirst, Martin; Meissner, Alexander; Milosavljevic, Aleksandar; Ren, Bing; Stamatoyannopoulos, John A; Wang, Ting; Kellis, Manolis

    2015-01-01

    The reference human genome sequence set the stage for studies of genetic variation and its association with human disease, but a similar reference has lacked for epigenomic studies. To address this need, the NIH Roadmap Epigenomics Consortium generated the largest collection to-date of human epigenomes for primary cells and tissues. Here, we describe the integrative analysis of 111 reference human epigenomes generated as part of the program, profiled for histone modification patterns, DNA accessibility, DNA methylation, and RNA expression. We establish global maps of regulatory elements, define regulatory modules of coordinated activity, and their likely activators and repressors. We show that disease and trait-associated genetic variants are enriched in tissue-specific epigenomic marks, revealing biologically-relevant cell types for diverse human traits, and providing a resource for interpreting the molecular basis of human disease. Our results demonstrate the central role of epigenomic information for understanding gene regulation, cellular differentiation, and human disease. PMID:25693563

  19. Analysis of "never events" following adult cardiac surgical procedures in the United States.

    Science.gov (United States)

    Robich, Michael P; Krafcik, Brianna M; Shah, Nishant K; Farber, Alik; Rybin, Denis; Siracuse, Jeffrey J

    2017-10-01

    This study was conducted to determine the risk factors, nature, and outcomes of "never events" following open adult cardiac surgical procedures. Understanding of these events can reduce their occurrence, and thereby improve patient care, quality metrics, and cost reduction. "Never events" for patients included in the Nationwide Inpatient Sample who underwent coronary artery bypass graft, heart valve repair/replacement, or thoracic aneurysm repair between 2003-2011 were documented. These events included air embolism, catheter-based urinary tract infection (UTI), pressure ulcer, falls/trauma, blood incompatibility, vascular catheter infection, poor glucose control, foreign object retention, wrong site surgery and mediastinitis. Analysis included characterization of preoperative demographics, comorbidities and outcomes for patients sustaining never events, and multivariate analysis of predictive risk factors and outcomes. A total of 588,417 patients meeting inclusion criteria were identified. Of these, never events occurred in 4377 cases. The majority of events were in-hospital falls, vascular catheter infections, and complications of poor glucose control. Rates of falls, catheter based UTIs, and glucose control complications increased between 2009-2011 as compared to 2003-2008. Analysis revealed increased hospital length of stay, hospital charges, and mortality in patients who suffered a never event as compared to those that did not. This study establishes a baseline never event rate after cardiac surgery. Adverse patient outcomes and increased resource utilization resulting from never events emphasizes the need for quality improvement surrounding them. A better understanding of individual patient characteristics for those at risk can help in developing protocols to decrease occurrence rates.

  20. Focus on: human adverse events to companion animal spot-ons and sprays.

    Science.gov (United States)

    2015-01-03

    Recently, the Veterinary Products Committee has taken great interest in the number of human adverse events reported following the use of companion animal products that are applied topically to prevent and treat parasite infestations. One particular question it has is whether the legal category of some of these products means that current point of sale advice is insufficient to influence pet owner behaviour in preventing these incidents. This article by the Veterinary Medicines Directorate (VMD) seeks to respond to these concerns, and to remind veterinary professionals of their responsibility to inform clients how to use the products supplied to them in a manner that is safe, not only for their pets, but also for themselves. British Veterinary Association.

  1. Time-compressed preplay of anticipated events in human primary visual cortex.

    Science.gov (United States)

    Ekman, Matthias; Kok, Peter; de Lange, Floris P

    2017-05-23

    Perception is guided by the anticipation of future events. It has been hypothesized that this process may be implemented by pattern completion in early visual cortex, in which a stimulus sequence is recreated after only a subset of the visual input is provided. Here we test this hypothesis using ultra-fast functional magnetic resonance imaging to measure BOLD activity at precisely defined receptive field locations in visual cortex (V1) of human volunteers. We find that after familiarizing subjects with a spatial sequence, flashing only the starting point of the sequence triggers an activity wave in V1 that resembles the full stimulus sequence. This preplay activity is temporally compressed compared to the actual stimulus sequence and remains present even when attention is diverted from the stimulus sequence. Preplay might therefore constitute an automatic prediction mechanism for temporal sequences in V1.

  2. Time scales of representation in the human brain: weighing past information to predict future events.

    Science.gov (United States)

    Harrison, Lee M; Bestmann, Sven; Rosa, Maria Joao; Penny, William; Green, Gary G R

    2011-01-01

    The estimates that humans make of statistical dependencies in the environment and therefore their representation of uncertainty crucially depend on the integration of data over time. As such, the extent to which past events are used to represent uncertainty has been postulated to vary over the cortex. For example, primary visual cortex responds to rapid perturbations in the environment, while frontal cortices involved in executive control encode the longer term contexts within which these perturbations occur. Here we tested whether primary and executive regions can be distinguished by the number of past observations they represent. This was based on a decay-dependent model that weights past observations from a Markov process and Bayesian Model Selection to test the prediction that neuronal responses are characterized by different decay half-lives depending on location in the brain. We show distributions of brain responses for short and long term decay functions in primary and secondary visual and frontal cortices, respectively. We found that visual and parietal responses are released from the burden of the past, enabling an agile response to fluctuations in events as they unfold. In contrast, frontal regions are more concerned with average trends over longer time scales within which local variations are embedded. Specifically, we provide evidence for a temporal gradient for representing context within the prefrontal cortex and possibly beyond to include primary sensory and association areas.

  3. Time scales of representation in the human brain: weighing past information to predict future events

    Directory of Open Access Journals (Sweden)

    Lee eHarrison

    2011-04-01

    Full Text Available The estimates that humans make of statistical dependencies in the environment and therefore their representation of uncertainty crucially depend on the integration of data over time. As such, the extent to which past events are used to represent uncertainty has been postulated to vary over the cortex. For example, primary visual cortex responds to rapid perturbations in the environment, while frontal cortices involved in executive control encode the longer term contexts within which these perturbations occur. Here we tested whether primary and executive regions can be distinguished by the number of past observations they represent. This was based on a decay-dependent model that weights past observations from a Markov process and Bayesian Model Selection (BMS to test the prediction that neuronal responses are characterised by different decay half-lives depending on location in the brain. We show distributions of brain responses for short and long term decay functions in primary and secondary visual and frontal cortices, respectively. We found that visual and parietal responses are released from the burden of the past, enabling an agile response to fluctuations in events as they unfold. In contrast, frontal regions are more concerned with average trends over longer time scales within which local variations are embedded. Specifically, we provide evidence for a temporal gradient for representing context within the prefrontal cortex and possibly beyond to include primary sensory and association areas.

  4. Analysis and Prediction of Exon Skipping Events from RNA-Seq with Sequence Information Using Rotation Forest

    Directory of Open Access Journals (Sweden)

    Xiuquan Du

    2017-12-01

    Full Text Available In bioinformatics, exon skipping (ES event prediction is an essential part of alternative splicing (AS event analysis. Although many methods have been developed to predict ES events, a solution has yet to be found. In this study, given the limitations of machine learning algorithms with RNA-Seq data or genome sequences, a new feature, called RS (RNA-seq and sequence features, was constructed. These features include RNA-Seq features derived from the RNA-Seq data and sequence features derived from genome sequences. We propose a novel Rotation Forest classifier to predict ES events with the RS features (RotaF-RSES. To validate the efficacy of RotaF-RSES, a dataset from two human tissues was used, and RotaF-RSES achieved an accuracy of 98.4%, a specificity of 99.2%, a sensitivity of 94.1%, and an area under the curve (AUC of 98.6%. When compared to the other available methods, the results indicate that RotaF-RSES is efficient and can predict ES events with RS features.

  5. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    Science.gov (United States)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  6. Catchment process affecting drinking water quality, including the significance of rainfall events, using factor analysis and event mean concentrations.

    Science.gov (United States)

    Cinque, Kathy; Jayasuriya, Niranjali

    2010-12-01

    To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.

  7. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  8. Autonomous Gait Event Detection with Portable Single-Camera Gait Kinematics Analysis System

    Directory of Open Access Journals (Sweden)

    Cheng Yang

    2016-01-01

    Full Text Available Laboratory-based nonwearable motion analysis systems have significantly advanced with robust objective measurement of the limb motion, resulting in quantified, standardized, and reliable outcome measures compared with traditional, semisubjective, observational gait analysis. However, the requirement for large laboratory space and operational expertise makes these systems impractical for gait analysis at local clinics and homes. In this paper, we focus on autonomous gait event detection with our bespoke, relatively inexpensive, and portable, single-camera gait kinematics analysis system. Our proposed system includes video acquisition with camera calibration, Kalman filter + Structural-Similarity-based marker tracking, autonomous knee angle calculation, video-frame-identification-based autonomous gait event detection, and result visualization. The only operational effort required is the marker-template selection for tracking initialization, aided by an easy-to-use graphic user interface. The knee angle validation on 10 stroke patients and 5 healthy volunteers against a gold standard optical motion analysis system indicates very good agreement. The autonomous gait event detection shows high detection rates for all gait events. Experimental results demonstrate that the proposed system can automatically measure the knee angle and detect gait events with good accuracy and thus offer an alternative, cost-effective, and convenient solution for clinical gait kinematics analysis.

  9. Analysis of events with isolated leptons and missing transverse momentum in ep collisions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, G.

    2007-02-07

    A study of events with isolated leptons and missing transverse momentum in ep collisions is presented. Within the Standard Model (SM) such topologies are expected mainly from production of real W bosons with subsequent leptonic decay. This thesis continues the analysis of such events done in the HERA-1 period where an excess over the SM prediction was observed for events with high hadronic transverse momentum P{sup X}{sub T}>25 GeV. New data of the HERA-2 period are added. The analysed data sample recorded in e{sup +}p collisions corresponds to an integrated luminosity of 220 pb{sup -1} which is a factor of two more with respect to the HERA-1 analysis. The e{sup -}p data correspond to 186 pb{sup -1} which is a factor of 13 more with respect to HERA-1. All three lepton generations (electrons muons and tau leptons) are analysed. In the electron and muon channels a total of 53 events are observed in 406 pb{sup -1}. This compares well to the SM expectation of 53.7{+-}6.5 events, dominated by W production. However a difference in the event rate is observed for different electron beam charges. In e{sup +}p data the excess of events with P{sup X}{sub T}>25 GeV is sustained, while the e{sup -}p data agree with the SM. In the tau channel 18 events are observed in all HERA data, with 20{+-}3 expected from the SM. The events are dominated by irreducible background from charged currents. The contribution from W production amounts to about 22%. One event with P{sup X}{sub T}>25 GeV is observed, where 1.4{+-}0.3 are expected from the SM. (orig.)

  10. Analysis of geohazards events along Swiss roads from autumn 2011 to present

    Science.gov (United States)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    In Switzerland, roads and railways are threatened throughout the year by several natural hazards. Some of these events reach transport infrastructure many time per year leading to the closing of transportation corridors, loss of access, deviation travels and sometimes infrastructures damages and loss of human lives (3 fatalities during the period considered). The aim of this inventory of events is to investigate the number of natural events affecting roads and railways in Switzerland since autumn 2011 until now. Natural hazards affecting roads and railway can be classified in five categories: rockfalls, landslides, debris flows, snow avalanches and floods. They potentially cause several important direct damages on transportation infrastructure (roads, railway), vehicles (slightly or very damaged) or human life (slightly or seriously injured person, death). These direct damages can be easily evaluated from press articles or from Swiss police press releases. Indirect damages such as deviation cost are not taken into account in this work. During the two a half last years, about 50 events affecting the Swiss roads and Swiss railways infrastructures were inventoried. The proportion of events due to rockfalls is 45%, to landslides 25%, to debris flows 15%, to snow avalanches 10% and to floods 5%. During this period, three fatalities and two persons were injured while 23 vehicles (car, trains and coach) and 24 roads and railways were damaged. We can see that floods occur mainly on the Swiss Plateau whereas rockfalls, debris flow, snow avalanches and landslides are mostly located in the Alpine area. Most of events occur on secondary mountain roads and railways. The events are well distributed on the whole Alpine area except for the Gotthard hotspot, where an important European North-South motorway (hit in 2003 with two fatalities) and railway (hit three times in 2012 with one fatalities) are more frequently affected. According to the observed events in border regions of

  11. Discrimination and numerical analysis of human pathogenic ...

    African Journals Online (AJOL)

    Discrimination and numerical analysis of human pathogenic Candida albicans strains based on SDSPAGE protein profiles. ... obtaining a correct identification, both the commercial yeast kit system and the numerical analysis of whole-cell protein patterns can be useful for the more reliable identification of C. albicans strains.

  12. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    Science.gov (United States)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2017-04-01

    We analyze a cluster of intermediate deep events in the eastern part of the Hellenic subduction zone (HSZ), recorded during the the deployment of the temporary seismic network EGELADOS in order to gain a better understanding of geodynamic processes in the HSZ, in particular in the eastern part. The cluster consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. By using the three component similarity analysis, both spatial and temporal clustering of the recorded events is studied. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The cross-correlation coefficients at the single stations show a decrease in similarity with increasing epicentral distance as well as the effect of local heterogenities at particular stations, causing noticable differences in waveform similarities. However, highly similar events tend to happen at the prefered depth ranges between 120 to 150 km depth. The double-difference earthquake relocation software HypoDD was used to perform the event relocation. The results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential traveltimes obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, number of aftershocks and occurrence time of the aftershocks is varied, it is shown that the event-time distribution follows almost a random Poisson time distribution with a slightly increasing event rate without indications for substantial inter-event triggering. The spatial distribution of the cluster can be modelled by a two

  13. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  14. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    Science.gov (United States)

    Van Strien, Jan W.; Isbell, Lynne A.

    2017-01-01

    Studies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to partially exposed snake models and scale patterns on the snake skin. Here, we examined whether snake skin patterns and partially exposed snakes elicit a larger EPN in humans. In Task 1, we employed pictures with close-ups of snake skins, lizard skins, and bird plumage. In task 2, we employed pictures of partially exposed snakes, lizards, and birds. Participants watched a random rapid serial visual presentation of these pictures. The EPN was scored as the mean activity (225–300 ms after picture onset) at occipital and parieto-occipital electrodes. Consistent with previous studies, and with the Snake Detection Theory, the EPN was significantly larger for snake skin pictures than for lizard skin and bird plumage pictures, and for lizard skin pictures than for bird plumage pictures. Likewise, the EPN was larger for partially exposed snakes than for partially exposed lizards and birds. The results suggest that the EPN snake effect is partly driven by snake skin scale patterns which are otherwise rare in nature. PMID:28387376

  15. Striking life events associated with primary breast cancer susceptibility in women: a meta-analysis study

    OpenAIRE

    Lin, Yan; Wang, Changjun; Zhong, Ying; Huang, Xin; Peng, Li; Shan, Guangliang; Wang, Ke; Sun, Qiang

    2013-01-01

    Purpose The association between striking life events, an important stress and acute anxiety disorder, and the occurrence of primary breast cancer is unclear. The current meta-analysis was designed to assess the relationship between striking life events and primary breast cancer incidence in women. Methods Systematic computerized searching of the PubMed, ScienceDirect, Embase, and BMJ databases with the combinations of controlled descriptors from Mesh, including breast cancer, breast tumor, ca...

  16. Attribution of extreme events in the western US to human activities

    Science.gov (United States)

    Mera, R. J.

    2015-12-01

    A project to investigate the role of human activities on the changing nature of extreme events in the western US began as part of a CLIVAR-sponsored Postdocs Applying Climate Expertise (PACE) project. The climate institution was the Oregon State University and the application partner was the Oregon Department of Land Conservation and Development (DLCD). DLCD was interested in the changes in weather extremes in the Pacific Northwest, specifically extreme rainfall, flooding, and droughts. The project employs very large ensembles of regional model simulations through volunteer computing resources and allows for probabilistic event attribution (PEA), an important climate research technique. The model was found to have good representation of atmospheric rivers, a major source of extreme precipitation in the Pacific Northwest. The model domain also encompasses California and Nevada. One of the studies focused on attribution of extreme heat in relation to vulnerable populations in California's Central Valley, where heat waves have become progressively more severe due to increasing nighttime temperatures. Specifically, we found that that (1) simulations of the hottest summer days during the 2000s were twice as likely to occur using observed levels of greenhouse gases than in a counterfactual world without major human activities, (2) detrimental impacts of heat on public health-relevant variables, such as the number of days above 40°C, can be quantified and attributed to human activities using PEA, and (3) PEA can serve as a tool for addressing climate justice concerns of populations within developed nations. The research conducted through the PACE program has also provided a framework for a pioneering climate attribution study at the Union of Concerned Scientists (UCS). The UCS project takes advantage of new research that shows that nearly two-thirds of carbon pollution released into the atmosphere, reported as carbon dioxide equivalent with hundred-year global warming

  17. Human Papillomavirus Status and the Risk of Cerebrovascular Events Following Radiation Therapy for Head and Neck Cancer.

    Science.gov (United States)

    Addison, Daniel; Seidelmann, Sara B; Janjua, Sumbal A; Emami, Hamed; Staziaki, Pedro V; Hallett, Travis R; Szilveszter, Bálint; Lu, Michael T; Cambria, Richard P; Hoffmann, Udo; Chan, Annie W; Wirth, Lori J; Neilan, Tomas G

    2017-08-30

    Radiation therapy (RT) is a standard treatment for head and neck cancer; however, it is associated with inflammation, accelerated atherosclerosis, and cerebrovascular events (CVEs; stroke or transient ischemic attack). Human papillomavirus (HPV) is found in nearly half of head and neck cancers and is associated with inflammation and atherosclerosis. Whether HPV confers an increased risk of CVEs after RT is unknown. Using an institutional database, we identified all consecutive patients treated with RT from 2002 to 2012 for head and neck cancer who were tested for HPV. The outcome of interest was the composite of ischemic stroke and transient ischemic attack, and the association between HPV and CVEs was assessed using Cox proportional hazard models, competing risk analysis, and inverse probability weighting. Overall, 326 participants who underwent RT for head and neck cancer were tested for HPV (age 59±12 years, 75% were male, 9% had diabetes mellitus, 45% had hypertension, and 61% were smokers), of which 191 (59%) were tumor HPV positive. Traditional risk factors for CVEs were similar between HPV-positive and -negative patients. Over a median follow-up of 3.4 years, there were 18 ischemic strokes and 5 transient ischemic attacks (event rate of 1.8% per year). The annual event rate was higher in the HPV-positive patients compared with the HPV-negative patients (2.6% versus 0.9%, P=0.002). In a multivariable model, HPV-positive status was associated with a >4 times increased risk of CVEs (hazard ratio: 4.4; 95% confidence interval, 1.5-13.2; P=0.008). In this study, HPV-positive status is associated with an increased risk of stroke or transient ischemic attack following RT for head and neck cancer. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  18. Leveraging the World Cup: Mega Sporting Events, Human Rights Risk, and Worker Welfare Reform in Qatar

    Directory of Open Access Journals (Sweden)

    Sarath Ganji

    2016-12-01

    Full Text Available Qatar will realize its decades-long drive to host a mega sporting event when, in 2022, the opening ceremony of the Fédération Internationale de Football Association (FIFA World Cup commences. By that time, the Qatari government will have invested at least $200 billion in real estate and development projects, employing anywhere between 500,000 and 1.5 million foreign workers to do so. The scale of these preparations is staggering — and not necessarily positive. Between 2010 and 2013, more than 1,200 labor migrants working in Qatar’s construction sector died, with another 4,000 deaths projected by the start of the event. Foreign workers are subject to conditions of forced labor, human trafficking, and indefinite detention. Advocacy groups cite deplorable living and working conditions, coupled with lax legal protections for workers, as the main culprits. Absent significant improvements in worker welfare, Qatar’s World Cup will be remembered as a human rights tragedy.This article examines whether it is possible for Qatar’s World Cup to forge a different legacy, as an agent of change on behalf of worker welfare reform. In examining the issue, the article takes a two-fold approach. First, it locates the policy problem of worker welfare abuses in the context of the migration life cycle. The migration life cycle represents the range of activities that mediate the relationship between an individual migrant and the labor migration system — from the time the migrant first considers working overseas to his employment abroad to his eventual return to the home country. An understanding of worker welfare abuses in Qatar does not begin or end with reports of migrant deaths. A much broader pattern of abuse exists that, if ignored, will undermine effective policy responses.Second, the article frames worker welfare as a matter that lies at the intersection of business and human rights. Mega events are large-scale, internationally recognized activities

  19. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  20. The human impact of volcanoes: a historical review of events 1900-2009 and systematic literature review.

    Science.gov (United States)

    Doocy, Shannon; Daniels, Amy; Dooling, Shayna; Gorokhovich, Yuri

    2013-04-16

    Introduction. More than 500 million people live within the potential exposure range of a volcano. The risk of catastrophic losses in future eruptions is significant given population growth, proximities of major cities to volcanoes, and the possibility of larger eruptions. The objectives of this review are to describe the impact of volcanoes on the human population, in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of volcanoes were compiled using two methods, a historical review of volcano events from 1900 to 2009 from multiple databases and a systematic literature review of publications ending in October 2012. Analysis included descriptive statistics and bivariate tests for associations between volcano mortality and characteristics using STATA 11. Findings. There were a total of 91,789 deaths (range: 81,703-102,372), 14,068 injuries (range 11,541-17,922), and 4.72 million people affected by volcanic events between 1900 and 2008. Inconsistent reporting suggests this is an underestimate, particularly in terms of numbers injured and affected. The primary causes of mortality in recent volcanic eruptions were ash asphyxiation, thermal injuries from pyroclastic flow, and trauma. Mortality was concentrated with the ten deadliest eruptions accounting for more than 80% of deaths; 84% of fatalities occurred in four locations (the Island of Martinique (France), Colombia, Indonesia, and Guatemala). Conclusions. Changes in land use practices and population growth provide a background for increasing risk; in conjunction with increasing urbanization in at risk areas, this poses a challenge for future volcano preparedness and mitigation efforts.

  1. Global Gene Expression Profiling and Alternative Splicing Events during the Chondrogenic Differentiation of Human Cartilage Endplate-Derived Stem Cells

    Directory of Open Access Journals (Sweden)

    Jin Shang

    2015-01-01

    Full Text Available Low back pain (LBP is a very prevalent disease and degenerative disc diseases (DDDs usually account for the LBP. However, the pathogenesis of DDDs is complicated and difficult to elucidate. Alternative splicing is a sophisticated regulatory process which greatly increases cellular complexity and phenotypic diversity of eukaryotic organisms. In addition, the cartilage endplate-derived stem cells have been discovered and identified by our research group. In this paper, we continue to investigate gene expression profiling and alternative splicing events during chondrogenic differentiation of cartilage endplate-derived stem cells. We adopted Affymetrix Human Transcriptome Array 2.0 (HTA 2.0 to compare the transcriptional and splicing changes between the control and differentiated samples. RT-PCR and quantitative PCR are used to validate the microarray results. The GO and KEGG pathway analysis was also performed. After bioinformatics analysis of the data, we detected 1953 differentially expressed genes. In terms of alternative splicing, the Splicing Index algorithm was used to select alternatively spliced genes. We detected 4411 alternatively spliced genes. GO and KEGG pathway analysis also revealed several functionally involved biological processes and signaling pathways. To our knowledge, this is the first study to investigate the alternative splicing mechanisms in chondrogenic differentiation of stem cells on a genome-wide scale.

  2. MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Donald D. Dudenhoeffer; Bruce P. Hallbert; Brian F. Gore

    2006-05-01

    This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error with novel control room equipment and configurations, (ii) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (iii) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms.

  3. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    Science.gov (United States)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  4. Dynamic Human Reliability Analysis: Benefits and Challenges of Simulating Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    To date, there has been considerable work on dynamic event trees and other areas related to dynamic probabilistic safety assessment (PSA). The counterpart to these efforts in human reliability analysis (HRA) has centered on the development of specific methods to account for the dynamic nature of human performance. In this paper, the author posits that the key to dynamic HRA is not in the development of specific methods but in the utilization of cognitive modeling and simulation to produce a framework of data that may be used in quantifying the likelihood of human error. This paper provides an overview of simulation approaches to HRA; reviews differences between first, second, and dynamic generation HRA; and outlines potential benefits and challenges of this approach.

  5. Human brain EEG indices of emotions: delineating responses to affective vocalizations by measuring frontal theta event-related synchronization.

    Science.gov (United States)

    Bekkedal, Marni Y V; Rossi, John; Panksepp, Jaak

    2011-10-01

    At present there is no direct brain measure of basic emotional dynamics from the human brain. EEG provides non-invasive approaches for monitoring brain electrical activity to emotional stimuli. Event-related desynchronization/synchronization (ERD/ERS) analysis, based on power shifts in specific frequency bands, has some potential as a method for differentiating responses to basic emotions as measured during brief presentations of affective stimuli. Although there appears to be fairly consistent theta ERS in frontal regions of the brain during the earliest phases of processing affective auditory stimuli, the patterns do not readily distinguish between specific emotions. To date it has not been possible to consistently differentiate brain responses to emotion-specific affective states or stimuli, and some evidence to suggests the theta ERS more likely measures general arousal processes rather than yielding veridical indices of specific emotional states. Perhaps cortical EEG patterns will never be able to be used to distinguish discrete emotional states from the surface of the brain. The implications and limitations of such approaches for understanding human emotions are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. The human impact of tropical cyclones: a historical review of events 1980-2009 and systematic literature review.

    Science.gov (United States)

    Doocy, Shannon; Dick, Anna; Daniels, Amy; Kirsch, Thomas D

    2013-04-16

    Background. Cyclones have significantly affected populations in Southeast Asia, the Western Pacific, and the Americas over the past quarter of a century. Future vulnerability to cyclones will increase due to factors including population growth, urbanization, increasing coastal settlement, and global warming. The objectives of this review were to describe the impact of cyclones on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of cyclones were compiled using two methods, a historical review from 1980 to 2009 of cyclone events from multiple databases and a systematic literature review of publications ending in October 2012. Analysis included descriptive statistics and bivariate tests for associations between cyclone characteristics and mortality using Stata 11.0. Findings. There were 412,644 deaths, 290,654 injured, and 466.1 million people affected by cyclones between 1980 and 2009, and the mortality and injury burden was concentrated in less developed nations of Southeast Asia and the Western Pacific. Inconsistent reporting suggests this is an underestimate, particularly in terms of the injured and affected populations. The primary cause of cyclone-related mortality is drowning; in developed countries male gender was associated with increased mortality risk, whereas females experienced higher mortality in less developed countries. Conclusions. Additional attention to preparedness and early warning, particularly in Asia, can lessen the impact of future cyclones.

  7. Human Motion Analysis for Creating Immersive Experiences

    OpenAIRE

    Abedan Kondori, Farid

    2012-01-01

    From an early age, people display the ability to quickly and effortlessly interpret the orientation and movement of human body parts, thereby allowing one to infer the intentions of others who are nearby and to comprehend an important nonverbal form of communication. The ease with which one accomplishes this task belies the difficulty of a problem that has challenged computational systems for decades, human motion analysis. Technological developments over years have resulted into many systems...

  8. Patents and human rights: a heterodox analysis.

    Science.gov (United States)

    Gold, E Richard

    2013-01-01

    Much international debate over access to medicines focuses on whether patent law accords with international human rights law. This article argues that this is the wrong question to ask. Following an analysis of both patent and human rights law, this article suggests that the better approach is to focus on national debates over the best calibration of patent law to achieve national objectives. © 2013 American Society of Law, Medicine & Ethics, Inc.

  9. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  10. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  11. FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, G.B.

    1988-12-01

    The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.

  12. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  13. Vibrational microspectroscopy analysis of human lenses

    Science.gov (United States)

    Paluszkiewicz, C.; Piergies, N.; Sozańska, A.; Chaniecki, P.; Rękas, M.; Miszczyk, J.; Gajda, M.; Kwiatek, W. M.

    2018-01-01

    In this study we present vibrational analysis of healthy (non-affected by cataract) and cataractous human lenses by means of Raman and FTIR spectroscopy methods. The performed analysis provides complex information about the secondary structure of the proteins and conformational changes of the amino acid residues due to the formation of opacification of human lens. Briefly, the changes in the conformation of the Tyr and Trp residues and the protein secondary structure between the healthy and cataractous samples, were recognized. Moreover, the observed spectral pattern suggests that the process of cataract development does not occur uniformly over the entire volume of the lens.

  14. One Health and Cyanobacteria in Freshwater Systems: Animal Illnesses and Deaths Are Sentinel Events for Human Health Risks

    Directory of Open Access Journals (Sweden)

    Elizabeth D. Hilborn

    2015-04-01

    Full Text Available Harmful cyanobacterial blooms have adversely impacted human and animal health for thousands of years. Recently, the health impacts of harmful cyanobacteria blooms are becoming more frequently detected and reported. However, reports of human and animal illnesses or deaths associated with harmful cyanobacteria blooms tend to be investigated and reported separately. Consequently, professionals working in human or in animal health do not always communicate findings related to these events with one another. Using the One Health concept of integration and collaboration among health disciplines, we systematically review the existing literature to discover where harmful cyanobacteria-associated animal illnesses and deaths have served as sentinel events to warn of potential human health risks. We find that illnesses or deaths among livestock, dogs and fish are all potentially useful as sentinel events for the presence of harmful cyanobacteria that may impact human health. We also describe ways to enhance the value of reports of cyanobacteria-associated illnesses and deaths in animals to protect human health. Efficient monitoring of environmental and animal health in a One Health collaborative framework can provide vital warnings of cyanobacteria-associated human health risks.

  15. One Health and Cyanobacteria in Freshwater Systems: Animal Illnesses and Deaths Are Sentinel Events for Human Health Risks

    Science.gov (United States)

    Hilborn, Elizabeth D.; Beasley, Val R.

    2015-01-01

    Harmful cyanobacterial blooms have adversely impacted human and animal health for thousands of years. Recently, the health impacts of harmful cyanobacteria blooms are becoming more frequently detected and reported. However, reports of human and animal illnesses or deaths associated with harmful cyanobacteria blooms tend to be investigated and reported separately. Consequently, professionals working in human or in animal health do not always communicate findings related to these events with one another. Using the One Health concept of integration and collaboration among health disciplines, we systematically review the existing literature to discover where harmful cyanobacteria-associated animal illnesses and deaths have served as sentinel events to warn of potential human health risks. We find that illnesses or deaths among livestock, dogs and fish are all potentially useful as sentinel events for the presence of harmful cyanobacteria that may impact human health. We also describe ways to enhance the value of reports of cyanobacteria-associated illnesses and deaths in animals to protect human health. Efficient monitoring of environmental and animal health in a One Health collaborative framework can provide vital warnings of cyanobacteria-associated human health risks. PMID:25903764

  16. Analysis of events related to cracks and leaks in the reactor coolant pressure boundary

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros, Antonio, E-mail: Antonio.Ballesteros-Avila@ec.europa.eu [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Sanda, Radian; Peinador, Miguel; Zerger, Benoit [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Negri, Patrice [IRSN: Institut de Radioprotection et de Sûreté Nucléaire (France); Wenke, Rainer [GRS: Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH (Germany)

    2014-08-15

    Highlights: • The important role of Operating Experience Feedback is emphasised. • Events relating to cracks and leaks in the reactor coolant pressure boundary are analysed. • A methodology for event investigation is described. • Some illustrative results of the analysis of events for specific components are presented. - Abstract: The presence of cracks and leaks in the reactor coolant pressure boundary may jeopardise the safe operation of nuclear power plants. Analysis of cracks and leaks related events is an important task for the prevention of their recurrence, which should be performed in the context of activities on Operating Experience Feedback. In response to this concern, the EU Clearinghouse operated by the JRC-IET supports and develops technical and scientific work to disseminate the lessons learned from past operating experience. In particular, concerning cracks and leaks, the studies carried out in collaboration with IRSN and GRS have allowed to identify the most sensitive areas to degradation in the plant primary system and to elaborate recommendations for upgrading the maintenance, ageing management and inspection programmes. An overview of the methodology used in the analysis of cracks and leaks related events is presented in this paper, together with the relevant results obtained in the study.

  17. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  18. Human milk metagenome: a functional capacity analysis

    Science.gov (United States)

    2013-01-01

    Background Human milk contains a diverse population of bacteria that likely influences colonization of the infant gastrointestinal tract. Recent studies, however, have been limited to characterization of this microbial community by 16S rRNA analysis. In the present study, a metagenomic approach using Illumina sequencing of a pooled milk sample (ten donors) was employed to determine the genera of bacteria and the types of bacterial open reading frames in human milk that may influence bacterial establishment and stability in this primal food matrix. The human milk metagenome was also compared to that of breast-fed and formula-fed infants’ feces (n = 5, each) and mothers’ feces (n = 3) at the phylum level and at a functional level using open reading frame abundance. Additionally, immune-modulatory bacterial-DNA motifs were also searched for within human milk. Results The bacterial community in human milk contained over 360 prokaryotic genera, with sequences aligning predominantly to the phyla of Proteobacteria (65%) and Firmicutes (34%), and the genera of Pseudomonas (61.1%), Staphylococcus (33.4%) and Streptococcus (0.5%). From assembled human milk-derived contigs, 30,128 open reading frames were annotated and assigned to functional categories. When compared to the metagenome of infants’ and mothers’ feces, the human milk metagenome was less diverse at the phylum level, and contained more open reading frames associated with nitrogen metabolism, membrane transport and stress response (P milk metagenome also contained a similar occurrence of immune-modulatory DNA motifs to that of infants’ and mothers’ fecal metagenomes. Conclusions Our results further expand the complexity of the human milk metagenome and enforce the benefits of human milk ingestion on the microbial colonization of the infant gut and immunity. Discovery of immune-modulatory motifs in the metagenome of human milk indicates more exhaustive analyses of the functionality of the human

  19. Erectile dysfunction and cardiovascular events in diabetic men: a meta-analysis of observational studies.

    Directory of Open Access Journals (Sweden)

    Tomohide Yamada

    Full Text Available BACKGROUND: Several studies have shown that erectile dysfunction (ED influences the risk of cardiovascular events (CV events. However, a meta-analysis of the overall risk of CV events associated with ED in patients with diabetes has not been performed. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE and the Cochrane Library for pertinent articles (including references published between 1951 and April 22, 2012. English language reports of original observational cohort studies and cross-sectional studies were included. Pooled effect estimates were obtained by random effects meta-analysis. A total of 3,791 CV events were reported in 3 cohort studies and 9 cross-sectional studies (covering 22,586 subjects. Across the cohort studies, the overall odds ratio (OR of diabetic men with ED versus those without ED was 1.74 (95% confidence interval [CI]: 1.34-2.27; P0.05. Moreover, meta-regression analysis found no relationship between the method used to assess ED (questionnaire or interview, mean age, mean hemoglobin A(1c, mean body mass index, or mean duration of diabetes and the risk of CV events or CHD. In the cross-sectional studies, the OR of diabetic men with ED versus those without ED was 3.39 (95% CI: 2.58-4.44; P<0.001 for CV events (N = 9, 3.43 (95% CI: 2.46-4.77; P<0.001 for CHD (N = 7, and 2.63 (95% CI: 1.41-4.91; P = 0.002 for peripheral vascular disease (N = 5. CONCLUSION/SIGNIFICANCE: ED was associated with an increased risk of CV events in diabetic patients. Prevention and early detection of cardiovascular disease are important in the management of diabetes, especially in view of the rapid increase in its prevalence.

  20. Brain Network Activation Analysis Utilizing Spatiotemporal Features for Event Related Potentials Classification

    Directory of Open Access Journals (Sweden)

    Yaki Stern

    2016-12-01

    Full Text Available The purpose of this study was to introduce an improved tool for automated classification of event-related potentials (ERPs using spatiotemporally parcellated events incorporated into a functional brain network activation (BNA analysis. The auditory oddball ERP paradigm was selected to demonstrate and evaluate the improved tool. Methods: The ERPs of each subject were decomposed into major dynamic spatiotemporal events. Then, a set of spatiotemporal events representing the group was generated by aligning and clustering the spatiotemporal events of all individual subjects. The temporal relationship between the common group events generated a network, which is the spatiotemporal reference BNA model. Scores were derived by comparing each subject’s spatiotemporal events to the reference BNA model and were then entered into a support vector machine classifier to classify subjects into relevant subgroups. The reliability of the BNA scores (test-retest repeatability using intraclass correlation and their utility as a classification tool were examined in the context of Target-Novel classification. Results: BNA intraclass correlation values of repeatability ranged between 0.51 and 0.82 for the known ERP components N100, P200 and P300. Classification accuracy was high when the trained data were validated on the same subjects for different visits (AUCs 0.93 and 0.95. The classification accuracy remained high for a test group recorded at a different clinical center with a different recording system (AUCs 0.81, 0.85 for 2 visits. Conclusion: The improved spatiotemporal BNA analysis demonstrates high classification accuracy. The BNA analysis method holds promise as a tool for diagnosis, follow-up and drug development associated with different neurological conditions.

  1. Review of evoked and event-related delta responses in the human brain.

    Science.gov (United States)

    Güntekin, Bahar; Başar, Erol

    2016-05-01

    In the last decade, the brain's oscillatory responses have invaded the literature. The studies on delta (0.5-3.5Hz) oscillatory responses in humans upon application of cognitive paradigms showed that delta oscillations are related to cognitive processes, mainly in decision making and attentional processes. The present manuscript comprehensively reviews the studies on delta oscillatory responses upon cognitive stimulation in healthy subjects and in different pathologies, namely Alzheimer's disease, Mild Cognitive Impairment (MCI), bipolar disorder, schizophrenia and alcoholism. Further delta oscillatory response upon presentation of faces, facial expressions, and affective pictures are reviewed. The relationship between pre-stimulus delta activity and post-stimulus evoked and event-related responses and/or oscillations is discussed. Cross-frequency couplings of delta oscillations with higher frequency windows are also included in the review. The conclusion of this review includes several important remarks, including that delta oscillatory responses are involved in cognitive and emotional processes. A decrease of delta oscillatory responses could be a general electrophysiological marker for cognitive dysfunction (Alzheimer's disease, MCI, bipolar disorder, schizophrenia and alcoholism). The pre-stimulus activity (phase or amplitude changes in delta activity) has an effect on post-stimulus EEG responses. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Directed proteomic analysis of the human nucleolus

    DEFF Research Database (Denmark)

    Andersen, Jens S; Lyon, Carol E; Fox, Archa H

    2002-01-01

    of their structure and function remain uncharacterized. RESULTS: We report a proteomic analysis of human nucleoli. Using a combination of mass spectrometry (MS) and sequence database searches, including online analysis of the draft human genome sequence, 271 proteins were identified. Over 30% of the nucleolar...... proteins were encoded by novel or uncharacterized genes, while the known proteins included several unexpected factors with no previously known nucleolar functions. MS analysis of nucleoli isolated from HeLa cells in which transcription had been inhibited showed that a subset of proteins was enriched....... These data highlight the dynamic nature of the nucleolar proteome and show that proteins can either associate with nucleoli transiently or accumulate only under specific metabolic conditions. CONCLUSIONS: This extensive proteomic analysis shows that nucleoli have a surprisingly large protein complexity...

  3. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Science.gov (United States)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  4. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    OpenAIRE

    Paniz Tavakoli; Ken Campbell

    2015-01-01

    Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from s...

  5. [ORION®: a simple and effective method for systemic analysis of clinical events and precursors occurring in hospital practice].

    Science.gov (United States)

    Debouck, F; Rieger, E; Petit, H; Noël, G; Ravinet, L

    2012-05-01

    Morbimortality review is now recommended by the French Health Authority (Haute Autorité de santé [HAS]) in all hospital settings. It could be completed by Comités de retour d'expérience (CREX), making systemic analysis of event precursors which may potentially result in medical damage. As commonly captured by their current practice, medical teams may not favour systemic analysis of events occurring in their setting. They require an easy-to-use method, more or less intuitive and easy-to-learn. It is the reason why ORION(®) has been set up. ORION(®) is based on experience acquired in aeronautics which is the main precursor in risk management since aircraft crashes are considered as unacceptable even though the mortality from aircraft crashes is extremely low compared to the mortality from medical errors in hospital settings. The systemic analysis is divided in six steps: (i) collecting data, (ii) rebuilding the chronology of facts, (iii) identifying the gaps, (iv) identifying contributing and influential factors, (v) proposing actions to put in place, (vi) writing the analysis report. When identifying contributing and influential factors, four kinds of factors favouring the event are considered: technical domain, working environment, organisation and procedures, human factors. Although they are essentials, human factors are not always considered correctly. The systemic analysis is done by a pilot, chosen among people trained to use the method, querying information from all categories of people acting in the setting. ORION(®) is now used in more than 400 French hospital settings for systemic analysis of either morbimortality cases or event precursors. It is used, in particular, in 145 radiotherapy centres for supporting CREX. As very simple to use and quasi-intuitive, ORION(®) is an asset to reach the objectives defined by HAS: to set up effective morbi-mortality reviews (RMM) and CREX for improving the quality of care in hospital settings. By helping the

  6. Analysis of instability event in Oskarshamn-3, Feb. 8, 1998, with SIMULATE-3K

    Energy Technology Data Exchange (ETDEWEB)

    Kruners, M. [Studsvik Scandpower AB, Varberg (Sweden)

    1998-12-01

    This report describes the analysis of the instability event on Feb. 8, 1998, in the BWR reactor Oskarshamn-3, performed with the Studsvik Scandpower kinetic nodal code SIMULATE-3K. On Feb. 8, 1998, after a short shut-down for maintenance, the reactor was in `power run-up` and operating at reduced power and flow (60% power, 34% flow), when an automatic scram on high power occurred. The analysis at the plant of the event indicated that the overpower protection system was triggered and scrammed the reactor because of a strong and intense power oscillation in the core. The analysis has been performed using data delivered from OKG (the operator of the Oskarshamn-3 NPP), and recalculated and extended at Studsvik Scandpower AB for this project. As part of the project, SIMULATE-3K has been validated against a set of stability measurements from previous operating cycles on the same reactor. The results from the analysis of Oskarshamn-3 are divided into two sets: 1. Validation of SIMULATE-3K against previous stability measurements. 2. SIMULATE-3K analysis of the event. The conclusions from the analysis are that the dominant and major contribution to the arising instability event is the power distribution in the core. The root cause of the event can be assigned to the control rod sequence used, and the power distribution created as a result of inserted and withdrawn control rods in the core. With this power distribution the normal fluctuations in the operating point (neutron flux, RC-flow, inlet temperature to the core etc.) finally caused the core to be unstable. No contribution to the instability is necessary from reactor peripheral systems or from adaptive control system modes

  7. 'It was a freak accident': an analysis of the labelling of injury events in the US press.

    Science.gov (United States)

    Smith, Katherine C; Girasek, Deborah C; Baker, Susan P; Manganello, Jennifer A; Bowman, Stephen M; Samuels, Alicia; Gielen, Andrea C

    2012-02-01

    Given that the news media shape our understanding of health issues, a study was undertaken to examine the use by the US media of the expression 'freak accident' in relation to injury events. This analysis is intended to contribute to the ongoing consideration of lay conceptualisation of injuries as 'accidents'. LexisNexis Academic was used to search three purposively selected US news sources (Associated Press, New York Times and Philadelphia Inquirer) for the expression 'freak accident' over 5 years (2005-9). Textual analysis included both structured and open coding. Coding included measures for who used the expression within the story, the nature of the injury event and the injured person(s) being reported upon, incorporation of prevention information within the story and finally a phenomenological consideration of the uses and meanings of the expression within the story context. Results The search yielded a dataset of 250 human injury stories incorporating the term 'freak accident'. Injuries sustained by professional athletes dominated coverage (61%). Fewer than 10% of stories provided a clear and explicit injury prevention message. Stories in which journalists employed the expression 'freak accident' were less likely to include prevention information than stories in which the expression was used by people quoted in the story. Journalists who frame injury events as freak accidents may be an appropriate focus for advocacy efforts. Effective prevention messages should be developed and disseminated to accompany injury reporting in order to educate and protect the public.

  8. Advancing Usability Evaluation through Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.

  9. Post-transcriptional exon shuffling events in humans can be evolutionarily conserved and abundant.

    Science.gov (United States)

    Al-Balool, Haya H; Weber, David; Liu, Yilei; Wade, Mark; Guleria, Kamlesh; Nam, Pitsien Lang Ping; Clayton, Jake; Rowe, William; Coxhead, Jonathan; Irving, Julie; Elliott, David J; Hall, Andrew G; Santibanez-Koref, Mauro; Jackson, Michael S

    2011-11-01

    In silico analyses have established that transcripts from some genes can be processed into RNAs with rearranged exon order relative to genomic structure (post-transcriptional exon shuffling, or PTES). Although known to contribute to transcriptome diversity in some species, to date the structure, distribution, abundance, and functional significance of human PTES transcripts remains largely unknown. Here, using high-throughput transcriptome sequencing, we identify 205 putative human PTES products from 176 genes. We validate 72 out of 112 products analyzed using RT-PCR, and identify additional PTES products structurally related to 61% of validated targets. Sequencing of these additional products reveals GT-AG dinucleotides at >95% of the splice junctions, confirming that they are processed by the spliceosome. We show that most PTES transcripts are expressed in a wide variety of human tissues, that they can be polyadenylated, and that some are conserved in mouse. We also show that they can extend into 5' and 3' UTRs, consistent with formation via trans-splicing of independent pre-mRNA molecules. Finally, we use real-time PCR to compare the abundance of PTES exon junctions relative to canonical exon junctions within the transcripts from seven genes. PTES exon junctions are present at 90% of the levels of canonical junctions, with transcripts from MAN1A2, PHC3, TLE4, and CDK13 exhibiting the highest levels. This is the first systematic experimental analysis of PTES in human, and it suggests both that the phenomenon is much more widespread than previously thought and that some PTES transcripts could be functional.

  10. Techniques for the Analysis of Human Movement.

    Science.gov (United States)

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  11. Leveraging Researcher Reflexivity to Consider a Classroom Event over Time: Reflexive Discourse Analysis of "What Counts"

    Science.gov (United States)

    Anderson, Kate T.

    2017-01-01

    This article presents a reflexive and critical discourse analysis of classroom events that grew out of a cross-cultural partnership with a secondary school teacher in Singapore. I aim to illuminate how differences between researcher and teacher assumptions about what participation in classroom activities should look like came into high relief when…

  12. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Science.gov (United States)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  13. Time-to-event analysis of mastitis at first-lactation in Valle del Belice ewes

    NARCIS (Netherlands)

    Portolano, B.; Firlocchiaro, R.; Kaam, van J.B.C.H.M.; Riggio, V.; Maizon, D.O.

    2007-01-01

    A time-to-event study for mastitis at first-lactation in Valle del Belice ewes was conducted, using survival analysis with an animal model. The goals were to evaluate the effect of lambing season and level of milk production on the time from lambing to the day when a ewe experienced a test-day with

  14. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    Science.gov (United States)

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  15. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  16. Space Mission Human Reliability Analysis (HRA) Project

    Science.gov (United States)

    Boyer, Roger

    2014-01-01

    The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.

  17. Factors involved in the inflammatory events of cervical ripening in humans

    Directory of Open Access Journals (Sweden)

    Wang Hong

    2004-10-01

    decrease in GR levels in human cervix at parturition. Concomitantly there is an increase of factors such as NFkappaB, PAF-R, COX-1 and COX-2, suggesting that they may participate in the sequence of events leading to the final cervical ripening.

  18. A multiprocessor system for the analysis of pictures of nuclear events

    CERN Document Server

    Bacilieri, P; Matteuzzi, P; Sini, G P; Zanotti, U

    1979-01-01

    The pictures of nuclear events obtained from the bubble chambers such as Gargamelle and BEBC at CERN and others from Serpukhov are geometrically processed at CNAF (Centro Nazionale Analysis Photogrammi) in Bologna. The analysis system includes an Erasme table and a CRT flying spot digitizer. The difficulties connected with the pictures of the four stereoscopic views of the bubble chambers are overcome by the choice of a strong interactive system. (0 refs).

  19. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  20. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Directory of Open Access Journals (Sweden)

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  1. Root cause analysis of serious adverse events among older patients in the Veterans Health Administration.

    Science.gov (United States)

    Lee, Alexandra; Mills, Peter D; Neily, Julia; Hemphill, Robin R

    2014-06-01

    Preventable adverse events are more likely to occur among older patients because of the clinical complexity of their care. The Veterans Health Administration (VHA) National Center for Patient Safety (NCPS) stores data about serious adverse events when a root cause analysis (RCA) has been performed. A primary objective of this study was to describe the types of adverse events occurring among older patients (age > or = 65 years) in Department of Veterans Affairs (VA) hospitals. Secondary objectives were to determine the underlying reasons for the occurrence of these events and report on effective action plans that have been implemented in VA hospitals. In a retrospective, cross-sectional review, RCA reports were reviewed and outcomes reported using descriptive statistics for all VA hospitals that conducted an RCA for a serious geriatric adverse event from January 2010 to January 2011 that resulted in sustained injury or death. The search produced 325 RCA reports on VA patients (age > or = 65 years). Falls (34.8%), delays in diagnosis and/or treatment (11.7%), unexpected death (9.9%), and medication errors (9.0%) were the most commonly reported adverse events among older VA patients. Communication was the most common underlying reason for these events, representing 43.9% of reported root causes. Approximately 40% of implemented action plans were judged by local staff to be effective. The RCA process identified falls and communication as important themes in serious adverse events. Concrete actions, such as process standardization and changes to communication, were reported by teams to yield some improvement. However, fewer than half of the action plans were reported to be effective. Further research is needed to guide development and implementation of effective action plans.

  2. A cross-sectional analysis of pharmaceutical industry-funded events for health professionals in Australia.

    Science.gov (United States)

    Fabbri, Alice; Grundy, Quinn; Mintzes, Barbara; Swandari, Swestika; Moynihan, Ray; Walkom, Emily; Bero, Lisa A

    2017-06-30

    To analyse patterns and characteristics of pharmaceutical industry sponsorship of events for Australian health professionals and to understand the implications of recent changes in transparency provisions that no longer require reporting of payments for food and beverages. Cross-sectional analysis. 301 publicly available company transparency reports downloaded from the website of Medicines Australia, the pharmaceutical industry trade association, covering the period from October 2011 to September 2015. Forty-two companies sponsored 116 845 events for health professionals, on average 608 per week with 30 attendees per event. Events typically included a broad range of health professionals: 82.0% included medical doctors, including specialists and primary care doctors, and 38.3% trainees. Oncology, surgery and endocrinology were the most frequent clinical areas of focus. Most events (64.2%) were held in a clinical setting. The median cost per event was $A263 (IQR $A153-1195) and over 90% included food and beverages. Over this 4-year period, industry-sponsored events were widespread and pharmaceutical companies maintained a high frequency of contact with health professionals. Most events were held in clinical settings, suggesting a pervasive commercial presence in everyday clinical practice. Food and beverages, known to be associated with changes to prescribing practice, were almost always provided. New Australian transparency provisions explicitly exclude meals from the reporting requirements; thus, a large proportion of potentially influential payments from pharmaceutical companies to health professionals will disappear from public view. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Application of human reliability analysis methodology of second generation; Aplicacion de metodologia de analisis de confiabilidad humana de segunda generacion

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz S, T. de J.; Nelson E, P. F. [Facultad de Ingenieria, Departamento de Sistemas Energeticos, UNAM, Paseo Cuauhnahuac 8532, 62550 Jiutepec, Morelos (Mexico)], e-mail: trs@cie.unam.mx

    2009-10-15

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  4. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  5. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  6. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes poss...... possible automated verification of large industrial designs with the use of only modest resources (less than 5 minutes on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool visualSTATETM....

  7. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  8. A Probabilistic Framework for Risk Analysis of Widespread Flood Events: A Proof-of-Concept Study.

    Science.gov (United States)

    Schneeberger, Klaus; Huttenlau, Matthias; Winter, Benjamin; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2017-07-27

    This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top-kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof-of-concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. © 2017 Society for Risk Analysis.

  9. A joint analysis of wave and surge conditions for past and present extrem events in the south-western Baltic Sea

    Science.gov (United States)

    Groll, Nikolaus; Gaslikova, Lidia

    2017-04-01

    Extreme marine events in the south-western Baltic Sea like the historic storm in 1872 are rare, but have large impacts on human safety and coastal infrastructure. The aforementioned extreme storm event of 1872 and has cost over 250 human lives, left severely damaged infrastructure and caused land loss due to coastal erosion. Recent extreme events also result in drastic impacts to coastal regions. Using results from numerical wave and hydrodynamic model simulations we will present a joint analysis of wave and water level conditions for selected extreme events. For the historic event the numerical models have been forced by reconstructed wind and pressure fields from pressure readings. Simulated atmospheric conditions from reanalysis have been used for the more recent events. The height of the water level due to the possible previous inflow of water masses in the Baltic Sea basin, as well as possible seiches and swell effects have been incorporated in the simulations. We will discuss similarities and differences between the historic and the more recent marine hazard events.

  10. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    Science.gov (United States)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  11. HLA DNA sequence variation among human populations: molecular signatures of demographic and selective events.

    Directory of Open Access Journals (Sweden)

    Stéphane Buhler

    to explore the genetic history of human populations, and that their analysis allows a more thorough investigation of human MHC molecular evolution.

  12. Analysis of unintended events in hospitals : inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Groenewegen, P.P.; Timmermans, D.

    2009-01-01

    Background. Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  13. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, de H.C.W.; Zwaan, L.; Timmermans, D.R.M.; Groenewegen, P.P.; Wagner, C.

    2009-01-01

    BACKGROUND: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis: is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  14. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes.

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Timmermans, D.; Groenewegen, P.; Wagner, C.

    2009-01-01

    Background: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  15. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  16. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    Science.gov (United States)

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  17. The use of significant event analysis and personal development plans in developing CPD: a pilot study.

    Science.gov (United States)

    Wright, P D; Franklin, C D

    2007-07-14

    This paper describes the work undertaken by the Postgraduate Primary Care Trust (PCT) Dental Tutor for South Yorkshire and East Midlands Regional Postgraduate Dental Education Office during the first year of a two-year pilot. The tutor has special responsibility for facilitating the writing of Personal Development Plans (PDPs) and the introduction of Significant Event Analysis to the 202 general dental practitioners in the four Sheffield PCTs. Data were collected on significant events and the educational needs highlighted as a result. A hands-on workshop format was used in small practice groups and 45% of Sheffield general dental practitioners now have written PDPs compared with a 16% national average. A library of significant events has also been collated from the data collected.

  18. Tailoring a Human Reliability Analysis to Your Industry Needs

    Science.gov (United States)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  19. The logic of surveillance guidelines: an analysis of vaccine adverse event reports from an ontological perspective.

    Directory of Open Access Journals (Sweden)

    Mélanie Courtot

    Full Text Available BACKGROUND: When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. METHODS AND FINDINGS: Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. CONCLUSIONS: By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. AVAILABILITY: Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero.

  20. When do young adults stop practising a sport? An event history analysis on the impact of four major life events

    NARCIS (Netherlands)

    Houten, J.M.A. van; Kraaykamp, G.L.M.; Breedveld, K.

    2017-01-01

    This article investigates the relationship between four major life events and stopping sport participation in young adulthood. We employ a neo-Weberian theoretical framework related to changes in temporal and social resources to explain how beginning to work, starting to live on one's own, starting

  1. [Meta-analysis of blood system adverse events of Tripterygium wilfordii].

    Science.gov (United States)

    Li, Zhi-xia; Ma, Dong-mei; Yang, Xing-hua; Sun, Feng; Yu, Kai; Zhan, Si-yan

    2015-01-01

    A systematic review was undertaken, including studies that evaluated the incidence of the blood system adverse events of Tripterygium wilfordii (TWP). Medline, Embase and the Cochrane library were searched for relevant studies, including RCT, cohort studies and case series, of patients treated with TWP published in English and Chinese from inception up until May 25th, 2013 with the keywords including "Tripterygium wilfordii", "toxicity", "reproductive", "side effect", "adverse", "safety" and "tolerability". Relevant information was extracted and the incidence of the blood system adverse events was pooled with MetaAnalyst software. Besides, subgroup and sensitivity analyses were performed based on age, mode of medicine, observation time and disease system. According to inclusion and exclusion criteria, a total of 49 articles were included in the meta-analysis, they were split into 54 researches incorporated in the analysis. There is a large degree of heterogeneity among the studies, so data was analyzed using random-effects model and the summary estimates of incidence of the blood system adverse events was 6.1%. The weighted combined incidence of three major blood system adverse events were white-blood cells decreasing 5.6% (95% CI, 4.3% - 7.3%), hemoglobin decreasing 1.7% (95% CI, 0.5% - 5.0%) and platelet decreasing 1.8% (95% CI, 1.0% - 3.1%), respectively . Sensitivity analyses based on 45 studies with high quality showed the combined value was close to the summary estimate of total 54 studies. The current evidence indicates that the incidence of the blood system adverse events induced by TWP was high; attentions should be paid on to the prevention and treatment of the blood system adverse events.

  2. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Science.gov (United States)

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  3. A hydrological analysis of the 4 November 2011 event in Genoa

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  4. Painful and provocative events scale and fearlessness about death among Veterans: Exploratory factor analysis.

    Science.gov (United States)

    Poindexter, Erin K; Nazem, Sarra; Forster, Jeri E

    2017-01-15

    The interpersonal theory of suicide suggests three proximal risk factors for suicide: perceived burdensomeness, thwarted belongingness, and acquired capability. Previous literature indicates that repetitive exposure to painful and provocative events is related to increased acquired capability for suicide. Despite this, research related to the assessment of painful and provocative events has been insufficient. Research has inconsistently administered the Painful and Provocative Events Scale (PPES; a painful and provocative events assessment), and no study has examined the factor structure of the English PPES. This study explored the factor structure of the PPES and the relation between factors and fearlessness about death. The sample was a cross-sectional, self-report study comprised of 119 Veterans (Mage = 46.5, SD = 13.5). Findings from an exploratory factor analysis indicated a four-factor solution for the PPES; however, no factor from the PPES significantly related to fearlessness about death (measured by the Acquired Capability for Suicide Scale - Fearlessness About Death Scale; all p >.21). Cross-sectional, small Veteran sample. Findings suggest that the PPES lacks the psychometric properties necessary to reliably investigate painful and provocative factors. Consequently, this measure may not reliably capture and explain how painful and provocative events relate to fearlessness about death, which is a barrier to improving suicide risk assessment and prediction. Recommendations for the construction of a new PPES are offered. Published by Elsevier B.V.

  5. Bridging Resilience Engineering and Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2010-06-01

    There has been strong interest in the new and emerging field called resilience engineering. This field has been quick to align itself with many existing safety disciplines, but it has also distanced itself from the field of human reliability analysis. To date, the discussion has been somewhat one-sided, with much discussion about the new insights afforded by resilience engineering. This paper presents an attempt to address resilience engineering from the perspective of human reliability analysis (HRA). It is argued that HRA shares much in common with resilience engineering and that, in fact, it can help strengthen nascent ideas in resilience engineering. This paper seeks to clarify and ultimately refute the arguments that have served to divide HRA and resilience engineering.

  6. Metagenomic Analysis of Airborne Bacterial Community and Diversity in Seoul, Korea, during December 2014, Asian Dust Event.

    Directory of Open Access Journals (Sweden)

    Seho Cha

    Full Text Available Asian dust or yellow sand events in East Asia are a major issue of environmental contamination and human health, causing increasing concern. A high amount of dust particles, especially called as particulate matter 10 (PM10, is transported by the wind from the arid and semi-arid tracks to the Korean peninsula, bringing a bacterial population that alters the terrestrial and atmospheric microbial communities. In this study, we aimed to explore the bacterial populations of Asian dust samples collected during November-December 2014. The dust samples were collected using the impinger method, and the hypervariable regions of the 16S rRNA gene were amplified using PCR followed by pyrosequencing. Analysis of the sequencing data were performed using Mothur software. The data showed that the number of operational taxonomic units and diversity index during Asian dust events were higher than those during non-Asian dust events. At the phylum level, the proportions of Proteobacteria, Actinobacteria, and Firmicutes were different between Asian dust and non-Asian dust samples. At the genus level, the proportions of the genus Bacillus (6.9%, Arthrobacter (3.6%, Blastocatella (2%, Planomicrobium (1.4% were increased during Asian dust compared to those in non-Asian dust samples. This study showed that the significant relationship between bacterial populations of Asian dust samples and non-Asian dust samples in Korea, which could significantly affect the microbial population in the environment.

  7. Metagenomic Analysis of Airborne Bacterial Community and Diversity in Seoul, Korea, during December 2014, Asian Dust Event.

    Science.gov (United States)

    Cha, Seho; Srinivasan, Sathiyaraj; Jang, Jun Hyeong; Lee, Dongwook; Lim, Sora; Kim, Kyung Sang; Jheong, Weonhwa; Lee, Dong-Won; Park, Eung-Roh; Chung, Hyun-Mi; Choe, Joonho; Kim, Myung Kyum; Seo, Taegun

    2017-01-01

    Asian dust or yellow sand events in East Asia are a major issue of environmental contamination and human health, causing increasing concern. A high amount of dust particles, especially called as particulate matter 10 (PM10), is transported by the wind from the arid and semi-arid tracks to the Korean peninsula, bringing a bacterial population that alters the terrestrial and atmospheric microbial communities. In this study, we aimed to explore the bacterial populations of Asian dust samples collected during November-December 2014. The dust samples were collected using the impinger method, and the hypervariable regions of the 16S rRNA gene were amplified using PCR followed by pyrosequencing. Analysis of the sequencing data were performed using Mothur software. The data showed that the number of operational taxonomic units and diversity index during Asian dust events were higher than those during non-Asian dust events. At the phylum level, the proportions of Proteobacteria, Actinobacteria, and Firmicutes were different between Asian dust and non-Asian dust samples. At the genus level, the proportions of the genus Bacillus (6.9%), Arthrobacter (3.6%), Blastocatella (2%), Planomicrobium (1.4%) were increased during Asian dust compared to those in non-Asian dust samples. This study showed that the significant relationship between bacterial populations of Asian dust samples and non-Asian dust samples in Korea, which could significantly affect the microbial population in the environment.

  8. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  9. Ontology-based time information representation of vaccine adverse events in VAERS for temporal analysis

    Directory of Open Access Journals (Sweden)

    Tao Cui

    2012-12-01

    Full Text Available Abstract Background The U.S. FDA/CDC Vaccine Adverse Event Reporting System (VAERS provides a valuable data source for post-vaccination adverse event analyses. The structured data in the system has been widely used, but the information in the write-up narratives is rarely included in these kinds of analyses. In fact, the unstructured nature of the narratives makes the data embedded in them difficult to be used for any further studies. Results We developed an ontology-based approach to represent the data in the narratives in a “machine-understandable” way, so that it can be easily queried and further analyzed. Our focus is the time aspect in the data for time trending analysis. The Time Event Ontology (TEO, Ontology of Adverse Events (OAE, and Vaccine Ontology (VO are leveraged for the semantic representation of this purpose. A VAERS case report is presented as a use case for the ontological representations. The advantages of using our ontology-based Semantic web representation and data analysis are emphasized. Conclusions We believe that representing both the structured data and the data from write-up narratives in an integrated, unified, and “machine-understandable” way can improve research for vaccine safety analyses, causality assessments, and retrospective studies.

  10. Traumatic events and depressive symptoms among youth in Southwest Nigeria: a qualitative analysis.

    Science.gov (United States)

    Omigbodun, Olayinka; Bakare, Kofoworola; Yusuf, Bidemi

    2008-01-01

    Traumatic experiences have dire consequences for the mental health of young persons. Despite high rates of traumatic experiences in some African cities, there are no reports for Nigerian youth. To investigate the pattern of traumatic events and their association with depressive symptoms among youth in Southwest Nigeria. This is a descriptive cross-sectional study of randomly selected youth in urban and rural schools in Southwest Nigeria. They completed self-reports on traumatic events and depressive symptoms using the Street Children's Project Questionnaire and the Youth DISC Predictive Scale (DPS). Of the 1,768 responses (88.4% response rate) entered into the analysis, 34% reported experiencing a traumatic situation. Following interpretative phenomenological analysis, 13 themes emerged. Frequently occurring traumatic events were 'road traffic accidents' (33.0%), 'sickness' (17.1%), 'lost or trapped' (11.2%) and 'armed robbery attack' (9.7%). A bad dream was described by 3.7%. Traumatic experiences were commoner in males (36.2%) than in females (31.6%) (x2 = 4.2; p = .041). Experiencing a traumatic event was associated with depressive symptoms (X2 = 37.98; p people are essential.

  11. Sequencing biological and physical events affects specific frequency bands within the human premotor cortex: an intracerebral EEG study.

    Directory of Open Access Journals (Sweden)

    Fausto Caruana

    Full Text Available Evidence that the human premotor cortex (PMC is activated by cognitive functions involving the motor domain is classically explained as the reactivation of a motor program decoupled from its executive functions, and exploited for different purposes by means of a motor simulation. In contrast, the evidence that PMC contributes to the sequencing of non-biological events cannot be explained by the simulationist theory. Here we investigated how motor simulation and event sequencing coexist within the PMC and how these mechanisms interact when both functions are executed. We asked patients with depth electrodes implanted in the PMC to passively observe a randomized arrangement of images depicting biological actions and physical events and, in a second block, to sequence them in the correct order. This task allowed us to disambiguate between the simple observation of actions, their sequencing (recruiting different motor simulation processes, as well as the sequencing of non-biological events (recruiting a sequencer mechanism non dependant on motor simulation. We analysed the response of the gamma, alpha and beta frequency bands to evaluate the contribution of each brain rhythm to the observation and sequencing of both biological and non-biological stimuli. We found that motor simulation (biological>physical and event sequencing (sequencing>observation differently affect the three investigated frequency bands: motor simulation was reflected on the gamma and, partially, in the beta, but not in the alpha band. In contrast, event sequencing was also reflected on the alpha band.

  12. Anatomy of a media event: how arguments clashed in the 2001 human cloning debate.

    Science.gov (United States)

    Nerlich, Brigitte; Clarke, David D

    2003-04-01

    This paper studies the distinctive role that staged media events play in the public understanding of genetics: they can focus the attention of the media, scientists and the public on the risks and benefits of genetic advances, in our case, cloning; they can accelerate policy changes by exposing scientific, legal and ethical uncertainties; the use of images, metaphors, cliches, and cultural narratives by scientists and the media engaged in this event can reinforce stereotypical representations of cloning, but can also expose fundamental clashes in arguments about cloning. The media event staged by two fertility experts in 2001 is here analysed as a case study.

  13. Ontology-based combinatorial comparative analysis of adverse events associated with killed and live influenza vaccines.

    Directory of Open Access Journals (Sweden)

    Sirarat Sarntivijai

    Full Text Available Vaccine adverse events (VAEs are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed inactivated influenza vaccine (TIV and trivalent live attenuated influenza vaccine (LAIV. Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS. Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone and 3,707 AE reports for the only LAIV (FluMist. The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE. CODAE detects AEs using Proportional Reporting Ratio (PRR, Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports. These AE terms were classified using the Ontology of Adverse Events (OAE, MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these

  14. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    Science.gov (United States)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine

  15. Interlocus gene conversion events introduce deleterious mutations into at least 1% of human genes associated with inherited disease.

    Science.gov (United States)

    Casola, Claudio; Zekonyte, Ugne; Phillips, Andrew D; Cooper, David N; Hahn, Matthew W

    2012-03-01

    Establishing the molecular basis of DNA mutations that cause inherited disease is of fundamental importance to understanding the origin, nature, and clinical sequelae of genetic disorders in humans. The majority of disease-associated mutations constitute single-base substitutions and short deletions and/or insertions resulting from DNA replication errors and the repair of damaged bases. However, pathological mutations can also be introduced by nonreciprocal recombination events between paralogous sequences, a phenomenon known as interlocus gene conversion (IGC). IGC events have thus far been linked to pathology in more than 20 human genes. However, the large number of duplicated gene sequences in the human genome implies that many more disease-associated mutations could originate via IGC. Here, we have used a genome-wide computational approach to identify disease-associated mutations derived from IGC events. Our approach revealed hundreds of known pathological mutations that could have been caused by IGC. Further, we identified several dozen high-confidence cases of inherited disease mutations resulting from IGC in ∼1% of all genes analyzed. About half of the donor sequences associated with such mutations are functional paralogous genes, suggesting that epistatic interactions or differential expression patterns will determine the impact upon fitness of specific substitutions between duplicated genes. In addition, we identified thousands of hitherto undescribed and potentially deleterious mutations that could arise via IGC. Our findings reveal the extent of the impact of interlocus gene conversion upon the spectrum of human inherited disease.

  16. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  17. Leading order analysis of neutrino induced dimuon events in the CHORUS experiment

    CERN Document Server

    Kayis-Topaksu, A; Van Dantzig, R; De Jong, M; Oldeman, R G C; Güler, M; Kama, S; Köse, U; Serin-Zeyrek, M; Tolun, P; Catanesi, M G; Muciaccia, M T; Bülte, A; Winter, Klaus; Van de Vyver, B; Vilain, P; Wilquet, G; Saitta, B; Di Capua, E; Ogawa, S; Shibuya, H; Hristova, I R; Kawamura, T; Kolev, D; Litmaath, M; Meinhard, H; Panman, J; Rozanov, A; Tsenov, R; Uiterwijk, J W E; Zucchelli, P; Goldberg, J; Chikawa, M; Song, J S; Yoon, C S; Kodama, K; Ushida, N; Aoki, S; Hara, T; Delbar, T; Favart, D; Grégoire, G; Kalinin, S; Makhlioueva, I; Artamonov, A; Gorbunov, P; Khovansky, V; Shamanov, V; Tsukerman, I; Bruski, N; Frekers, D; Rondeshagen, D; Wolff, T; Hoshino, K; Kawada, J; Komatsu, M; Miyanishi, M; Nakamura, M; Nakano, T; Narita, K; Niu, K; Niwa, K; Nonaka, N; Sato, O; Toshito, T; Buontempo, S; Cocco, A G; D'Ambrosio, N; De Lellis, G; De Rosa, G; Di Capua, F; Fiorillo, G; Marotta, A; Messina, M; Migliozzi, P; Santorelli, R; Scotto-Lavina, L; Strolin, P; Tioukov, V; Okusawa, T; Dore, U; Loverre, P F; Ludovici, L; Rosa, G; Santacesaria, R; Satta, A; Spada, F R; Barbuto, E; Bozza, C; Grella, G; Romano, G; Sirignano, C; Sorrentino, S; Sato, Y; Tezuka, I

    2008-01-01

    We present a leading order QCD analysis of a sample of neutrino induced charged-current events with two muons in the final state originating in the lead-scintillating fibre calorimeter of the CHORUS detector. The results are based on a sample of 8910 neutrino and 430 antineutrino induced opposite-sign dimuon events collected during the exposure of the detector to the CERN Wide Band Neutrino Beam between 1995 and 1998. The analysis yields a value of the charm quark mass of $m_c=(1.26+- 0.16+-0.09) GeV/c^2$ and a value of the ratio of the strange to non-strange sea in the nucleon of $\\kappa=0.33+-0.05+-0.05$, improving the results obtained in similar analyses by previous experiments.

  18. Through the eyes of the other: using event analysis to build cultural competence.

    Science.gov (United States)

    Kozub, Mary L

    2013-07-01

    Cultural competence requires more than the accumulation of information about cultural groups. An awareness of the nurse's own culture, beliefs, and values is considered by several transcultural nursing theorists to be essential to the development of cultural competence and the provision of quality patient care. Using Transformational Learning Theory, this article describes event analysis, an active learning tool that uses the nurse's own practice to explore multiple perspectives of an experience, with the goal of transforming the nurse's approach to diversity from an ethnocentric stance, to one of tolerance and consideration for the patient's needs, values, and beliefs with regard to quality of care. Furthermore, the application of the event analysis to multiple settings, including inpatient, educational, and administrative environments, is discussed.

  19. Novel data-mining methodologies for adverse drug event discovery and analysis.

    Science.gov (United States)

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis.

  20. 'Haymaking on Rajac Mt': Tourist event analysis according to gender and age structure

    OpenAIRE

    Brankov Jovana; Bjeljac Željko; Popović Ivan B.

    2009-01-01

    This work, using previous researches of socio-demographic influences upon tourist market, studies habits and behavior of tourists (visitors) on tourist event Haymaking on Rajac mountain, as well as the use of advertising means for the purposes of gaining new information. The poll was conducted among 352 people selected by chance whereas the data analysis was carried out according to gender and aging structure. The aim of this research is to determine if there is a significant difference in th...

  1. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    Science.gov (United States)

    2012-03-01

    performance of RDBMS . Paper presented at the HICSS, 3013. Shenk, J. (2009). SANS annual 2009 log management survey. A SANS Whitepaper. Retrieved...of Engineering and Management (AFIT/EN) 2950 Hobson Way, Building 640 WPAFB OH 45433-7765 8. PERFORMING ORGANIZATION REPORT NUMBER AFIT/GIR...ANALYSIS OF THE IMPACT OF DATA NORMALIZATION ON CYBER EVENT CORRELATION QUERY PERFORMANCE THESIS Smile T. Ludovice, Master

  2. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  3. Human reliability analysis (HRA) techniques and observational clinical HRA.

    Science.gov (United States)

    Cuschieri, Alfred; Tang, B

    2010-01-01

    This review explains the nature of human reliability analysis (HRA) methods developed and used for predicting safety in high-risk human activities. HRA techniques have evolved over the years and have become less subjective as a result of inclusion of (i) cognitive factors in the man-machine interface and (ii) high and low dependency levels between human failure events (HFEs). All however remain probabilistic in the assessment of safety. In the translation of these techniques, developed for assessment of safety of high-risk industries (nuclear, aerospace etc.) where catastrophic failures from the man-machine complex interface are fortunately rare, to the clinical operative surgery (with its high incidence of human errors), the system loses subjectivity since the documentation of HFEs can be assessed and studied prospectively on the basis of an objective data capture of errors enacted during a defined clinical activity. The observational clinical-HRA (OC-HRA) was developed specifically for this purpose, initially for laparoscopic general surgery. It has however been used by other surgical specialties. OC-HRA has the additional merit of objective determination of the proficiency of a surgeon in executing specific interventions and is adaptable to the evaluation of safety and proficiency in clinical activities within the preoperative and postoperative periods.

  4. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    Science.gov (United States)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  5. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Directory of Open Access Journals (Sweden)

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  6. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  7. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  8. Rockfall events in the European Alps: analysis of topography and permafrost conditions at the detachment zones

    Science.gov (United States)

    Naegeli, Barbara; Nötzli, Jeannette; Fischer, Luzia

    2010-05-01

    The occurrence and changes of permafrost can influence the stability of high mountain rock walls among other factors such as topography, geology or hydrology. Knowledge of the connection between rock fall and permafrost is still limited, but several indications link past rock fall events and the warming of permafrost. For example, observation of numerous rock fall events in the unusually dry and hot summer 2003, the presence of ice in detachment zones, or the demonstrated reduction of shear-strength in ice-filled fractures with warming temperatures. The main objective of the presented study is the analysis of past rock fall staring zones in the European Alps with focus on permafrost and topography in order to learn about the conditions under which such instabilities develop. An inventory of recent rock fall events in the European Alps (mainly the past 100 years) has been established and the collected data has been analysed. The work presented bases on and extends similar earlier studies of rock fall starting zones in permafrost areas. In a first step, a descriptive statistical analysis of topographic and geological characteristics of the detachment zones was conducted and subsequently, the permafrost conditions in the detachment zones were investigated: a) the evaluation of the elevation and the aspect for each detachment zone and the relative comparison with two different permafrost boundary estimations from regional permafrost models, b) the relation of the mean annual air temperature and the potential solar radiation for each detachment zone, c) the assessment of the topographical situation of each detachment zone. Despite uncertainties in the raw data, results corroborate findings from earlier studies that the majority of the rock fall events in the inventory originated from areas with potentially warm permafrost. Interestingly, a large proportion of events originates from areas below ridges and peaks.

  9. ANALYSIS OF SHIP ARRIVAL FUNCTIONS IN DISCRETE EVENT SIMULATION MODELS OF AN IRON ORE EXPORT TERMINAL

    Directory of Open Access Journals (Sweden)

    Romeu Rodrigues

    2016-04-01

    Full Text Available ABSTRACT This work evaluates the results of different distribution functions of ships arrivals in the simulation of an iron ore export terminal under implementation. The system adopted for the analysis applied a real case designed for installation of a Brazilian company terminal. The analysis was carried out by a discrete event simulation model. Variations of up to 566% in predicting demurrage paid to ships are detected. A database with 2,518 arrivals of ships demonstrates that, unlike recommendations of the traditional literature, the Pearson 6 function is the one that better represents the distribution to a shipping terminal of iron ore than the Exponential, Erlang and Weibull functions.

  10. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  11. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    Science.gov (United States)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  12. Cost analysis of adverse events associated with non-small cell lung cancer management in France

    Directory of Open Access Journals (Sweden)

    Chouaid C

    2017-07-01

    , anemia (€5,752 per event, dehydration (€5,207 per event and anorexia (€4,349 per event. Costs were mostly driven by hospitalization costs.Conclusion: Among the AEs identified, a majority appeared to have an important economic impact, with a management cost of at least €2,000 per event mainly driven by hospitalization costs. This study may be of interest for economic evaluations of new interventions in NSCLC. Keywords: non-small cell lung cancer, adverse events, cost analysis, chemotherapy, immunotherapy

  13. Proteomic Analysis of the Human Olfactory Bulb.

    Science.gov (United States)

    Dammalli, Manjunath; Dey, Gourav; Madugundu, Anil K; Kumar, Manish; Rodrigues, Benvil; Gowda, Harsha; Siddaiah, Bychapur Gowrishankar; Mahadevan, Anita; Shankar, Susarla Krishna; Prasad, Thottethodi Subrahmanya Keshava

    2017-08-01

    The importance of olfaction to human health and disease is often underappreciated. Olfactory dysfunction has been reported in association with a host of common complex diseases, including neurological diseases such as Alzheimer's disease and Parkinson's disease. For health, olfaction or the sense of smell is also important for most mammals, for optimal engagement with their environment. Indeed, animals have developed sophisticated olfactory systems to detect and interpret the rich information presented to them to assist in day-to-day activities such as locating food sources, differentiating food from poisons, identifying mates, promoting reproduction, avoiding predators, and averting death. In this context, the olfactory bulb is a vital component of the olfactory system receiving sensory information from the axons of the olfactory receptor neurons located in the nasal cavity and the first place that processes the olfactory information. We report in this study original observations on the human olfactory bulb proteome in healthy subjects, using a high-resolution mass spectrometry-based proteomic approach. We identified 7750 nonredundant proteins from human olfactory bulbs. Bioinformatics analysis of these proteins showed their involvement in biological processes associated with signal transduction, metabolism, transport, and olfaction. These new observations provide a crucial baseline molecular profile of the human olfactory bulb proteome, and should assist the future discovery of biomarker proteins and novel diagnostics associated with diseases characterized by olfactory dysfunction.

  14. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data.

    Science.gov (United States)

    Saramago, Pedro; Chuang, Ling-Hsiang; Soares, Marta O

    2014-09-10

    Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest.

  15. Climate Change, Extreme Weather Events, and Human Health Implications in the Asia Pacific Region.

    Science.gov (United States)

    Hashim, Jamal Hisham; Hashim, Zailina

    2016-03-01

    The Asia Pacific region is regarded as the most disaster-prone area of the world. Since 2000, 1.2 billion people have been exposed to hydrometeorological hazards alone through 1215 disaster events. The impacts of climate change on meteorological phenomena and environmental consequences are well documented. However, the impacts on health are more elusive. Nevertheless, climate change is believed to alter weather patterns on the regional scale, giving rise to extreme weather events. The impacts from extreme weather events are definitely more acute and traumatic in nature, leading to deaths and injuries, as well as debilitating and fatal communicable diseases. Extreme weather events include heat waves, cold waves, floods, droughts, hurricanes, tropical cyclones, heavy rain, and snowfalls. Globally, within the 20-year period from 1993 to 2012, more than 530 000 people died as a direct result of almost 15 000 extreme weather events, with losses of more than US$2.5 trillion in purchasing power parity. © 2015 APJPH.

  16. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  17. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Directory of Open Access Journals (Sweden)

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  18. Spatial-Temporal Feature Analysis on Single-Trial Event Related Potential for Rapid Face Identification.

    Science.gov (United States)

    Jiang, Lei; Wang, Yun; Cai, Bangyu; Wang, Yueming; Wang, Yiwen

    2017-01-01

    The event-related potential (ERP) is the brain response measured in electroencephalography (EEG), which reflects the process of human cognitive activity. ERP has been introduced into brain computer interfaces (BCIs) to communicate the computer with the subject's intention. Due to the low signal-to-noise ratio of EEG, most ERP studies are based on grand-averaging over many trials. Recently single-trial ERP detection attracts more attention, which enables real time processing tasks as rapid face identification. All the targets needed to be retrieved may appear only once, and there is no knowledge of target label for averaging. More interestingly, how the features contribute temporally and spatially to single-trial ERP detection has not been fully investigated. In this paper, we propose to implement a local-learning-based (LLB) feature extraction method to investigate the importance of spatial-temporal components of ERP in a task of rapid face identification using single-trial detection. Comparing to previous methods, LLB method preserves the nonlinear structure of EEG signal distribution, and analyze the importance of original spatial-temporal components via optimization in feature space. As a data-driven methods, the weighting of the spatial-temporal component does not depend on the ERP detection method. The importance weights are optimized by making the targets more different from non-targets in feature space, and regularization penalty is introduced in optimization for sparse weights. This spatial-temporal feature extraction method is evaluated on the EEG data of 15 participants in performing a face identification task using rapid serial visual presentation paradigm. Comparing with other methods, the proposed spatial-temporal analysis method uses sparser (only 10% of the total) features, and could achieve comparable performance (98%) of single-trial ERP detection as the whole features across different detection methods. The interesting finding is that the N250 is

  19. Incident stressful and traumatic life events and human immunodeficiency virus sexual transmission risk behaviors in a longitudinal, multisite cohort study.

    Science.gov (United States)

    Pence, Brian Wells; Raper, James L; Reif, Susan; Thielman, Nathan M; Leserman, Jane; Mugavero, Michael J

    2010-09-01

    To assess the association between incident stressful life events (e.g., sexual and physical assault; housing instability; and major financial, employment, and legal difficulties) and unprotected anal or vaginal sexual intercourse (unprotected sex) among people living with human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (PLWHA). We assessed incident stressful events and unprotected sex over 27 months in 611 participants in an eight-site, five-state study in the Southeast United States. Using mixed-effects logistic models and separately estimating between-person and within-person associations, we assessed the association of incident stressful events with unprotected sex with all partners, HIV-positive partners, and HIV-negative/serostatus-unknown partners. Incident stressful events reported at one third or more of interviews included major illness, injury or accident (non-HIV-related); major illness of a family member/close friend; death of a family member/close friend; financial stresses; and relationship stresses. In multivariable models, each additional moderately stressful event an individual experienced at a given time point above his or her norm (within-person association) was associated with a 24% to 27% increased odds of unprotected sex for each partner type. Risk reduction among PLWHA remains a major focus of efforts to combat the HIV epidemic. Incident stressful events are exceedingly common in the lives of PLWHA and are associated with increased unprotected sex. Efforts to either prevent the occurrence of such events (e.g., financial or relationship counseling) or address their sequelae (e.g., coping skills or other behavioral counseling) may help reduce secondary HIV transmission.

  20. Meta-Analysis of Relation of Vital Exhaustion to Cardiovascular Disease Events.

    Science.gov (United States)

    Cohen, Randy; Bavishi, Chirag; Haider, Syed; Thankachen, Jincy; Rozanski, Alan

    2017-04-15

    To assess the net impact of vital exhaustion on cardiovascular events and all-cause mortality, we conducted a systematic search of PubMed, EMBASE, and PsychINFO (through April 2016) to identify all studies which investigated the relation between vital exhaustion (VE) and health outcomes. Inclusion criteria were as follows: (1) a cohort study (prospective cohort or historical cohort) consisting of adults (>18 years); (2) at least 1 self-reported or interview-based assessment of VE or exhaustion; (3) evaluated the association between vital exhaustion or exhaustion and relevant outcomes; and (4) reported adjusted risk estimates of vital exhaustion/exhaustion for outcomes. Maximally adjusted effect estimates with 95% CIs along with variables used for adjustment in multivariate analysis were also abstracted. Primary study outcome was cardiovascular events. Secondary outcomes were stroke and all-cause mortality. Seventeen studies (19 comparisons) with a total of 107,175 participants were included in the analysis. Mean follow-up was 6 years. VE was significantly associated with an increased risk for cardiovascular events (relative risk 1.53, 95% CI 1.28 to 1.83, p exhaustion, such as occupational burnout. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    Science.gov (United States)

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Quadrivalent human papillomavirus vaccine and autoimmune adverse events: a case-control assessment of the vaccine adverse event reporting system (VAERS) database.

    Science.gov (United States)

    Geier, David A; Geier, Mark R

    2017-02-01

    Gardasil is a quadrivalent human papillomavirus (HPV4) vaccine that was approved for use by the US Food and Drug Administration in June 2006. HPV4 vaccine is routinely recommended for administration to women in the USA who are 11-12 years old by the Advisory Committee on Immunization Practices. Previous studies suggest HPV4 vaccine administration was associated with autoimmune diseases. As a consequence, an epidemiological assessment of the vaccine adverse event reporting system database was undertaken for adverse event reports associated with vaccines administered from 2006 to 2014 to 6-39 year-old recipients with a listed US residence and a specified female gender. Cases with the serious autoimmune adverse event (SAAE) outcomes of gastroenteritis (odds ratio (OR) 4.627, 95 % confidence interval (CI) 1.892-12.389), rheumatoid arthritis (OR 5.629, 95 % CI 2.809-12.039), thrombocytopenia (OR 2.178, 95 % CI 1.222-3.885), systemic lupus erythematosus (OR 7.626, 95 % CI 3.385-19.366), vasculitis (OR 3.420, 95 % CI 1.211-10.408), alopecia (OR 8.894, 95 % CI 6.255-12.914), CNS demyelinating conditions (OR 1.585, 95 % CI 1.129-2.213), ovarian damage (OR 14.961, 95 % CI 6.728-39.199), or irritable bowel syndrome (OR 10.021, 95 % CI 3.725-33.749) were significantly more likely than controls to have received HPV4 vaccine (median onset of initial symptoms ranged from 3 to 37 days post-HPV4 vaccination). Cases with the outcome of Guillain-Barre syndrome (OR 0.839, 95 % CI 0.601-1.145) were no more likely than controls to have received HPV4 vaccine. In addition, cases with the known HPV4-related outcome of syncope were significantly more likely than controls to have received HPV4 vaccine (OR 5.342, 95 % CI 4.942-5.777). Cases with the general health outcomes of infection (OR 0.765, 95 % CI 0.428-1.312), conjunctivitis (OR 1.010, 95 % CI 0.480-2.016), diarrhea (OR 0.927, 95 % CI 0.809-1.059), or pneumonia (OR 0.785, 95 % CI 0.481-1.246) were no more likely

  3. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    Science.gov (United States)

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  4. Human action analysis with randomized trees

    CERN Document Server

    Yu, Gang; Liu, Zicheng

    2014-01-01

    This book will provide a comprehensive overview on human action analysis with randomized trees. It will cover both the supervised random trees and the unsupervised random trees. When there are sufficient amount of labeled data available, supervised random trees provides a fast method for space-time interest point matching. When labeled data is minimal as in the case of example-based action search, unsupervised random trees is used to leverage the unlabelled data. We describe how the randomized trees can be used for action classification, action detection, action search, and action prediction.

  5. Fracturing tests on reservoir rocks: Analysis of AE events and radial strain evolution

    CERN Document Server

    Pradhan, S; Fjær, E; Stenebråten, J; Lund, H K; Sønstebø, E F; Roy, S

    2015-01-01

    Fracturing in reservoir rocks is an important issue for the petroleum industry - as productivity can be enhanced by a controlled fracturing operation. Fracturing also has a big impact on CO2 storage, geothermal installation and gas production at and from the reservoir rocks. Therefore, understanding the fracturing behavior of different types of reservoir rocks is a basic need for planning field operations towards these activities. In our study, the fracturing of rock sample is monitored by Acoustic Emission (AE) and post-experiment Computer Tomography (CT) scans. The fracturing experiments have been performed on hollow cylinder cores of different rocks - sandstones and chalks. Our analysis show that the amplitudes and energies of acoustic events clearly indicate initiation and propagation of the main fractures. The amplitudes of AE events follow an exponential distribution while the energies follow a power law distribution. Time-evolution of the radial strain measured in the fracturing-test will later be comp...

  6. Joint analysis of panel count data with an informative observation process and a dependent terminal event.

    Science.gov (United States)

    Zhou, Jie; Zhang, Haixiang; Sun, Liuquan; Sun, Jianguo

    2016-07-23

    Panel count data occur in many clinical and observational studies, and in many situations, the observation process may be informative and also there may exist a terminal event such as death which stops the follow-up. In this article, we propose a new joint model for the analysis of panel count data in the presence of both an informative observation process and a dependent terminal event via two latent variables. For the inference on the proposed models, a class of estimating equations is developed and the resulting estimators are shown to be consistent and asymptotically normal. In addition, a lack-of-fit test is provided for assessing the adequacy of the models. Simulation studies suggest that the proposed approach works well for practical situations. A real example from a bladder cancer clinical trial is used to illustrate the proposed methods.

  7. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    Science.gov (United States)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  8. Landslide Event on 24 June in Sichuan Province, China: Preliminary Investigation and Analysis

    Directory of Open Access Journals (Sweden)

    Wanlin Meng

    2018-01-01

    Full Text Available This paper reports on a massive landslide event, in which 8 million cubic meters of earth and rocks slid down from the top of a mountain in the village of Xinmo, located in the county of Maoxian, in the province of Sichuan, China, on 24 June 2017. This landslide resulted in 10 fatalities and 73 people were reported as missing. This paper details the preliminary investigation, the joint-force rescue activity, and the analysis of the nearby topography, rainfall, and seismic fracture zone. The combined effects of large amounts of rainwater, steep topography, deep-seated sliding interface, and significant altitude difference between the highest point of the mountain and the Xinmo villagers’ houses are considered as the main influencing factor that triggered this landslide event. To develop geological disaster-prone areas in the future, four main recommendations to reduce casualties and environmental impacts are provided in this paper.

  9. Wavelet analysis of EEG for three-dimensional mapping of epileptic events.

    Science.gov (United States)

    Senhadji, L; Dillenseger, J L; Wendling, F; Rocha, C; Kinie, A

    1995-01-01

    This paper is aimed at understanding epileptic patient disorders through the analysis of surface electroencephalograms (EEG). It deals with the detection of spikes or spike-waves based on a nonorthogonal wavelet transform. A multilevel structure is described that locates the temporal segments where abnormal events occur. These events are then visually interpreted by means of a 3D mapping technique. This 3D display makes use of a ray tracing scheme and combines both the functional (the EEG but also its wavelet representation) and the morphological data (acquired from computed tomography [CT] or magnetic resonance imaging [MRI] devices). The results show that a significant reduction of the clinical workload is obtained while the most important episodes are better reviewed and analyzed.

  10. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  11. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    Energy Technology Data Exchange (ETDEWEB)

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  12. The effects of the 1996–2012 summer heat events on human mortality in Slovakia

    Directory of Open Access Journals (Sweden)

    Výberči Dalibor

    2015-09-01

    Full Text Available The impacts of summer heat events on the mortality of the Slovak population, both in total and for selected population sub-groups, are the foci of this study. This research is the first of its kind, focusing on a given population, and therefore one priority was to create a knowledge base for the issue and to basically evaluate existing conditions for the heat-mortality relationship in Slovakia. This article also aims to fill a void in current research on these issues in Europe. In addition to overall effects, we focused individually on the major historical heat events which occurred in the summers of 2007, 2010 and 2012. During the heat events, a non-negligible negative response in mortality was recorded and fatal effects were more pronounced during particularly strong heat events and periods which lasted for two or more days. In general, females and the elderly were the most sensitive groups in the population and mortality was characterized by several specific effects in individual population groups. The most extreme heat periods were commonly followed by a deficit in mortality, corresponding to a short-term mortality displacement, the pattern of which varied in specific cases. In general, displaced mortality appeared to compensate for a large part of heat-induced excess deaths.

  13. Representations in human visual short-term memory : an event-related brain potential study

    NARCIS (Netherlands)

    Klaver, P; Smid, HGOM; Heinze, HJ

    1999-01-01

    Behavioral measures and event-related potentials (ERPs) were recorded from 12 subjects while performing three delayed matching-to-sample tasks. The task instructions indicated whether stimulus locations, shapes or conjunctions of locations and shapes had to be memorized and matched against a probe.

  14. Interleukin-1 receptor antagonist (IL1RN) is associated with suppression of early carcinogenic events in human oral malignancies.

    Science.gov (United States)

    Shiiba, Masashi; Saito, Kengo; Yamagami, Hitomi; Nakashima, Dai; Higo, Morihiro; Kasamatsu, Atsushi; Sakamoto, Yosuke; Ogawara, Katsunori; Uzawa, Katsuhiro; Takiguchi, Yuichi; Tanzawa, Hideki

    2015-05-01

    Inflammatory abnormalities have been implicated in the pathogenesis of various human diseases, including cancer. Interleukin-1 receptor antagonist (IL1RN) is a potent anti-inflammatory molecule that modulates the biological activity of the proinflammatory cytokine, interleukin-1. The aim of this study was to examine the expression of IL1RN in oral squamous cell carcinomas (OSCCs), and to determine its clinical significance. Expression levels of IL1RN in matched normal and tumor specimens from 39 OSCCs were evaluated using real-time quantitative polymerase chain reaction methods, and immunohistochemical analysis. Protein expression of IL1RN was also examined in 18 oral premalignant lesions (OPLs). Expression of IL1RN mRNA was significantly downregulated in OSCCs compared with normal tissues. Decreased expression of IL1RN protein was also observed in OPLs and OSCCs. The IL1RN expression level was lower in the OPL cases with severe dysplasia compared to those with mild/moderate dysplasia. Significantly downregulated IL1RN expression was observed in all OSCC lesion sites examined when compared with the matched normal tissues. However, the decreased level of IL1RN expression did not correspond with tumor progression. Noteworthy, IL1RN expression was higher in the advanced OSCC cases (T3/T4) compared to early cases (T1/T2). Among OSCC samples, relatively higher IL1RN expression was associated with active tumor development in the OSCCs occurring in the buccal mucosa, oral floor, fauces and gingiva, but not the tongue. These data suggest that IL1RN may exhibit opposing characteristics in oral malignancies depending on the stage of cancer development, suppressing early carcinogenic events, yet promoting tumor development in some lesion sites. Thus, IL1RN could represent a reliable biomarker for the early diagnosis of OSCCs. Furthermore, IL1RN may possess unknown and complex functions in the developed OSCC.

  15. Personal significance is encoded automatically by the human brain: an event-related potential study with ringtones.

    Science.gov (United States)

    Roye, Anja; Jacobsen, Thomas; Schröger, Erich

    2007-08-01

    In this human event-related brain potential (ERP) study, we have used one's personal--relative to another person's--ringtone presented in a two-deviant passive oddball paradigm to investigate the long-term memory effects of self-selected personal significance of a sound on the automatic deviance detection and involuntary attention system. Our findings extend the knowledge of long-term effects usually reported in group-approaches in the domains of speech, music and environmental sounds. In addition to the usual mismatch negativity (MMN) and P3a component elicited by deviants in contrast to standard stimuli, we observed a posterior ERP deflection directly following the MMN for the personally significant deviant only. This specific impact of personal significance started around 200 ms after sound onset and involved neural generators that were different from the mere physical deviance detection mechanism. Whereas the early part of the P3a component was unaffected by personal significance, the late P3a was enhanced for the ERPs to the personal significant deviant suggesting that this stimulus was more powerful in attracting attention involuntarily. Following the involuntary attention switch, the personally significant stimulus elicited a widely-distributed negative deflection, probably reflecting further analysis of the significant sound involving evaluation of relevance or reorienting to the primary task. Our data show, that the personal significance of mobile phone and text message technology, which have developed as a major medium of communication in our modern world, prompts the formation of individual memory representations, which affect the processing of sounds that are not in the focus of attention.

  16. The DNA sequence, annotation and analysis of human chromosome 3

    DEFF Research Database (Denmark)

    Muzny, Donna M; Scherer, Steven E; Kaul, Rajinder

    2006-01-01

    After the completion of a draft human genome sequence, the International Human Genome Sequencing Consortium has proceeded to finish and annotate each of the 24 chromosomes comprising the human genome. Here we describe the sequencing and analysis of human chromosome 3, one of the largest human chr...

  17. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  18. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Science.gov (United States)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  19. Two Extreme Climate Events of the Last 1000 Years Recorded in Himalayan and Andean Ice Cores: Impacts on Humans

    Science.gov (United States)

    Thompson, L. G.; Mosley-Thompson, E. S.; Davis, M. E.; Kenny, D. V.; Lin, P.

    2013-12-01

    historically in South America, is concomitant with major droughts in India, the collapse of the Yang Dynasty and the Black Death that eliminated roughly one third of the global population. Understanding the characteristics and drivers of these 'natural' events is critical to design adaptive measures for a world with over seven billion people and a climate system now influenced by human activities.

  20. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L. (Science Applications International Corp., Albuquerque, NM (USA); Sandia National Labs., Albuquerque, NM (USA))

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  1. Parallel Factor Analysis as an exploratory tool for wavelet transformed event-related EEG

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai; Hermann, Cristoph S.

    2006-01-01

    -Montes, E., Valdes-Sosa, P.A., Nishiyama, N., Mizuhara, H., Yamaguchi, Y., 2004. Decomposing EEG data into space-time-frequency components using parallel factor analysis. Neuroimage 22, 1035-1045). In this article, PARAFAC is used for the first time to decompose wavelet transformed event-related EEG given...... of frequency transformed multi-channel EEG of channel x frequency x time data. The multi-way decomposition method Parallel Factor (PARAFAC), also named Canonical Decomposition (CANDECOMP), was recently used to decompose the wavelet transformed ongoing EEG of channel x frequency x time (Miwakeichi, F., Martinez......In the decomposition of multi-channel EEG signals, principal component analysis (PCA) and independent component analysis (ICA) have widely been used. However, as both methods are based on handling two-way data, i.e. two-dimensional matrices, multi-way methods might improve the interpretation...

  2. Genome-Wide Analysis of DNA Methylation in Human Amnion

    Directory of Open Access Journals (Sweden)

    Jinsil Kim

    2013-01-01

    Full Text Available The amnion is a specialized tissue in contact with the amniotic fluid, which is in a constantly changing state. To investigate the importance of epigenetic events in this tissue in the physiology and pathophysiology of pregnancy, we performed genome-wide DNA methylation profiling of human amnion from term (with and without labor and preterm deliveries. Using the Illumina Infinium HumanMethylation27 BeadChip, we identified genes exhibiting differential methylation associated with normal labor and preterm birth. Functional analysis of the differentially methylated genes revealed biologically relevant enriched gene sets. Bisulfite sequencing analysis of the promoter region of the oxytocin receptor (OXTR gene detected two CpG dinucleotides showing significant methylation differences among the three groups of samples. Hypermethylation of the CpG island of the solute carrier family 30 member 3 (SLC30A3 gene in preterm amnion was confirmed by methylation-specific PCR. This work provides preliminary evidence that DNA methylation changes in the amnion may be at least partially involved in the physiological process of labor and the etiology of preterm birth and suggests that DNA methylation profiles, in combination with other biological data, may provide valuable insight into the mechanisms underlying normal and pathological pregnancies.

  3. [Characteristics analysis of human tongue reflectance spectra].

    Science.gov (United States)

    Zhao, Jing; Liu, Ming; Lu, Xiao-zuo; Li, Gang

    2014-08-01

    The present paper presents the spectroscopic analysis method. Eighty samples of spectra data of tongue parts with coating and without coating were collected by Usb4000 spectrometer of Ocean Optics, then comparing the spectra data of the different parts of tongue we found that there was a relation between the spectra characteristics and tongue coating, and further analysis of the spectra data showed that there was a big difference between the two parts within the wavelength range between 500 and 600 nm. It was also found that the biggest differences appear when the wavelength is 579.39 nm, and at the same time, different colors of tongue coating were also compared, and the spectrum was also quite different because of different color and thickness of the tongue coating. The experiment results show that different color, thickness, and dryness of the human tongue coating lead to different spectral characteristics, and compared with the current colorimetric method of tongue characterization, spectral reflectance can reflect more physiological and pathological information. The experiment results also indicated that the different spectral characteristics of tongue property and tongue coating will be used for further separation of these two parts, and to provide an objective analysis index for tongue coating qualitative and quantitative analysis, so as to promote the objectivity of the TCM.

  4. An event-related analysis of P300 by simultaneous EEG/fMRI

    Science.gov (United States)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  5. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    Science.gov (United States)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  6. Pinda: a web service for detection and analysis of intraspecies gene duplication events.

    Science.gov (United States)

    Kontopoulos, Dimitrios-Georgios; Glykos, Nicholas M

    2013-09-01

    We present Pinda, a Web service for the detection and analysis of possible duplications of a given protein or DNA sequence within a source species. Pinda fully automates the whole gene duplication detection procedure, from performing the initial similarity searches, to generating the multiple sequence alignments and the corresponding phylogenetic trees, to bootstrapping the trees and producing a Z-score-based list of duplication candidates for the input sequence. Pinda has been cross-validated using an extensive set of known and bibliographically characterized duplication events. The service facilitates the automatic and dependable identification of gene duplication events, using some of the most successful bioinformatics software to perform an extensive analysis protocol. Pinda will prove of use for the analysis of newly discovered genes and proteins, thus also assisting the study of recently sequenced genomes. The service's location is http://orion.mbg.duth.gr/Pinda. The source code is freely available via https://github.com/dgkontopoulos/Pinda/. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Using internet search queries to predict human mobility in social events

    DEFF Research Database (Denmark)

    Borysov, Stanislav; Lourenco, Mariana; Rodrigues, Filipe

    2016-01-01

    While our transport systems are generally designed for habitual behavior, the dynamics of large and mega cities systematically push it to its limits. Particularly, transport planning and operations in large events are well known to be a challenge. Not only they imply stress to the system...... on an irregular basis, their associated mobility behavior is also difficult to predict. Previous studies have shown a strong correlation between number of public transport arrivals with the semi-structured data mined from online announcement websites. However, these models tend to be complex in form and demand...... natural language form, we employ supervised topic model to correlate it with real measurements of transport usage. In this way, the proposed approach is more generic and a transit agency can start planning ahead as early as the event is announced on the web. The results show that using information mined...

  8. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  9. Impacts of extreme temperature events on mortality: analysis over individual seasons

    Science.gov (United States)

    Kysely, J.; Plavcova, E.; Kyncl, J.; Kriz, B.; Pokorna, L.

    2009-04-01

    Extreme temperature events influence human society in many ways, including impacts on morbidity and mortality. While the effects of hot summer periods are relatively direct in mid-latitudinal regions, much less is known and little consensus has been achieved about possible consequences of both positive and negative temperature extremes in other parts of year. The study examines links between spells of hot and cold temperature anomalies and daily all-cause (total) mortality and mortality due to cardiovascular diseases in the population of the Czech Republic (central Europe) in individual seasons (DJF, MAM, JJA, SON). The datasets cover the period 1986-2006. Hot (cold) spells are defined in terms of anomalies of average daily temperature from the mean annual cycle as periods of at least 2 successive days on which the anomalies are above (below) the 95% (5%) quantile of the empirical distribution of the anomalies. Excess daily mortality is established by calculating deviations of the observed number of deaths and the expected number of deaths, which takes into account effects of long-term changes in mortality and the annual cycle. Periods when mortality is affected by influenza and acute respiratory infection outbreaks have been identified and excluded from the datasets before the analysis. The study is carried out for several population groups in order to identify dependence of the mortality impacts on age and gender; in particular, we focus on differences in the impacts on the elderly (70+ yrs) and younger age groups (0-69 yrs). Although results for hot- and cold-related mortality are less conclusive in the other seasons outside summer, significant links are found in several cases. The analysis reveals that - the largest effects of either hot or cold spells are observed for hot spells in JJA, with a 14% (16%) increase in mortality for the 1-day lag for all ages (70+ yrs); - much smaller but still significant effects are associated with hot spells in MAM; - the

  10. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  11. Progression and Iteration in Event Semantics — An LTAG Analysis Using Hybrid Logic and Frame Semantics

    OpenAIRE

    Kallmeyer, Laura; Osswald, Rainer; Pogodalla, Sylvain

    2015-01-01

    Colloque de Syntaxe et Sémantique à Paris (CSSP 2015); International audience; In this paper, we propose to use Hybrid Logic (HL) as a means to combine frame-based lexical semantics with quantification. We integrate this into an LTAG syntax-semantics interface and show that this architecture allows a fine-grained description of event structures by quantifying for instance over subevents. As a case study we provide an analysis of iteration and progression in combination with for-adverbials. Wi...

  12. Hidden Markov analysis of trajectories in single-molecule experiments and the effects of missed events.

    Science.gov (United States)

    Stigler, Johannes; Rief, Matthias

    2012-03-01

    The ever more complex fluctuation patterns discovered by single molecule experiments require statistical methods to analyze multi-state hopping traces of long lengths. Hidden Markov modeling is a statistical tool that offers the scalability to analyze even complex data and extract kinetic information. We give an introduction on how to implement hidden Markov modeling for the analysis of single molecule force spectroscopic traces, deal with missed events, and test the method on a calcium binding protein. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Meteorological factors and timing of the initiating event of human parturition

    Science.gov (United States)

    Hirsch, Emmet; Lim, Courtney; Dobrez, Deborah; Adams, Marci G.; Noble, William

    2011-03-01

    The aim of this study was to determine whether meteorological factors are associated with the timing of either onset of labor with intact membranes or rupture of membranes prior to labor—together referred to as `the initiating event' of parturition. All patients delivering at Evanston Hospital after spontaneous labor or rupture of membranes at ≥20 weeks of gestation over a 6-month period were studied. Logistic regression models of the initiating event of parturition using clinical variables (maternal age, gestational age, parity, multiple gestation and intrauterine infection) with and without the addition of meteorological variables (barometric pressure, temperature and humidity) were compared. A total of 1,088 patients met the inclusion criteria. Gestational age, multiple gestation and chorioamnionitis were associated with timing of initiation of parturition ( P < 0.01). The addition of meteorological to clinical variables generated a statistically significant improvement in prediction of the initiating event; however, the magnitude of this improvement was small (less than 2% difference in receiver-operating characteristic score). These observations held regardless of parity, fetal number and gestational age. Meteorological factors are associated with the timing of parturition, but the magnitude of this association is small.

  14. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  15. Review of Findings for Human Performance Contribution to Risk in Operating Events

    Science.gov (United States)

    2002-03-01

    emergency operating procedure EPRI Electric Power Research Institute ERAT emergency reserve auxiliary transformer ESFAS engineered safety features...replacement refueling outage. During the outage, obsolete engineered safety features actuation system ( ESFAS ) bistables were replaced to improve Human

  16. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    Energy Technology Data Exchange (ETDEWEB)

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  17. Trimpi occurrence and geomagnetic activity: Analysis of events detected at Comandante Ferraz Brazilian Antarctic Station (L=2.25)

    OpenAIRE

    Fernandez, JH; Piazza, LR; Kaufmann, P

    2003-01-01

    [1] We present an analysis of the occurrence of Trimpi events observed at Comandante Ferraz Brazilian Antarctic Station (EACF), at L = 2.25, as observed by the amplitude of very low frequency (VLF) signals transmitted from Hawaii (NPM 21.4 kHz) from April 1996 to August 1999. The event parameters ( total duration, amplitude variation, time incidence, and type ( negative or positive)) were analyzed for 4394 events detected in the first year ( solar minimum and relatively low Trimpi activity). ...

  18. Post-transcriptional exon shuffling events in humans can be evolutionarily conserved and abundant

    OpenAIRE

    Al-Balool, Haya H.; Weber, David; Liu, Yilei; Wade, Mark; Guleria, Kamlesh; Nam, Pitsien Lang Ping; Clayton, Jake; Rowe, William; Coxhead, Jonathan; Irving, Julie; Elliott, David J.; Hall, Andrew G.; Santibanez-Koref, Mauro; Jackson, Michael S.

    2011-01-01

    In silico analyses have established that transcripts from some genes can be processed into RNAs with rearranged exon order relative to genomic structure (post-transcriptional exon shuffling, or PTES). Although known to contribute to transcriptome diversity in some species, to date the structure, distribution, abundance, and functional significance of human PTES transcripts remains largely unknown. Here, using high-throughput transcriptome sequencing, we identify 205 putative human PTES produc...

  19. Genome-Wide Transcriptome Analysis Reveals Extensive Alternative Splicing Events in the Protoscoleces of Echinococcus granulosus and Echinococcus multilocularis.

    Science.gov (United States)

    Liu, Shuai; Zhou, Xiaosu; Hao, Lili; Piao, Xianyu; Hou, Nan; Chen, Qijun

    2017-01-01

    Alternative splicing (AS), as one of the most important topics in the post-genomic era, has been extensively studied in numerous organisms. However, little is known about the prevalence and characteristics of AS in Echinococcus species, which can cause significant health problems to humans and domestic animals. Based on high-throughput RNA-sequencing data, we performed a genome-wide survey of AS in two major pathogens of echinococcosis-Echinococcus granulosus and Echinococcus multilocularis. Our study revealed that the prevalence and characteristics of AS in protoscoleces of the two parasites were generally consistent with each other. A total of 6,826 AS events from 3,774 E. granulosus genes and 6,644 AS events from 3,611 E. multilocularis genes were identified in protoscolex transcriptomes, indicating that 33-36% of genes were subject to AS in the two parasites. Strikingly, intron retention instead of exon skipping was the predominant type of AS in Echinococcus species. Moreover, analysis of the Kyoto Encyclopedia of Genes and Genomes pathway indicated that genes that underwent AS events were significantly enriched in multiple pathways mainly related to metabolism (e.g., purine, fatty acid, galactose, and glycerolipid metabolism), signal transduction (e.g., Jak-STAT, VEGF, Notch, and GnRH signaling pathways), and genetic information processing (e.g., RNA transport and mRNA surveillance pathways). The landscape of AS obtained in this study will not only facilitate future investigations on transcriptome complexity and AS regulation during the life cycle of Echinococcus species, but also provide an invaluable resource for future functional and evolutionary studies of AS in platyhelminth parasites.

  20. Analysis of extrinsic and intrinsic factors affecting event related desynchronization production.

    Science.gov (United States)

    Takata, Yohei; Kondo, Toshiyuki; Saeki, Midori; Izawa, Jun; Takeda, Kotaro; Otaka, Yohei; It, Koji

    2012-01-01

    Recently there has been an increase in the number of stroke patients with motor paralysis. Appropriate re-afferent sensory feedback synchronized with a voluntary motor intention would be effective for promoting neural plasticity in the stroke rehabilitation. Therefore, BCI technology is considered to be a promising approach in the neuro-rehabilitation. To estimate human motor intention, an event-related desynchronization (ERD), a feature of electroencephalogram (EEG) evoked by motor execution or motor imagery is usually used. However, there exists various factors that affect ERD production, and its neural mechanism is still an open question. As a preliminary stage, we evaluate mutual effects of intrinsic (voluntary motor imagery) and extrinsic (visual and somatosensory stimuli) factors on the ERD production. Experimental results indicate that these three factors are not always additively interacting with each other and affecting the ERD production.

  1. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    Science.gov (United States)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  2. Wood anatomical analysis of Alnus incana and Betula pendula injured by a debris-flow event.

    Science.gov (United States)

    Arbellay, Estelle; Stoffel, Markus; Bollschweiler, Michelle

    2010-10-01

    Vessel chronologies in ring-porous species have been successfully employed in the past to extract the climate signal from tree rings. Environmental signals recorded in vessels of ring-porous species have also been used in previous studies to reconstruct discrete events of drought, flooding and insect defoliation. However, very little is known about the ability of diffuse-porous species to record environmental signals in their xylem cells. Moreover, time series of wood anatomical features have only rarely been used to reconstruct former geomorphic events. This study was therefore undertaken to characterize the wood anatomical response of diffuse-porous Alnus incana (L.) Moench and Betula pendula Roth to debris-flow-induced wounding. Tree microscopic response to wounding was assessed through the analysis of wood anatomical differences between injured rings formed in the debris-flow event year and uninjured rings formed in the previous year. The two ring types were examined close and opposite to the injury in order to determine whether wound effects on xylem cells decrease with increasing tangential distance from the injury. Image analysis was used to measure vessel parameters as well as fiber and parenchyma cell (FPC) parameters. The results of this study indicate that injured rings are characterized by smaller vessels as compared with uninjured rings. By contrast, FPC parameters were not found to significantly differ between injured and uninjured rings. Vessel and FPC parameters mainly remained constant with increasing tangential distance from the injury, except for a higher proportion of vessel lumen area opposite to the injury within A. incana. This study highlights the existence of anatomical tree-ring signatures-in the form of smaller vessels-related to past debris-flow activity and addresses a new methodological approach to date injuries inflicted on trees by geomorphic processes.

  3. Reconstructing the 1771 Great Yaeyama Tsunami Event by using Impact Intensity Analysis and Volume Flux Method

    Science.gov (United States)

    Wu, Han; Wu, Tso-Ren; Lee, Chun-Juei; Tsai, Yu-Lin; Li, Pei-Yu

    2017-04-01

    The event of 1771 Japan Ishigaki Earthquake induced a large tsunami with an 80-meter runup height recorded. Several reef boulders transported by the huge tsunami waves were found along the coast and were located at elevation about 30 meters. Considering the limited distance between Yaeyama and Taiwan Islands, this study aimed to understand the behavior of tsunami propagation and the potential hazard in Taiwan. Reconstructing the 1771 event and validating the result with the field survey is the first step. In order to analysis hazard from the potential tsunami sources around the event area, we adopted the Impact Intensity Analysis (IIA), which had been presented in the EGU 2016 and many other international conferences. Instead of using IIA method, we further developed a new method called the Volume Flux Method (VFM). The VFM kept the accuracy of IIA method. However, the efficiency was improved significantly. The analyzed results showed that the source of the 1771 Great Yaeyama Tsunami was most likely located at the south offshore of Ishigaki Island. The wave height and inundation area were matched with the survey map (Geospatial Information Authority of Japan, 1994). The tsunami threat to Taiwan was also simulated. It indicated that the tsunami height would not be greater than 1 meters at east coast of Taiwan if the tsunami source located at nearshore around Ishigaki Island. However, it is noteworthy that the northeast coast of Taiwan was under the tsunami threats if the sources located in the south offshore on the Ryukyu Trench. We will present the detailed result in EGU 2017.

  4. A regional frequency analysis of extreme rainfall events over Southern France

    Science.gov (United States)

    Najib, K.; Neppel, L.; Tramblay, Y.

    2010-09-01

    Reliable estimates of extreme rainfall events are required for several hydrological purposes. However, the reliability of statistical inference tools based on Extreme Value Theory is poor when applied to short time series. These well-established statistical procedures are relevant only when applied to relatively long data records. Therefore, regional estimation methods that "trade space for time" by including several at-site data records in the frequency analysis are efficient tools to improve the reliability of extreme quantile estimates. Regional frequency analysis methods also allow for the estimation of extreme rainfall quantiles in sites with no data. However, all regionalization procedures require an extra step: the construction of homogenous regions. For this critical step, an original neighbourhood-type approach which provides a specific statistically homogeneous region for each site of interest is proposed in the present study. Both the Hosking and Wallis heterogeneity measure, based on L-moment ratios, and the non-parametric Anderson and Darling homogeneity test, are applied there within. A pooling scheme is also proposed to avoid the effects of intersite correlation. This regionalization method based on an index-value type procedure is applied to extreme rainfall events of Southern France. This study uses 1219 daily rainfall stations belonging to the French Weather Forecast rain gauge network: 601 stations have more than 20 years of daily data and 222 stations more than 50 years (from 1950 to 2008). A calibration-validation procedure was performed to evaluate the descriptive and predictive accuracy and the robustness of this regionalization method. Finally, this study provides a comparison between local and regional estimation methods for mapping Southern France extreme rainfall events.

  5. The analysis of competing events like cause-specific mortality--beware of the Kaplan-Meier method

    NARCIS (Netherlands)

    Verduijn, Marion; Grootendorst, Diana C.; Dekker, Friedo W.; Jager, Kitty J.; le Cessie, Saskia

    2011-01-01

    Kaplan-Meier analysis is a popular method used for analysing time-to-event data. In case of competing event analyses such as that of cardiovascular and non-cardiovascular mortality, however, the Kaplan-Meier method profoundly overestimates the cumulative mortality probabilities for each of the

  6. Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft

    Science.gov (United States)

    Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.

  7. Analysis of RPE morphometry in human eyes.

    Science.gov (United States)

    Bhatia, Shagun K; Rashid, Alia; Chrenek, Micah A; Zhang, Qing; Bruce, Beau B; Klein, Mitchel; Boatright, Jeffrey H; Jiang, Yi; Grossniklaus, Hans E; Nickerson, John M

    2016-01-01

    To describe the RPE morphometry of healthy human eyes regarding age and topographic location using modern computational methods with high accuracy and objectivity. We tested whether there were regional and age-related differences in RPE cell area and shape. Human cadaver donor eyes of varying ages were dissected, and the RPE flatmounts were immunostained for F-actin with AF635-phalloidin, nuclei stained with propidium iodide, and imaged with confocal microscopy. Image analysis was performed using ImageJ (NIH) and CellProfiler software. Quantitative parameters, including cell density, cell area, polygonality of cells, number of neighboring cells, and measures of cell shape, were obtained from these analyses to characterize individual and groups of RPE cells. Measurements were taken from selected areas spanning the length of the temporal retina through the macula and the mid-periphery to the far periphery. Nineteen eyes from 14 Caucasian donors of varying ages ranging from 29 to 80 years were used. Along a horizontal nasal to temporal meridian, there were differences in several cell shape and size characteristics. Generally, the cell area and shape was relatively constant and regular except in the far periphery. In the outer third of the retina, the cell area and shape differed from the inner two-thirds statistically significantly. In the macula and the far periphery, an overall decreasing trend in RPE cell density, percent hexagonal cells, and form factor was observed with increasing age. We also found a trend toward increasing cell area and eccentricity with age in the macula and the far periphery. When individuals were divided into two age groups, RPE morphometry between age groups were found in the mid-periphery. Human cadaver RPE cells differ mainly in area and shape in the outer one third compared to the inner two-thirds of the temporal retina. RPE cells become less dense and larger, lose their typical hexagonal shape, and become more oval with increasing age.

  8. Leveraging the World Cup: Mega Sporting Events, Human Rights Risk, and Worker Welfare Reform in Qatar

    OpenAIRE

    Sarath Ganji

    2016-01-01

    Qatar will realize its decades-long drive to host a mega sporting event when, in 2022, the opening ceremony of the Fédération Internationale de Football Association (FIFA) World Cup commences. By that time, the Qatari government will have invested at least $200 billion in real estate and development projects, employing anywhere between 500,000 and 1.5 million foreign workers to do so. The scale of these preparations is staggering — and not necessarily positive. Between 2010 and 2013, more tha...

  9. Initiating oncogenic event determines gene-expression patterns of human breast cancer models

    OpenAIRE

    Desai, Kartiki V; Xiao, Nianqing; Wang, Weili; Gangi, Lisa; Greene, John; Powell, John I.; Dickson, Robert; Furth, Priscilla; Hunter, Kent; Kucherlapati, Raju; Simon, Richard; Liu, Edison T; Green, Jeffrey E

    2002-01-01

    Molecular expression profiling of tumors initiated by transgenic overexpression of c-myc, c-neu, c-ha-ras, polyoma middle T antigen (PyMT) or simian virus 40 T/t antigen (T-ag) targeted to the mouse mammary gland have identified both common and oncogene-specific events associated with tumor formation and progression. The tumors shared great similarities in their gene-expression profiles as compared with the normal mammary gland with an induction of cell-cycle regulators, metabolic regulators,...

  10. AN ANALYSIS OF RISK EVENTS IN THE OIL-TANKER MAINTENANCE BUSINESS

    Directory of Open Access Journals (Sweden)

    Roque Rabechini Junior

    2012-12-01

    Full Text Available This work presents the results of an investigation into risk events and their respective causes, carried out in ship maintenance undertakings in the logistical sector of the Brazilian oil industry. Its theoretical, conceptual positioning lies in those aspects related to risk management of the undertakings as instruments of support in decision making by executives in the tanker-maintenance business. The case-study method was used as an alternative methodology with a qualitative approach of an exploratory nature and, for the presentation of data, a descriptive format was chosen. Through the analysis of 75 risk events in projects of tanker docking it was possible to extract eight of the greatest relevance. The risk analysis facilitated the identification of actions aimed at their mitigation. As a conclusion it was possible to propose a risk-framework model in four categories, HSE (health, safety and the environment, technicians, externalities and management, designed to provide tanker-docking business executives and administrators, with evidence of actions to assist in their decision-making processes. Finally, the authors identified proposals for further study as well as showing the principal limitations of the study.

  11. Multivariate Regression Analysis of Winter Ozone Events in the Uinta Basin of Eastern Utah, USA

    Science.gov (United States)

    Mansfield, M. L.

    2012-12-01

    I report on a regression analysis of a number of variables that are involved in the formation of winter ozone in the Uinta Basin of Eastern Utah. One goal of the analysis is to develop a mathematical model capable of predicting the daily maximum ozone concentration from values of a number of independent variables. The dependent variable is the daily maximum ozone concentration at a particular site in the basin. Independent variables are (1) daily lapse rate, (2) daily "basin temperature" (defined below), (3) snow cover, (4) midday solar zenith angle, (5) monthly oil production, (6) monthly gas production, and (7) the number of days since the beginning of a multi-day inversion event. Daily maximum temperature and daily snow cover data are available at ten or fifteen different sites throughout the basin. The daily lapse rate is defined operationally as the slope of the linear least-squares fit to the temperature-altitude plot, and the "basin temperature" is defined as the value assumed by the same least-squares line at an altitude of 1400 m. A multi-day inversion event is defined as a set of consecutive days for which the lapse rate remains positive. The standard deviation in the accuracy of the model is about 10 ppb. The model has been combined with historical climate and oil & gas production data to estimate historical ozone levels.

  12. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  13. Analysis of Loss-of-Offsite-Power Events 1998–2013

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk Assessment and Management Services Dept.

    2015-02-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses suggest that loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2013. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The emergency diesel generator failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. No statistically significant trends in LOOP frequencies over the 1997–2013 period are identified. There is a possibility that a significant trend in grid-related LOOP frequency exists that is not easily detected by a simple analysis. Statistically significant increases in recovery times after grid- and switchyard-related LOOPs are identified.

  14. Considering historical flood events in flood frequency analysis: Is it worth the effort?

    Science.gov (United States)

    Schendel, Thomas; Thongwichian, Rossukon

    2017-07-01

    Information about historical floods can be useful in reducing uncertainties in flood frequency estimation. Since the start of the historical record is often defined by the first known flood, the length of the true historical period M remains unknown. We have expanded a previously published method of estimating M to the case of several known floods within the historical period. We performed a systematic evaluation of the usefulness of including historical flood events into flood frequency analysis for a wide range of return periods and studied bias as well as relative root mean square error (RRMSE). Since we used the generalized extreme value distribution (GEV) as parent distribution, we were able to investigate the impact of varying the skewness on RRMSE. We confirmed the usefulness of historical flood data regarding the reduction of RRMSE, however we found that this reduction is less pronounced the more positively skewed the parent distribution was. Including historical flood information had an ambiguous effect on bias: depending on length and number of known floods of the historical period, bias was reduced for large return periods, but increased for smaller ones. Finally, we customized the test inversion bootstrap for estimating confidence intervals to the case that historical flood events are taken into account into flood frequency analysis.

  15. Spousal communication and contraceptive use in rural Nepal: an event history analysis.

    Science.gov (United States)

    Link, Cynthia F

    2011-06-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication-contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives' and husbands' perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives' reports of communication are substantial explanatory factors in couples' initiation of all contraceptive methods examined. Husbands' reports of communication predict couples'subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics--as well as husbands' perceptions of these dynamics--influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs.

  16. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    Science.gov (United States)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  17. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman; Katya Le Blanc

    2011-09-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  18. Event rates, hospital utilization, and costs associated with major complications of diabetes: a multicountry comparative analysis.

    Directory of Open Access Journals (Sweden)

    Philip M Clarke

    2010-02-01

    Full Text Available Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries.Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE study (mean age at entry 66 y. The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$, which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%-96% across regions and lowest for nephropathy (15%-26%. The average numbers of days in hospital given at least one admission were greatest for stroke (17-32 d across

  19. An overview of the evolution of human reliability analysis in the context of probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Bley, Dennis C. (Buttonwood Consulting Inc., Oakton, VA); Lois, Erasmia (U.S. Nuclear Regulatory Commission, Washington, DC); Kolaczkowski, Alan M. (Science Applications International Corporation, Eugene, OR); Forester, John Alan; Wreathall, John (John Wreathall and Co., Dublin, OH); Cooper, Susan E. (U.S. Nuclear Regulatory Commission, Washington, DC)

    2009-01-01

    Since the Reactor Safety Study in the early 1970's, human reliability analysis (HRA) has been evolving towards a better ability to account for the factors and conditions that can lead humans to take unsafe actions and thereby provide better estimates of the likelihood of human error for probabilistic risk assessments (PRAs). The purpose of this paper is to provide an overview of recent reviews of operational events and advances in the behavioral sciences that have impacted the evolution of HRA methods and contributed to improvements. The paper discusses the importance of human errors in complex human-technical systems, examines why humans contribute to accidents and unsafe conditions, and discusses how lessons learned over the years have changed the perspective and approach for modeling human behavior in PRAs of complicated domains such as nuclear power plants. It is argued that it has become increasingly more important to understand and model the more cognitive aspects of human performance and to address the broader range of factors that have been shown to influence human performance in complex domains. The paper concludes by addressing the current ability of HRA to adequately predict human failure events and their likelihood.

  20. Donating to disaster victims: responses to natural and humanly caused events

    OpenAIRE

    Zagefka, Hanna; Noor, Masi; Brown, Rupert; Randsley de Moura, Georgina; Hopthrow, Tim

    2011-01-01

    The effect of the cause of a disaster, i.e. whether it was perceived to be caused by human or natural factors, on willingness to donate money to disaster victims was examined. In Study 1 (N=76), the cause of a fictitious disaster was experimentally varied. In Study 2 (N=219), participants were asked about their views regarding donations to two real-life disasters, one of which was perceived to be naturally caused while the other one was perceived to be caused by humans. In Study 3 (N=115), th...

  1. Modeling propensity to move after job change using event history analysis and temporal GIS

    Science.gov (United States)

    Vandersmissen, Marie-Hélène; Séguin, Anne-Marie; Thériault, Marius; Claramunt, Christophe

    2009-03-01

    The research presented in this paper analyzes the emergent residential behaviors of individual actors in a context of profound social changes in the work sphere. It incorporates a long-term view in the analysis of the relationships between social changes in the work sphere and these behaviors. The general hypothesis is that social changes produce complex changes in the long-term dynamics of residential location behavior. More precisely, the objective of this paper is to estimate the propensity for professional workers to move house after a change of workplace. Our analysis draws on data from a biographical survey using a retrospective questionnaire that enables a posteriori reconstitution of the familial, professional and residential lifelines of professional workers since their departure from their parents’ home. The survey was conducted in 1996 in the Quebec City Metropolitan Area, which, much like other Canadian cities, has experienced a substantial increase in “unstable” work, even for professionals. The approach is based on event history analysis, a Temporal Geographic Information System and exploratory spatial analysis of model’s residuals. Results indicate that 48.9% of respondents moved after a job change and that the most important factors influencing the propensity to move house after a job change are home tenure (for lone adults as for couple) and number of children (for couples only). We also found that moving is associated with changing neighborhood for owners while tenants or co-tenants tend to stay in the same neighborhood. The probability of moving 1 year after a job change is 0.10 for lone adults and couples while after 2 years, the household structure seems to have an impact: the probability increased to 0.23 for lone adults and to 0.21 for couples. The outcome of this research contributes to furthering our understanding of a familial decision (to move) following a professional event (change of job), controlling for household structure

  2. How Event Managers Lead: applying competency school theory to event management

    OpenAIRE

    Abson, Emma

    2017-01-01

    A lack of research into human resource development, managerial skillsets, and leadership practices of event managers has resulted in widespread assumptions about the nature of leadership within events, which is unsupported by primary research. This qualitative research based on semistructured interviews focused on event managers working within the business events industry. Data analysis using thematic analysis and a ranking list establishes six key leadership practices—engaging communication,...

  3. Vertically integrated analysis of human DNA. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, M.

    1997-10-01

    This project has been oriented toward improving the vertical integration of the sequential steps associated with the large-scale analysis of human DNA. The central focus has been on an approach to the preparation of {open_quotes}sequence-ready{close_quotes} maps, which is referred to as multiple-complete-digest (MCD) mapping, primarily directed at cosmid clones. MCD mapping relies on simple experimental steps, supported by advanced image-analysis and map-assembly software, to produce extremely accurate restriction-site and clone-overlap maps. We believe that MCD mapping is one of the few high-resolution mapping systems that has the potential for high-level automation. Successful automation of this process would be a landmark event in genome analysis. Once other higher organisms, paving the way for cost-effective sequencing of these genomes. Critically, MCD mapping has the potential to provide built-in quality control for sequencing accuracy and to make possible a highly integrated end product even if there are large numbers of discontinuities in the actual sequence.

  4. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  5. A search engine for retrieval and inspection of events with 48 human actions in realistic videos

    NARCIS (Netherlands)

    Burghouts, G.J.; Penning, H.L.H. de; Hove, R.J.M. ten; Landsmeer, S.; Broek, S.P. van den; Hollander, R.J.M.; Hanckmann, P.; Kruithof, M.C.; Leeuwen, C.J. van; Korzec, S.; Bouma, H.; Schutte, K.

    2013-01-01

    The contribution of this paper is a search engine that recognizes and describes 48 human actions in realistic videos. The core algorithms have been published recently, from the early visual processing (Bouma, 2012), discriminative recognition (Burghouts, 2012) and textual description (Hankmann,

  6. Two monoclonal anti-CD3 antibodies can induce different events in human T lymphocyte activation

    NARCIS (Netherlands)

    Roosnek, E. E.; van Lier, R. A.; Aarden, L. A.

    1987-01-01

    Two monoclonal antibodies, WT32 and CLB-T3/4.2a, directed against the CD3 complex were used to study the mechanism of activation of human peripheral T lymphocytes. WT32, a mouse monoclonal IgG2a antibody with a low avidity (much less than OKT3) for the CD3 complex, effectively induces mitogenesis of

  7. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  8. HMPAS: Human Membrane Protein Analysis System.

    Science.gov (United States)

    Kim, Min-Sung; Yi, Gwan-Su

    2013-11-07

    Membrane proteins perform essential roles in diverse cellular functions and are regarded as major pharmaceutical targets. The significance of membrane proteins has led to the developing dozens of resources related with membrane proteins. However, most of these resources are built for specific well-known membrane protein groups, making it difficult to find common and specific features of various membrane protein groups. We collected human membrane proteins from the dispersed resources and predicted novel membrane protein candidates by using ortholog information and our membrane protein classifiers. The membrane proteins were classified according to the type of interaction with the membrane, subcellular localization, and molecular function. We also made new feature dataset to characterize the membrane proteins in various aspects including membrane protein topology, domain, biological process, disease, and drug. Moreover, protein structure and ICD-10-CM based integrated disease and drug information was newly included. To analyze the comprehensive information of membrane proteins, we implemented analysis tools to identify novel sequence and functional features of the classified membrane protein groups and to extract features from protein sequences. We constructed HMPAS with 28,509 collected known membrane proteins and 8,076 newly predicted candidates. This system provides integrated information of human membrane proteins individually and in groups organized by 45 subcellular locations and 1,401 molecular functions. As a case study, we identified associations between the membrane proteins and diseases and present that membrane proteins are promising targets for diseases related with nervous system and circulatory system. A web-based interface of this system was constructed to facilitate researchers not only to retrieve organized information of individual proteins but also to use the tools to analyze the membrane proteins. HMPAS provides comprehensive information about

  9. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, M.T.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., Reston, VA (United States); Bley, D.C. [PLG, Inc., Newport Beach, CA (United States); Ramey-Smith, A. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC`s Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed.

  10. Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    DEFF Research Database (Denmark)

    Thompson, Simon; Kaptoge, Stephen; White, Ian

    2010-01-01

    Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure-risk relationships, but involves a number of analytical challenges....

  11. BOLIVAR-tool for analysis and simulation of metocean extreme events

    Science.gov (United States)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  12. Political regime and human capital: A cross-country analysis

    NARCIS (Netherlands)

    Klomp, J.G.; Haan, de J.

    2013-01-01

    We examine the relationship between different dimensions of the political regime in place and human capital using a two-step structural equation model. In the first step, we employ factor analysis on 16 human capital indicators to construct two new human capital measures (basic and advanced human

  13. Political regime and human capital : A cross-country analysis

    NARCIS (Netherlands)

    Klomp, J.G.; de Haan, J.

    We examine the relationship between different dimensions of the political regime in place and human capital using a two-step structural equation model. In the first step, we employ factor analysis on 16 human capital indicators to construct two new human capital measures (basic and advanced human

  14. Political Regime and Human Capital: A Cross-Country Analysis

    Science.gov (United States)

    Klomp, Jeroen; de Haan, Jakob

    2013-01-01

    We examine the relationship between different dimensions of the political regime in place and human capital using a two-step structural equation model. In the first step, we employ factor analysis on 16 human capital indicators to construct two new human capital measures (basic and advanced human capital). In the second step, we estimate the…

  15. A Study to Model Human Behavior in Discrete Event Simulation (DES) using Simkit

    Science.gov (United States)

    2007-12-01

    predictor Agent Personality Types Extrovert , Introvert and Neutral The result from the El Farol DEMAS is expected to differ from the original El...it was also realized that the analysis process taxed available work hours with tedious data comparison . The formulae involved in the analysis

  16. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...... and for termination of the search for `causes'. In addition, the concept of human error is analysed and its intimate relation with human adaptation and learning is discussed. It is concluded that identification of errors as a separate class of behaviour is becoming increasingly difficult in modern work environments......, designers or managers have played a major role. There are, however, several basic problems in analysis of accidents and identification of human error. This paper addresses the nature of causal explanations and the ambiguity of the rules applied for identification of the events to include in analysis...

  17. Collaborative human-machine analysis using a controlled natural language

    Science.gov (United States)

    Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave

    2015-05-01

    A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".

  18. Changes of Extreme Climate Events in Latvia

    OpenAIRE

    Avotniece, Z; Klavins, M; Rodinovs, V

    2012-01-01

    Extreme climate events are increasingly recognized as a threat to human health, agriculture, forestry and other sectors. To assess the occurrence and impacts of extreme climate events, we have investigated the changes of indexes characterizing positive and negative temperature extremes and extreme precipitation as well as the spatial heterogeneity of extreme climate events in Latvia. Trend analysis of long–term changes in the frequency of extreme climate events demonst...

  19. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  20. The Identification of Seismo and Volcanomagnetic Events Using Non-stationary Analysis of Geomagnetic Field Variations.

    Science.gov (United States)

    Fedi, M.; Gonçalves, P.; Johnston, M.; La Manna, M.

    Many studies have shown a clear correlation between volcanic and/or seismic activ- ity and time variations of local geomagnetic fields, called seismomagnetic (SM) and /or volcanomagnetic (VM) effects. SM and VM can be generated from various phys- ical process, such as piezomagnetism, tectonomagnetism and electrokinetism. Rele- vant parameters are the event duration, the event magnitude and the magnetometer sample rate. Here, we present some results obtained from a non-stationary analysis of geomagnetic time series that focuses on automatic detection of possible SM and VM events. Several approaches are considered. The first one, based on the continuous wavelet transform, provides us with a multiresolution lecture of the signal, expanded in time-scale space. The second uses a time-variant adaptive algorithm (RLS) that al- lows the detection of some time intervals where important statistical variations of the signal occur. Finally, we investigate a third technique relying on multifractal analy- sis. This latter allows estimation of local regularity of a time series path, in order to detect unusual singularities. Different multifractal models were used for testing the methodology, such as multifractional Brownian Motions (mbmSs), before applying it to synthetic simulations of geomagnetic signals. In our simulations, we took into account theoretical SM and/or VM effects deriving from fault rupture and overpres- sured magma chambers. We applied these methodologies to two different real world data sets, recorded on Mt Etna (volcanic area) during the volcanic activity occurred in 1981, and in North Palm Springs (seismic area) during the seism of July 8th 1986, respectively. In both cases, all techniques were effective in automatically identifying the geomagnetic time-variations likely inferred by volcanic and/or seismic activity and the results are in good agreement with the indices provided by real volcanic and seismic measurements.

  1. Sequential Events in the Irreversible Thermal Denaturation of Human Brain-Type Creatine Kinase by Spectroscopic Methods

    Directory of Open Access Journals (Sweden)

    Yan-Song Gao

    2010-06-01

    Full Text Available The non-cooperative or sequential events which occur during protein thermal denaturation are closely correlated with protein folding, stability, and physiological functions. In this research, the sequential events of human brain-type creatine kinase (hBBCK thermal denaturation were studied by differential scanning calorimetry (DSC, CD, and intrinsic fluorescence spectroscopy. DSC experiments revealed that the thermal denaturation of hBBCK was calorimetrically irreversible. The existence of several endothermic peaks suggested that the denaturation involved stepwise conformational changes, which were further verified by the discrepancy in the transition curves obtained from various spectroscopic probes. During heating, the disruption of the active site structure occurred prior to the secondary and tertiary structural changes. The thermal unfolding and aggregation of hBBCK was found to occur through sequential events. This is quite different from that of muscle-type CK (MMCK. The results herein suggest that BBCK and MMCK undergo quite dissimilar thermal unfolding pathways, although they are highly conserved in the primary and tertiary structures. A minor difference in structure might endow the isoenzymes dissimilar local stabilities in structure, which further contribute to isoenzyme-specific thermal stabilities.

  2. Frequency of Extreme Heat Event as a Surrogate Exposure Metric for Examining the Human Health Effects of Climate Change.

    Directory of Open Access Journals (Sweden)

    Crystal Romeo Upperman

    Full Text Available Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989 that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate, including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events; responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies.

  3. Cluster Analysis of the Wind Events and Seasonal Wind Circulation Patterns in the Mexico City Region

    Directory of Open Access Journals (Sweden)

    Susana Carreón-Sierra

    2015-07-01

    Full Text Available The residents of Mexico City face serious problems of air pollution. Identifying the most representative scenarios for the transport and dispersion of air pollutants requires the knowledge of the main wind circulation patterns. In this paper, a simple method to recognize and characterize the wind circulation patterns in a given region is proposed and applied to the Mexico City winds (2001–2006. This method uses a lattice wind approach to model the local wind events at the meso-β scale, and hierarchical cluster analysis to recognize their agglomerations in their phase space. Data of the meteorological network of Mexico City was used as input for the lattice wind model. The Ward’s clustering algorithm with Euclidean distance was applied to organize the model wind events in seasonal clusters for each year of the period. Comparison of the hourly population trends of these clusters permitted the recognition and detailed description of seven circulation patterns. These patterns resemble the qualitative descriptions of the Mexico City wind circulation modes reported by other authors. Our method, however, permitted also their quantitative characterization in terms of the wind attributes of velocity, divergence and vorticity, and an estimation of their seasonal and annual occurrence probabilities, which never before were quantified.

  4. Development of single-event-effects analysis system at the IMP microbeam facility

    Science.gov (United States)

    Guo, Jinlong; Du, Guanghua; Bi, Jinshun; Liu, Wenjing; Wu, Ruqun; Chen, Hao; Wei, Junze; Li, Yaning; Sheng, Lina; Liu, Xiaojun; Ma, Shuyi

    2017-08-01

    Single-event-effects (SEEs) in integrated circuits (ICs) caused by galactic single ions are the major cause of anomalies for a spacecraft. The main strategies to decrease radiation failures for spacecraft are using SEEs less-sensitive devices and design radiation hardened ICs. High energy ion microbeam is one of the powerful tools to obtain spatial information of SEEs in ICs and to guide the radiation hardening design. The microbeam facility in the Institute of Modern Physics (IMP), Chinese Academy of Science (CAS) can meet both the liner energy transfer (LET) and ion range requirements for SEEs simulation experiments on ground. In order to study SEEs characteristics of ICs at this microbeam platform, a SEEs analysis system was developed. This system can target and irradiate ICs with single ions in micrometer-scale accuracy, meanwhile it acquires multi-channel SEE signals and maps the SEE sensitive regions online. A 4-Mbit NOR Flash memory was tested with this system using 2.2 GeV Kr ions, the radiation sensitive peripheral circuit regions for SEEs of 1 to 0 and 0 to 1 upset, multi-bit-upset and single event latchup have been obtained.

  5. Kickoff to conflict: a sequence analysis of intra-state conflict-preceding event structures.

    Directory of Open Access Journals (Sweden)

    Vito D'Orazio

    Full Text Available While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998-2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace.

  6. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  7. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  8. Investigation of Lab Fire Prevention Management System of Combining Root Cause Analysis and Analytic Hierarchy Process with Event Tree Analysis

    Directory of Open Access Journals (Sweden)

    Cheng-Chan Shih

    2016-01-01

    Full Text Available This paper proposed a new approach, combining root cause analysis (RCA, analytic hierarchy process (AHP, and event tree analysis (ETA in a loop to systematically evaluate various laboratory safety prevention strategies. First, 139 fire accidents were reviewed to identify the root causes and draw out prevention strategies. Most fires were caused due to runaway reactions, operation error and equipment failure, and flammable material release. These mostly occurred in working places of no prompt fire protection. We also used AHP to evaluate the priority of these strategies and found that chemical fire prevention strategy is the most important control element, and strengthening maintenance and safety inspection intensity is the most important action. Also together with our surveys results, we proposed that equipment design is also critical for fire prevention. Therefore a technical improvement was propounded: installing fire detector, automatic sprinkler, and manual extinguisher in the lab hood as proactive fire protections. ETA was then used as a tool to evaluate laboratory fire risks. The results indicated that the total risk of a fire occurring decreases from 0.0351 to 0.0042 without/with equipment taking actions. Establishing such system can make Environment, Health and Safety (EH&S office not only analyze and prioritize fire prevention policies more practically, but also demonstrate how effective protective equipment improvement can achieve and the probabilities of the initiating event developing into a serious accident or controlled by the existing safety system.

  9. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  10. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  11. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...... the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy....

  12. System-level analysis of single event upset susceptibility in RRAM architectures

    Science.gov (United States)

    Liu, Rui; Barnaby, Hugh J.; Yu, Shimeng

    2016-12-01

    In this work, the single event upset susceptibility of a resistive random access memory (RRAM) system with 1-transistor-1-resistor (1T1R) and crossbar architectures to heavy ion strikes is investigated from the circuit-level to the system-level. From a circuit-level perspective, the 1T1R is only susceptible to single-bit-upset (SBU) due to the isolation of cells, while in the crossbar, multiple-bit-upsets may occur because ion-induced voltage spikes generated on drivers may propagate along rows or columns. Three factors are considered to evaluate system-level susceptibility: the upset rate, the sensitive area, and the vulnerable time window. Our analysis indicates that the crossbar architecture has a smaller maximum bit-error-rate per day as compared to the 1T1R architecture for a given sub-array size, I/O width and susceptible time window.

  13. Analysis of Shop Floor Performance through Discrete Event Simulation: A Case Study

    Directory of Open Access Journals (Sweden)

    Yeong Wei Ng

    2014-01-01

    Full Text Available Shop floor performance management is a method to ensure the effective utilization of people, processes, and equipment. Changes in the shop floor might have a positive or negative effect on production performance. Therefore, optimal shop floor operation is required to enhance shop floor performance and to ensure the long-term efficiency of the production process. This work presents a case study of a semiconductor industry. The punching department is modeled to investigate the effect of changes in the shop floor on production performance through discrete event simulation. The effects on the throughput rate, machine utilization, and labor utilization are studied by adjusting the volume of parts, number of operators, and flow pattern of parts in a series of models. Simulation results are tested and analyzed by using analysis of variance (ANOVA. The best model under changes in the shop floor is identified during the exploration of alternative scenarios.

  14. Superposed ruptile deformational events revealed by field and VOM structural analysis

    Science.gov (United States)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and

  15. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  16. An Entry Point for Formal Methods: Specification and Analysis of Event Logs

    Directory of Open Access Journals (Sweden)

    Howard Barringer

    2010-03-01

    Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.

  17. Efficacy of forensic statement analysis in distinguishing truthful from deceptive eyewitness accounts of highly stressful events.

    Science.gov (United States)

    Morgan, Charles A; Colwell, Kevin; Hazlett, Gary A

    2011-09-01

    Laboratory-based detecting deception research suggests that truthful statements differ from those of deceptive statements. This nonlaboratory study tested whether forensic statement analysis (FSA) methods would distinguish genuine from false eyewitness accounts about exposure to a highly stressful event. A total of 35 military participants were assigned to truthful or deceptive eyewitness conditions. Genuine eyewitness reported truthfully about exposure to interrogation stress. Deceptive eyewitnesses studied transcripts of genuine eyewitnesses for 24 h and falsely claimed they had been interrogated. Cognitive Interviews were recorded, transcribed, and assessed by FSA raters blind to the status of participants. Genuine accounts contained more unique words, external and contextual referents, and a greater total word count than did deceptive statements. The type-token ratio was lower in genuine statements. The classification accuracy using FSA techniques was 82%. FSA methods may be effective in real-world circumstances and have relevance to professionals in law enforcement, security, and criminal justice. © 2011 American Academy of Forensic Sciences.

  18. Analysis of severe convective events from two operational dual polarisation doppler radars

    Directory of Open Access Journals (Sweden)

    M. Celano

    2006-01-01

    Full Text Available The recent gradual increase in the use of polarimetric radars prompts for possible improvements in the estimation of precipitation and the identification of the prevailing hydrometeor type. An analysis of different convection episodes (20 May 2003, 4 and 7 May 2004 is conducted in order to explore the attenuation effects at C band and their consequences on the rainfall field estimation using two polarimetric radars in the Po Valley, Italy, located about 90 km apart. A hydrometeor classification scheme, developed at the National Severe Storms Laboratory (NSSL, is used to detect the microphysical structure of the different cloud systems. The work is focused on the reconstruction of the 3-D organisation of the convective events, highlighting how the two radar systems ''see'' the storms from different points of view. Furthermore, the two distinct observations and the temperature field are used to correct the effect of attenuation.

  19. A historical analysis of hazardous events in the Alps – the case of Hindelang (Bavaria, Germany

    Directory of Open Access Journals (Sweden)

    F. Barnikel

    2003-01-01

    Full Text Available A historical analysis of natural hazards for the Hindelang area in the Bavarian Alps is done by researching and assessing data from different archives. The focus is on an evaluation of historical hazards on a local scale by working with written documents only. Data is compiled from the archives of governmental departments, local authorities, private collections and state archives. The bandwidth of the assessed hazards includes floods, mass movements and snow avalanches. So far we have collected more than 400 references for events in the Hindelang area, some of which at times or in places where natural hazards used to be thought of as unlikely or unknown. Our aim was to collect all written data for this area and to deduce as much information on the hazardous effects on the environment as possible, thereby enhancing our knowledge about past climatic and geomorphic dynamics in the Alps.

  20. Guest Editorial: Analysis and Retrieval of Events/Actions and Workflows in Video Streams

    DEFF Research Database (Denmark)

    Doulamis, Anastasios; Doulamis, Nikolaos; Bertini, Marco

    2016-01-01

    Cognitive video supervision and event analysis in video sequences is a critical task in many multimedia applications. Methods, tools, and algorithms that aim to detect and recognize high-level concepts and their respective spatiotemporal and causal relations in order to identify semantic video...... activities, actions, and procedures have been in the focus of the research community over the last years. This research area has strong impact on many real-life applications such as service quality assurance, compliance to the designed procedures in industrial plants, surveillance of people-dense areas (e.......g., thematic parks, critical public infrastructures), crisis management in public service areas (e.g., train stations, airports), security (detection of abnormal behaviors in surveillance videos), semantic characterization, and annotation of video streams in various domains (e.g., broadcast or user...

  1. 'Haymaking on Rajac Mt': Tourist event analysis according to gender and age structure

    Directory of Open Access Journals (Sweden)

    Brankov Jovana

    2009-01-01

    Full Text Available This work, using previous researches of socio-demographic influences upon tourist market, studies habits and behavior of tourists (visitors on tourist event Haymaking on Rajac mountain, as well as the use of advertising means for the purposes of gaining new information. The poll was conducted among 352 people selected by chance whereas the data analysis was carried out according to gender and aging structure. The aim of this research is to determine if there is a significant difference in the number or motives of traveling, choice of transport means, the amount of funds intended for staying and the way of spending money, as well as in using different means of tourist propaganda.

  2. Migration Experience and Premarital Sexual Initiation in Urban Kenya: An Event History Analysis

    Science.gov (United States)

    Luke, Nancy; Xu, Hongwei; Mberu, Blessing U.; Goldberg, Rachel E.

    2013-01-01

    Migration during the formative adolescent years can affect important life-course transitions, including the initiation of sexual activity. In this study, we use life history calendar data to investigate the relationship between changes in residence and timing of premarital sexual debut among young people in urban Kenya. By age 18, 64 percent of respondents had initiated premarital sex, and 45 percent had moved at least once between the ages of 12 and 18. Results of the event history analysis show that girls and boys who move during early adolescence experience the earliest onset of sexual activity. For adolescent girls, however, other dimensions of migration provide protective effects, with greater numbers of residential changes and residential changes in the last one to three months associated with later sexual initiation. To support young people’s ability to navigate the social, economic, and sexual environments that accompany residential change, researchers and policymakers should consider how various dimensions of migration affect sexual activity. PMID:23175950

  3. Migration experience and premarital sexual initiation in urban Kenya: an event history analysis.

    Science.gov (United States)

    Luke, Nancy; Xu, Hongwei; Mberu, Blessing U; Goldberg, Rachel E

    2012-06-01

    Migration during the formative adolescent years can affect important life-course transitions, including the initiation of sexual activity. In this study, we use life history calendar data to investigate the relationship between changes in residence and timing of premarital sexual debut among young people in urban Kenya. By age 18, 64 percent of respondents had initiated premarital sex, and 45 percent had moved at least once between the ages of 12 and 18. Results of the event history analysis show that girls and boys who move during early adolescence experience the earliest onset of sexual activity. For adolescent girls, however, other dimensions of migration provide protective effects, with greater numbers of residential changes and residential changes in the last one to three months associated with later sexual initiation. To support young people's ability to navigate the social, economic, and sexual environments that accompany residential change, researchers and policymakers should consider how various dimensions of migration affect sexual activity.

  4. Human recognition memory and conflict control: An event-related potential study.

    Science.gov (United States)

    Liu, T; Liu, X; Xiao, T; Shi, J

    2016-01-28

    The relationship between recognition memory and cognitive control is an important research topic. The current study investigated how conflict control influences an individual's emotional memory. During the encoding phase, participants were required to judge the affective valence of a Chinese Chengyu word (either positive or negative) in a modified Simon paradigm and to remember the word. Half of the words were presented in the congruent condition and the other half were displayed in the incongruent condition. During the retrieval phase, participants were instructed to make an 'old/new judgment' and decide whether the word had been presented previously. Electrophysiological responses were recorded using the event-related potential (ERP) technique. The behavioral results of retrieval processes showed that participants remembered more positive than negative words when they were encoded in the congruent condition. The electrophysiological results revealed that the retrieval of words encoded in the incongruent condition elicited less negative frontal negativity (FN) and early posterior negativity (EPN) amplitudes than those encoded in the congruent condition. The retrieval of words encoded in the incongruent condition induced greater late positive complex (LPC) amplitudes, relative to those encoded in the congruent condition on the left hemisphere. It was also observed that the recognition of positive words induced faster LPC responses than negative words when they were encoded in the incongruent condition. The present electrophysiological study illustrates that emotional memory processes may be affected by conflict control. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  5. A database of high-impact weather events in Greece: a descriptive impact analysis for the period 2001–2011

    Directory of Open Access Journals (Sweden)

    K. Papagiannaki

    2013-03-01

    Full Text Available This paper introduces the development of a database of high-impact weather events that occurred in Greece since 2001. The selected events are related to the occurrence of floods, flash floods, hail, snow/frost, tornados, windstorms, heat waves and lightning with adverse consequences (excluding those related to agriculture. The database includes, among others, the geographical distribution of the recorded events, relevant meteorological data, a brief description of the induced impacts and references in the press. This paper further offers an extensive analysis of the temporal and spatial distribution of high-impact weather events for the period 2001–2011, taking into account the intensity of weather conditions and the consequent impact on the society. Analysis of the monthly distribution of high-impact weather events showed that they are more frequent during October and November. More than 80 people lost their lives, half of which due to flash floods. In what concerns the spatial distribution of high-impact weather events, among the 51 prefectures of the country, Attica, Thessaloniki, Elia and Halkidiki were the most frequently affected areas, mainly by flash floods. Significant was also the share of tornados in Elia, of windstorms in Attica, of lightning and hail events in Halkidiki and of snow/frost events in Thessaloniki.

  6. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    Science.gov (United States)

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  7. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  8. Analysis and Prediction of West African Moist Events during the Boreal Spring of 2009

    Science.gov (United States)

    Mera, Roberto Javier

    Weather and climate in Sahelian West Africa are dominated by two major wind systems, the southwesterly West African Monsoon (WAM) and the northeasterly (Harmattan) trade winds. In addition to the agricultural benefit of the WAM, the public health sector is affected given the relationship between the onset of moisture and end of meningitis outbreaks. Knowledge and prediction of moisture distribution during the boreal spring is vital to the mitigation of meningitis by providing guidance for vaccine dissemination. The goal of the present study is to (a) develop a climatology and conceptual model of the moisture regime during the boreal spring, (b) investigate the role of extra-tropical and Convectively-coupled Equatorial Waves (CCEWs) on the modulation of westward moving synoptic waves and (c) determine the efficacy of a regional model as a tool for predicting moisture variability. Medical reports during 2009, along with continuous meteorological observations at Kano, Nigeria, showed that the advent of high humidity correlated with cessation of the disease. Further analysis of the 2009 boreal spring elucidated the presence of short-term moist events that modulated surface moisture on temporal scales relevant to the health sector. The May moist event (MME) provided insight into interplays among climate anomalies, extra-tropical systems, equatorially trapped waves and westward-propagating synoptic disturbances. The synoptic disturbance initiated 7 May and traveled westward to the coast by 12 May. There was a marked, semi-stationary moist anomaly in the precipitable water field (kg m-2) east of 10°E through late April and early May, that moved westward at the time of the MME. Further inspection revealed a mid-latitude system may have played a role in increasing the latitudinal amplitude of the MME. CCEWs were also found to have an impact on the MME. A coherent Kelvin wave propagated through the region, providing increased monsoonal flow and heightened convection. A

  9. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L. (Science Applications International Corp., Albuquerque, NM (USA); Sandia National Labs., Albuquerque, NM (USA))

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  10. Analysis of flash flood parameters and human impacts in the US from 2006 to 2012

    Science.gov (United States)

    Špitalar, Maruša; Gourley, Jonathan J.; Lutoff, Celine; Kirstetter, Pierre-Emmanuel; Brilly, Mitja; Carr, Nicholas

    2014-11-01

    Several different factors external to the natural hazard of flash flooding can contribute to the type and magnitude of their resulting damages. Human exposure, vulnerability, fatality and injury rates can be minimized by identifying and then mitigating the causative factors for human impacts. A database of flash flooding was used for statistical analysis of human impacts across the U.S. 21,549 flash flood events were analyzed during a 6-year period from October 2006 to 2012. Based on the information available in the database, physical parameters were introduced and then correlated to the reported human impacts. Probability density functions of the frequency of flash flood events and the PDF of occurrences weighted by the number of injuries and fatalities were used to describe the influence of each parameter. The factors that emerged as the most influential on human impacts are short flood durations, small catchment sizes in rural areas, vehicles, and nocturnal events with low visibility. Analyzing and correlating a diverse range of parameters to human impacts give us important insights into what contributes to fatalities and injuries and further raises questions on how to manage them.

  11. Cardiopulmonary resuscitation in the elderly: analysis of the events in the emergency department

    Directory of Open Access Journals (Sweden)

    Augusto Tricerri

    2013-10-01

    Full Text Available With the increasing number of old people in all western countries and increasing life expectancy at birth, many seniors spend the last period of their life with various afflictions that may lead to cardiac arrest. Bystander cardiopulmonary resuscitation (CPR increases survival rates. Octogenarians are the fastest growing segment of the population and despite empirical evidence that CPR is of questionable effectiveness in seniors with comorbidities, it is still the only treatment among life-sustaining ones. Cardiopulmonary resuscitation is frequently unsuccessful, but if survival is achieved, a fairly good quality of life can be expected. Various papers analyzed the effect of CPR in hospitalized patients or in cardiac arrest occurring at home or in public places, while less is known about events occurring in the emergency room (ER. We performed a retrospective analysis of cardiac arrest events occurred in ER within 54 months: we analyzed 415,001 records of ER visits (from 01/01/1999 to 30/06/2003 in San Giovanni Addolorata Hospital. Data were analyzed in terms of age and outcome. We identified 475 records with the outcome of death in ER or death on arrival. Out of them, we selected 290 medical records which had sufficient data to be analyzed. Of the 290 patients evaluated, 225 died in ER, 18 were deemed to die on arrival, and 47 survived the cardiac arrest and were admitted to intensive care unit (ICU. The overall mortality was 0.11%, while the incidence of the selected events was 0.072%. The mean age of the analyzed population was 71.3 years. The only possible diagnosis was often cardiac arrest, though most of the times we could specify and group the diagnosis even better. The analysis of the procedures showed that cardiac arrest treated by direct current (DC shock was similarly distributed in different age groups, and no difference was detectable between the two groups. The mean age of the patients who underwent tracheal intubation (TI was

  12. Audio visual information fusion for human activity analysis

    OpenAIRE

    Thagadur Shivappa, Shankar

    2010-01-01

    Human activity analysis in unconstrained environments using far-field sensors is a challenging task. The fusion of audio and visual cues enables us to build robust and efficient