WorldWideScience

Sample records for event data recorders

  1. The Forensics Aspects of Event Data Recorders

    Directory of Open Access Journals (Sweden)

    Jeremy S. Daily

    2008-09-01

    Full Text Available The proper generation and preservation of digital data from Event Data Recorders (EDRs can provide invaluable evidence to automobile crash reconstruction investigations. However, data collected from the EDR can be difficult to use and authenticate, complicating the presentation of such information as evidence in legal proceedings. Indeed, current techniques for removing and preserving such data do not meet the court’s standards for electronic evidence. Experimentation with an EDR unit from a 2001 GMC Sierra pickup truck highlighted particular issues with repeatability of results. Fortunately, advances in the digital forensics field and memory technology can be applied to EDR analysis in order to provide more complete and usable data. The presented issues should assist in the identification and development of a model for forensically sound collection and investigation techniques for EDRs.

  2. 77 FR 48492 - Event Data Recorders

    Science.gov (United States)

    2012-08-14

    ... that [cir] Involve side or side curtain/tube air bags such that EDR data would only need to be locked... deployable restraints other than frontal, side or side/curtain air bags such that EDR data would not need to... definitions to alleviate any uncertainties in multiple event crashes; Revised certain sensor ranges and...

  3. 76 FR 47478 - Event Data Recorders

    Science.gov (United States)

    2011-08-05

    ... increase the cost of memory for storage of acceleration data. It further commented that the revised... Requirements of Part 563 Part 563 specifies that if the EDR records acceleration data ``in non-volatile memory... protocols to better reflect current accelerometer technologies. \\4\\ See Docket number NHTSA-2004-18029. \\5...

  4. 77 FR 47552 - Event Data Recorders

    Science.gov (United States)

    2012-08-09

    ... as percentages. We also believed the change would better address state-of-the-art active steering... are recorded. We believe that section 563.9(b) is clear that when a memory buffer is available, EDRs... memory buffers are full, manufacturers may either overwrite any previous data that does not involve...

  5. Event metadata records as a testbed for scalable data mining

    International Nuclear Information System (INIS)

    Gemmeren, P van; Malon, D

    2010-01-01

    At a data rate of 200 hertz, event metadata records ('TAGs,' in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise 'data mining,' but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  6. THE USE OF EVENT DATA RECORDER (EDR – BLACK BOX

    Directory of Open Access Journals (Sweden)

    Gabriel Nowacki

    2014-03-01

    Full Text Available The paper refers to the registration of road events by a modern device called EDR – black box for all types of the motor vehicles. The device records data concerning vehicle’s technical condition, the way it was driven and RTS. The recorder may be used in private and commercial cars, taxies, buses and trucks. The recorder may serve the purpose of a neutral witness for the police, courts and insurance firms, for which it will facilitate making the reconstruction of the road accidents events and will provide a proof for those who caused them. The device will bring efficient driving, which will significantly contribute to decreasing the number of road accidents and limiting the environmental pollution. In the end in the last year German parliament backed a proposal to the European Commission to put black boxes, which gather information from vehicles involved in accidents, in all the new cars from 2015 on.

  7. Development of requirements and functional specifications for crash event data recorders : final report

    Science.gov (United States)

    2004-12-01

    The U.S. DOT has conducted research on the requirements for a Crash Event Data Recorder to facilitate the reconstruction of commercial motor vehicle crashes. This report documents the work performed on the Development of Requirements and Functiona...

  8. Effect of a data buffer on the recorded distribution of time intervals for random events

    Energy Technology Data Exchange (ETDEWEB)

    Barton, J C [Polytechnic of North London (UK)

    1976-03-15

    The use of a data buffer enables the distribution of the time intervals between events to be studied for times less than the recording system dead-time but the usual negative exponential distribution for random events has to be modified. The theory for this effect is developed for an n-stage buffer followed by an asynchronous recorder. Results are evaluated for the values of n from 1 to 5. In the language of queueing theory the system studied is of type M/D/1/n+1, i.e. with constant service time and a finite number of places.

  9. Video event data recording of a taxi driver used for diagnosis of epilepsy

    Directory of Open Access Journals (Sweden)

    Kotaro Sakurai

    2014-01-01

    Full Text Available A video event data recorder (VEDR in a motor vehicle records images before and after a traffic accident. This report describes a taxi driver whose seizures were recorded by VEDR, which was extremely useful for the diagnosis of epilepsy. The patient was a 63-year-old right-handed Japanese male taxi driver. He collided with a streetlight. Two years prior to this incident, he raced an engine for a long time while parked. The VEDR enabled confirmation that the accidents depended on an epileptic seizure and he was diagnosed with symptomatic localization-related epilepsy. The VEDR is useful not only for traffic accident evidence; it might also contribute to a driver's health care and road safety.

  10. Event Recording Data Acquisition System and Experiment Data Management System for Neutron Experiments at MLF, J-PARC

    Science.gov (United States)

    Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.

    Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.

  11. Common data elements for secondary use of electronic health record data for clinical trial execution and serious adverse event reporting

    Directory of Open Access Journals (Sweden)

    Philipp Bruland

    2016-11-01

    Full Text Available Abstract Background Data capture is one of the most expensive phases during the conduct of a clinical trial and the increasing use of electronic health records (EHR offers significant savings to clinical research. To facilitate these secondary uses of routinely collected patient data, it is beneficial to know what data elements are captured in clinical trials. Therefore our aim here is to determine the most commonly used data elements in clinical trials and their availability in hospital EHR systems. Methods Case report forms for 23 clinical trials in differing disease areas were analyzed. Through an iterative and consensus-based process of medical informatics professionals from academia and trial experts from the European pharmaceutical industry, data elements were compiled for all disease areas and with special focus on the reporting of adverse events. Afterwards, data elements were identified and statistics acquired from hospital sites providing data to the EHR4CR project. Results The analysis identified 133 unique data elements. Fifty elements were congruent with a published data inventory for patient recruitment and 83 new elements were identified for clinical trial execution, including adverse event reporting. Demographic and laboratory elements lead the list of available elements in hospitals EHR systems. For the reporting of serious adverse events only very few elements could be identified in the patient records. Conclusions Common data elements in clinical trials have been identified and their availability in hospital systems elucidated. Several elements, often those related to reimbursement, are frequently available whereas more specialized elements are ranked at the bottom of the data inventory list. Hospitals that want to obtain the benefits of reusing data for research from their EHR are now able to prioritize their efforts based on this common data element list.

  12. Common data elements for secondary use of electronic health record data for clinical trial execution and serious adverse event reporting.

    Science.gov (United States)

    Bruland, Philipp; McGilchrist, Mark; Zapletal, Eric; Acosta, Dionisio; Proeve, Johann; Askin, Scott; Ganslandt, Thomas; Doods, Justin; Dugas, Martin

    2016-11-22

    Data capture is one of the most expensive phases during the conduct of a clinical trial and the increasing use of electronic health records (EHR) offers significant savings to clinical research. To facilitate these secondary uses of routinely collected patient data, it is beneficial to know what data elements are captured in clinical trials. Therefore our aim here is to determine the most commonly used data elements in clinical trials and their availability in hospital EHR systems. Case report forms for 23 clinical trials in differing disease areas were analyzed. Through an iterative and consensus-based process of medical informatics professionals from academia and trial experts from the European pharmaceutical industry, data elements were compiled for all disease areas and with special focus on the reporting of adverse events. Afterwards, data elements were identified and statistics acquired from hospital sites providing data to the EHR4CR project. The analysis identified 133 unique data elements. Fifty elements were congruent with a published data inventory for patient recruitment and 83 new elements were identified for clinical trial execution, including adverse event reporting. Demographic and laboratory elements lead the list of available elements in hospitals EHR systems. For the reporting of serious adverse events only very few elements could be identified in the patient records. Common data elements in clinical trials have been identified and their availability in hospital systems elucidated. Several elements, often those related to reimbursement, are frequently available whereas more specialized elements are ranked at the bottom of the data inventory list. Hospitals that want to obtain the benefits of reusing data for research from their EHR are now able to prioritize their efforts based on this common data element list.

  13. 77 FR 74144 - Federal Motor Vehicle Safety Standards; Event Data Recorders

    Science.gov (United States)

    2012-12-13

    ... strategies were used during the event. Additionally, the data can be used to assess whether the vehicle was... the agency regarding their 2010 vehicles and then weighting using 2010 corporate-level vehicle... advance notice of proposed rulemaking in the near future to explore the potential for, and future utility...

  14. Analysis of Driver Evasive Maneuvering Prior to Intersection Crashes Using Event Data Recorders.

    Science.gov (United States)

    Scanlon, John M; Kusano, Kristofer D; Gabler, Hampton C

    2015-01-01

    Intersection crashes account for over 4,500 fatalities in the United States each year. Intersection Advanced Driver Assistance Systems (I-ADAS) are emerging vehicle-based active safety systems that have the potential to help drivers safely navigate across intersections and prevent intersection crashes and injuries. The performance of an I-ADAS is expected to be highly dependent upon driver evasive maneuvering prior to an intersection crash. Little has been published, however, on the detailed evasive kinematics followed by drivers prior to real-world intersection crashes. The objective of this study was to characterize the frequency, timing, and kinematics of driver evasive maneuvers prior to intersection crashes. Event data recorders (EDRs) downloaded from vehicles involved in intersection crashes were investigated as part of NASS-CDS years 2001 to 2013. A total of 135 EDRs with precrash vehicle speed and braking application were downloaded to investigate evasive braking. A smaller subset of 59 EDRs that collected vehicle yaw rate was additionally analyzed to investigate evasive steering. Each vehicle was assigned to one of 3 precrash movement classifiers (traveling through the intersection, completely stopped, or rolling stop) based on the vehicle's calculated acceleration and observed velocity profile. To ensure that any significant steering input observed was an attempted evasive maneuver, the analysis excluded vehicles at intersections that were turning, driving on a curved road, or performing a lane change. Braking application at the last EDR-recorded time point was assumed to indicate evasive braking. A vehicle yaw rate greater than 4° per second was assumed to indicate an evasive steering maneuver. Drivers executed crash avoidance maneuvers in four-fifths of intersection crashes. A more detailed analysis of evasive braking frequency by precrash maneuver revealed that drivers performing complete or rolling stops (61.3%) braked less often than drivers

  15. Use and Customization of Risk Scores for Predicting Cardiovascular Events Using Electronic Health Record Data.

    Science.gov (United States)

    Wolfson, Julian; Vock, David M; Bandyopadhyay, Sunayan; Kottke, Thomas; Vazquez-Benitez, Gabriela; Johnson, Paul; Adomavicius, Gediminas; O'Connor, Patrick J

    2017-04-24

    Clinicians who are using the Framingham Risk Score (FRS) or the American College of Cardiology/American Heart Association Pooled Cohort Equations (PCE) to estimate risk for their patients based on electronic health data (EHD) face 4 questions. (1) Do published risk scores applied to EHD yield accurate estimates of cardiovascular risk? (2) Are FRS risk estimates, which are based on data that are up to 45 years old, valid for a contemporary patient population seeking routine care? (3) Do the PCE make the FRS obsolete? (4) Does refitting the risk score using EHD improve the accuracy of risk estimates? Data were extracted from the EHD of 84 116 adults aged 40 to 79 years who received care at a large healthcare delivery and insurance organization between 2001 and 2011. We assessed calibration and discrimination for 4 risk scores: published versions of FRS and PCE and versions obtained by refitting models using a subset of the available EHD. The published FRS was well calibrated (calibration statistic K=9.1, miscalibration ranging from 0% to 17% across risk groups), but the PCE displayed modest evidence of miscalibration (calibration statistic K=43.7, miscalibration from 9% to 31%). Discrimination was similar in both models (C-index=0.740 for FRS, 0.747 for PCE). Refitting the published models using EHD did not substantially improve calibration or discrimination. We conclude that published cardiovascular risk models can be successfully applied to EHD to estimate cardiovascular risk; the FRS remains valid and is not obsolete; and model refitting does not meaningfully improve the accuracy of risk estimates. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  16. First ATLAS Events Recorded Underground

    CERN Multimedia

    Teuscher, R

    As reported in the CERN Bulletin, Issue No.30-31, 25 July 2005 The ATLAS barrel Tile calorimeter has recorded its first events underground using a cosmic ray trigger, as part of the detector commissioning programme. This is not a simulation! A cosmic ray muon recorded by the barrel Tile calorimeter of ATLAS on 21 June 2005 at 18:30. The calorimeter has three layers and a pointing geometry. The light trapezoids represent the energy deposited in the tiles of the calorimeter depicted as a thick disk. On the evening of June 21, the ATLAS detector, now being installed in the underground experimental hall UX15, reached an important psychological milestone: the barrel Tile calorimeter recorded the first cosmic ray events in the underground cavern. An estimated million cosmic muons enter the ATLAS cavern every 3 minutes, and the ATLAS team decided to make good use of some of them for the commissioning of the detector. Although only 8 of the 128 calorimeter slices ('superdrawers') were included in the trigg...

  17. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    G. W. L. Gresik

    2013-07-01

    Full Text Available The goal of RECORDS (Reaching Recording Data Technologies is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  18. Records Reaching Recording Data Technologies

    Science.gov (United States)

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-07-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  19. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  20. Factors contributing to commercial vehicle rear-end conflicts in China: A study using on-board event data recorders.

    Science.gov (United States)

    Bianchi Piccinini, Giulio; Engström, Johan; Bärgman, Jonas; Wang, Xuesong

    2017-09-01

    In the last 30years, China has undergone a dramatic increase in vehicle ownership and a resulting escalation in the number of road crashes. Although crash figures are decreasing today, they remain high; it is therefore important to investigate crash causation mechanisms to further improve road safety in China. To shed more light on the topic, naturalistic driving data was collected in Shanghai as part of the evaluation of a behavior-based safety service. The data collection included instrumenting 47 vehicles belonging to a commercial fleet with data acquisition systems. From the overall sample, 91 rear-end crash or near-crash (CNC) events, triggered by 24 drivers, were used in the analysis. The CNC were annotated by three researchers, through an expert assessment methodology based on videos and kinematic variables. The results show that the main factor behind the rear-end CNC was the adoption of very small safety margins. In contrast to results from previous studies in the US, the following vehicles' drivers typically had their eyes on the road and reacted quickly in response to the evolving conflict in most events. When delayed reactions occurred, they were mainly due to driving-related visual scanning mismatches (e.g., mirror checks) rather than visual distraction. Finally, the study identified four main conflict scenarios that represent the typical development of rear-end conflicts in this data. The findings of this study have several practical applications, such as informing the specifications of in-vehicle safety measures and automated driving and providing input into the design of coaching/training procedures to improve the driving habits of drivers. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Data processing of natural and induced events recorded at the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC

    Directory of Open Access Journals (Sweden)

    Nov¨¢k Josef

    2001-09-01

    Full Text Available The operation of the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC (¦Õ = 49.8352¡ãN; ¦Ë = 18.1422¡ãE which is situated at present in an experimental gallery nearby the Ostrava planetarium started in the year 1983 being equiped initially by analogue instrumentation. Modernization of instrumentation at the station was aimed at the installation of a new digital data acquisition system and the respective software packages for data interpretation and transmission.Data acquisition system VISTEC is based on PC which enables continuous recording of three- component short-period and medium-period systems with the sampling frequency of 20 Hz. The basic advantage of the OS Linux adopted allows remote access (telnet and the possibility of the recorded data transmission (ftp. Possible troubles in the seismic station operation can be quickly detected (even automatically and all recorded data are with minimum delay on disposal. The use of the remote access makes possible also to change the parameters of measuring set-up. The standard form of output data allows the application of standard software packages for visualisation and evaluation. There are on disposal following formates: GSE2/CM6, GSE2/INT and MiniSEED. The output data sets can be compressed by a special procedure. For interactive interpretation od digital seismic data, software package EVENT developed in the Geophysical Institute AS CR and package WAVE developed in the Institute of Geonics AS CR are used.Experimental operation of digital seismographs at the station OKC confirmed justification of its incorporation into the seismic stations of the Czech national seismological network (CNSN. Based on the preliminary analysis of digital data it proved that following groups of seismic events are recorded: earthquakes, induced seismic events from Polish copper and coal mines, induced seismic events from the Ostrava-Karvin¨¢ Coal Basin, quarry blasts and weak regional seismic events of the

  2. The intelligent data recorder

    International Nuclear Information System (INIS)

    Kojima, Mamoru; Hidekuma, Sigeru.

    1985-01-01

    The intelligent data recorder has been developed to data acquisition for a microwave interferometer. The 'RS-232C' which is the standard interface is used for data transmission to the host computer. Then, it's easy to connect with any computer which has general purpose serial port. In this report, the charcteristics of the intelligent data recorder and the way of developing the software are described. (author)

  3. Data analysis of event tape and connection

    International Nuclear Information System (INIS)

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  4. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    Science.gov (United States)

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  5. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    Science.gov (United States)

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Major events in Neogene oxygen isotopic records

    International Nuclear Information System (INIS)

    Kennett, J.P.; Hodell, D.A.

    1986-01-01

    Changes in oxygen isotopic ratios of foraminiferal calcite during the cainozoic have been one of the primary tools for investigating the history of Arctic and Antarctic glaciation, although interpretations of the oxygen isotopic record differ markedly. The ambiguity in interpretation results mainly from the partitioning of temperature from ice volume effects in delta 18 O changes. Oxygen isotopic records for the Cainozoic show an increase in delta 18 O values towards the present, reflecting gradual cooling and increased glaciation of the Earth's climate since the late Cretaceous. A variety of core material from the South Atlantic and South-west Pacific oceans are investigated. This composite data represents one of the most complete available with which to evaluate the evolution of glaciation during the Neogene. Expansion of ice shelves in Antarctica undoubtedly accompanied the increased glaciation of the northern hemisphere, since eustatic sea-level lowering would positively reinforce ice growth on Antarctica

  7. NPP unusual events: data, analysis and application

    International Nuclear Information System (INIS)

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  8. 49 CFR 229.135 - Event recorders.

    Science.gov (United States)

    2010-10-01

    ...) Distance; (v) Throttle position; (vi) Applications and operations of the train automatic air brake; (vii... automatic air brake, including emergency applications. The system shall record, or provide a means of... responsive to a command originating from or executed by an on-board computer (e.g., electronic braking system...

  9. Text mining electronic health records to identify hospital adverse events

    DEFF Research Database (Denmark)

    Gerdes, Lars Ulrik; Hardahl, Christian

    2013-01-01

    Manual reviews of health records to identify possible adverse events are time consuming. We are developing a method based on natural language processing to quickly search electronic health records for common triggers and adverse events. Our results agree fairly well with those obtained using manu...

  10. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  11. Fire Event Data from Licensee Event Reports

    Data.gov (United States)

    Nuclear Regulatory Commission — The purpose of this study data is to provide a metric with which to assess the effectiveness of improvements to the U.S. NRC's fire protection regulations in support...

  12. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  13. Digital event recorder capable of simple computations and with ...

    African Journals Online (AJOL)

    C.W. Way-Jones·. Department of Physics and Electronics, Rhodes University, ... In sensitive or critical experiments it is frequently neces- sary to have ..... while decreasing their size and cost. ... A cheap method or recording behavioural events,.

  14. Late Eocene impact events recorded in deep-sea sediments

    Science.gov (United States)

    Glass, B. P.

    1988-01-01

    Raup and Sepkoski proposed that mass extinctions have occurred every 26 Myr during the last 250 Myr. In order to explain this 26 Myr periodicity, it was proposed that the mass extinctions were caused by periodic increases in cometary impacts. One method to test this hypothesis is to determine if there were periodic increases in impact events (based on crater ages) that correlate with mass extinctions. A way to test the hypothesis that mass extinctions were caused by periodic increases in impact cratering is to look for evidence of impact events in deep-sea deposits. This method allows direct observation of the temporal relationship between impact events and extinctions as recorded in the sedimentary record. There is evidence in the deep-sea record for two (possibly three) impact events in the late Eocene. The younger event, represented by the North American microtektite layer, is not associated with an Ir anomaly. The older event, defined by the cpx spherule layer, is associated with an Ir anomaly. However, neither of the two impact events recorded in late Eocene deposits appears to be associated with an unusual number of extinctions. Thus there is little evidence in the deep-sea record for an impact-related mass extinction in the late Eocene.

  15. Big Data tools as applied to ATLAS event data

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2017-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and to...

  16. SMOS data and extreme events

    Science.gov (United States)

    Kerr, Yann; Wigneron, Jean-Pierre; Ferrazzoli, Paolo; Mahmoodi, Ali; Al-Yaari, Amen; Parrens, Marie; Bitar, Ahmad Al; Rodriguez-Fernandez, Nemesio; Bircher, Simone; Molero-rodenas, Beatriz; Drusch, Matthias; Mecklenburg, Susanne

    2017-04-01

    The SMOS (Soil Moisture and Ocean Salinity) satellite was successfully launched in November 2009. This ESA led mission for Earth Observation is dedicated to provide soil moisture over continental surface (with an accuracy goal of 0.04 m3/m3), vegetation water content over land, and ocean salinity. These geophysical features are important as they control the energy balance between the surface and the atmosphere. Their knowledge at a global scale is of interest for climatic and weather researches, and in particular in improving model forecasts. The Soil Moisture and Ocean Salinity mission has now been collecting data for over 7 years. The whole data set has been reprocessed (Version 620 for levels 1 and 2 and version 3 for level 3 CATDS) while operational near real time soil moisture data is now available and assimilation of SMOS data in NWP has proved successful. After 7 years it seems important to start using data for having a look at anomalies and see how they can relate to large scale events. We have also produced a 15 year soil moisture data set by merging SMOS and AMSR using a neural network approach. The purpose of this communication is to present the mission results after more than seven years in orbit in a climatic trend perspective, as through such a period anomalies can be detected. Thereby we benefit from consistent datasets provided through the latest reprocessing using most recent algorithm enhancements. Using the above mentioned products it is possible to follow large events such as the evolution of the droughts in North America, or water fraction evolution over the Amazonian basin. In this occasion we will focus on the analysis of SMOS and ancillary products anomalies to reveal two climatic trends, the temporal evolution of water storage over the Indian continent in relation to rainfall anomalies, and the global impact of El Nino types of events on the general water storage distribution. This presentation shows in detail the use of long term data sets

  17. Data processing of natural and induced events recorded at the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC)

    OpenAIRE

    Nov¨¢k Josef; Rušajov¨¢ Jana; Holub Karel; Kejzl¨ªk Jarom¨ªr

    2001-01-01

    The operation of the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC) (¦Õ = 49.8352¡ãN; ¦Ë = 18.1422¡ãE) which is situated at present in an experimental gallery nearby the Ostrava planetarium started in the year 1983 being equiped initially by analogue instrumentation. Modernization of instrumentation at the station was aimed at the installation of a new digital data acquisition system and the respective software packages for data interpretation and transmission.Data acquisition system VISTEC is b...

  18. Fast event recorder utilizing a CCD analog shift register

    International Nuclear Information System (INIS)

    Ducar, R.J.; McIntyre, P.M.

    1978-01-01

    A system of electronics has been developed to allow the capture and recording of relatively fast, low-amplitude analog events. The heart of the system is a dual 455-cell analog shift register charge-coupled device, Fairchild CCD321ADC-3. The CCD is operated in a dual clock mode. The input is sampled at a selectable clock rate of .25-20 MHz. The stored analog data is then clocked out at a slower rate, typically about .25 MHz. The time base expansion of the analog data allows for analog-to-digital conversion and memory storage using conventional medium-speed devices. The digital data is sequentially loaded into a static RAM and may then be block transferred to a computer. The analog electronics are housed in a single-width NIM module, and the RAM memory in a single-width CAMAC module. Each pair of modules provides six parallel channels. Cost is about $200.00 per channel. Applications are described for ionization imaging (TPC, IRC) and long-drift calorimetry in liquid argon

  19. Insertable cardiac event recorder in detection of atrial fibrillation after cryptogenic stroke: an audit report.

    Science.gov (United States)

    Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas

    2013-07-01

    Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.

  20. Extreme Drought Events Revealed in Amazon Tree Ring Records

    Science.gov (United States)

    Jenkins, H. S.; Baker, P. A.; Guilderson, T. P.

    2010-12-01

    The Amazon basin is a center of deep atmospheric convection and thus acts as a major engine for global hydrologic circulation. Yet despite its significance, a full understanding of Amazon rainfall variability remains elusive due to a poor historical record of climate. Temperate tree rings have been used extensively to reconstruct climate over the last thousand years, however less attention has been given to the application of dendrochronology in tropical regions, in large part due to a lower frequency of tree species known to produce annual rings. Here we present a tree ring record of drought extremes from the Madre de Dios region of southeastern Peru over the last 190 years. We confirm that tree ring growth in species Cedrela odorata is annual and show it to be well correlated with wet season precipitation. This correlation is used to identify extreme dry (and wet) events that have occurred in the past. We focus on drought events identified in the record as drought frequency is expected to increase over the Amazon in a warming climate. The Cedrela chronology records historic Amazon droughts of the 20th century previously identified in the literature and extends the record of drought for this region to the year 1816. Our analysis shows that there has been an increase in the frequency of extreme drought (mean recurrence interval = 5-6 years) since the turn of the 20th century and both Atlantic and Pacific sea surface temperature (SST) forcing mechanisms are implicated.

  1. Multi-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a high-multiplicity collision event observed by the CMS detector in the search for microscopic black holes, in collision data recorded in 2015. The event contains 12 jets with transverse momenta greater than 50 GeV each, and the mass of this system is 6.4 TeV. The scalar sum of the transverse energies of all energetic objects in the event (including missing transverse energy) is 5.4 TeV.

  2. A diary after dinner: How the time of event recording influences later accessibility of diary events.

    Science.gov (United States)

    Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály

    2015-01-01

    Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation.

  3. Big Data Tools as Applied to ATLAS Event Data

    Science.gov (United States)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling

  4. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  5. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  6. Visual pattern discovery in timed event data

    Science.gov (United States)

    Schaefer, Matthias; Wanner, Franz; Mansmann, Florian; Scheible, Christian; Stennett, Verity; Hasselrot, Anders T.; Keim, Daniel A.

    2011-01-01

    Business processes have tremendously changed the way large companies conduct their business: The integration of information systems into the workflows of their employees ensures a high service level and thus high customer satisfaction. One core aspect of business process engineering are events that steer the workflows and trigger internal processes. Strict requirements on interval-scaled temporal patterns, which are common in time series, are thereby released through the ordinal character of such events. It is this additional degree of freedom that opens unexplored possibilities for visualizing event data. In this paper, we present a flexible and novel system to find significant events, event clusters and event patterns. Each event is represented as a small rectangle, which is colored according to categorical, ordinal or intervalscaled metadata. Depending on the analysis task, different layout functions are used to highlight either the ordinal character of the data or temporal correlations. The system has built-in features for ordering customers or event groups according to the similarity of their event sequences, temporal gap alignment and stacking of co-occurring events. Two characteristically different case studies dealing with business process events and news articles demonstrate the capabilities of our system to explore event data.

  7. An open ocean record of the Toarcian oceanic anoxic event

    Directory of Open Access Journals (Sweden)

    D. R. Gröcke

    2011-11-01

    Full Text Available Oceanic anoxic events were time intervals in the Mesozoic characterized by widespread distribution of marine organic matter-rich sediments (black shales and significant perturbations in the global carbon cycle. These perturbations are globally recorded in sediments as carbon isotope excursions irrespective of lithology and depositional environment. During the early Toarcian, black shales were deposited on the epi- and pericontinental shelves of Pangaea, and these sedimentary rocks are associated with a pronounced (ca. 7 ‰ negative (organic carbon isotope excursion (CIE which is thought to be the result of a major perturbation in the global carbon cycle. For this reason, the lower Toarcian is thought to represent an oceanic anoxic event (the T-OAE. If the T-OAE was indeed a global event, an isotopic expression of this event should be found beyond the epi- and pericontinental Pangaean localities. To address this issue, the carbon isotope composition of organic matter (δ13Corg of lower Toarcian organic matter-rich cherts from Japan, deposited in the open Panthalassa Ocean, was analysed. The results show the presence of a major (>6 ‰ negative excursion in δ13Corg that, based on radiolarian biostratigraphy, is a correlative of the lower Toarcian negative CIE known from Pangaean epi- and pericontinental strata. A smaller negative excursion in δ13Corg (ca. 2 ‰ is recognized lower in the studied succession. This excursion may, within the current biostratigraphic resolution, represent the excursion recorded in European epicontinental successions close to the Pliensbachian/Toarcian boundary. These results from the open ocean realm suggest, in conjunction with other previously published datasets, that these Early Jurassic carbon cycle perturbations affected the active global reservoirs of the exchangeable carbon cycle (deep marine, shallow marine, atmospheric.

  8. The Great Oxidation Event Recorded in Paleoproterozoic Rocks from Fennoscandia

    Directory of Open Access Journals (Sweden)

    Dmitry V. Rychanchik

    2010-04-01

    Full Text Available With support of the International Continental Scientific Drilling Program (ICDP and other funding organizations, the Fennoscandia Arctic Russia – Drilling Early Earth Project (FAR-DEEP operations have been successfully completed during 2007. A total of 3650 meters of core have been recovered from fifteen holes drilled through sedimentary and volcanic formations in Fennoscandia (Fig. 1, recording several global environmental changes spanning the time interval 2500–2000 Ma, including the Great Oxidation Event (GOE (Holland, 2002. The core was meanwhile curated and archived in Trondheim, Norway, and it has been sampled by an international team of scientists.

  9. Bidirectional RNN for Medical Event Detection in Electronic Health Records.

    Science.gov (United States)

    Jagannatha, Abhyuday N; Yu, Hong

    2016-06-01

    Sequence labeling for extraction of medical events and their attributes from unstructured text in Electronic Health Record (EHR) notes is a key step towards semantic understanding of EHRs. It has important applications in health informatics including pharmacovigilance and drug surveillance. The state of the art supervised machine learning models in this domain are based on Conditional Random Fields (CRFs) with features calculated from fixed context windows. In this application, we explored recurrent neural network frameworks and show that they significantly out-performed the CRF models.

  10. Event displays in 13 TeV data

    CERN Document Server

    CMS Collaboration

    2016-01-01

    This performance note presents some illustrative event displays at a center-of-mass energy of 13 TeV. The data set consists of the first proton-proton collision data recorded by the CMS detector in 2015 with a magnetic field of 3.8 Teslas.

  11. Event displays in 13 TeV data

    CERN Document Server

    CMS Collaboration

    2016-01-01

    This performance note presents some illustrative event displays together with kinematic quantities for diboson production candidates at a center-of-mass energy of 13 TeV. The data set consists of the proton-proton collision data recorded by the CMS detector in 2016 with a magnetic field of 3.8 Teslas.

  12. Event displays at 13 TeV of 2016 data

    CERN Document Server

    CMS Collaboration

    2017-01-01

    This performance note presents some illustrative event displays at a center-of-mass energy of 13 TeV. The data set consists of the proton-proton collision data recorded by the CMS detector in 2016 with a magnetic field of 3.8 Tesla.

  13. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  14. Cognitive complexity of the medical record is a risk factor for major adverse events.

    Science.gov (United States)

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  15. Feasibility study on the acquisition of licensee event data

    International Nuclear Information System (INIS)

    Kato, W.Y.; Hall, R.E.; Teichmann, T.; Taylor, J.; Luckas, W.J. Jr.; Saha, P.; Samanta, P.; Fragola, J.

    1983-01-01

    Objective of the study was to assess the feasibility of modifying the LER reporting system as proposed by NRC-AEOD, and/or developing an alternative plan that would in addition collect information about significant events amenable to statistical analysis, such as multi-case, multi-variate analysis. The study indicated that the LERs constitute reports from a large variety of events which have in most cases many different plant parameters, both measured and currently not measured, to characterize the event. In order to determine event-specific plant parameters required for statistical and deterministic analysis, a data matrix approach was used to identify those parameters which are currently being recorded, those which could be measured and recorded, and those which are required for certain types of events involving thermal-hydraulics and neutronics as illustrative of events requiring in-depth analysis. Also included in the study was a review of INPO's Nuclear Plant Reliability Data System; NASA's Problem Reporting and Corrective Action (PRACA) program; Electricite de France's KIT system, an automatic computer-based reactor parameter monitoring and recording system; and the regulatory relationship between the FAA and the commercial airline industry

  16. Proxy records of Holocene storm events in coastal barrier systems: Storm-wave induced markers

    Science.gov (United States)

    Goslin, Jérôme; Clemmensen, Lars B.

    2017-10-01

    Extreme storm events in the coastal zone are one of the main forcing agents of short-term coastal system behavior. As such, storms represent a major threat to human activities concentrated along the coasts worldwide. In order to better understand the frequency of extreme events like storms, climate science must rely on longer-time records than the century-scale records of instrumental weather data. Proxy records of storm-wave or storm-wind induced activity in coastal barrier systems deposits have been widely used worldwide in recent years to document past storm events during the last millennia. This review provides a detailed state-of-the-art compilation of the proxies available from coastal barrier systems to reconstruct Holocene storm chronologies (paleotempestology). The present paper aims (I) to describe the erosional and depositional processes caused by storm-wave action in barrier and back-barrier systems (i.e. beach ridges, storm scarps and washover deposits), (ii) to understand how storm records can be extracted from barrier and back-barrier sedimentary bodies using stratigraphical, sedimentological, micro-paleontological and geochemical proxies and (iii) to show how to obtain chronological control on past storm events recorded in the sedimentary successions. The challenges that paleotempestology studies still face in the reconstruction of representative and reliable storm-chronologies using these various proxies are discussed, and future research prospects are outlined.

  17. Data Bookkeeping Service 3 - Providing event metadata in CMS

    CERN Document Server

    Giffels, Manuel; Riley, Daniel

    2014-01-01

    The Data Bookkeeping Service 3 provides a catalog of event metadata for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It comprises all necessary information for tracking datasets, their processing history and associations between runs, files and datasets, on a large scale of about $200,000$ datasets and more than $40$ million files, which adds up in around $700$ GB of metadata. The DBS is an essential part of the CMS Data Management and Workload Management (DMWM) systems, all kind of data-processing like Monte Carlo production, processing of recorded event data as well as physics analysis done by the users are heavily relying on the information stored in DBS.

  18. Data Bookkeeping Service 3 - Providing Event Metadata in CMS

    Energy Technology Data Exchange (ETDEWEB)

    Giffels, Manuel [CERN; Guo, Y. [Fermilab; Riley, Daniel [Cornell U.

    2014-01-01

    The Data Bookkeeping Service 3 provides a catalog of event metadata for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It comprises all necessary information for tracking datasets, their processing history and associations between runs, files and datasets, on a large scale of about 200, 000 datasets and more than 40 million files, which adds up in around 700 GB of metadata. The DBS is an essential part of the CMS Data Management and Workload Management (DMWM) systems [1], all kind of data-processing like Monte Carlo production, processing of recorded event data as well as physics analysis done by the users are heavily relying on the information stored in DBS.

  19. Recordable storage medium with protected data area

    NARCIS (Netherlands)

    2005-01-01

    The invention relates to a method of storing data on a rewritable data storage medium, to a corresponding storage medium, to a corresponding recording apparatus and to a corresponding playback apparatus. Copy-protective measures require that on rewritable storage media some data must be stored which

  20. Evaluation of Data Recording at Teaching Hospitals

    Directory of Open Access Journals (Sweden)

    Hasan Karbasi

    2009-02-01

    Full Text Available Background and purpose: Medical records of patients have an undeniable role on education, research and evaluation of health care delivery, and also could be used as reliable documents of past in casesof patients’ legal complains. This study was done to evaluate medical data recording at teaching hospital of Birjand University of Medical Sciences in 2004.Methods: In this descriptive-analytic study, 527 patients’ records of patients who had been discharged from general wards of the hospitals after 24 hours of hospitalization were randomly selected. 18 standard titles of records include in each patient’s record were evaluated using checklists. Data were analyzed using frequency distribution tables, independent t-test and Chi-square test.Results: Items on records’ titles were completed in a range of 0-100%. Titles of neonates and nursing care with 96% completeness were the most completed ones~ Titles of recovery, pre-delivery care, medical history, summary, and progress notes with 50% to 74% completeness were categorized as moderately completed titles; and titles of vital signs, pre-operation care and operation report were weak. Records of the infectious diseases ward were the most completed records (68% and the least completed were from ophthmology ward (35.8%. There were significant differences between the hospitals and between different wards.Conclusion: Results of this study show the need for further education on record writing, taking medical history, and order writing and more importantly the need for a system of continuous monitoringof the records.Keywords: MEDICAL RECORD, TEACHING HOSPITAL, EVALUATION

  1. Reporting, Recording, and Transferring Contingency Demand Data

    National Research Council Canada - National Science Library

    Smith, Bernard

    2000-01-01

    .... In this report, we develop a standard set of procedures for reporting and recording demand data at the contingency location and transferring contingency demand data to the home base - ensuring proper level allocation and valid worldwide peacetime operating stock (POS) and readiness spares package (RSP) requirements.

  2. Recurrent process mining with live event data

    NARCIS (Netherlands)

    Syamsiyah, A.; van Dongen, B.F.; van der Aalst, W.M.P.; Teniente, E.; Weidlich, M.

    2018-01-01

    In organizations, process mining activities are typically performed in a recurrent fashion, e.g. once a week, an event log is extracted from the information systems and a process mining tool is used to analyze the process’ characteristics. Typically, process mining tools import the data from a

  3. Electronic Health Record-Related Events in Medical Malpractice Claims.

    Science.gov (United States)

    Graber, Mark L; Siegal, Dana; Riah, Heather; Johnston, Doug; Kenyon, Kathy

    2015-11-06

    There is widespread agreement that the full potential of health information technology (health IT) has not yet been realized and of particular concern are the examples of unintended consequences of health IT that detract from the safety of health care or from the use of health IT itself. The goal of this project was to obtain additional information on these health IT-related problems, using a mixed methods (qualitative and quantitative) analysis of electronic health record-related harm in cases submitted to a large database of malpractice suits and claims. Cases submitted to the CRICO claims database and coded during 2012 and 2013 were analyzed. A total of 248 cases (<1%) involving health IT were identified and coded using a proprietary taxonomy that identifies user- and system-related sociotechnical factors. Ambulatory care accounted for most of the cases (146 cases). Cases were most typically filed as a result of an error involving medications (31%), diagnosis (28%), or a complication of treatment (31%). More than 80% of cases involved moderate or severe harm, although lethal cases were less likely in cases from ambulatory settings. Etiologic factors spanned all of the sociotechnical dimensions, and many recurring patterns of error were identified. Adverse events associated with health IT vulnerabilities can cause extensive harm and are encountered across the continuum of health care settings and sociotechnical factors. The recurring patterns provide valuable lessons that both practicing clinicians and health IT developers could use to reduce the risk of harm in the future. The likelihood of harm seems to relate more to a patient's particular situation than to any one class of error.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share thework provided it is properly cited. The work cannot be changed in any way or used

  4. Sequence Synopsis: Optimize Visual Summary of Temporal Event Data.

    Science.gov (United States)

    Chen, Yuanzhe; Xu, Panpan; Ren, Liu

    2018-01-01

    Event sequences analysis plays an important role in many application domains such as customer behavior analysis, electronic health record analysis and vehicle fault diagnosis. Real-world event sequence data is often noisy and complex with high event cardinality, making it a challenging task to construct concise yet comprehensive overviews for such data. In this paper, we propose a novel visualization technique based on the minimum description length (MDL) principle to construct a coarse-level overview of event sequence data while balancing the information loss in it. The method addresses a fundamental trade-off in visualization design: reducing visual clutter vs. increasing the information content in a visualization. The method enables simultaneous sequence clustering and pattern extraction and is highly tolerant to noises such as missing or additional events in the data. Based on this approach we propose a visual analytics framework with multiple levels-of-detail to facilitate interactive data exploration. We demonstrate the usability and effectiveness of our approach through case studies with two real-world datasets. One dataset showcases a new application domain for event sequence visualization, i.e., fault development path analysis in vehicles for predictive maintenance. We also discuss the strengths and limitations of the proposed method based on user feedback.

  5. Changes in record-breaking temperature events in China and projections for the future

    Science.gov (United States)

    Deng, Hanqing; Liu, Chun; Lu, Yanyu; He, Dongyan; Tian, Hong

    2017-06-01

    As global warming intensifies, more record-breaking (RB) temperature events are reported in many places around the world where temperatures are higher than ever before http://cn.bing.com/dict/search?q=.&FORM=BDVSP6&mkt=zh-cn. The RB temperatures have caused severe impacts on ecosystems and human society. Here, we address changes in RB temperature events occurring over China in the past (1961-2014) as well as future projections (2006-2100) using observational data and the newly available simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5). The number of RB events has a significant multi-decadal variability in China, and the intensity expresses a strong decrease from 1961 to 2014. However, more frequent RB events occurred in mid-eastern and northeastern China over last 30 years (1981-2010). Comparisons with observational data indicate multi-model ensemble (MME) simulations from the CMIP5 model perform well in simulating RB events for the historical run period (1961-2005). CMIP5 MME shows a relatively larger uncertainty for the change in intensity. From 2051 to 2100, fewer RB events are projected to occur in most parts of China according to RCP 2.6 scenarios. Over the longer period from 2006 to 2100, a remarkable increase is expected for the entire country according to RCP 8.5 scenarios and the maximum numbers of RB events increase by approximately 600 per year at end of twenty-first century.

  6. Analyzing time-ordered event data with missed observations.

    Science.gov (United States)

    Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P

    2017-09-01

    A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p  = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event

  7. Ocean Color and Earth Science Data Records

    Science.gov (United States)

    Maritorena, S.

    2014-12-01

    The development of consistent, high quality time series of biogeochemical products from a single ocean color sensor is a difficult task that involves many aspects related to pre- and post-launch instrument calibration and characterization, stability monitoring and the removal of the contribution of the atmosphere which represents most of the signal measured at the sensor. It is even more challenging to build Climate Data Records (CDRs) or Earth Science Data Records (ESDRs) from multiple sensors as design, technology and methodologies (bands, spectral/spatial resolution, Cal/Val, algorithms) differ from sensor to sensor. NASA MEaSUREs, ESA Climate Change Initiative (CCI) and IOCCG Virtual Constellation are some of the underway efforts that investigate or produce ocean color CDRs or ESDRs from the recent and current global missions (SeaWiFS, MODIS, MERIS). These studies look at key aspects of the development of unified data records from multiple sensors, e.g. the concatenation of the "best" individual records vs. the merging of multiple records or band homogenization vs. spectral diversity. The pros and cons of the different approaches are closely dependent upon the overall science purpose of the data record and its temporal resolution. While monthly data are generally adequate for biogeochemical modeling or to assess decadal trends, higher temporal resolution data records are required to look into changes in phenology or the dynamics of phytoplankton blooms. Similarly, short temporal resolution (daily to weekly) time series may benefit more from being built through the merging of data from multiple sensors while a simple concatenation of data from individual sensors might be better suited for longer temporal resolution (e.g. monthly time series). Several Ocean Color ESDRs were developed as part of the NASA MEaSUREs project. Some of these time series are built by merging the reflectance data from SeaWiFS, MODIS-Aqua and Envisat-MERIS in a semi-analytical ocean color

  8. Analysis of event-mode data with Interactive Data Language

    International Nuclear Information System (INIS)

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  9. Data recording and processing in mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    McKown, H. [International Atomic Energy Agency, Vienna (Austria)

    1978-12-15

    When a mass spectrometer is going to be obtained, it must be specified to do a particular task. It follows that the data recording system must be designed to work satisfactorily with hardware that produces the ion current or currents. The author describes two systems: the AVCO mass spectrometer and the tandem mass spectrometer.

  10. Predictive modeling of structured electronic health records for adverse drug event detection.

    Science.gov (United States)

    Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik

    2015-01-01

    The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and

  11. Persistent Data Layout and Infrastructure for Efficient Selective Retrieval of Event Data in ATLAS

    CERN Document Server

    INSPIRE-00084279; Malon, David

    2011-01-01

    The ATLAS detector at CERN has completed its first full year of recording collisions at 7 TeV, resulting in billions of events and petabytes of data. At these scales, physicists must have the capability to read only the data of interest to their analyses, with the importance of efficient selective access increasing as data taking continues. ATLAS has developed a sophisticated event-level metadata infrastructure and supporting I/O framework allowing event selections by explicit specification, by back navigation, and by selection queries to a TAG database via an integrated web interface. These systems and their performance have been reported on elsewhere. The ultimate success of such a system, however, depends significantly upon the efficiency of selective event retrieval. Supporting such retrieval can be challenging, as ATLAS stores its event data in column-wise orientation using ROOT trees for a number of reasons, including compression considerations, histogramming use cases, and more. For 2011 data, ATLAS wi...

  12. Mining Electronic Health Records using Linked Data.

    Science.gov (United States)

    Odgers, David J; Dumontier, Michel

    2015-01-01

    Meaningful Use guidelines have pushed the United States Healthcare System to adopt electronic health record systems (EHRs) at an unprecedented rate. Hospitals and medical centers are providing access to clinical data via clinical data warehouses such as i2b2, or Stanford's STRIDE database. In order to realize the potential of using these data for translational research, clinical data warehouses must be interoperable with standardized health terminologies, biomedical ontologies, and growing networks of Linked Open Data such as Bio2RDF. Applying the principles of Linked Data, we transformed a de-identified version of the STRIDE into a semantic clinical data warehouse containing visits, labs, diagnoses, prescriptions, and annotated clinical notes. We demonstrate the utility of this system though basic cohort selection, phenotypic profiling, and identification of disease genes. This work is significant in that it demonstrates the feasibility of using semantic web technologies to directly exploit existing biomedical ontologies and Linked Open Data.

  13. A volcanic event forecasting model for multiple tephra records, demonstrated on Mt. Taranaki, New Zealand

    Science.gov (United States)

    Damaschke, Magret; Cronin, Shane J.; Bebbington, Mark S.

    2018-01-01

    Robust time-varying volcanic hazard assessments are difficult to develop, because they depend upon having a complete and extensive eruptive activity record. Missing events in eruption records are endemic, due to poor preservation or erosion of tephra and other volcanic deposits. Even with many stratigraphic studies, underestimation or overestimation of eruption numbers is possible due to mis-matching tephras with similar chemical compositions or problematic age models. It is also common to have gaps in event coverage due to sedimentary records not being available in all directions from the volcano, especially downwind. Here, we examine the sensitivity of probabilistic hazard estimates using a suite of four new and two existing high-resolution tephra records located around Mt. Taranaki, New Zealand. Previous estimates were made using only single, or two correlated, tephra records. In this study, tephra data from six individual sites in lake and peat bogs covering an arc of 120° downwind of the volcano provided an excellent temporal high-resolution event record. The new data confirm a previously identified semi-regular pattern of variable eruption frequency at Mt. Taranaki. Eruption intervals exhibit a bimodal distribution, with eruptions being an average of 65 years apart, and in 2% of cases, centuries separate eruptions. The long intervals are less common than seen in earlier studies, but they have not disappeared with the inclusion of our comprehensive new dataset. Hence, the latest long interval of quiescence, since AD 1800, is unusual, but not out of character with the volcano. The new data also suggest that one of the tephra records (Lake Rotokare) used in earlier work had an old carbon effect on age determinations. This shifted ages of the affected tephras so that they were not correlated to other sites, leading to an artificially high eruption frequency in the previous combined record. New modelled time-varying frequency estimates suggest a 33

  14. Tracking the El Nino events from Antarctic ice core records

    International Nuclear Information System (INIS)

    Keskin, S.S.; Oelmez, I.

    2004-01-01

    Sodium and chlorine measurements were made by instrumental neutron activation analysis (INAA) on stratigraphically dated ice core samples from Byrd Station, Antarctica, for the last three centuries. The time period between 1969 and 1989 showed an enhanced impact on the Antarctic ice sheets from oceans in the form of marine aerosols. A disturbed ocean-atmosphere interface due to El Ni Southern Oscillation (ENSO) events seems to be a candidate for this observation in Antarctica. (author)

  15. NPOESS Environmental Data Record (EDR) Production

    Science.gov (United States)

    Hughes, R.; Grant, K. D.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes NPOESS satellite data to provide environmental data products (aka, Environmental Data Records or EDRs) to NOAA and DoD processing centers operated by the United States government. The IDPS will process EDRs beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. Together, IDPS and A&DP must support the calibration, validation, and data quality improvement initiatives of the NPOESS program to ensure the production of atmospheric and environmental products that meet strict requirements for accuracy and precision. In support of this activity, A&DP and IDPS continually updates the estimated performance of the NPOESS system with respect to both latency and data quality, using the latest operational implementation of the data processing software and information from instrument test activities. This presentation will illustrate and describe the processing chains that create the data products, as well as describe the

  16. R2R Eventlogger: Community-wide Recording of Oceanographic Cruise Science Events

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Stolp, L.; Lerner, S.; Avery, J.; Thiel, T.

    2012-12-01

    Methods used by researchers to track science events during a science research cruise - and to note when and where these occur - varies widely. Handwritten notebooks, printed forms, watch-keeper logbooks, data-logging software, and customized software have all been employed. The quality of scientific results is affected by the consistency and care with which such events are recorded and integration of multi-cruise results is hampered because recording methods vary widely from cruise to cruise. The Rolling Deck to Repository (R2R) program has developed an Eventlogger system that will eventually be deployed on most vessels in the academic research fleet. It is based on the open software package called ELOG (http://midas.psi.ch/elog/) originally authored by Stefan Ritt and enhanced by our team. Lessons have been learned in its development and use on several research cruises. We have worked hard to find approaches that encourage cruise participants to use tools like the eventlogger. We examine these lessons and several eventlogger datasets from past cruises. We further describe how the R2R Science Eventlogger works in concert with the other R2R program elements to help coordinate research vessels into a coordinated mobile observing fleet. Making use of data collected on different research cruises is enabled by adopting common ways of describing science events, the science instruments employed, the data collected, etc. The use of controlled vocabularies and the practice of mapping these local vocabularies to accepted oceanographic community vocabularies helps to bind shipboard research events from different cruises into a more cohesive set of fleet-wide events that can be queried and examined in a cross-cruise manner. Examples of the use of the eventlogger during multi-cruise oceanographic research programs along with examples of resultant eventlogger data will be presented. Additionally we will highlight the importance of vocabulary use strategies to the success of the

  17. Optimizing access to conditions data in ATLAS event data processing

    CERN Document Server

    Rinaldi, Lorenzo; The ATLAS collaboration

    2018-01-01

    The processing of ATLAS event data requires access to conditions data which is stored in database systems. This data includes, for example alignment, calibration, and configuration information that may be characterized by large volumes, diverse content, and/or information which evolves over time as refinements are made in those conditions. Additional layers of complexity are added by the need to provide this information across the world-wide ATLAS computing grid and the sheer number of simultaneously executing processes on the grid, each demanding a unique set of conditions to proceed. Distributing this data to all the processes that require it in an efficient manner has proven to be an increasing challenge with the growing needs and number of event-wise tasks. In this presentation, we briefly describe the systems in which we have collected information about the use of conditions in event data processing. We then proceed to explain how this information has been used to refine not only reconstruction software ...

  18. Paleo-event data standards for dendrochronology

    Science.gov (United States)

    Elaine Kennedy Sutherland; P. Brewer; W. Gross

    2017-01-01

    Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...

  19. Analyzing time-ordered event data with missed observations

    NARCIS (Netherlands)

    Dokter, Adriaan M.; van Loon, E. Emiel; Fokkema, Wimke; Lameris, Thomas K.; Nolet, Bart A.; van der Jeugd, Henk P.

    2017-01-01

    A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally

  20. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    Moraes M.F.D.

    1997-01-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  1. Developing NOAA's Climate Data Records From AVHRR and Other Data

    Science.gov (United States)

    Privette, J. L.; Bates, J. J.; Kearns, E. J.

    2010-12-01

    As part of the provisional NOAA Climate Service, NOAA is providing leadership in the development of authoritative, measurement-based information on climate change and variability. NOAA’s National Climatic Data Center (NCDC) recently initiated a satellite Climate Data Record Program (CDRP) to provide sustained and objective climate information derived from meteorological satellite data that NOAA has collected over the past 30+ years - particularly from its Polar Orbiting Environmental Satellites (POES) program. These are the longest sustained global measurement records in the world and represent billions of dollars of investment. NOAA is now applying advanced analysis methods -- which have improved remarkably over the last decade -- to the POES AVHRR and other instrument data. Data from other satellite programs, including NASA and international research programs and the Defense Meteorological Satellite Program (DMSP), are also being used. This process will unravel the underlying climate trend and variability information and return new value from the records. In parallel, NCDC will extend these records by applying the same methods to present-day and future satellite measurements, including the Joint Polar Satellite System (JPSS) and Jason-3. In this presentation, we will describe the AVHRR-related algorithm development activities that CDRP recently selected and funded through open competitions. We will particularly discuss some of the technical challenges related to adapting and using AVHRR algorithms with the VIIRS data that should become available with the launch of the NPOESS Preparatory Project (NPP) satellite in early 2012. We will also describe IT system development activities that will provide data processing and reprocessing, storage and management. We will also outline the maturing Program framework, including the strategies for coding and development standards, community reviews, independent program oversight, and research-to-operations algorithm

  2. Detection of explosive cough events in audio recordings by internal sound analysis.

    Science.gov (United States)

    Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P

    2017-07-01

    We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.

  3. The HepMC C++ Monte Carlo Event Record for High Energy Physics

    CERN Document Server

    Dobbs, M

    2000-01-01

    HepMC is an Object Oriented event record written in C++ for High Energy Physics Monte Carlo Event Generators. Many extensions from HEPEVT, the Fortran HEP standard, are supported: the number of entries is unlimited, spin density matrices can be stored with each vertex, flow patterns (such as colour) can be stored and traced, random number generator states can be stored, and an arbitrary number of event weights can be included. Particles and vertices are stored separately in a graph structure, reflecting the evolution of a physics event. The added information supports the modularisation of event generators. The event record has been kept as simple as possible with minimal internal/external dependencies. Event information is accessed by means of iterators supplied with HepMC.

  4. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  5. SOCIAL INTERACTIONS AND FEELINGS OF INFERIORITY AMONG CORRECTIONAL OFFICERS - A DAILY EVENT-RECORDING APPROACH

    NARCIS (Netherlands)

    PEETERS, MCW; BUUNK, BP; SCHAUFELI, WB

    1995-01-01

    A daily event-recording method, referred to as the Daily Interaction Record in Organizations (DIRO) was employed for assessing the influence of three types of social interaction on negative affect at work. For this purpose, 38 correctional officers (COs) completed forms, for a 1-week period, that

  6. Continuing the Total and Spectral Solar Irradiance Climate Data Record

    Science.gov (United States)

    Coddington, O.; Pilewskie, P.; Kopp, G.; Richard, E. C.; Sparn, T.; Woods, T. N.

    2017-12-01

    Radiative energy from the Sun establishes the basic climate of the Earth's surface and atmosphere and defines the terrestrial environment that supports all life on the planet. External solar variability on a wide range of scales ubiquitously affects the Earth system, and combines with internal forcings, including anthropogenic changes in greenhouse gases and aerosols, and natural modes such as ENSO, and volcanic forcing, to define past, present, and future climates. Understanding these effects requires continuous measurements of total and spectrally resolved solar irradiance that meet the stringent requirements of climate-quality accuracy and stability over time. The current uninterrupted 39-year total solar irradiance (TSI) climate data record is the result of several overlapping instruments flown on different missions. Measurement continuity, required to link successive instruments to the existing data record to discern long-term trends makes this important climate data record susceptible to loss in the event of a gap in measurements. While improvements in future instrument accuracy will reduce the risk of a gap, the 2017 launch of TSIS-1 ensures continuity of the solar irradiance record into the next decade. There are scientific and programmatic motivations for addressing the challenges of maintaining the solar irradiance data record beyond TSIS-1. The science rests on well-founded requirements of establishing a trusted climate observing network that can monitor trends in fundamental climate variables. Programmatically, the long-term monitoring of solar irradiance must be balanced within the broader goals of NASA Earth Science. New concepts for a low-risk, cost efficient observing strategy is a priority. New highly capable small spacecraft, low-cost launch vehicles and a multi-decadal plan to provide overlapping TSI and SSI data records are components of a low risk/high reliability plan with lower annual cost than past implementations. This paper provides the

  7. Using weather data to determine dry and wet periods relative to ethnographic records

    Science.gov (United States)

    Felzer, B. S.; Jiang, M.; Cheng, R.; Ember, C. R.

    2017-12-01

    Ethnographers record flood or drought events that affect a society's food supply and can be interpreted in terms of a society's ability to adapt to extreme events. Using daily weather station data from the Global Historical Climatology Network for wet events, and monthly gridded climatic data from the Climatic Research Unit for drought events, we determine if it is possible to relate these measured data to the ethnographic records. We explore several drought and wetness indices based on temperature and precipitation, as well as the Colwell method to determine the predictability, seasonality, and variability of these extreme indices. Initial results indicate that while it is possible to capture the events recorded in the ethnographic records, there are many more "false" captures of events that are not recorded in these records. Although extreme precipitation is a poor indicator of floods due to antecedent moisture conditions, even using streamflow for selected sites produces false captures. Relating drought indices to actual food supply as measured in crop yield only related to minimum crop yield in half the cases. Further mismatches between extreme precipitation and drought indices and ethnographic records may relate to the fact that only extreme events that affect food supply are recorded in the ethnographic records or that not all events are recorded by the ethnographers. We will present new results on how predictability measures relate to the ethnographic disasters. Despite the highlighted technical challenges, our results provide a historic perspective linking environmental stressors with socio-economic impacts, which in turn, will underpin the current efforts of risk assessment in a changing environment.

  8. A survey of flux transfer events recorded by the UKS spacecraft magnetometer

    International Nuclear Information System (INIS)

    Southwood, D.J.; Saunders, M.A.; Dunlop, M.W.; Mier-Jedrzejowicz, W.A.C.; Rijnbeek, R.P.

    1986-01-01

    The UKS spacecraft operated from August 1984 through to January 1985. During that time, it made multiple crossings of the magnetopause in local time sectors extending from mid-afternoon to just behind the dawn meridian. We have surveyed the magnetometer records from these magnetopause encounters and have compiled a catalogue of flux transfer events. Using the catalogue, we find the FTE occurrence determined from the UKS data set is substantially less than that detected using data from the early ISEE 1/2 spacecraft orbits. The UKS data set shows a correlation between FTE occurrence and southward external magnetic field, but there are several instances of passes in which no FTEs are detected but for which the external field was unambiguously southward. The passes with the largest number of events are those for which the field outside the magnetopause has a large Bsub(M) component. We conclude that the lower latitude of the UKS encounters is responsible for the discrepancy with the ISEE occurrence. The most likely source region appears to be near the subsolar region. (author)

  9. EOP TDRs (Temperature-Depth-Recordings) Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Temperature-depth-recorders (TDRs) were attached to commercial longline and research Cobb trawl gear to obtain absolute depth and temperature measurement during...

  10. VA Personal Health Record Sample Data

    Data.gov (United States)

    Department of Veterans Affairs — My HealtheVet (www.myhealth.va.gov) is a Personal Health Record portal designed to improve the delivery of health care services to Veterans, to promote health and...

  11. Boosting joint models for longitudinal and time-to-event data

    DEFF Research Database (Denmark)

    Waldmann, Elisabeth; Taylor-Robinson, David; Klein, Nadja

    2017-01-01

    Joint models for longitudinal and time-to-event data have gained a lot of attention in the last few years as they are a helpful technique clinical studies where longitudinal outcomes are recorded alongside event times. Those two processes are often linked and the two outcomes should thus be model...

  12. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid- 2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. These slides present two components of the ATLAS EventIndex: its data collection supervisor and its web interface partner.

  13. Duplicate Record Elimination in Large Data Files.

    Science.gov (United States)

    1981-08-01

    UNCLASSIFIJED CSTR -445 NL LmEE~hhE - I1.0 . 111112----5 1.~4 __112 ___IL25_ 1.4 111111.6 EI24 COMPUTER SCIENCES DEPARTMENT oUniversity of Wisconsin...we propose a combinatorial model for the use in the analysis of algorithms for duplicate elimination. We contend that this model can serve as a...duplicates in a multiset of records, knowing the size of the multiset and the number of distinct records in it. 3. Algorithms for Duplicate Elimination

  14. Genomic inferences of domestication events are corroborated by written records in Brassica rapa.

    Science.gov (United States)

    Qi, Xinshuai; An, Hong; Ragsdale, Aaron P; Hall, Tara E; Gutenkunst, Ryan N; Chris Pires, J; Barker, Michael S

    2017-07-01

    Demographic modelling is often used with population genomic data to infer the relationships and ages among populations. However, relatively few analyses are able to validate these inferences with independent data. Here, we leverage written records that describe distinct Brassica rapa crops to corroborate demographic models of domestication. Brassica rapa crops are renowned for their outstanding morphological diversity, but the relationships and order of domestication remain unclear. We generated genomewide SNPs from 126 accessions collected globally using high-throughput transcriptome data. Analyses of more than 31,000 SNPs across the B. rapa genome revealed evidence for five distinct genetic groups and supported a European-Central Asian origin of B. rapa crops. Our results supported the traditionally recognized South Asian and East Asian B. rapa groups with evidence that pak choi, Chinese cabbage and yellow sarson are likely monophyletic groups. In contrast, the oil-type B. rapa subsp. oleifera and brown sarson were polyphyletic. We also found no evidence to support the contention that rapini is the wild type or the earliest domesticated subspecies of B. rapa. Demographic analyses suggested that B. rapa was introduced to Asia 2,400-4,100 years ago, and that Chinese cabbage originated 1,200-2,100 years ago via admixture of pak choi and European-Central Asian B. rapa. We also inferred significantly different levels of founder effect among the B. rapa subspecies. Written records from antiquity that document these crops are consistent with these inferences. The concordance between our age estimates of domestication events with historical records provides unique support for our demographic inferences. © 2017 John Wiley & Sons Ltd.

  15. Environmental response to the cold climate event 8200 years ago as recorded at Hoejby Soe, Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Peter (Geological Survey of Denmark and Greenland, Copenhagen (Denmark)); Ulfeldt Hede, M.; Noe-Nygaard, N. (Univ. of Copenhagen, Dept. of Geography and Geology, Copenhagen (Denmark)); Clarke, A.L. (APEM Manchester Lab., Stockport (United Kingdom)); Vinebrooke, R.D. (Univ. of Alberta, Dept. of Biological Science - Freshwater Biodiversity Lab., Edmonton (Canada))

    2008-07-15

    The need for accurate predictions of future environmental change under conditions of global warming has led to a great interest in the most pronounced climate change known from the Holocene: an abrupt cooling event around 8200 years before present (present = A.D. 1950), also known as the '8.2 ka cooling event' (ka = kilo-annum = 1000 years). This event has been recorded as a negative delta18OMICRON excursion in the central Greenland ice cores (lasting 160 years with the lowest temperature at 8150) and in a variety of other palaeoclimatic archives including lake sediments, ocean cores, speleothems, tree rings, and glacier oscillations from most of the Northern Hemisphere. In Greenland the maximum cooling was estimated to be 6 +- 2 deg. C while in southern Fennoscandia and the Baltic countries pollenbased quantitative temperature reconstructions indicate a maximum annual mean temperature decrease of around 1.5 deg. C. Today there is a general consensus that the primary cause of the cooling event was the final collapse of the Laurentide ice sheet near Hudson Bay and the associated sudden drainage of the proglacial Lake Agassiz into the North Atlantic Ocean around 8400 B.P. . This freshwater outflow, estimated to amount to c. 164,000 km3 of water, reduced the strength of the North Atlantic thermohaline circulation and thereby the heat transported to the North Atlantic region, resulting in an atmospheric cooling. The climatic consequences of this meltwater flood are assumed to be a good geological analogue for future climate-change scenarios, as a freshening of the North Atlantic is projected by almost all global-warming models and is also currently being registered in the region. In an ongoing project, the influence of the 8.2 ka cooling event on a Danish terrestrial and lake ecosystem is being investigated using a variety of biological and geochemical proxy data from a sediment core extracted from Hojby So, north-west Sjaelland. Here we present data on

  16. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  17. Radiation scanning system for data recording media

    International Nuclear Information System (INIS)

    Gucza, E.

    1975-01-01

    The scanner of an encoded record support operates by the reflection principle. The record support has tracks brocken down into individual fields which are assigned light-dark markers for encoding purposes.The support consists of a light, non-transparent card which can be pulled over a slot by a guide attached to the scanner. The slot is arranged at an oblique angle relative to the card and emits radiation, for instance, light. This radiation is reflected by the tracks, the empty fields reflecting more radiation than the blackend ones, and then after having been transformed into signals, impinges upon phototransistors through openings. The number of openings corresponds to the number of tracks. The light can be made diffuse prior to exposure of the card by means of a red transparent plastic foil. (DG/RF) [de

  18. Understanding extreme rainfall events in Australia through historical data

    Science.gov (United States)

    Ashcroft, Linden; Karoly, David John

    2016-04-01

    Historical climate data recovery is still an emerging field in the Australian region. The majority of Australia's instrumental climate analyses begin in 1900 for rainfall and 1910 for temperature, particularly those focussed on extreme event analysis. This data sparsity for the past in turn limits our understanding of long-term climate variability, constraining efforts to predict the impact of future climate change. To address this need for improved historical data in Australia, a new network of recovered climate observations has recently been developed, centred on the highly populated southeastern Australian region (Ashcroft et al., 2014a, 2014b). The dataset includes observations from more than 39 published and unpublished sources and extends from British settlement in 1788 to the formation of the Australian Bureau of Meteorology in 1908. Many of these historical sources provide daily temperature and rainfall information, providing an opportunity to improve understanding of the multidecadal variability of Australia's extreme events. In this study we combine the historical data for three major Australian cities - Melbourne, Sydney and Adelaide - with modern observations to examine extreme rainfall variability over the past 174 years (1839-2013). We first explore two case studies, combining instrumental and documentary evidence to support the occurrence of severe storms in Sydney in 1841 and 1844. These events appear to be at least as extreme as Sydney's modern 24-hour rainfall record. Next we use a suite of rainfall indices to assess the long-term variability of rainfall in southeastern Australia. In particular, we focus on the stationarity of the teleconnection between the El Niño-Southern Oscillation (ENSO) phenomenon and extreme rainfall events. Using ENSO reconstructions derived from both palaeoclimatic and documentary sources, we determine the historical relationship between extreme rainfall in southeastern Australia and ENSO, and examine whether or not this

  19. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    Prokoshin, Fedor; The ATLAS collaboration; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information obtained from the ATLAS detector. The basic unit of this information is event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex are the event picking, providing information for the Event Service and data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the GRID, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalog AMI and the Rucio data man...

  20. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  1. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  2. Recording the LHCb data and software dependencies

    Science.gov (United States)

    Trisovic, Ana; Couturier, Ben; Gibson, Val; Jones, Chris

    2017-10-01

    In recent years awareness of the importance of preserving the experimental data and scientific software at CERN has been rising. To support this effort, we are presenting a novel approach to structure dependencies of the LHCb data and software to make it more accessible in the long-term future. In this paper, we detail the implementation of a graph database of these dependencies. We list the implications that can be deduced from the graph mining (such as a search for the legacy software), with emphasis on data preservation. Furthermore, we introduce a methodology of recreating the LHCb data, thus supporting reproducible research and data stewardship. Finally, we describe how this information is made available to the users on a web portal that promotes data and analysis preservation and good practise with analysis documentation.

  3. Component event data bank fast breeder reactors

    International Nuclear Information System (INIS)

    Cambi, G.; Righini, R.; Sola, P.G.; Zappellini, G.

    1986-01-01

    ENEA, the Italian Committee for Research and Development of Nuclear Energy, has created a data bank expressly concerning FBRs. The structure of the bank is similar to previous CEDB for LWR, but has been fitted to the new requirements typical of FBRs and to the different conditions of the industry. (DG)

  4. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00064378; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Prokoshin, Fedor; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on p...

  5. Fundamental constraints on some event data

    International Nuclear Information System (INIS)

    Watson, I.A.

    1986-01-01

    A modified version of Searle's theory of the structure of human action has been explained and applied to man machine interaction. The comprehensiveness of the theory has been demonstrated, in particular its explanation of human performance and that its consistency with current theories of human error for which it provides an overall setting. The importance of the mental component of human error is highlighted and the constraints that this puts on the collection analysis and use of human error data. Examples have been given to illustrate and apply the theory ranging from considerations of the tenuousness of the link between safety goals and data to simple valve operations. Two approaches which recognise the constraints shown by the theory have been explained. (orig./DG)

  6. Continuous event recorders did not affect anxiety or quality of life in patients with palpitations

    NARCIS (Netherlands)

    Hoefman, Emmy; Boer, Kimberly R.; van Weert, Henk C. P. M.; Reitsma, Johannes B.; Koster, Rudolf W.; Bindels, Patrick J. P.

    2007-01-01

    OBJECTIVES: Palpitations can generate feelings of anxiety and decrease quality of life (QoL) due to fear of a cardiac abnormality. Continuous event recorders (CERs) have proven to be successful in diagnosing causes of palpitations but may affect patient QoL and anxiety. The aim is to determine

  7. Information retrieval for nonstationary data records

    Science.gov (United States)

    Su, M. Y.

    1971-01-01

    A review and a critical discussion are made on the existing methods for analysis of nonstationary time series, and a new algorithm for splitting nonstationary time series, is applied to the analysis of sunspot data.

  8. NIMS EXPERIMENT DATA RECORDS: SL-9 COMET IMPACT WITH JUPITER

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  9. NIMS EXPERIMENT DATA RECORDS: GASPRA/IDA ENCOUNTERS

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  10. High density data recording for SSCL linac

    International Nuclear Information System (INIS)

    VanDeusen, A.L.; Crist, C.

    1993-01-01

    The Superconducting Super Collider Laboratory and AlliedSignal Aerospace have collaboratively developed a high density data monitoring system for beam diagnostic activities. The 128 channel data system is based on a custom multi-channel high speed digitizer card for the VXI bus. The card is referred to as a Modular Input VXI (MIX) digitizer. Multiple MIX cards are used in the complete system to achieve the necessary high channel density requirements. Each MIX digitizer card also contains programmable signal conditioning, and enough local memory to complete an entire beam scan without assistance from the host processor

  11. Motivation and intention to integrate physical activity into daily school life: the JAM World Record event.

    Science.gov (United States)

    Vazou, Spyridoula; Vlachopoulos, Symeon P

    2014-11-01

    Research on the motivation of stakeholders to integrate physical activity into daily school life is limited. The purpose was to examine the motivation of stakeholders to participate in a world record physical activity event and whether motivation was associated with future intention to use activity breaks during the daily school life and future participation in a similar event. After the 2012 JAM (Just-a-Minute) World Record event, 686 adults (591 women; 76.1% participated for children amotivation for participation in the next event were reported. Hierarchical regression analysis, controlling for age, gender, and occupation, showed that intrinsic forms of motivation positively predicted, whereas amotivation negatively predicted, future intention to participate in the event and use the activity breaks. Multivariate analyses of variance revealed that school-related participants were more intrinsically motivated and intended to use the activity breaks and repeat the event more than those who were not affiliated with a school. Nonschool participants reported higher extrinsic motivation and amotivation than school-related participants. © 2014 Society for Public Health Education.

  12. Backdating of events in electronic primary health care data: should one censor at the date of last data collection.

    Science.gov (United States)

    Sammon, Cormac J; Petersen, Irene

    2016-04-01

    Studies using primary care databases often censor follow-up at the date data are last collected from clinical computer systems (last collection date (LCD)). We explored whether this results in the selective exclusion of events entered in the electronic health records after their date of occurrence, that is, backdated events. We used data from The Health Improvement Network (THIN). Using two versions of the database, we identified events that were entered into a later (THIN14) but not an earlier version of the database (THIN13) and investigated how the number of entries changed as a function of time since LCD. Times between events and the dates they were recorded were plotted as a function of time since the LCD in an effort to determine appropriate points at which to censor follow-up. There were 356 million eligible events in THIN14 and 355 million eligible events in THIN13. When comparing the two data sets, the proportion of missing events in THIN13 was highest in the month prior to the LCD (9.6%), decreasing to 5.2% at 6 months and 3.4% at 12 months. The proportion of missing events was largest for events typically diagnosed in secondary care such as neoplasms (28% in the month prior to LCD) and negligible for events typically diagnosed in primary care such as respiratory events (2% in the month prior to LCD). Studies using primary care databases, particularly those investigating events typically diagnosed outside primary care, should censor follow-up prior to the LCD to avoid underestimation of event rates. Copyright © 2016 John Wiley & Sons, Ltd.

  13. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration; Sanchez, Javier

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment [1][2] at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. This paper presents two components of the ATLAS EventIndex [3]: its data collection supervisor and its web interface partner.

  14. Client and event driven data hub system at CDF

    International Nuclear Information System (INIS)

    Kilminster, Ben; McFarland, Kevin; Vaiciulis, Tony; Matsunaga, Hiroyuki; Shimojima, Makoto

    2001-01-01

    The Consumer-Server Logger (CSL) system at the Collider Detector at Fermilab is a client and event driven data hub capable of receiving physics events from multiple connections, and logging them to multiple streams while distributing them to multiple online analysis programs (consumers). Its multiple-partitioned design allows data flowing through different paths of the detector sub-systems to be processed separately. The CSL system, using a set of internal memory buffers and message queues mapped to the location of events within its programs, and running on an SGI 2200 Server, is able to process at least the required 20 MB/s of constant event logging (75 Hz of 250 KB events) while also filtering up to 10 MB/s to consumers requesting specific types of events

  15. A novel method for inferring RFID tag reader recordings into clinical events.

    Science.gov (United States)

    Chang, Yung-Ting; Syed-Abdul, Shabbir; Tsai, Chung-You; Li, Yu-Chuan

    2011-12-01

    Nosocomial infections (NIs) are among the important indicators used for evaluating patients' safety and hospital performance during accreditation of hospitals. NI rate is higher in Intensive Care Units (ICUs) than in the general wards because patients require intense care involving both invasive and non-invasive clinical procedures. The emergence of Superbugs is motivating health providers to enhance infection control measures. Contact behavior between health caregivers and patients is one of the main causes of cross infections. In this technology driven era remote monitoring of patients and caregivers in the hospital setting can be performed reliably, and thus is in demand. Proximity sensing using radio frequency identification (RFID) technology can be helpful in capturing and keeping track on all contact history between health caregivers and patients for example. This study intended to extend the use of proximity sensing of radio frequency identification technology by proposing a model for inferring RFID tag reader recordings into clinical events. The aims of the study are twofold. The first aim is to set up a Contact History Inferential Model (CHIM) between health caregivers and patients. The second is to verify CHIM with real-time observation done at the ICU ward. A pre-study was conducted followed by two study phases. During the pre-study proximity sensing of RFID was tested, and deployment of the RFID in the Clinical Skill Center in one of the medical centers in Taiwan was done. We simulated clinical events and developed CHIM using variables such as duration of time, frequency, and identity (tag) numbers assigned to caregivers. All clinical proximity events are classified into close-in events, contact events and invasive events. During the first phase three observers were recruited to do real time recordings of all clinical events in the Clinical Skill Center with the deployed automated RFID interaction recording system. The observations were used to verify

  16. Data Discovery and Access via the Heliophysics Events Knowledgebase (HEK)

    Science.gov (United States)

    Somani, A.; Hurlburt, N. E.; Schrijver, C. J.; Cheung, M.; Freeland, S.; Slater, G. L.; Seguin, R.; Timmons, R.; Green, S.; Chang, L.; Kobashi, A.; Jaffey, A.

    2011-12-01

    The HEK is a integrated system which helps direct scientists to solar events and data from a variety of providers. The system is fully operational and adoption of HEK has been growing since the launch of NASA's SDO mission. In this presentation we describe the different components that comprise HEK. The Heliophysics Events Registry (HER) and Heliophysics Coverage Registry (HCR) form the two major databases behind the system. The HCR allows the user to search on coverage event metadata for a variety of instruments. The HER allows the user to search on annotated event metadata for a variety of instruments. Both the HCR and HER are accessible via a web API which can return search results in machine readable formats (e.g. XML and JSON). A variety of SolarSoft services are also provided to allow users to search the HEK as well as obtain and manipulate data. Other components include - the Event Detection System (EDS) continually runs feature finding algorithms on SDO data to populate the HER with relevant events, - A web form for users to request SDO data cutouts for multiple AIA channels as well as HMI line-of-sight magnetograms, - iSolSearch, which allows a user to browse events in the HER and search for specific events over a specific time interval, all within a graphical web page, - Panorama, which is the software tool used for rapid visualization of large volumes of solar image data in multiple channels/wavelengths. The user can also easily create WYSIWYG movies and launch the Annotator tool to describe events and features. - EVACS, which provides a JOGL powered client for the HER and HCR. EVACS displays the searched for events on a full disk magnetogram of the sun while displaying more detailed information for events.

  17. Program package for data preparation of RISK events

    International Nuclear Information System (INIS)

    Denes, E.; Wagner, I.; Nagy, J.

    1980-01-01

    A FORTRAN program package written for the CDC-6500 computer is presented. The SMHV program is designed to transform data obtained from events of RISK streamer chamber by means of measuring SAMET or PUOS devices to the HEVAS data tormat format needed by the geometrical reconstruction program. Such a transformation provides the standartization of measurement data procession inside the RISK collaboration and capability of the direct input into program of event geometrical reconstruction registered on a film of RISK streamer chamber

  18. Digital voice recording: An efficient alternative for data collection

    Science.gov (United States)

    Mark A. Rumble; Thomas M. Juntti; Thomas W. Bonnot; Joshua J. Millspaugh

    2009-01-01

    Study designs are usually constrained by logistical and budgetary considerations that can affect the depth and breadth of the research. Little attention has been paid to increasing the efficiency of data recording. Digital voice recording and translation may offer improved efficiency of field personnel. Using this technology, we increased our data collection by 55...

  19. Health Data Recording, Reporting and Utilization Practices Among ...

    African Journals Online (AJOL)

    Health Data Recording, Reporting and Utilization Practices Among Primary Health Care Workers in Enugu State, South Eastern Nigeria. ... of PHC workers used notepads, 52.3% used notebooks while only 47.7% used health management information system (HMIS) forms to record data. ... AJOL African Journals Online.

  20. Gravity Data for Indiana-over 10,000 records

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (10,629 records) were compiled by Purdue University. This data base was received in December 1989. Principal gravity parameters include Free-air...

  1. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  2. Expression and cut parser for CMS event data

    International Nuclear Information System (INIS)

    Lista, Luca; Jones, Christopher D; Petrucciani, Giovanni

    2010-01-01

    We present a parser to evaluate expressions and Boolean selections that is applied on CMS event data for event filtering and analysis purposes. The parser is based on Boost Spirit grammar definition, and uses Reflex dictionaries for class introspection. The parser allows for a natural definition of expressions and cuts in users' configurations, and provides good runtime performance compared to other existing parsers.

  3. Out-of-order event processing in kinetic data structures

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Agrawal, Pankaj

    2011-01-01

    ’s for the maintenance of several fundamental structures such as kinetic sorting and kinetic tournament trees, which overcome the difficulty by employing a refined event scheduling and processing technique. We prove that the new event scheduling mechanism leads to a KDS that is correct except for finitely many short......We study the problem of designing kinetic data structures (KDS’s for short) when event times cannot be computed exactly and events may be processed in a wrong order. In traditional KDS’s this can lead to major inconsistencies from which the KDS cannot recover. We present more robust KDS...

  4. Adverse-drug-event data provided by pharmaceutical companies.

    Science.gov (United States)

    Cudny, Magdalena E; Graham, Angie S

    2008-06-01

    Pharmaceutical company drug information center (PCDIC) responses to queries about adverse drug events (ADEs) were studied to determine whether PCDICs search sources other than the prescribing information on the package insert (PI) and whether the PCDICs' approach differs according to whether an ADE is listed in the PI (labeled) or not (unlabeled). Companies were selected from a list of PCDICs in the Physicians' Desk Reference. One oral or injectable prescription drug from each company was selected. For each drug, a labeled ADE and an unlabeled ADE about which to query the PCDICs were randomly selected from the index of an annual publication on ADEs. The investigators telephoned the PCDICs with an open-ended inquiry about the incidence, timing, and management of the ADE as reported in the literature and the company's internal data; they clarified that the request did not concern a specific patient. Whether or not information was provided, the source searched was recorded (PI, literature, internal database), and the percentages of PCDICs that used each source for labeled and for unlabeled ADEs were analyzed. Results were obtained from 100 companies to questions about 100 drugs (200 ADEs). For ADEs overall, 80% used the PI, 50% the medical literature, and 38% internal data. For labeled versus unlabeled ADEs, respectively, the PI was used by 84% and 76%; literature, both 50%; and internal data, 35% and 41%. The PCDIC specialists referencing the PI did not always provide accurate or up-to-date information. Some specialists, when asked to query internal databases, said that was not an option. For both labeled and unlabeled ADEs, the PI was the primary source used by PCDICs to answer safety questions about their products, and internal data were the least-used source. Most resources used by PCDICs are readily available to practicing pharmacists.

  5. Validating administrative data for the detection of adverse events in older hospitalized patients

    Directory of Open Access Journals (Sweden)

    Ackroyd-Stolarz S

    2014-08-01

    Full Text Available Stacy Ackroyd-Stolarz,1,2 Susan K Bowles,3–5 Lorri Giffin6 1Performance Excellence Portfolio, Capital District Health Authority, Halifax, Nova Scotia, Canada; 2Department of Emergency Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 3Geriatric Medicine, Capital District Health Authority, Halifax, Nova Scotia, Canada; 4College of Pharmacy and Division of Geriatric Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 5Department of Pharmacy at Capital District Health Authority, Halifax, Nova Scotia, Canada; 6South Shore Family Health, Bridgewater, Nova Scotia, Canada Abstract: Older hospitalized patients are at risk of experiencing adverse events including, but not limited to, hospital-acquired pressure ulcers, fall-related injuries, and adverse drug events. A significant challenge in monitoring and managing adverse events is lack of readily accessible information on their occurrence. Purpose: The objective of this retrospective cross-sectional study was to validate diagnostic codes for pressure ulcers, fall-related injuries, and adverse drug events found in routinely collected administrative hospitalization data. Methods: All patients 65 years of age or older discharged between April 1, 2009 and March 31, 2011 from a provincial academic health sciences center in Canada were eligible for inclusion in the validation study. For each of the three types of adverse events, a random sample of 50 patients whose records were positive and 50 patients whose records were not positive for an adverse event was sought for review in the validation study (n=300 records in total. A structured health record review was performed independently by two health care providers with experience in geriatrics, both of whom were unaware of the patient's status with respect to adverse event coding. A physician reviewed 40 records (20 reviewed by each health care provider to establish interrater agreement. Results: A total of 39 pressure ulcers, 56 fall

  6. Hospital staff should use more than one method to detect adverse events and potential adverse events: incident reporting, pharmacist surveillance and local real‐time record review may all have a place

    Science.gov (United States)

    Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles

    2007-01-01

    Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203

  7. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  8. Cosmic ray event in 994 C.E. recorded in radiocarbon from Danish oak

    Science.gov (United States)

    Fogtmann-Schulz, A.; Østbø, S. M.; Nielsen, S. G. B.; Olsen, J.; Karoff, C.; Knudsen, M. F.

    2017-08-01

    We present measurements of radiocarbon in annual tree rings from the time period 980-1006 Common Era (C.E.), hereby covering the cosmic ray event in 994 C.E. The new radiocarbon record from Danish oak is based on both earlywood and latewood fractions of the tree rings, which makes it possible to study seasonal variations in 14C production. The measurements show a rapid increase of ˜10‰ from 993 to 994 C.E. in latewood, followed by a modest decline and relatively high values over the ensuing ˜10 years. This rapid increase occurs from 994 to 995 C.E. in earlywood, suggesting that the cosmic ray event most likely occurred during the period between April and June 994 C.E. Our new record from Danish oak shows strong agreement with existing Δ14C records from Japan, thus supporting the hypothesis that the 994 C.E. cosmic ray event was uniform throughout the Northern Hemisphere and therefore can be used as an astrochronological tie point to anchor floating chronologies of ancient history.

  9. Design of double tape recorder data acquisition system

    International Nuclear Information System (INIS)

    Guo Tianrui; Du Yifei

    1995-01-01

    In the data acquisition system supported by the microcomputer tape recorder, as the acquisition speed is often limited by the low speed of tape recorder, so a double tape recorder system is designed. In this system, two tape recorders are used in on-line acquisition system simultaneously. One DMA channel used is one designed for soft disk driver, another DMA channel used is one retained for user. By this way, the speed of tape writing could be increased to nearly twice as much. In order to prevent the data confusion, the authors open two data buffers in system and write different mark in each buffer, then write the data block to two tape recorders according to the mark. The system complies with the principle: 'Double write, Double read'

  10. Dedicated data recording video system for Spacelab experiments

    Science.gov (United States)

    Fukuda, Toshiyuki; Tanaka, Shoji; Fujiwara, Shinji; Onozuka, Kuniharu

    1984-04-01

    A feasibility study of video tape recorder (VTR) modification to add the capability of data recording etc. was conducted. This system is an on-broad system to support Spacelab experiments as a dedicated video system and a dedicated data recording system to operate independently of the normal operation of the Orbiter, Spacelab and the other experiments. It continuously records the video image signals with the acquired data, status and operator's voice at the same time on one cassette video tape. Such things, the crews' actions, animals' behavior, microscopic views and melting materials in furnace, etc. are recorded. So, it is expected that experimenters can make a very easy and convenient analysis of the synchronized video, voice and data signals in their post flight analysis.

  11. Online event filtering in the JADE data acquisition system

    International Nuclear Information System (INIS)

    Mills, H.E.

    1986-01-01

    The data acquisition system developed for the JADE experiment at PETRA, DESY includes the facility to use software to filter out background events. The design, implementation, testing and experience gained are discussed. (orig.)

  12. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  13. Predicting mining collapse: Superjerks and the appearance of record-breaking events in coal as collapse precursors

    Science.gov (United States)

    Jiang, Xiang; Liu, Hanlong; Main, Ian G.; Salje, Ekhard K. H.

    2017-08-01

    The quest for predictive indicators for the collapse of coal mines has led to a robust criterion from scale-model tests in the laboratory. Mechanical collapse under uniaxial stress forms avalanches with a power-law probability distribution function of radiated energy P ˜E-ɛ , with exponent ɛ =1.5 . Impending major collapse is preceded by a reduction of the energy exponent to the mean-field value ɛ =1.32 . Concurrently, the crackling noise increases in intensity and the waiting time between avalanches is reduced when the major collapse is approaching. These latter criteria were so-far deemed too unreliable for safety assessments in coal mines. We report a reassessment of previously collected extensive collapse data sets using "record-breaking analysis," based on the statistical appearance of "superjerks" within a smaller spectrum of collapse events. Superjerks are defined as avalanche signals with energies that surpass those of all previous events. The final major collapse is one such superjerk but other "near collapse" events equally qualify. In this way a very large data set of events is reduced to a sparse sequence of superjerks (21 in our coal sample). The main collapse can be anticipated from the sequence of energies and waiting times of superjerks, ignoring all weaker events. Superjerks are excellent indicators for the temporal evolution, and reveal clear nonstationarity of the crackling noise at constant loading rate, as well as self-similarity in the energy distribution of superjerks as a function of the number of events so far in the sequence Es j˜nδ with δ =1.79 . They are less robust in identifying the precise time of the final collapse, however, than the shift of the energy exponents in the whole data set which occurs only over a short time interval just before the major event. Nevertheless, they provide additional diagnostics that may increase the reliability of such forecasts.

  14. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  15. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  16. New ATLAS event generator tunes to 2010 data

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    This note describes the Monte Carlo event generator tunings for the Pythia 6 and Herwig/Jimmy generators in the ATLAS MC11 simulation production. New tunes have been produced for these generators, making maximal use of available published data from ATLAS and from the Tevatron and LEP experiments. Particular emphasis has been placed on improvement of the description of e+ e− event shape and jet rate data, and on description of hadron collider event shape observables in Pythia, as well as the established procedure of tuning the multiple parton interactions of both models to describe underlying event and minimum bias data. The tuning of Pythia is provided at this time for the MRST LO∗∗ PDF, while the purely MPI tune of Herwig/Jimmy is performed for ten different PDFs.

  17. Microprocessor event analysis in parallel with Camac data acquisition

    International Nuclear Information System (INIS)

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  18. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  19. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  20. Court Upholds Confidentiality of Research Records/Data.

    Science.gov (United States)

    Florio, David H.

    1980-01-01

    Reviews the background of the Forsham v Harris case and discusses the implications of the Supreme Court's ruling that research records and data of federally funded grantees are not considered federal agency records subject to disclosure under the Freedom of Information Act. (Author/GC)

  1. 40 CFR 1065.202 - Data updating, recording, and control.

    Science.gov (United States)

    2010-07-01

    ... command and control frequency Minimum recording frequency § 1065.510 Speed and torque during an engine step-map 1 Hz 1 mean value per step. § 1065.510 Speed and torque during an engine sweep-map 5 Hz 1 Hz... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments § 1065.202 Data updating, recording...

  2. Good Laboratory Practice. Part 2. Recording and Retaining Raw Data

    Science.gov (United States)

    Wedlich, Richard C.; Libera, Agata E.; Pires, Amanda; Tellarini, Cassandra

    2013-01-01

    A clear understanding of how "raw data" is defined, recorded, and retained in the laboratory record is essential to the chemist employed in the laboratory compliant with the Good Laboratory Practices regulations. This article is intended to provide an understanding by drawing upon examples taken from the modern pharmaceutical analysis…

  3. Data compression systems for home-use digital video recording

    NARCIS (Netherlands)

    With, de P.H.N.; Breeuwer, M.; van Grinsven, P.A.M.

    1992-01-01

    The authors focus on image data compression techniques for digital recording. Image coding for storage equipment covers a large variety of systems because the applications differ considerably in nature. Video coding systems suitable for digital TV and HDTV recording and digital electronic still

  4. Record transfer of data between CERN and California

    CERN Document Server

    2003-01-01

    A data transfer record has been broken by transmitting at a rate of 2.38 gigabits per second for more than one hour between CERN and Sunnyvale in California, a distance of more than 10,000 km. This record-breaking performance was achieved in the framework of tests to develop a high-speed global network for the future computing grid.

  5. Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data

    Science.gov (United States)

    Butterfield, Ansel J.

    2001-01-01

    Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.

  6. Earth Science Data Fusion with Event Building Approach

    Science.gov (United States)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  7. A study of data representation in Hadoop to optimize data storage and search performance for the ATLAS EventIndex

    Science.gov (United States)

    Baranowski, Z.; Canali, L.; Toebbicke, R.; Hrivnac, J.; Barberis, D.

    2017-10-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  8. A study of data representation in Hadoop to optimise data storage and search performance for the ATLAS EventIndex

    CERN Document Server

    AUTHOR|(CDS)2078799; The ATLAS collaboration; Canali, Luca; Toebbicke, Rainer; Hrivnac, Julius; Barberis, Dario

    2017-01-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  9. A study of data representations in Hadoop to optimize data storage and search performance of the ATLAS EventIndex

    CERN Document Server

    Baranowski, Zbigniew; The ATLAS collaboration

    2016-01-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions event records, each of which consisting of ~100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. This paper reports on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  10. The National Extreme Events Data and Research Center (NEED)

    Science.gov (United States)

    Gulledge, J.; Kaiser, D. P.; Wilbanks, T. J.; Boden, T.; Devarakonda, R.

    2014-12-01

    The Climate Change Science Institute at Oak Ridge National Laboratory (ORNL) is establishing the National Extreme Events Data and Research Center (NEED), with the goal of transforming how the United States studies and prepares for extreme weather events in the context of a changing climate. NEED will encourage the myriad, distributed extreme events research communities to move toward the adoption of common practices and will develop a new database compiling global historical data on weather- and climate-related extreme events (e.g., heat waves, droughts, hurricanes, etc.) and related information about impacts, costs, recovery, and available research. Currently, extreme event information is not easy to access and is largely incompatible and inconsistent across web sites. NEED's database development will take into account differences in time frames, spatial scales, treatments of uncertainty, and other parameters and variables, and leverage informatics tools developed at ORNL (i.e., the Metadata Editor [1] and Mercury [2]) to generate standardized, robust documentation for each database along with a web-searchable catalog. In addition, NEED will facilitate convergence on commonly accepted definitions and standards for extreme events data and will enable integrated analyses of coupled threats, such as hurricanes/sea-level rise/flooding and droughts/wildfires. Our goal and vision is that NEED will become the premiere integrated resource for the general study of extreme events. References: [1] Devarakonda, Ranjeet, et al. "OME: Tool for generating and managing metadata to handle BigData." Big Data (Big Data), 2014 IEEE International Conference on. IEEE, 2014. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.

  11. Tablet computers for recording tuberculosis data at a community ...

    African Journals Online (AJOL)

    Tablet computers for recording tuberculosis data at a community health centre in King Sabata Dalindyebo Local Municipality, ... South African Family Practice ... They expressed a desire to extend the use of tablets to other areas of their work.

  12. Usage reporting on recorded lectures using educational data mining

    NARCIS (Netherlands)

    Gorissen, Pierre; Van Bruggen, Jan; Jochems, Wim

    2012-01-01

    Gorissen, P., Van Bruggen, J., & Jochems, W. M. G. (2012). Usage reporting on recorded lectures using educational data mining. International Journal of Learning Technology, 7, 23-40. doi:10.1504/IJLT.2012.046864

  13. MRO CRISM MULTISPECTRAL REDUCED DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset contains CRISM Multispectral Reduced Data Records (MRDRs). MRDRs are organized into 30 subdirectories named by the Mars Chart containing the MRDR, e.g....

  14. Physicists set new record for network data transfer

    CERN Multimedia

    2007-01-01

    "An international team of physicists, computer scientists, and network engineers joined forces to set new records for sustained data transfer between storage systems durint the SuperComputing 2006 (SC06) Bandwidth Challenge (BWC). (3 pages)

  15. Platform links clinical data with electronic health records

    Science.gov (United States)

    To make data gathered from patients in clinical trials available for use in standard care, NCI has created a new computer tool to support interoperability between clinical research and electronic health record systems. This new software represents an inno

  16. [Assessing the economic impact of adverse events in Spanish hospitals by using administrative data].

    Science.gov (United States)

    Allué, Natalia; Chiarello, Pietro; Bernal Delgado, Enrique; Castells, Xavier; Giraldo, Priscila; Martínez, Natalia; Sarsanedas, Eugenia; Cots, Francesc

    2014-01-01

    To evaluate the incidence and costs of adverse events registered in an administrative dataset in Spanish hospitals from 2008 to 2010. A retrospective study was carried out that estimated the incremental cost per episode, depending on the presence of adverse events. Costs were obtained from the database of the Spanish Network of Hospital Costs. This database contains data from 12 hospitals that have costs per patient records based on activities and clinical records. Adverse events were identified through the Patient Safety Indicators (validated in the Spanish Health System) created by the Agency for Healthcare Research and Quality together with indicators of the EuroDRG European project. This study included 245,320 episodes with a total cost of 1,308,791,871€. Approximately 17,000 patients (6.8%) experienced an adverse event, representing 16.2% of the total cost. Adverse events, adjusted by diagnosis-related groups, added a mean incremental cost of between €5,260 and €11,905. Six of the 10 adverse events with the highest incremental cost were related to surgical interventions. The total incremental cost of adverse events was € 88,268,906, amounting to an additional 6.7% of total health expenditure. Assessment of the impact of adverse events revealed that these episodes represent significant costs that could be reduced by improving the quality and safety of the Spanish Health System. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  17. RECORD OF THE BINARY DATA ON SD CARD ARDUINO DUE

    Directory of Open Access Journals (Sweden)

    V. G. Mikhailov

    2016-01-01

    Full Text Available The short review of microcontrollers of family Arduino, their characteristics and application fields is given. Importance of record of parameters of researched object is marked to produce debugging of control systems on microcontrollers Arduino. Unique possibility of registration of parameters in family Arduino is record on SD a card in an alpha mode with usage of functions print (, write (. The problems connected to record of the binary data on SD a card on microcontroller Arduino Due are considered. The analysis of methods of record of the binary data on SD card Arduino Due, originating problems with neo-cleaning of memory from the previous program leading to possibility of duplication of the data on SD to a card, presence of the erratic point of view about restriction of volumes of data record and necessity of usage become outdated SD cards is carried out. Ways of elimination of the marked lacks are considered. The estimation of high-speed performance of various approaches of a data recording on SD a card is led. On the basis of the led researches the approach of multiplexing of the writeable information at the expense of conversion of the binary data is offered is byte-serial in a character array in code ASCI without magnification of their volume and record by units on 240 byte. It allows to use as much as possible standard function possibilities write ( Arduino and specificity of the organization of memory SD of cards and to increase high-speed performance more than in 1100 times in comparison with record in a character type on one byte.It is marked that usage of decisions of an exception of duplication of the data offered at forums does not provide completeness of their elimination. For Arduino Due for storage cleaning it is necessary usages of the special programmator or setting of the new program of loading.

  18. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  19. High-Speed Data Recorder for Space, Geodesy, and Other High-Speed Recording Applications

    Science.gov (United States)

    Taveniku, Mikael

    2013-01-01

    A high-speed data recorder and replay equipment has been developed for reliable high-data-rate recording to disk media. It solves problems with slow or faulty disks, multiple disk insertions, high-altitude operation, reliable performance using COTS hardware, and long-term maintenance and upgrade path challenges. The current generation data recor - ders used within the VLBI community are aging, special-purpose machines that are both slow (do not meet today's requirements) and are very expensive to maintain and operate. Furthermore, they are not easily upgraded to take advantage of commercial technology development, and are not scalable to multiple 10s of Gbit/s data rates required by new applications. The innovation provides a softwaredefined, high-speed data recorder that is scalable with technology advances in the commercial space. It maximally utilizes current technologies without being locked to a particular hardware platform. The innovation also provides a cost-effective way of streaming large amounts of data from sensors to disk, enabling many applications to store raw sensor data and perform post and signal processing offline. This recording system will be applicable to many applications needing realworld, high-speed data collection, including electronic warfare, softwaredefined radar, signal history storage of multispectral sensors, development of autonomous vehicles, and more.

  20. Development of a Method to Compensate for Signal Quality Variations in Repeated Auditory Event-Related Potential Recordings

    Science.gov (United States)

    Paukkunen, Antti K. O.; Leminen, Miika M.; Sepponen, Raimo

    2010-01-01

    Reliable measurements are mandatory in clinically relevant auditory event-related potential (AERP)-based tools and applications. The comparability of the results gets worse as a result of variations in the remaining measurement error. A potential method is studied that allows optimization of the length of the recording session according to the concurrent quality of the recorded data. In this way, the sufficiency of the trials can be better guaranteed, which enables control of the remaining measurement error. The suggested method is based on monitoring the signal-to-noise ratio (SNR) and remaining measurement error which are compared to predefined threshold values. The SNR test is well defined, but the criterion for the measurement error test still requires further empirical testing in practice. According to the results, the reproducibility of average AERPs in repeated experiments is improved in comparison to a case where the number of recorded trials is constant. The test-retest reliability is not significantly changed on average but the between-subject variation in the value is reduced by 33–35%. The optimization of the number of trials also prevents excessive recordings which might be of practical interest especially in the clinical context. The efficiency of the method may be further increased by implementing online tools that improve data consistency. PMID:20407635

  1. Data-driven approach for creating synthetic electronic medical records.

    Science.gov (United States)

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4

  2. Tool for Constructing Data Albums for Significant Weather Events

    Science.gov (United States)

    Kulkarni, A.; Ramachandran, R.; Conover, H.; McEniry, M.; Goodman, H.; Zavodsky, B. T.; Braun, S. A.; Wilson, B. D.

    2012-12-01

    Case study analysis and climatology studies are common approaches used in Atmospheric Science research. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a given event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. To gather relevant data and information for case studies and climatology analysis is tedious and time consuming; current Earth Science data systems are not suited to assemble multi-instrument, multi mission datasets around specific events. For example, in hurricane science, finding airborne or satellite data relevant to a given storm requires searching through web pages and data archives. Background information related to damages, deaths, and injuries requires extensive online searches for news reports and official storm summaries. We will present a knowledge synthesis engine to create curated "Data Albums" to support case study analysis and climatology studies. The technological challenges in building such a reusable and scalable knowledge synthesis engine are several. First, how to encode domain knowledge in a machine usable form? This knowledge must capture what information and data resources are relevant and the semantic relationships between the various fragments of information and data. Second, how to extract semantic information from various heterogeneous sources including unstructured texts using the encoded knowledge? Finally, how to design a structured database from the encoded knowledge to store all information and to support querying? The structured database must allow both knowledge overviews of an event as well as drill down capability needed for detailed analysis. An application ontology driven framework is being used to design the knowledge synthesis engine. The knowledge synthesis engine is being

  3. Pooling overdispersed binomial data to estimate event rate.

    Science.gov (United States)

    Young-Xu, Yinong; Chan, K Arnold

    2008-08-19

    The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  4. Pooling overdispersed binomial data to estimate event rate

    Directory of Open Access Journals (Sweden)

    Chan K Arnold

    2008-08-01

    Full Text Available Abstract Background The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. Methods We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. Results In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. Conclusion The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  5. Sharing adverse drug event data using business intelligence technology.

    Science.gov (United States)

    Horvath, Monica M; Cozart, Heidi; Ahmad, Asif; Langman, Matthew K; Ferranti, Jeffrey

    2009-03-01

    Duke University Health System uses computerized adverse drug event surveillance as an integral part of medication safety at 2 community hospitals and an academic medical center. This information must be swiftly communicated to organizational patient safety stakeholders to find opportunities to improve patient care; however, this process is encumbered by highly manual methods of preparing the data. Following the examples of other industries, we deployed a business intelligence tool to provide dynamic safety reports on adverse drug events. Once data were migrated into the health system data warehouse, we developed census-adjusted reports with user-driven prompts. Drill down functionality enables navigation from aggregate trends to event details by clicking report graphics. Reports can be accessed by patient safety leadership either through an existing safety reporting portal or the health system performance improvement Web site. Elaborate prompt screens allow many varieties of reports to be created quickly by patient safety personnel without consultation with the research analyst. The reduction in research analyst workload because of business intelligence implementation made this individual available to additional patient safety projects thereby leveraging their talents more effectively. Dedicated liaisons are essential to ensure clear communication between clinical and technical staff throughout the development life cycle. Design and development of the business intelligence model for adverse drug event data must reflect the eccentricities of the operational system, especially as new areas of emphasis evolve. Future usability studies examining the data presentation and access model are needed.

  6. A 230 ka record of glacial and interglacial events from Aurora Cave, Fiordland, New Zealand

    International Nuclear Information System (INIS)

    Williams, P.W.

    1996-01-01

    Caves overrun by glaciers are known to accumulate dateable evidence of past glacial and interglacial events. Results are reported from an investigation of Aurora Cave on the slopes above Lake Te Anau in Fiordland. The cave commenced to form before c. 230 ka B.P. Sequences of glaciofluvial sediments interbedded with speleothems are evidence of the number and timing of glacial advances and the status of intervals between them. Twenty-six uranium series dates on speleothems underpin a chronology of seven glacial advances in the last 230 ka, with the peak of the late Otira glaciation, Aurora 3 advance, at c. 19 ka B.P. With five advances in the Otiran, the last glaciation is more complex than previously recognised. Comparison of the record with that recorded offshore from DSDP Site 594 reveals little matching, but the correspondence of the Aurora sequence with that interpreted from other onshore deposits is more convincing. Glacial deposits on slopes above the cave for a further 660 m may be evidence of the 'missing' glacial events of the mid-early Pleistocene. (author). 44 refs., 12 figs., 5 tabs

  7. An overview of European efforts in generating climate data records

    NARCIS (Netherlands)

    Su, Z.; Timmermans, W.J.; Zeng, Y.; Schulz, J.; John, V.O.; Roebeling, R.A.; Poli, P.; Tan, D.; Kaspar, F.; Kaiser-Weiss, A.; Swinnen, E.; Tote, C.; Gregow, H.; Manninen, T.; Riihela, A.; Calvet, J.C.; Ma, Yaoming; Wen, Jun

    2018-01-01

    The Coordinating Earth Observation Data Validation for Reanalysis for Climate Services project (CORE-CLIMAX) aimed to substantiate how Copernicus observations and products can contribute to climate change analyses. CORE-CLIMAX assessed the European capability to provide climate data records (CDRs)

  8. Data Recording in Performance Management: Trouble With the Logics

    Science.gov (United States)

    Groth Andersson, Signe; Denvall, Verner

    2017-01-01

    In recent years, performance management (PM) has become a buzzword in public sector organizations. Well-functioning PM systems rely on valid performance data, but critics point out that conflicting rationale or logic among professional staff in recording information can undermine the quality of the data. Based on a case study of social service…

  9. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi...

  10. Events in time: Basic analysis of Poisson data

    International Nuclear Information System (INIS)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given

  11. Events in time: Basic analysis of Poisson data

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  12. Identification of major cardiovascular events in patients with diabetes using primary care data.

    Science.gov (United States)

    Pouwels, Koen Bernardus; Voorham, Jaco; Hak, Eelko; Denig, Petra

    2016-04-02

    Routine primary care data are increasingly being used for evaluation and research purposes but there are concerns about the completeness and accuracy of diagnoses and events captured in such databases. We evaluated how well patients with major cardiovascular disease (CVD) can be identified using primary care morbidity data and drug prescriptions. The study was conducted using data from 17,230 diabetes patients of the GIANTT database and Dutch Hospital Data register. To estimate the accuracy of the different measures, we analyzed the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) relative to hospitalizations and/or records with a diagnosis indicating major CVD, including ischaemic heart diseases and cerebrovascular events. Using primary care morbidity data, 43% of major CVD hospitalizations could be identified. Adding drug prescriptions to the search increased the sensitivity up to 94%. A proxy of at least one prescription of either a platelet aggregation inhibitor, vitamin k antagonist or nitrate could identify 85% of patients with a history of major CVD recorded in primary care, with an NPV of 97%. Using the same proxy, 57% of incident major CVD recorded in primary or hospital care could be identified, with an NPV of 99%. A substantial proportion of major CVD hospitalizations was not recorded in primary care morbidity data. Drug prescriptions can be used in addition to diagnosis codes to identify more patients with major CVD, and also to identify patients without a history of major CVD.

  13. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  14. Identification of unusual events in multi-channel bridge monitoring data

    Science.gov (United States)

    Omenzetter, Piotr; Brownjohn, James Mark William; Moyo, Pilate

    2004-03-01

    Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure such as bridges. However, converting large amounts of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localising sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.

  15. Identification of unusual events in multichannel bridge monitoring data using wavelet transform and outlier analysis

    Science.gov (United States)

    Omenzetter, Piotr; Brownjohn, James M. W.; Moyo, Pilate

    2003-08-01

    Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure. However, converting large amount of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure in Singapore and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localizing sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.

  16. The Boltysh crater record of rapid vegetation change during the Dan-C2 hyperthermal event.

    Science.gov (United States)

    Jolley, D. W.; Daly, R.; Gilmour, I.; Gilmour, M.; Kelley, S. P.

    2012-04-01

    Analysis of a cored borehole drilled through the sedimentary fill of the 24km wide Boltysh meteorite crater, Ukraine has yielded a unique, high resolution record spanning algae records. These records reflect environmental change from the K/Pg1 to the post Dan-C2 Danian. Leading into the CIE, warm temperate gymnosperm - angiosperm - fern communities are replaced by precipitation limited (winterwet) plant communities within the negative CIE. Winterwet plant communities dominate the negative CIE, but are replaced within the isotope recovery stage by warm temperate floras. These in turn give way to cooler temperate floras in the post positive CIE section of the uppermost crater fill. The distribution of temperate taxa about the negative CIE represents the broadest scale of oscillatory variation in the palynofloras. Shorter frequency oscillations are evident from diversity and botanical group distributions reflecting changes in moisture availability over several thousand years. Detailed analysis of variability within one of these oscillations records plant community cyclicity across the inception of the negative CIE. This short term cyclicity provides evidence that the replacement of warm termperate by winterwet floras occurred in a stepwise manner at the negative CIE suggesting cumulative atmospheric forcing. At <1mm scale, lamination within the negative CIE showed no obvious lithological or colour differences, and are not seasonal couplets. However, palynofloral analysis of laminations from within the negative CIE has yielded evidence of annual variation identifying the potential for recoding changes in 'paleoweather' across a major hyperthermal event. [1] Jolley, D. W. et al. (2010) Geology 38, 835-838.

  17. Relational databases for conditions data and event selection in ATLAS

    International Nuclear Information System (INIS)

    Viegas, F; Hawkings, R; Dimitrov, G

    2008-01-01

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects

  18. Relational databases for conditions data and event selection in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Viegas, F; Hawkings, R; Dimitrov, G [CERN, CH-1211 Geneve 23 (Switzerland)

    2008-07-15

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects.

  19. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-01-01

    Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492

  20. Building clinical data groups for electronic medical record in China.

    Science.gov (United States)

    Tu, Haibo; Yu, Yingtao; Yang, Peng; Tang, Xuejun; Hu, Jianping; Rao, Keqin; Pan, Feng; Xu, Yongyong; Liu, Danhong

    2012-04-01

    This article aims at building clinical data groups for Electronic Medical Records (EMR) in China. These data groups can be reused as basic information units in building the medical sheets of Electronic Medical Record Systems (EMRS) and serve as part of its implementation guideline. The results were based on medical sheets, the forms that are used in hospitals, which were collected from hospitals. To categorize the information in these sheets into data groups, we adopted the Health Level 7 Clinical Document Architecture Release 2 Model (HL7 CDA R2 Model). The regulations and legal documents concerning health informatics and related standards in China were implemented. A set of 75 data groups with 452 data elements was created. These data elements were atomic items that comprised the data groups. Medical sheet items contained clinical records information and could be described by standard data elements that exist in current health document protocols. These data groups match different units of the CDA model. Twelve data groups with 87 standardized data elements described EMR headers, and 63 data groups with 405 standardized data elements constituted the body. The later 63 data groups in fact formed the sections of the model. The data groups had two levels. Those at the first level contained both the second level data groups and the standardized data elements. The data groups were basically reusable information units that served as guidelines for building EMRS and that were used to rebuild a medical sheet and serve as templates for the clinical records. As a pilot study of health information standards in China, the development of EMR data groups combined international standards with Chinese national regulations and standards, and this was the most critical part of the research. The original medical sheets from hospitals contain first hand medical information, and some of their items reveal the data types characteristic of the Chinese socialist national health system

  1. The role of attributions in the cognitive appraisal of work-related stressful events : An event-recording approach

    NARCIS (Netherlands)

    Peeters, MCW; Schaufeli, WB; Buunk, BP

    1995-01-01

    This paper describes a micro-analysis of the cognitive appraisal of daily stressful events in a sample of correctional officers (COs). More specifically, the authors examined whether three attribution dimensions mediated the relationship between the occurrence of stressful events and the

  2. Di-photon event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with a photon pair observed by the CMS detector in proton-collision data collected in 2015. The mass of the di-photon system is 750 GeV. Both photon candidates, with transverse momenta of 400 GeV and 230 GeV respectively, are reconstructed in the barrel region. The candidates are consistent with the expectations that they are prompt isolated photons.

  3. The use of holographic techniques for recording high-speed events

    International Nuclear Information System (INIS)

    Stepanov, B.M.; Filenko, Yu.I.

    The metods resulting from studies carried out using the commercial holographic device UIG-I are described. The device is intended for recording and investigating moving scenes and high-speed events by a holographic method. It consists of a quantum generator with a two-stage amplifier whose radiation energy in a single-mode operation is 0.7 J, and pulse width for passive Q-switching is 40nsec. Hologram portrait making was one of the experiments which illustrate the possible applications of the device. Hologram portraits such as group portraits and those that can be reconstructed in white light, were obtained on Micrat BP-2 and Agfa Gevaert plates

  4. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino, E-mail: martino@cardiol.br; Peixoto, Giselle de Lima; Lima, José Jayme Galvão de; Siqueira, Sérgio Freitas de; Costa, Roberto; Gowdak, Luís Henrique Wolff [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Paula, Flávio Jota de [Unidade de Transplante Renal - Divisão de Urologia do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Kalil Filho, Roberto; Ramires, José Antônio Franchini [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil)

    2015-11-15

    The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  5. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Directory of Open Access Journals (Sweden)

    Rodrigo Tavares Silva

    2015-11-01

    Full Text Available AbstractBackground:The recording of arrhythmic events (AE in renal transplant candidates (RTCs undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used.Objective:We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR.Methods:A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE.Results:During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002, and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001 were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041.Conclusions:In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  6. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Science.gov (United States)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  7. Data-driven approach for creating synthetic electronic medical records

    Directory of Open Access Journals (Sweden)

    Moniz Linda

    2010-10-01

    Full Text Available Abstract Background New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. Methods This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia and for background records. The method developed has three major steps: 1 synthetic patient identity and basic information generation; 2 identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3 adaptation of these care patterns to the synthetic patient population. Results We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. Conclusions A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders. The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious

  8. Iridium abundance measurements across bio-event horizons in the geological record

    Science.gov (United States)

    Orth, C. J.; Attrep, M., Jr.

    1988-01-01

    Geochemical studies have been performed on thousands of rock samples collected across bio-event horizons in the fossil record using INAA for about 40 common and trace elements and radiochemical isolation procedures for Os, Ir, Pt, and Au on selected samples. These studies were begun soon after the Alvarez team announced their discovery of the Cretaceous-Tertiary (K-T) Ir anomaly in marine rock sequences in Europe. With their encouragement the Authors searched for the anomaly in nearby continental (freshwater coal swamp) deposits. In collaboration with scientists from the U.S.G.S. in Denver, the anomaly was located and it was observed that a floral crisis occurred at the same stratigraphic position as the Ir spike. Further work in the Raton Basin has turned up numerous well-preserved K-T boundary sections. Although the Authors have continued to study the K-T boundary and provide geochemical measurements for other groups trying to precisely locate it, the primary effort was turned to examining the other bio-events in the Phanerozoic, especially to those that are older than the terminal Cretaceous. A list of horizons that were examined in collaboration with paleontologists and geologists is given. Results are also given and discussed.

  9. Bridging data models and terminologies to support adverse drug event reporting using EHR data.

    Science.gov (United States)

    Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C

    2015-01-01

    This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.

  10. Lidar data assimilation for improved analyses of volcanic aerosol events

    Science.gov (United States)

    Lange, Anne Caroline; Elbern, Hendrik

    2014-05-01

    Observations of hazardous events with release of aerosols are hardly analyzable by today's data assimilation algorithms, without producing an attenuating bias. Skillful forecasts of unexpected aerosol events are essential for human health and to prevent an exposure of infirm persons and aircraft with possibly catastrophic outcome. Typical cases include mineral dust outbreaks, mostly from large desert regions, wild fires, and sea salt uplifts, while the focus aims for volcanic eruptions. In general, numerical chemistry and aerosol transport models cannot simulate such events without manual adjustments. The concept of data assimilation is able to correct the analysis, as long it is operationally implemented in the model system. Though, the tangent-linear approximation, which describes a substantial precondition for today's cutting edge data assimilation algorithms, is not valid during unexpected aerosol events. As part of the European COPERNICUS (earth observation) project MACC II and the national ESKP (Earth System Knowledge Platform) initiative, we developed a module that enables the assimilation of aerosol lidar observations, even during unforeseeable incidences of extreme emissions of particulate matter. Thereby, the influence of the background information has to be reduced adequately. Advanced lidar instruments comprise on the one hand the aspect of radiative transfer within the atmosphere and on the other hand they can deliver a detailed quantification of the detected aerosols. For the assimilation of maximal exploited lidar data, an appropriate lidar observation operator is constructed, compatible with the EURAD-IM (European Air Pollution and Dispersion - Inverse Model) system. The observation operator is able to map the modeled chemical and physical state on lidar attenuated backscatter, transmission, aerosol optical depth, as well as on the extinction and backscatter coefficients. Further, it has the ability to process the observed discrepancies with lidar

  11. Intra-event isotope and raindrop size data of tropical rain reveal effects concealed by event averaged data

    Science.gov (United States)

    Managave, S. R.; Jani, R. A.; Narayana Rao, T.; Sunilkumar, K.; Satheeshkumar, S.; Ramesh, R.

    2016-08-01

    Evaporation of rain is known to contribute water vapor, a potent greenhouse gas, to the atmosphere. Stable oxygen and hydrogen isotopic compositions (δ18O and, δD, respectively) of precipitation, usually measured/presented as values integrated over rain events or monthly mean values, are important tools for detecting evaporation effects. The slope ~8 of the linear relationship between such time-averaged values of δD and δ18O (called the meteoric water line) is widely accepted as a proof of condensation under isotopic equilibrium and absence of evaporation of rain during atmospheric fall. Here, through a simultaneous investigation of the isotopic and drop size distributions of seventeen rain events sampled on an intra-event scale at Gadanki (13.5°N, 79.2°E), southern India, we demonstrate that the evaporation effects, not evident in the time-averaged data, are significantly manifested in the sub-samples of individual rain events. We detect this through (1) slopes significantly less than 8 for the δD-δ18O relation on intra-event scale and (2) significant positive correlations between deuterium excess ( d-excess = δD - 8*δ18O; lower values in rain indicate evaporation) and the mass-weighted mean diameter of the raindrops ( D m ). An estimated ~44 % of rain is influenced by evaporation. This study also reveals a signature of isotopic equilibration of rain with the cloud base vapor, the processes important for modeling isotopic composition of precipitation. d-excess values of rain are modified by the post-condensation processes and the present approach offers a way to identify the d-excess values least affected by such processes. Isotope-enabled global circulation models could be improved by incorporating intra-event isotopic data and raindrop size dependent isotopic effects.

  12. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  13. The potential of satellite data to study individual wildfire events

    Science.gov (United States)

    Benali, Akli; López-Saldana, Gerardo; Russo, Ana; Sá, Ana C. L.; Pinto, Renata M. S.; Nikos, Koutsias; Owen, Price; Pereira, Jose M. C.

    2014-05-01

    Large wildfires have important social, economic and environmental impacts. In order to minimize their impacts, understand their main drivers and study their dynamics, different approaches have been used. The reconstruction of individual wildfire events is usually done by collection of field data, interviews and by implementing fire spread simulations. All these methods have clear limitations in terms of spatial and temporal coverage, accuracy, subjectivity of the collected information and lack of objective independent validation information. In this sense, remote sensing is a promising tool with the potential to provide relevant information for stakeholders and the research community, by complementing or filling gaps in existing information and providing independent accurate quantitative information. In this work we show the potential of satellite data to provide relevant information regarding the dynamics of individual large wildfire events, filling an important gap in wildfire research. We show how MODIS active-fire data, acquired up to four times per day, and satellite-derived burnt perimeters can be combined to extract relevant information wildfire events by describing the methods involved and presenting results for four regions of the world: Portugal, Greece, SE Australia and California. The information that can be retrieved encompasses the start and end date of a wildfire event and its ignition area. We perform an evaluation of the information retrieved by comparing the satellite-derived parameters with national databases, highlighting the strengths and weaknesses of both and showing how the former can complement the latter leading to more complete and accurate datasets. We also show how the spatio-temporal distribution of wildfire spread dynamics can be reconstructed using satellite-derived active-fires and how relevant descriptors can be extracted. Applying graph theory to satellite active-fire data, we define the major fire spread paths that yield

  14. Using the FAIMS Mobile App for field data recording

    Science.gov (United States)

    Ballsun-Stanton, Brian; Klump, Jens; Ross, Shawn

    2016-04-01

    Multiple people creating data in the field poses a hard technical problem: our ``web 2.0'' environment presumes constant connectivity, data ``authority'' held by centralised servers, and sees mobile devices as tools for presentation rather than origination. A particular design challenge is the remoteness of the sampling locations, hundreds of kilometres away from network access. The alternative, however, is hand collection with a lengthy, error prone, and expensive digitisation process. This poster will present a field-tested open-source solution to field data recording. This solution, originally created by a community of archaeologists, needed to accommodate diverse recording methodologies. The community could not agree on standard vocabularies, workflows, attributes, or methodologies, but most agreed that at app to ``record data in the field'' was desirable. As a result, the app is generalised for field data collection; not only can it record a range of data types, but it is deeply customisable. The NeCTAR / ARC funded FAIMS Project, therefore, created an app which allows for arbitrary data collection in the field. In order to accomplish this ambitious goal, FAIMS relied heavily on OSS projects including: spatialite and gdal (for GIS support), sqlite (for a lightweight key-attribute-value datastore), Javarosa and Beanshell (for UI and scripting), Ruby, and Linux. Only by standing on the shoulders of giants, FAIMS was able to make a flexible and highly generalisable field data collection system that CSIRO geoscientists were able to customise to suit most of their completely unanticipated needs. While single-task apps (i.e. those commissioned by structural geologists to take strikes and dips) will excel in their domains, other geoscientists (palaeoecologists, palaeontologists, anyone taking samples) likely cannot afford to commission domain- and methodology-specific recording tools for their custom recording needs. FAIMS shows the utility of OSS software

  15. Smart responsive phosphorescent materials for data recording and security protection.

    Science.gov (United States)

    Sun, Huibin; Liu, Shujuan; Lin, Wenpeng; Zhang, Kenneth Yin; Lv, Wen; Huang, Xiao; Huo, Fengwei; Yang, Huiran; Jenkins, Gareth; Zhao, Qiang; Huang, Wei

    2014-04-07

    Smart luminescent materials that are responsive to external stimuli have received considerable interest. Here we report ionic iridium (III) complexes simultaneously exhibiting mechanochromic, vapochromic and electrochromic phosphorescence. These complexes share the same phosphorescent iridium (III) cation with a N-H moiety in the N^N ligand and contain different anions, including hexafluorophosphate, tetrafluoroborate, iodide, bromide and chloride. The anionic counterions cause a variation in the emission colours of the complexes from yellow to green by forming hydrogen bonds with the N-H proton. The electronic effect of the N-H moiety is sensitive towards mechanical grinding, solvent vapour and electric field, resulting in mechanochromic, vapochromic and electrochromic phosphorescence. On the basis of these findings, we construct a data-recording device and demonstrate data encryption and decryption via fluorescence lifetime imaging and time-gated luminescence imaging techniques. Our results suggest that rationally designed phosphorescent complexes may be promising candidates for advanced data recording and security protection.

  16. Data quality of seismic records from the Tohoku, Japan earthquake as recorded across the Albuquerque Seismological Laboratory networks

    Science.gov (United States)

    Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.

    2012-01-01

    Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.

  17. Record transfer of data between CERN and California

    CERN Multimedia

    Maximilien Brice

    2003-01-01

    On 27 February 2003 the California Institute of Technology (Caltech), CERN, the Los Alamos National Laboratory (LANL) and the Stanford Linear Accelerator Center (SLAC) broke a data transfer record by transmitting 1 terabyte of data in less than an hour across the 10,000 kilometres between CERN and Sunnyvale in California. The team sustained a transmission rate of 2.38 gigabits per second for over an hour, which is equivalent to transferring 26 CDs per minute. The record-breaking performance was achieved in the framework of tests directly linked to the DataGrid project, which involves the creation of a network of distributed computers able to deliver the unprecedented computing power and data management capacity that will be needed by the data-intensive experiments at the LHC. CERN's participation in these high-speed data transfer tests is led by IT division's External Networking team in the framework of the CERN-led European DataTAG project. Pictured here are some of the members of the CERN DataTAG project te...

  18. Cooled CCDs for recording data from electron microscopes

    CERN Document Server

    Faruqi, A R

    2000-01-01

    A cooled-CCD camera based on a low-noise scientific grade device is described in this paper used for recording images in a 120 kV electron microscope. The primary use of the camera is for recording electron diffraction patterns from two-dimensionally ordered arrays of proteins at liquid-nitrogen temperatures leading to structure determination at atomic or near-atomic resolution. The traditional method for recording data in the microscope is with electron sensitive film but electronic detection methods offer the following advantages over film methods: the data is immediately available in a digital format which can be displayed on a monitor screen for visual inspection whereas a film record needs to be developed and digitised, a lengthy process taking at least several hours, prior to inspection; the dynamic range of CCD detectors is about two orders of magnitude greater with better linearity. The accuracy of measurements is also higher for CCDs, particularly for weak signals due to inherent fog levels in film. ...

  19. Danngarrd-Oscar events recorded in a terrestrial sequence in central British Columbia, Canada

    Science.gov (United States)

    Ward, B. C.; Geertsema, M.; Telka, A.; Mathewes, R.

    2012-12-01

    Danngarrd-Oscar events recorded in the GISP2 Greenland Ice Core. The increasingly dry and cold conditions indicated by the macrofossil assemblage likely reflect the growth of ice in the Coast Mountains that would reduce the availability of moisture to the Interior Plateau from Pacific air masses. This is confirmed by reconstruction of the growth of the Cordilleran Ice Sheet during the Late Wisconsinan based on published radiocarbon dates.

  20. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  1. Benchmarking dairy herd health status using routinely recorded herd summary data.

    Science.gov (United States)

    Parker Gaddis, K L; Cole, J B; Clay, J S; Maltecca, C

    2016-02-01

    Genetic improvement of dairy cattle health through the use of producer-recorded data has been determined to be feasible. Low estimated heritabilities indicate that genetic progress will be slow. Variation observed in lowly heritable traits can largely be attributed to nongenetic factors, such as the environment. More rapid improvement of dairy cattle health may be attainable if herd health programs incorporate environmental and managerial aspects. More than 1,100 herd characteristics are regularly recorded on farm test-days. We combined these data with producer-recorded health event data, and parametric and nonparametric models were used to benchmark herd and cow health status. Health events were grouped into 3 categories for analyses: mastitis, reproductive, and metabolic. Both herd incidence and individual incidence were used as dependent variables. Models implemented included stepwise logistic regression, support vector machines, and random forests. At both the herd and individual levels, random forest models attained the highest accuracy for predicting health status in all health event categories when evaluated with 10-fold cross-validation. Accuracy (SD) ranged from 0.61 (0.04) to 0.63 (0.04) when using random forest models at the herd level. Accuracy of prediction (SD) at the individual cow level ranged from 0.87 (0.06) to 0.93 (0.001) with random forest models. Highly significant variables and key words from logistic regression and random forest models were also investigated. All models identified several of the same key factors for each health event category, including movement out of the herd, size of the herd, and weather-related variables. We concluded that benchmarking health status using routinely collected herd data is feasible. Nonparametric models were better suited to handle this complex data with numerous variables. These data mining techniques were able to perform prediction of health status and could add evidence to personal experience in herd

  2. Ethics and subsequent use of electronic health record data.

    Science.gov (United States)

    Lee, Lisa M

    2017-07-01

    The digital health landscape in the United States is evolving and electronic health record data hold great promise for improving health and health equity. Like many scientific and technological advances in health and medicine, there exists an exciting narrative about what we can do with the new technology, as well as reflection about what we should do with it based on what we value. Ethical reflections about the use of EHR data for research and quality improvement have considered the important issues of privacy and informed consent for subsequent use of data. Additional ethical aspects are important in the conversation, including data validity, patient obligation to participate in the learning health system, and ethics integration into training for all personnel who interact with personal health data. Attention to these ethical issues is paramount to our realizing the benefits of electronic health data. Published by Elsevier Inc.

  3. VLSI-based video event triggering for image data compression

    Science.gov (United States)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  4. Di-muon event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass muon pair so far observed by the CMS detector in proton-collision data collected in 2015. The mass of the di-muon system is 2.4 TeV. One muon, with a transverse momentum of 0.7 TeV, goes through the Drift Tubes in the central region, while the second, with a transverse momentum of 1.0 TeV, hits the Cathode Strip Chambers in the forward region. Both muons satisfy the high-transverse-momentum muon selection criteria.

  5. Records of Mesoproterozoic taphrogenic events in the eastern basement of the Araçuaí Orogen, southeast Brazil

    Directory of Open Access Journals (Sweden)

    Tobias Maia Rabelo Fonte-Boa

    Full Text Available ABSTRACT: The history of palaeocontinents alternates long fragmentation to drift periods with relatively short agglutination intervals. One of the products of a Rhyacian-Orosirian orogeny was a palaeocontinent that brought together the basement of the Araçuaí-West Congo orogen (AWCO with regions now located in the São Francisco and Congo cratons. From ca. 2 Ga to ca. 0.7 Ga, this large region of the São Francisco-Congo palaeocontinent was spared of orogenic events, but underwent at least five taphrogenic events recorded by anorogenic magmatism and/or sedimentation. The taphrogenic events are well documented in the AWCO proximal portions and neighboring cratonic regions, but lack evidence in the AWCO high-grade core. Our studies on amphibolites intercalated in the Rhyacian Pocrane complex, basement of the Rio Doce magmatic arc, allowed to the recognition of two Mesoproterozoic taphrogenic episodes. The oldest one, a Calymmian episode, is recorded by amphibolites with a zircon magmatic crystallization age at 1529 ± 37 Ma (U-Pb SHRIMP, and lithochemical signature of basaltic magmatism related to continental intraplate settings. Another set of amphibolite bodies records the youngest taphrogenic episode, a Stenian event, with a zircon magmatic crystallization age at 1096 ± 20 Ma (U-Pb SHRIMP, and lithochemical signature similar to mature magmatism of continental rift setting. The Calymmian episode (ca. 1.5 Ga correlates to the Espinhaço II basin stage and mafic dikes of the northern Espinhaço, Chapada Diamantina and Curaçá domains, while the Stenian episode (ca. 1.1 Ga correlates to the Espinhaço III basin stage. We also present U-Pb data for 87 detrital zircon grains from a quartzite lens intercalated in the Pocrane complex, the Córrego Ubá quartzite. Its age spectrum shows main peaks at 1176 ± 21 Ma (35%, 1371 ± 30 Ma (18%, 1536 ± 22 Ma (19%, 1803 ± 36 Ma (17% and 1977 ± 38 Ma (12%, suggesting a Stenian (ca. 1176 Ma maximum

  6. Depth Discrimination Using Rg-to-Sg Spectral Amplitude Ratios for Seismic Events in Utah Recorded at Local Distances

    Energy Technology Data Exchange (ETDEWEB)

    Tibi, Rigobert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Koper, Keith D. [Univ. of Utah, Salt Lake City, UT (United States). Dept. of Geology and Geophysics; Pankow, Kristine L. [Univ. of Utah, Salt Lake City, UT (United States). Dept. of Geology and Geophysics; Young, Christopher J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-03-20

    Short-period fundamental-mode Rayleigh waves (Rg) are commonly observed on seismograms of anthropogenic seismic events and shallow, naturally occurring tectonic earthquakes (TEs) recorded at local distances. In the Utah region, strong Rg waves traveling with an average group velocity of about 1.8 km/s are observed at ~1 Hz on waveforms from shallow events ( depth<10 km ) recorded at distances up to about 150 km. At these distances, Sg waves, which are direct shear waves traveling in the upper crust, are generally the dominant signals for TEs. Here in this study, we leverage the well-known notion that Rg amplitude decreases dramatically with increasing event depth to propose a new depth discriminant based on Rg-to-Sg spectral amplitude ratios. The approach is successfully used to discriminate shallow events (both earthquakes and anthropogenic events) from deeper TEs in the Utah region recorded at local distances ( <150 km ) by the University of Utah Seismographic Stations (UUSS) regional seismic network. Using Mood’s median test, we obtained probabilities of nearly zero that the median Rg-to-Sg spectral amplitude ratios are the same between shallow events on the one hand (including both shallow TEs and anthropogenic events), and deeper earthquakes on the other, suggesting that there is a statistically significant difference in the estimated Rg-to-Sg ratios between the two populations. We also observed consistent disparities between the different types of shallow events (e.g., mining blasts vs. mining-induced earthquakes), implying that it may be possible to separate the subpopulations that make up this group. Lastly, this suggests that using local distance Rg-to-Sg spectral amplitude ratios one can not only discriminate shallow events from deeper events but may also be able to discriminate among different populations of shallow events.

  7. Multi-Channel Data Recording of Marx switch closures

    International Nuclear Information System (INIS)

    Lockwood, G.J.; Ruggles, L.E.; Ziska, G.R.

    1984-01-01

    The authors have measured the optical signals associated with switch closure on the Demon marx at Sandia National Laboratories. Using the High Speed Multi-Channel Data Recorder(HSMCDR), they have recorded the time histories of the optical signals from the thirty switches in the marx generator. All thirty switches were fiber connected to the HSMCDR. The HSMCDR consists of a high speed streak camera, and a microcomputer-based video digitizing system. Since the thirty signals are recorded on a single streak, the time sequence can be determined with great accuracy. The appearance of a given signal can be determined to within two samples of the 256 samples that make up the time streak. The authors have found that the light intensity and time history of any given switch varied over a large range from shot to shot. Thus, the ability to record the entire optical signal as a function of time for each switch on every shot is necessary if accurate timing results are required

  8. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    Science.gov (United States)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  9. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  10. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  11. Neurophysiological Effects of Meditation Based on Evoked and Event Related Potential Recordings.

    Science.gov (United States)

    Singh, Nilkamal; Telles, Shirley

    2015-01-01

    Evoked potentials (EPs) are a relatively noninvasive method to assess the integrity of sensory pathways. As the neural generators for most of the components are relatively well worked out, EPs have been used to understand the changes occurring during meditation. Event-related potentials (ERPs) yield useful information about the response to tasks, usually assessing attention. A brief review of the literature yielded eleven studies on EPs and seventeen on ERPs from 1978 to 2014. The EP studies covered short, mid, and long latency EPs, using both auditory and visual modalities. ERP studies reported the effects of meditation on tasks such as the auditory oddball paradigm, the attentional blink task, mismatched negativity, and affective picture viewing among others. Both EP and ERPs were recorded in several meditations detailed in the review. Maximum changes occurred in mid latency (auditory) EPs suggesting that maximum changes occur in the corresponding neural generators in the thalamus, thalamic radiations, and primary auditory cortical areas. ERP studies showed meditation can increase attention and enhance efficiency of brain resource allocation with greater emotional control.

  12. Neurophysiological Effects of Meditation Based on Evoked and Event Related Potential Recordings

    Science.gov (United States)

    Singh, Nilkamal; Telles, Shirley

    2015-01-01

    Evoked potentials (EPs) are a relatively noninvasive method to assess the integrity of sensory pathways. As the neural generators for most of the components are relatively well worked out, EPs have been used to understand the changes occurring during meditation. Event-related potentials (ERPs) yield useful information about the response to tasks, usually assessing attention. A brief review of the literature yielded eleven studies on EPs and seventeen on ERPs from 1978 to 2014. The EP studies covered short, mid, and long latency EPs, using both auditory and visual modalities. ERP studies reported the effects of meditation on tasks such as the auditory oddball paradigm, the attentional blink task, mismatched negativity, and affective picture viewing among others. Both EP and ERPs were recorded in several meditations detailed in the review. Maximum changes occurred in mid latency (auditory) EPs suggesting that maximum changes occur in the corresponding neural generators in the thalamus, thalamic radiations, and primary auditory cortical areas. ERP studies showed meditation can increase attention and enhance efficiency of brain resource allocation with greater emotional control. PMID:26137479

  13. Earth System Data Records of Mass Transport from Time-Variable Gravity Data

    Science.gov (United States)

    Zlotnicki, V.; Talpe, M.; Nerem, R. S.; Landerer, F. W.; Watkins, M. M.

    2014-12-01

    Satellite measurements of time variable gravity have revolutionized the study of Earth, by measuring the ice losses of Greenland, Antarctica and land glaciers, changes in groundwater including unsustainable losses due to extraction of groundwater, the mass and currents of the oceans and their redistribution during El Niño events, among other findings. Satellite measurements of gravity have been made primarily by four techniques: satellite tracking from land stations using either lasers or Doppler radio systems, satellite positioning by GNSS/GPS, satellite to satellite tracking over distances of a few hundred km using microwaves, and through a gravity gradiometer (radar altimeters also measure the gravity field, but over the oceans only). We discuss the challenges in the measurement of gravity by different instruments, especially time-variable gravity. A special concern is how to bridge a possible gap in time between the end of life of the current GRACE satellite pair, launched in 2002, and a future GRACE Follow-On pair to be launched in 2017. One challenge in combining data from different measurement systems consists of their different spatial and temporal resolutions and the different ways in which they alias short time scale signals. Typically satellite measurements of gravity are expressed in spherical harmonic coefficients (although expansions in terms of 'mascons', the masses of small spherical caps, has certain advantages). Taking advantage of correlations among spherical harmonic coefficients described by empirical orthogonal functions and derived from GRACE data it is possible to localize the otherwise coarse spatial resolution of the laser and Doppler derived gravity models. This presentation discusses the issues facing a climate data record of time variable mass flux using these different data sources, including its validation.

  14. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  15. Satellite Climate Data Records: Development, Applications, and Societal Benefits

    Directory of Open Access Journals (Sweden)

    Wenze Yang

    2016-04-01

    Full Text Available This review paper discusses how to develop, produce, sustain, and serve satellite climate data records (CDRs in the context of transitioning research to operation (R2O. Requirements and critical procedures of producing various CDRs, including Fundamental CDRs (FCDRs, Thematic CDRs (TCDRs, Interim CDRs (ICDRs, and climate information records (CIRs are discussed in detail, including radiance/reflectance and the essential climate variables (ECVs of land, ocean, and atmosphere. Major international CDR initiatives, programs, and projects are summarized. Societal benefits of CDRs in various user sectors, including Agriculture, Forestry, Fisheries, Energy, Heath, Water, Transportation, and Tourism are also briefly discussed. The challenges and opportunities for CDR development, production and service are also addressed. It is essential to maintain credible CDR products by allowing free access to products and keeping the production process transparent by making source code and documentation available with the dataset.

  16. Long-Term Memory: A Natural Mechanism for the Clustering of Extreme Events and Anomalous Residual Times in Climate Records

    Science.gov (United States)

    Bunde, Armin; Eichner, Jan F.; Kantelhardt, Jan W.; Havlin, Shlomo

    2005-01-01

    We study the statistics of the return intervals between extreme events above a certain threshold in long-term persistent records. We find that the long-term memory leads (i)to a stretched exponential distribution of the return intervals, (ii)to a pronounced clustering of extreme events, and (iii)to an anomalous behavior of the mean residual time to the next event that depends on the history and increases with the elapsed time in a counterintuitive way. We present an analytical scaling approach and demonstrate that all these features can be seen in long climate records. The phenomena should also occur in heartbeat records, Internet traffic, and stock market volatility and have to be taken into account for an efficient risk evaluation.

  17. APNEA list mode data acquisition and real-time event processing

    Energy Technology Data Exchange (ETDEWEB)

    Hogle, R.A.; Miller, P. [GE Corporate Research & Development Center, Schenectady, NY (United States); Bramblett, R.L. [Lockheed Martin Specialty Components, Largo, FL (United States)

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  18. ATLAS event at 13 TeV - Multijet Exotics Search Event Display - 2015 data

    CERN Multimedia

    ATLAS Collaboration

    2015-01-01

    Run 279984, Event 1079767163 A 10 jet event selected in the search for strong gravity in multijet final states (CERN-PH-EP-2015-312). The scalar sum of jet transverse momenta (HT) of the event is 4.4 TeV. Run 282712, Event 474587238 The event with the largest scalar sum of jet transverse momenta (HT) selected in the search for strong gravity in multijet final states (CERN-PH-EP-2015-312). The HT of the event is 6.4 TeV.

  19. Continuity of Climate Data Records derived from Microwave Observations

    Science.gov (United States)

    Mears, C. A.; Wentz, F. J.; Brewer, M.; Meissner, T.; Ricciardulli, L.

    2017-12-01

    Remote Sensing Systems (www.remss.com) has been producing and distributing microwave climate data products from microwave imagers (SSMI, TMI, AMSR, WindSat, GMI, Aquarius, SMAP) over the global oceans since the launch of the first SSMI in 1987. Interest in these data products has been significant as researchers around the world have downloaded the approximate equivalent of 1 million satellite years of processed data. Users, including NASA, NOAA, US National Laboratories, US Navy, UK Met, ECMWF, JAXA, JMA, CMC, the Australian Bureau of Meteorology, as well as many hundreds of other agencies and universities routinely access these microwave data products. The quality of these data records has increased as more observations have become available and inter-calibration techniques have improved. The impending end of missions for WindSat, AMSR-2, and the remaining SSMIs will have significant impact on the quality and continuity of long term microwave climate data records. In addition to the problem of reduced numbers of observations, there is a real danger of losing overlapping observations. Simultaneous operation of satellites, especially when the observations are at similar local crossing times, provides a significant benefit in the effort to inter-calibrate satellites to yield accurate and stable long-term records. The end of WindSat and AMSR-2 will leave us without microwave SSTs in cold water, as there will be no microwave imagers with C-band channels. Microwave SSTs have a crucial advantage over IR SSTs, which is not able to measure SST in clouds or if aerosols are present. The gap in ocean wind vectors will be somewhat mitigated as the European ASCAT C-band scatterometer mission on MetOp is continuing. Nonetheless, the anticipated cease of several microwave satellite radiometers retrieving ocean winds in the coming years will lead to a significant gap in temporal coverage. Atmospheric water vapor, cloud liquid water, and rain rate are all important climate

  20. An optical age chronology of late Quaternary extreme fluvial events recorded in Ugandan dambo soils

    Science.gov (United States)

    Mahan, S.A.; Brown, D.J.

    2007-01-01

    There is little geochonological data on sedimentation in dambos (seasonally saturated, channel-less valley floors) found throughout Central and Southern Africa. Radiocarbon dating is problematic for dambos due to (i) oxidation of organic materials during dry seasons; and (ii) the potential for contemporary biological contamination of near-surface sediments. However, for luminescence dating the equatorial site and semi-arid climate facilitate grain bleaching, while the gentle terrain ensures shallow water columns, low turbidity, and relatively long surface exposures for transported grains prior to deposition and burial. For this study, we focused on dating sandy strata (indicative of high-energy fluvial events) at various positions and depths within a second-order dambo in central Uganda. Blue-light quartz optically stimulated luminescences (OSL) ages were compared with infrared stimulated luminescence (IRSL) and thermoluminescence (TL) ages from finer grains in the same sample. A total of 8 samples were dated, with 6 intervals obtained at ???35, 33, 16, 10.4, 8.4, and 5.9 ka. In general, luminescence ages were stratigraphically, geomorphically and ordinally consistent and most blue-light OSL ages could be correlated with well-dated climatic events registered either in Greenland ice cores or Lake Victoria sediments. Based upon OSL age correlations, we theorize that extreme fluvial dambo events occur primarily during relatively wet periods, often preceding humid-to-arid transitions. The optical ages reported in this study provide the first detailed chronology of dambo sedimentation, and we anticipate that further dambo work could provide a wealth of information on the paleohydrology of Central and Southern Africa. ?? 2006 Elsevier Ltd. All rights reserved.

  1. THREE-DIMENSIONAL DATA AND THE RECORDING OF MATERIAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    R. Parenti

    2012-09-01

    Full Text Available The “description” of a material structure requires a high degree of objectivity to serve the scientific interests of certain disciplines (archeological documentation, conservation and restoration, safeguarding of cultural assets and heritage. Geometric data and photographic documentation of surfaces are thus the best instruments for efficacious, clear and objective recording of architectural objects and other anthropic manifestations. In particular, the completeness and diachrony of photographic documentation has always proven essential in recording the material structure of historical buildings.The aim of our contribution is to show the results of several projects carried out with the aid of survey methodologies that utilize digital photographic images to generate RGB (ZScan point clouds of architectural monuments (urban standing buildings, monuments in archaeological areas, etc. and art objects. These technologies allow us to capture data using digital photogrammetric techniques; although not based on laser scanners, they can nonetheless create dense 3D point clouds, simply by using images that have been obtained via digital camera. The results are comparable to those achieved with laser scanner technology, although the procedures are simpler, faster and cheaper. We intend to try to adapt these technologies to the requirements and needs of scientific research and the conservation of cultural heritage. Furthermore, we will present protocols and procedures for data recording, processing and transfer in the cultural heritage field, especially with regard to historical buildings. Cooperation among experts from different disciplines (archaeology, engineering and photogrammetry will allow us to develop technologies and proposals for a widely adoptable workflow in the application of such technologies, in order to build an integrated system that can be used throughout the scientific community. Toward this end, open formats and integration will be

  2. Implementation of the ATLAS Run 2 event data model

    CERN Document Server

    Buckley, Andrew; Elsing, Markus; Gillberg, Dag Ingemar; Koeneke, Karsten; Krasznahorkay, Attila; Moyse, Edward; Nowak, Marcin; Snyder, Scott; van Gemmeren, Peter

    2015-01-01

    During the 2013--2014 shutdown of the Large Hadron Collider, ATLAS switched to a new event data model for analysis, called the xAOD. A key feature of this model is the separation of the object data from the objects themselves (the `auxiliary store'). Rather being stored as member variables of the analysis classes, all object data are stored separately, as vectors of simple values. Thus, the data are stored in a `structure of arrays' format, while the user still can access it as an `array of structures'. This organization allows for on-demand partial reading of objects, the selective removal of object properties, and the addition of arbitrary user-defined properties in a uniform manner. It also improves performance by increasing the locality of memory references in typical analysis code. The resulting data structures can be written to ROOT files with data properties represented as simple ROOT tree branches. This talk will focus on the design and implementation of the auxiliary store and its interaction with RO...

  3. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  4. Implementation of the ATLAS Run 2 event data model

    Science.gov (United States)

    Buckley, A.; Eifert, T.; Elsing, M.; Gillberg, D.; Koeneke, K.; Krasznahorkay, A.; Moyse, E.; Nowak, M.; Snyder, S.; van Gemmeren, P.

    2015-12-01

    During the 2013-2014 shutdown of the Large Hadron Collider, ATLAS switched to a new event data model for analysis, called the xAOD. A key feature of this model is the separation of the object data from the objects themselves (the ‘auxiliary store’). Rather than being stored as member variables of the analysis classes, all object data are stored separately, as vectors of simple values. Thus, the data are stored in a ‘structure of arrays’ format, while the user still can access it as an ‘array of structures’. This organization allows for on-demand partial reading of objects, the selective removal of object properties, and the addition of arbitrary user- defined properties in a uniform manner. It also improves performance by increasing the locality of memory references in typical analysis code. The resulting data structures can be written to ROOT files with data properties represented as simple ROOT tree branches. This paper focuses on the design and implementation of the auxiliary store and its interaction with ROOT.

  5. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Directory of Open Access Journals (Sweden)

    Yuan Zhou

    2014-02-01

    Full Text Available Background The effect of health information technology (HIT on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  6. Curve Number Estimation for a Small Urban Catchment from Recorded Rainfall-Runoff Events

    Directory of Open Access Journals (Sweden)

    Banasik Kazimierz

    2014-12-01

    Full Text Available Runoff estimation is a key component in various hydrological considerations. Estimation of storm runoff is especially important for the effective design of hydraulic and road structures, for the flood flow management, as well as for the analysis of land use changes, i.e. urbanization or low impact development of urban areas. The curve number (CN method, developed by Soil Conservation Service (SCS of the U.S. Department of Agriculture for predicting the flood runoff depth from ungauged catchments, has been in continuous use for ca. 60 years. This method has not been extensively tested in Poland, especially in small urban catchments, because of lack of data. In this study, 39 rainfall-runoff events, collected during four years (2009–2012 in a small (A=28.7 km2, urban catchment of Służew Creek in southwest part of Warsaw were used, with the aim of determining the CNs and to check its applicability to ungauged urban areas. The parameters CN, estimated empirically, vary from 65.1 to 95.0, decreasing with rainfall size and, when sorted rainfall and runoff separately, reaching the value from 67 to 74 for large rainfall events.

  7. Improving Metadata Compliance for Earth Science Data Records

    Science.gov (United States)

    Armstrong, E. M.; Chang, O.; Foster, D.

    2014-12-01

    One of the recurring challenges of creating earth science data records is to ensure a consistent level of metadata compliance at the granule level where important details of contents, provenance, producer, and data references are necessary to obtain a sufficient level of understanding. These details are important not just for individual data consumers but also for autonomous software systems. Two of the most popular metadata standards at the granule level are the Climate and Forecast (CF) Metadata Conventions and the Attribute Conventions for Dataset Discovery (ACDD). Many data producers have implemented one or both of these models including the Group for High Resolution Sea Surface Temperature (GHRSST) for their global SST products and the Ocean Biology Processing Group for NASA ocean color and SST products. While both the CF and ACDD models contain various level of metadata richness, the actual "required" attributes are quite small in number. Metadata at the granule level becomes much more useful when recommended or optional attributes are implemented that document spatial and temporal ranges, lineage and provenance, sources, keywords, and references etc. In this presentation we report on a new open source tool to check the compliance of netCDF and HDF5 granules to the CF and ACCD metadata models. The tool, written in Python, was originally implemented to support metadata compliance for netCDF records as part of the NOAA's Integrated Ocean Observing System. It outputs standardized scoring for metadata compliance for both CF and ACDD, produces an objective summary weight, and can be implemented for remote records via OPeNDAP calls. Originally a command-line tool, we have extended it to provide a user-friendly web interface. Reports on metadata testing are grouped in hierarchies that make it easier to track flaws and inconsistencies in the record. We have also extended it to support explicit metadata structures and semantic syntax for the GHRSST project that can be

  8. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    Science.gov (United States)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  9. Data-mining of medication records to improve asthma management.

    Science.gov (United States)

    Bereznicki, Bonnie J; Peterson, Gregory M; Jackson, Shane L; Walters, E Haydn; Fitzmaurice, Kimbra D; Gee, Peter R

    2008-07-07

    To use community pharmacy medication records to identify patients whose asthma may not be well managed and then implement and evaluate a multidisciplinary educational intervention to improve asthma management. We used a multisite controlled study design. Forty-two pharmacies throughout Tasmania ran a software application that "data-mined" medication records, generating a list of patients who had received three or more canisters of inhaled short-acting beta(2)-agonists in the preceding 6 months. The patients identified were allocated to an intervention or control group. Pre-intervention data were collected for the period May to November 2006 and post-intervention data for the period December 2006 to May 2007. Intervention patients were contacted by the community pharmacist via mail, and were sent educational material and a letter encouraging them to see their general practitioner for an asthma management review. Pharmacists were blinded to the control patients' identities until the end of the post-intervention period. Dispensing ratio of preventer medication (inhaled corticosteroids [ICSs]) to reliever medication (inhaled short-acting beta(2)-agonists). Thirty-five pharmacies completed the study, providing 702 intervention and 849 control patients. The intervention resulted in a threefold increase in the preventer-to-reliever ratio in the intervention group compared with the control group (P < 0.01) and a higher proportion of patients in the intervention group using ICS therapy than in the control group (P < 0.01). Community pharmacy medication records can be effectively used to identify patients with suboptimal asthma management, who can then be referred to their GP for review. The intervention should be trialled on a national scale to determine the effects on clinical, social, emotional and economic outcomes for people in the Australian community, with a longer follow-up to determine sustainability of the improvements noted.

  10. Physicists set new record for network data transfer

    CERN Multimedia

    2006-01-01

    "An internatinal team of physicists, computer scientists, and network engineers led by the California Institute of Technology, CERN and the University of Michigan and partners at the University of Florida and Vanderbilt, as well as participants from Brazil (Rio de Janeiro State University, UERJ, and the State Universities of Sao Paulo, USP and UNESP) and Korea (Kyungpook National University, KISTI) joined forces to set new records for sustained data transfer between storage systems during the SuperComputing 2006 (SC06) Bandwidth Challenge (BWC)." (2 pages)

  11. A late Holocene record of arid events from the Cuzco region, Peru

    Science.gov (United States)

    Chepstow-Lusty, Alex; Frogley, Michael R.; Bauer, Brian S.; Bush, Mark. B.; Tupayachi Herrera, Alfredo

    2003-09-01

    The small recently infilled lake basin of Marcacocha (13°13S, 72°12W, 3355 m) in the Cuzco region of Peru has a morphology and location that renders it extremely sensitive to environmental change. A record of vegetation, human impact and climatic change during the past 4200 yr has been obtained from a highly organic core taken from the centre of the basin. Sustained arid episodes that affected the Peruvian Andes may be detectable using the proxy indicator of sedge (Cyperaceae) pollen abundances. As the lake-level was lowered during sustained drier conditions, the local catchment was colonised by Cyperaceae, whereas during lake floods, they retreated or were submerged and pollen production was correspondingly reduced. Drier episodes during prehistoric times occurred around 900 bc, 500 bc, ad 100 and ad 550, with a longer dry episode occurring from ad 900 to 1800. Evidence from the independently derived Quelccaya ice-core record and the archaeological chronology for the Cuzco region appears to support the climatic inferences derived from the sedge data. Many of these aridity episodes appear to correspond with important cultural changes in the Cuzco region and elsewhere in the Central Andes. Copyright

  12. Validation Data Acquisition in HTTF during PCC Events

    Energy Technology Data Exchange (ETDEWEB)

    Bardet, Philippe [George Washington Univ., Washington, DC (United States)

    2018-02-07

    A validation experimental campaign was conducted in an Integral Effect Test (IET) facility of High Temperature Gas Reactors (HTGR), the High-Temperature Test Facility (HTTF) at Oregon State University (OSU). The HTTF simulates Depressurized and Pressurized Conduction Cooldowns (DCC and PCC). This campaign required the development of a new laser spectroscopic diagnostic to measure velocity in the challenging conditions of the HTTF: low speed (~1 m/s) gas flows at HTGR prototypical temperature and 1/10th pressure. This was a collaborative effort between co-PIs at The George Washington University (GW), Oregon State University (OSU), and NASA Langley Research Center. The main accomplishments of this project include the record for dynamic range for velocimetry, highest accuracy obtained with this technique, successful deployment in an IET leading to new validation matrix for CFD. These are detailed below and in manuscript appended to this executive summary. For this project, we introduced a new class of laser spectroscopic diagnostics to Thermal-Hydraulics to measure velocity of gases; furthermore, the diagnostic was demonstrated in-situ in an IET during DCC events. In such scenarios, particles used in mainstream techniques, like Particle Image Velocimetry (PIV) are not appropriate as they settle down too rapidly and also contaminate the experimental facility. Molecular tracers stay mixed within the working gas and can seed the flow in a technique called Molecular Tagging Velocimetry (MTV). In MTV a molecular tracer is photo-dissociated by a first (write) laser into radicals or molecules. The pattern created by the write laser is then interrogated with planar laser-induced fluorescence (PLIF), the read pulse(s), which are recorded with a camera. The pattern is probed and matched at two times (interval or probe time, dt), resulting in a time-of-flight velocimetry technique. This project demonstrated a new application range for MTV in gases. MTV has been extensively used

  13. Views of CMS Event Data Objects, Files, Collections, Virtual Data Products

    CERN Document Server

    Holtman, Koen

    2001-01-01

    The CMS data grid system will store many types of data maintained by the CMS collaboration. An important type of data is the event data, which is defined in this note as all data that directly represents simulated, raw, or reconstructed CMS physics events. Many views on this data will exist simultaneously. To a CMS physics code implementer this data will appear as C++ objects, to a tape robot operator the data will appear as files. This note identifies different views that can exist, describes each of them, and interrelates them by placing them into a vertical stack. This particular stack integrates several existing architectural structures, and is therefore a plausible basis for further prototyping and architectural work. This document is intended as a contribution to, and as common (terminological) reference material for, the CMS architectural efforts and for the Grid projects PPDG, GriPhyN, and the EU DataGrid.

  14. Accuracy of Laboratory Data Communication on ICU Daily Rounds Using an Electronic Health Record.

    Science.gov (United States)

    Artis, Kathryn A; Dyer, Edward; Mohan, Vishnu; Gold, Jeffrey A

    2017-02-01

    Accurately communicating patient data during daily ICU rounds is critically important since data provide the basis for clinical decision making. Despite its importance, high fidelity data communication during interprofessional ICU rounds is assumed, yet unproven. We created a robust but simple methodology to measure the prevalence of inaccurately communicated (misrepresented) data and to characterize data communication failures by type. We also assessed how commonly the rounding team detected data misrepresentation and whether data communication was impacted by environmental, human, and workflow factors. Direct observation of verbalized laboratory data during daily ICU rounds compared with data within the electronic health record and on presenters' paper prerounding notes. Twenty-six-bed academic medical ICU with a well-established electronic health record. ICU rounds presenter (medical student or resident physician), interprofessional rounding team. None. During 301 observed patient presentations including 4,945 audited laboratory results, presenters used a paper prerounding tool for 94.3% of presentations but tools contained only 78% of available electronic health record laboratory data. Ninty-six percent of patient presentations included at least one laboratory misrepresentation (mean, 6.3 per patient) and 38.9% of all audited laboratory data were inaccurately communicated. Most misrepresentation events were omissions. Only 7.8% of all laboratory misrepresentations were detected. Despite a structured interprofessional rounding script and a well-established electronic health record, clinician laboratory data retrieval and communication during ICU rounds at our institution was poor, prone to omissions and inaccuracies, yet largely unrecognized by the rounding team. This highlights an important patient safety issue that is likely widely prevalent, yet underrecognized.

  15. Two Extreme Climate Events of the Last 1000 Years Recorded in Himalayan and Andean Ice Cores: Impacts on Humans

    Science.gov (United States)

    Thompson, L. G.; Mosley-Thompson, E. S.; Davis, M. E.; Kenny, D. V.; Lin, P.

    2013-12-01

    In the last few decades numerous studies have linked pandemic influenza, cholera, malaria, and viral pneumonia, as well as droughts, famines and global crises, to the El Niño-Southern Oscillation (ENSO). Two annually resolved ice core records, one from Dasuopu Glacier in the Himalaya and one from the Quelccaya Ice Cap in the tropical Peruvian Andes provide an opportunity to investigate these relationships on opposite sides of the Pacific Basin for the last 1000 years. The Dasuopu record provides an annual history from 1440 to 1997 CE and a decadally resolved record from 1000 to 1440 CE while the Quelccaya ice core provides annual resolution over the last 1000 years. Major ENSO events are often recorded in the oxygen isotope, insoluble dust, and chemical records from these cores. Here we investigate outbreaks of diseases, famines and global crises during two of the largest events recorded in the chemistry of these cores, particularly large peaks in the concentrations of chloride (Cl-) and fluoride (Fl-). One event is centered on 1789 to 1800 CE and the second begins abruptly in 1345 and tapers off after 1360 CE. These Cl- and F- peaks represent major droughts and reflect the abundance of continental atmospheric dust, derived in part from dried lake beds in drought stricken regions upwind of the core sites. For Dasuopu the likely sources are in India while for Quelccaya the sources would be the Andean Altiplano. Both regions are subject to drought conditions during the El Niño phase of the ENSO cycle. These two events persist longer (10 to 15 years) than today's typical ENSO events in the Pacific Ocean Basin. The 1789 to 1800 CE event was associated with a very strong El Niño event and was coincidental with the Boji Bara famine resulting from extended droughts that led to over 600,000 deaths in central India by 1792. Similarly extensive droughts are documented in Central and South America. Likewise, the 1345 to 1360 CE event, although poorly documented

  16. Recording real case data of earth faults in distribution lines

    Energy Technology Data Exchange (ETDEWEB)

    Haenninen, S. [VTT Energy, Espoo (Finland)

    1996-12-31

    The most common fault type in the electrical distribution networks is the single phase to earth fault. According to the earlier studies, for instance in Nordic countries, about 80 % of all faults are of this type. To develop the protection and fault location systems, it is important to obtain real case data of disturbances and faults which occur in the networks. For example, the earth fault initial transients can be used for earth fault location. The aim of this project was to collect and analyze real case data of the earth fault disturbances in the medium voltage distribution networks (20 kV). Therefore, data of fault occurrences were recorded at two substations, of which one has an unearthed and the other a compensated neutral, measured as follows: (a) the phase currents and neutral current for each line in the case of low fault resistance (b) the phase voltages and neutral voltage from the voltage measuring bay in the case of low fault resistance (c) the neutral voltage and the components of 50 Hz at the substation in the case of high fault resistance. In addition, the basic data of the fault occurrences were collected (data of the line, fault location, cause and so on). The data will be used in the development work of fault location and earth fault protection systems

  17. Recording real case data of earth faults in distribution lines

    Energy Technology Data Exchange (ETDEWEB)

    Haenninen, S [VTT Energy, Espoo (Finland)

    1997-12-31

    The most common fault type in the electrical distribution networks is the single phase to earth fault. According to the earlier studies, for instance in Nordic countries, about 80 % of all faults are of this type. To develop the protection and fault location systems, it is important to obtain real case data of disturbances and faults which occur in the networks. For example, the earth fault initial transients can be used for earth fault location. The aim of this project was to collect and analyze real case data of the earth fault disturbances in the medium voltage distribution networks (20 kV). Therefore, data of fault occurrences were recorded at two substations, of which one has an unearthed and the other a compensated neutral, measured as follows: (a) the phase currents and neutral current for each line in the case of low fault resistance (b) the phase voltages and neutral voltage from the voltage measuring bay in the case of low fault resistance (c) the neutral voltage and the components of 50 Hz at the substation in the case of high fault resistance. In addition, the basic data of the fault occurrences were collected (data of the line, fault location, cause and so on). The data will be used in the development work of fault location and earth fault protection systems

  18. Record of palaeoenvironmental changes in the Mid-Polish Basin during the Valanginian Event

    Science.gov (United States)

    Morales, Chloé; Kujau, Ariane; Heimhofer, Ulrich; Mutterlose, Joerg; Spangenberg, Jorge; Adatte, Thierry; Ploch, Isabela; Föllmi, Karl B.

    2013-04-01

    The Valanginian stage displays the first major perturbation of the carbon cycle of the Cretaceous period. The Valanginian Weissert episode is associated with a positive excursion (CIE) in δ13Ccarb and δ13Corg values, and the occurrence of a crisis in pelagic and neritic carbonate production (Weissert et al., 1998; Erba, 2004, Föllmi et al., 2007). As for Cretaceous oceanic anoxic events (OAEs), the carbon anomaly is explained by the intensification of continental biogeochemical weathering triggering an increase in marine primary productivity and organic-matter preservation. However, to the contrary of OAEs, the organic matter trapped in the Tethyan Ocean during the Valanginian is both marine and continental and the occurrence of a widespread anoxia could not be evidenced (Westermann et al., 2010; Kujau et al., 2012). The resulting marine Corg burial rates were probably not sufficient to explain the shift in δ13C values and an alternative scheme has been proposed by Westermann et al. (2010): the carbonate platform crisis combined with the storage of organic-matter on the continent may be the major triggers of the δ13C positive shift. (Westermann et al., 2010). We present the results of an analysis of the Wawal drilling core (Mid-Polish Trough), which is of particular interest because of its near-coastal setting and its exceptional preservation, demonstrated by the presence of up to 17 wt.% aragonite. The section consists in marine silty to sandy clays deposited on top of a lower Berriasian karstified limestone. It covers the Early and early Late Valanginian, and displays the onset of the positive excursion. The lack of anoxia is evidenced by trace-element and Rock-Eval data. Two intervals of phosphogenesis are emphasised that appear equivalent in time to the condensed horizons of the northern Tethyan region (Helvetic Alps). A rapid climate change toward less humid and seasonally-contrasted conditions that is similar to the northern Tethyan areas is observed

  19. Characterization of a Mediterranean flash flood event using rain gauges, radar, GIS and lightning data

    Directory of Open Access Journals (Sweden)

    M. Barnolas

    2008-06-01

    Full Text Available Flash flood events are very common in Catalonia, generating a high impact on society, including losses in life almost every year. They are produced by the overflowing of ephemeral rivers in narrow and steep basins close to the sea. This kind of floods is associated with convective events producing high rainfall intensities. The aim of the present study is to analyse the 12–14 September 2006 flash flood event within the framework of the characteristics of flood events in the Internal Basins of Catalonia (IBC. To achieve this purpose all flood events occurred between 1996 and 2005 have been analysed. Rainfall and radar data have been introduced into a GIS, and a classification of the events has been done. A distinction of episodes has been made considering the spatial coverage of accumulated rainfall in 24 h, and the degree of the convective precipitation registered. The study case can be considered as a highly convective one, with rainfalls covering all the IBC on the 13th of September. In that day 215.9 mm/24 h were recorded with maximum intensities above 130 mm/h. A complete meteorological study of this event is also presented. In addition, as this is an episode with a high lightning activity it has been chosen to be studied into the framework of the FLASH project. In this way, a comparison between this information and raingauge data has been developed. All with the goal in mind of finding a relation between lightning density, radar echoes and amounts of precipitation. Furthermore, these studies improve our knowledge about thunderstorms systems.

  20. FDA Adverse Event Reporting System (FAERS): Latest Quartely Data Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — The FDA Adverse Event Reporting System (FAERS) is a database that contains information on adverse event and medication error reports submitted to FDA. The database...

  1. On the interpretation of rare events recorded by Kamiokande 2. and IMB detectors in association with occurrence of supernova 1987 A

    International Nuclear Information System (INIS)

    Krivoruchenko, M.I.

    1989-01-01

    A statistical analysis of angular distribution of neutrino events observed in Kamiokande 2. and IMB detectors from supernova SN 1987 A is carried out. The Neyman-Pearson test is applied to each of the events in testing the hypothesis ν-bar e p→e + n against the alternative one νe→νe. The confidence level of the hypothesis that the recorded events all represent ν-bar e p→e + n inelastic scatterings against possible alternatives is found with the use of the Kolmogorov and Mises tests to be 2% and 0.9%, respectively. The number of νe→νe events is estimated to be from 3 to 11 with probability ≥0.9. The current supernova models fail to give a satisfactory account of the angular distribution data

  2. Homogeneity of a Global Multisatellite Soil Moisture Climate Data Record

    Science.gov (United States)

    Su, Chun-Hsu; Ryu, Dongryeol; Dorigo, Wouter; Zwieback, Simon; Gruber, Alexander; Albergel, Clement; Reichle, Rolf H.; Wagner, Wolfgang

    2016-01-01

    Climate Data Records (CDR) that blend multiple satellite products are invaluable for climate studies, trend analysis and risk assessments. Knowledge of any inhomogeneities in the CDR is therefore critical for making correct inferences. This work proposes a methodology to identify the spatiotemporal extent of the inhomogeneities in a 36-year, global multisatellite soil moisture CDR as the result of changing observing systems. Inhomogeneities are detected at up to 24 percent of the tested pixels with spatial extent varying with satellite changeover times. Nevertheless, the contiguous periods without inhomogeneities at changeover times are generally longer than 10 years. Although the inhomogeneities have measurable impact on the derived trends, these trends are similar to those observed in ground data and land surface reanalysis, with an average error less than 0.003 cubic meters per cubic meter per year. These results strengthen the basis of using the product for long-term studies and demonstrate the necessity of homogeneity testing of multisatellite CDRs in general.

  3. Applying Metrological Techniques to Satellite Fundamental Climate Data Records

    Science.gov (United States)

    Woolliams, Emma R.; Mittaz, Jonathan PD; Merchant, Christopher J.; Hunt, Samuel E.; Harris, Peter M.

    2018-02-01

    Quantifying long-term environmental variability, including climatic trends, requires decadal-scale time series of observations. The reliability of such trend analysis depends on the long-term stability of the data record, and understanding the sources of uncertainty in historic, current and future sensors. We give a brief overview on how metrological techniques can be applied to historical satellite data sets. In particular we discuss the implications of error correlation at different spatial and temporal scales and the forms of such correlation and consider how uncertainty is propagated with partial correlation. We give a form of the Law of Propagation of Uncertainties that considers the propagation of uncertainties associated with common errors to give the covariance associated with Earth observations in different spectral channels.

  4. Identical event-related potentials to target and frequent stimuli of visual oddball task recorded by intracerebral electrodes

    Czech Academy of Sciences Publication Activity Database

    Kukleta, M.; Brázdil, M.; Roman, R.; Jurák, Pavel

    2003-01-01

    Roč. 114, č. 7 (2003), s. 1292 - 1297 ISSN 1388-2457 Institutional research plan: CEZ:AV0Z2065902 Keywords : event-related potential * intra-cerebral EEG recording in humans * oddball task Subject RIV: FA - Cardiovascular Diseases incl. Cardiotharic Surgery Impact factor: 2.485, year: 2003

  5. ATLAS event at 13 TeV - Highest mass dijets resonance event in 2015 data

    CERN Multimedia

    ATLAS Collaboration

    2015-01-01

    The highest-mass, central dijet event passing the dijet resonance selection collected in 2015 (Event 1273922482, Run 280673) : the two central high-pT jets have an invariant mass of 6.9 TeV, the two leading jets have a pT of 3.2 TeV. The missing transverse momentum in this event is 46 GeV.

  6. ATLAS event at 13 TeV - Highest mass dijets angular event in 2015 data

    CERN Multimedia

    ATLAS Collaboration

    2015-01-01

    The highest-mass dijet event passing the angular selection collected in 2015 (Event 478442529, Run 280464): the two central high-pT jets have an invariant mass of 7.9 TeV, the three leading jets have a pT of 1.99, 1.86 and 0.74 TeV respectively. The missing transverse momentum in this event is 46 GeV

  7. Multilevel recording of complex amplitude data pages in a holographic data storage system using digital holography.

    Science.gov (United States)

    Nobukawa, Teruyoshi; Nomura, Takanori

    2016-09-05

    A holographic data storage system using digital holography is proposed to record and retrieve multilevel complex amplitude data pages. Digital holographic techniques are capable of modulating and detecting complex amplitude distribution using current electronic devices. These techniques allow the development of a simple, compact, and stable holographic storage system that mainly consists of a single phase-only spatial light modulator and an image sensor. As a proof-of-principle experiment, complex amplitude data pages with binary amplitude and four-level phase are recorded and retrieved. Experimental results show the feasibility of the proposed holographic data storage system.

  8. Unbiased analysis of geomagnetic data sets and comparison of historical data with paleomagnetic and archeomagnetic records

    Science.gov (United States)

    Arneitz, Patrick; Egli, Ramon; Leonhardt, Roman

    2017-03-01

    Reconstructions of the past geomagnetic field provide fundamental constraints for understanding the dynamics of the Earth's interior, as well as serving as basis for magnetostratigraphic and archeomagnetic dating tools. Such reconstructions, when extending over epochs that precede the advent of instrumental measurements, rely exclusively on magnetic records from archeological artifacts, and, further in the past, from rocks and sediments. The most critical component of such indirect records is field intensity because of possible biases introduced by material properties and by laboratory protocols, which do not reproduce exactly the original field recording conditions. Large biases are usually avoided by the use of appropriate checking procedures; however, smaller ones can remain undetected in individual studies and might significantly affect field reconstructions. We introduce a new general approach for analyzing geomagnetic databases in order to investigate the reliability of indirect records. This approach is based on the comparison of historical records with archeomagnetic and volcanic data, considering temporal and spatial mismatches with adequate weighting functions and error estimation. A good overall agreement is found between indirect records and historical measurements, while for several subsets systematic bias is detected (e.g., inclination shallowing of lava records). We also demonstrate that simple approaches to analyzing highly inhomogeneous and internally correlated paleomagnetic data sets can lead to incorrect conclusions about the efficiency of quality checks and corrections. Consistent criteria for selecting and weighting data are presented in this review and can be used to improve current geomagnetic field modeling techniques.

  9. Medication errors: an analysis comparing PHICO's closed claims data and PHICO's Event Reporting Trending System (PERTS).

    Science.gov (United States)

    Benjamin, David M; Pendrak, Robert F

    2003-07-01

    Clinical pharmacologists are all dedicated to improving the use of medications and decreasing medication errors and adverse drug reactions. However, quality improvement requires that some significant parameters of quality be categorized, measured, and tracked to provide benchmarks to which future data (performance) can be compared. One of the best ways to accumulate data on medication errors and adverse drug reactions is to look at medical malpractice data compiled by the insurance industry. Using data from PHICO insurance company, PHICO's Closed Claims Data, and PHICO's Event Reporting Trending System (PERTS), this article examines the significance and trends of the claims and events reported between 1996 and 1998. Those who misread history are doomed to repeat the mistakes of the past. From a quality improvement perspective, the categorization of the claims and events is useful for reengineering integrated medication delivery, particularly in a hospital setting, and for redesigning drug administration protocols on low therapeutic index medications and "high-risk" drugs. Demonstrable evidence of quality improvement is being required by state laws and by accreditation agencies. The state of Florida requires that quality improvement data be posted quarterly on the Web sites of the health care facilities. Other states have followed suit. The insurance industry is concerned with costs, and medication errors cost money. Even excluding costs of litigation, an adverse drug reaction may cost up to $2500 in hospital resources, and a preventable medication error may cost almost $4700. To monitor costs and assess risk, insurance companies want to know what errors are made and where the system has broken down, permitting the error to occur. Recording and evaluating reliable data on adverse drug events is the first step in improving the quality of pharmacotherapy and increasing patient safety. Cost savings and quality improvement evolve on parallel paths. The PHICO data

  10. Integrating cancer genomic data into electronic health records

    Directory of Open Access Journals (Sweden)

    Jeremy L. Warner

    2016-10-01

    Full Text Available Abstract The rise of genomically targeted therapies and immunotherapy has revolutionized the practice of oncology in the last 10–15 years. At the same time, new technologies and the electronic health record (EHR in particular have permeated the oncology clinic. Initially designed as billing and clinical documentation systems, EHR systems have not anticipated the complexity and variety of genomic information that needs to be reviewed, interpreted, and acted upon on a daily basis. Improved integration of cancer genomic data with EHR systems will help guide clinician decision making, support secondary uses, and ultimately improve patient care within oncology clinics. Some of the key factors relating to the challenge of integrating cancer genomic data into EHRs include: the bioinformatics pipelines that translate raw genomic data into meaningful, actionable results; the role of human curation in the interpretation of variant calls; and the need for consistent standards with regard to genomic and clinical data. Several emerging paradigms for integration are discussed in this review, including: non-standardized efforts between individual institutions and genomic testing laboratories; “middleware” products that portray genomic information, albeit outside of the clinical workflow; and application programming interfaces that have the potential to work within clinical workflow. The critical need for clinical-genomic knowledge bases, which can be independent or integrated into the aforementioned solutions, is also discussed.

  11. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  12. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  13. Phenological records as a complement to aerobiological data

    Science.gov (United States)

    Tormo, Rafael; Silva, Inmaculada; Gonzalo, Ángela; Moreno, Alfonsa; Pérez, Remedios; Fernández, Santiago

    2011-01-01

    Phenological studies in combination with aerobiological studies enable one to observe the relationship between the release of pollen and its presence in the atmosphere. To obtain a suitable comparison between the daily variation of airborne pollen concentrations and flowering, it is necessary for the level of accuracy of both sets of data to be as similar as possible. To analyse the correlation between locally observed flowering data and pollen counts in pollen traps in order to set pollen information forecasts, pollen was sampled using a Burkard volumetric pollen trap working continuously from May 1993. For the phenological study we selected the main pollen sources of the six pollen types most abundant in our area: Cupressaceae, Platanus, Quercus, Plantago, Olea, and Poaceae with a total of 35 species. We selected seven sites to register flowering or pollination, two with semi-natural vegetation, the rest being urban sites. The sites were visited weekly from March to June in 2007, and from January to June in 2008 and 2009. Pollen shedding was checked at each visit, and recorded as the percentage of flowers or microsporangia in that state. There was an association between flowering phenology and airborne pollen records for some of the pollen types ( Platanus, Quercus, Olea and Plantago). Nevertheless, for the other types (Cupressaceae and Poaceae) the flowering and airborne pollen peaks did not coincide, with up to 1 week difference in phase. Some arguments are put forward in explanation of this phenomenon. Phenological studies have shown that airborne pollen results from both local and distant sources, although the pollen peaks usually appear when local sources are shedding the greatest amounts of pollen. Resuspension phenomena are probably more important than long-distance transport in explaining the presence of airborne pollen outside the flowering period. This information could be used to improve pollen forecasts.

  14. Analysis of strong scintillation events by using GPS data at low latitudes

    Science.gov (United States)

    Forte, Biagio; Jakowski, Norbert; Wilken, Volker

    2010-05-01

    Drifting structures charaterised by inhomogeneities in the spatial electron density distribution at ionospheric heights originate scintillation of radio waves propagating through. The fractional electron density fluctuations and the corresponding scintillation levels may reach extreme values at low latitudes during high solar activity. Strong scintillation events have disruptive effects on a number of technological applications. In particular, operations and services based on GPS signals and receivers may experience severe disruption due to a significant degradation of the signal-to-noise ratio, eventually leading to signal loss of lock. Experimental scintillation data collected in the Asian sector at low latitudes by means of a GPS dual frequency receiver under moderate solar activity (2006) have been analysed. The GPS receiver is particularly modified in firmware in order to record power estimates on the C/A code as well as on the carriers L1 and L2. Strong scintillation activity is recorded in the post-sunset period (saturating S4 and SI as high as 20 dB). An overview of these events is presented, by taking into account scintillation impact on the signal intensity, phase, and dynamics. In particular, the interpretation of these events based on a refined scattering theory is provided with possible consequences for standard scintillation models.

  15. The use of photography to record geologic data

    International Nuclear Information System (INIS)

    McClay, P.L.

    1985-01-01

    Although photography is generally used today by geologists to record important data and features of interest, no strong effort has been made to systematically photo-document preliminary investigations and siting of critical facilities such as nuclear power plants, Liquified Natural Gas (LNG) terminals, or dams At a time when the safe siting of critical facilities is coming under ever closer scrutiny by regulatory agencies and the public, the importance and usefulness of photographic evidence and authentication is clear. Photography by no means replaces the accurate, detailed log or map. However, when used together, the photograph and graphic log or map can provide a clearer, more understandable representation of geologic data. This can be extremely important to the non-technical reviewer or decision maker. A simple method of presenting documentary photographs has been used for the proposed LNG facility at Little Cojo Bay, near Point Conception, California. This method combines both geologic data and photographic images through the use of clear mylar or acetate overlays

  16. Analyzing System on A Chip Single Event Upset Responses using Single Event Upset Data, Classical Reliability Models, and Space Environment Data

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth; Campola, Michael; Xapsos, Michael

    2017-01-01

    We are investigating the application of classical reliability performance metrics combined with standard single event upset (SEU) analysis data. We expect to relate SEU behavior to system performance requirements. Our proposed methodology will provide better prediction of SEU responses in harsh radiation environments with confidence metrics. single event upset (SEU), single event effect (SEE), field programmable gate array devises (FPGAs)

  17. Concordance between maternal recall of birth complications and data from obstetrical records.

    Science.gov (United States)

    Keenan, Kate; Hipwell, Alison; McAloon, Rose; Hoffmann, Amy; Mohanty, Arpita; Magee, Kelsey

    2017-02-01

    Prenatal complications are associated with poor outcomes in the offspring. Access to medical records is limited in the United States and investigators often rely on maternal report of prenatal complications. We tested concordance between maternal recall and birth records in a community-based sample of mothers participating in a longitudinal study in order to determine the accuracy of maternal recall of perinatal complications. Participants were 151 biological mothers, who were interviewed about gestational age at birth, birthweight, and the most commonly occurring birth complications: nuchal cord and meconium aspiration when the female child was on average 6years old, and for whom birth records were obtained. Concordance between reports was assessed using one-way random intra-class coefficients for continuous measures and kappa coefficients for dichotomous outcomes. Associations between maternal demographic and psychological factors and discrepancies also were tested. Concordance was excellent for continuously measured birthweight (ICC=0.85, pbirth record and absence according to maternal recall. Receipt of public assistance was associated with a decrease in discrepancy in report of nuchal cord. Concordance between maternal retrospective report and medical birth records varies across different types of perinatal events. There was little evidence that demographic or psychological factors increased the risk of discrepancies. Maternal recall based on continuous measures of perinatal factors may yield more valid data than dichotomous outcomes. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. An alluvial record of El Niño events from northern coastal Peru

    Science.gov (United States)

    Wells, Lisa E.

    1987-12-01

    Overbank flood deposits of northern coastal Peru provide the potential for the development of a late Quaternary chronology of El Niño events. Alluvial deposits from the 1982-1983 El Niño event are the basis for establishing a type El Niño deposit. Sedimentary structures suggesting depositional processes range from sheet flows to debris flows, with sheet flood deposits being the most common. The 1982-1983 deposits are characterized by a 50- to 100-cm- thick basal gravel, overlain by a 10- to 100-cm-thick sand bed, grading into a 1- to 10-cm-thick silty sand bed and capped by a very thin layer of silt or clay. The surface of the deposit commonly displays the original shear flow lines crosscut by postdepositional mud cracks and footprints (human and animal). Stacked sequences of flood deposits are present in Pleistocene and Holocene alluvial fill, suggesting that El Niño type events likely occurred throughout the late Quaternary. A relative chronology of the deposits is developed based on terrace and soil stratigraphy and on the degree of preservation of surficial features. A minimum of 15 El Niño events occurred during the Holocene; a minimum of 21 events occurred during the late Pleistocene. Timing of the Holocene events is bracketed by isochrons derived from the archaeologic stratigraphy. Corrected radiocarbon ages from included detrital wood provide the following absolute dates for El Niño events: 1720 ± 60 A.D., 1460 ± 20 A.D., 1380 ± 140 A.D. (error overlaps with the A.D. 1460 event; these may represent a single event), and 1230 ± 60 B.C.

  19. A training manual for event history data management using Health and Demographic Surveillance System data.

    Science.gov (United States)

    Bocquier, Philippe; Ginsburg, Carren; Herbst, Kobus; Sankoh, Osman; Collinson, Mark A

    2017-06-26

    The objective of this research note is to introduce a training manual for event history data management. The manual provides a first comprehensive guide to longitudinal Health and Demographic Surveillance System (HDSS) data management that allows for a step-by-step description of the process of structuring and preparing a dataset for the calculation of demographic rates and event history analysis. The research note provides some background information on the INDEPTH Network, and the iShare data repository and describes the need for a manual to guide users as to how to correctly handle HDSS datasets. The approach outlined in the manual is flexible and can be applied to other longitudinal data sources. It facilitates the development of standardised longitudinal data management and harmonization of datasets to produce a comparative set of results.

  20. Evaluating and Extending the Ocean Wind Climate Data Record

    Science.gov (United States)

    Ricciardulli, Lucrezia; Rodriguez, Ernesto; Stiles, Bryan W.; Bourassa, Mark A.; Long, David G.; Hoffman, Ross N.; Stoffelen, Ad; Verhoef, Anton; O'Neill, Larry W.; Farrar, J. Tomas; Vandemark, Douglas; Fore, Alexander G.; Hristova-Veleva, Svetla M.; Turk, F. Joseph; Gaston, Robert; Tyler, Douglas

    2017-01-01

    Satellite microwave sensors, both active scatterometers and passive radiometers, have been systematically measuring near-surface ocean winds for nearly 40 years, establishing an important legacy in studying and monitoring weather and climate variability. As an aid to such activities, the various wind datasets are being intercalibrated and merged into consistent climate data records (CDRs). The ocean wind CDRs (OW-CDRs) are evaluated by comparisons with ocean buoys and intercomparisons among the different satellite sensors and among the different data providers. Extending the OW-CDR into the future requires exploiting all available datasets, such as OSCAT-2 scheduled to launch in July 2016. Three planned methods of calibrating the OSCAT-2 σo measurements include 1) direct Ku-band σo intercalibration to QuikSCAT and RapidScat; 2) multisensor wind speed intercalibration; and 3) calibration to stable rainforest targets. Unfortunately, RapidScat failed in August 2016 and cannot be used to directly calibrate OSCAT-2. A particular future continuity concern is the absence of scheduled new or continuation radiometer missions capable of measuring wind speed. Specialized model assimilations provide 30-year long high temporal/spatial resolution wind vector grids that composite the satellite wind information from OW-CDRs of multiple satellites viewing the Earth at different local times. PMID:28824741

  1. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  2. Simple procedure for evaluating earthquake response spectra of large-event motions based on site amplification factors derived from smaller-event records

    International Nuclear Information System (INIS)

    Dan, Kazuo; Miyakoshi, Jun-ichi; Yashiro, Kazuhiko.

    1996-01-01

    A primitive procedure was proposed for evaluating earthquake response spectra of large-event motions to make use of records from smaller events. The result of the regression analysis of the response spectra was utilized to obtain the site amplification factors in the proposed procedure, and the formulation of the seismic-source term in the regression analysis was examined. A linear form of the moment magnitude, Mw, is good for scaling the source term of moderate earthquakes with Mw of 5.5 to 7.0, while a quadratic form of Mw and the ω-square source-spectrum model is appropriate for scaling the source term of smaller and greater earthquakes, respectively. (author). 52 refs

  3. Record completeness and data concordance in an anesthesia information management system using context-sensitive mandatory data-entry fields.

    Science.gov (United States)

    Avidan, Alexander; Weissman, Charles

    2012-03-01

    Use of an anesthesia information management system (AIMS) does not insure record completeness and data accuracy. Mandatory data-entry fields can be used to assure data completeness. However, they are not suited for data that is mandatory depending on the clinical situation (context sensitive). For example, information on equal breath sounds should be mandatory with tracheal intubation, but not with mask ventilation. It was hypothesized that employing context-sensitive mandatory data-entry fields can insure high data-completeness and accuracy while maintaining usability. A commercial off-the-shelf AIMS was enhanced using its built-in VBScript programming tool to build event-driven forms with context-sensitive mandatory data-entry fields. One year after introduction of the system, all anesthesia records were reviewed for data completeness. Data concordance, used as a proxy for accuracy, was evaluated using verifiable age-related data. Additionally, an anonymous satisfaction survey on general acceptance and usability of the AIMS was performed. During the initial 12 months of AIMS use, 12,241 (99.6%) of 12,290 anesthesia records had complete data. Concordances of entered data (weight, size of tracheal tubes, laryngoscopy blades and intravenous catheters) with patients' ages were 98.7-99.9%. The AIMS implementation was deemed successful by 98% of the anesthesiologists. Users rated the AIMS usability in general as very good and the data-entry forms in particular as comfortable. Due to the complexity and the high costs of implementation of an anesthesia information management system it was not possible to compare various system designs (for example with or without context-sensitive mandatory data entry-fields). Therefore, it is possible that a different or simpler design would have yielded the same or even better results. This refers also to the evaluation of usability, since users did not have the opportunity to work with different design approaches or even different

  4. Archetype-based data warehouse environment to enable the reuse of electronic health record data.

    Science.gov (United States)

    Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G

    2015-09-01

    The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world

  5. Vegetation Earth System Data Record from DSCOVR EPIC Observations

    Science.gov (United States)

    Knyazikhin, Y.; Song, W.; Yang, B.; Mottus, M.; Rautiainen, M.; Stenberg, P.

    2017-12-01

    The NASA's Earth Polychromatic Imaging Camera (EPIC) onboard NOAA's Deep Space Climate Observatory (DSCOVR) mission was launched on February 11, 2015 to the Sun-Earth Lagrangian L1 point where it began to collect radiance data of the entire sunlit Earth every 65 to 110 min in June 2015. It provides imageries in near backscattering directions with the scattering angle between 168° and 176° at ten ultraviolet to near infrared (NIR) narrow spectral bands centered at 317.5 (band width 1.0) nm, 325.0 (2.0) nm, 340.0 (3.0) nm, 388.0 (3.0) nm, 433.0 (3.0) nm, 551.0 (3.0) nm, 680.0 (3.0) nm, 687.8 (0.8) nm, 764.0 (1.0) nm and 779.5 (2.0) nm. This poster presents current status of the Vegetation Earth System Data Record of global Leaf Area Index (LAI), solar zenith angle dependent Sunlit Leaf Area Index (SLAI), Fraction vegetation absorbed Photosynthetically Active Radiation (FPAR) and Normalized Difference Vegetation Index (NDVI) derived from the DSCOVR EPIC observations. Whereas LAI is a standard product of many satellite missions, the SLAI is a new satellite-derived parameter. Sunlit and shaded leaves exhibit different radiative response to incident Photosynthetically Active Radiation (400-700 nm), which in turn triggers various physiological and physical processes required for the functioning of plants. FPAR, LAI and SLAI are key state parameters in most ecosystem productivity models and carbon/nitrogen cycle. The product at 10 km sinusoidal grid and 65 to 110 min temporal frequency as well as accompanying Quality Assessment (QA) variables will be publicly available from the NASA Langley Atmospheric Science Data Center. The Algorithm Theoretical Basis (ATBD) and product validation strategy are also discussed in this poster.

  6. Higgs and associated vector boson event recorded by CMS (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    Real proton-proton collision event at 13 TeV in the CMS detector in which two high-energy electrons (green lines), two high-energy muons (red lines), and two high-energy jets (dark yellow cones) are observed. The event shows characteristics expected in the production of a Higgs boson in association with a vector boson with the decay of the Higgs boson in four leptons and the decay of the vector boson in two jets, and is also consistent with background standard model physics processes.

  7. Predicting 30-Day Pneumonia Readmissions Using Electronic Health Record Data.

    Science.gov (United States)

    Makam, Anil N; Nguyen, Oanh Kieu; Clark, Christopher; Zhang, Song; Xie, Bin; Weinreich, Mark; Mortensen, Eric M; Halm, Ethan A

    2017-04-01

    Readmissions after hospitalization for pneumonia are common, but the few risk-prediction models have poor to modest predictive ability. Data routinely collected in the electronic health record (EHR) may improve prediction. To develop pneumonia-specific readmission risk-prediction models using EHR data from the first day and from the entire hospital stay ("full stay"). Observational cohort study using stepwise-backward selection and cross-validation. Consecutive pneumonia hospitalizations from 6 diverse hospitals in north Texas from 2009-2010. All-cause nonelective 30-day readmissions, ascertained from 75 regional hospitals. Of 1463 patients, 13.6% were readmitted. The first-day pneumonia-specific model included sociodemographic factors, prior hospitalizations, thrombocytosis, and a modified pneumonia severity index; the full-stay model included disposition status, vital sign instabilities on discharge, and an updated pneumonia severity index calculated using values from the day of discharge as additional predictors. The full-stay pneumonia-specific model outperformed the first-day model (C statistic 0.731 vs 0.695; P = 0.02; net reclassification index = 0.08). Compared to a validated multi-condition readmission model, the Centers for Medicare and Medicaid Services pneumonia model, and 2 commonly used pneumonia severity of illness scores, the full-stay pneumonia-specific model had better discrimination (C statistic range 0.604-0.681; P pneumonia. This approach outperforms a first-day pneumonia-specific model, the Centers for Medicare and Medicaid Services pneumonia model, and 2 commonly used pneumonia severity of illness scores. Journal of Hospital Medicine 2017;12:209-216. © 2017 Society of Hospital Medicine

  8. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  9. A fifty year record of winter glacier melt events in southern Chile, 38°–42°S

    International Nuclear Information System (INIS)

    Brock, Ben W; Burger, Flavia; Montecinos, Aldo; Rivera, Andrés

    2012-01-01

    Little is known about the frequency and potential mass balance impact of winter glacier melt events. In this study, daily atmospheric temperature soundings from the Puerto Montt radiosonde (41.43°S) are used to reconstruct winter melting events at the glacier equilibrium line altitude in the 38°–42°S region of southern Chile, between 1960 and 2010. The representativeness of the radiosonde temperatures to near-surface glacier temperatures is demonstrated using meteorological records from close to the equilibrium line on two glaciers in the region over five winters. Using a degree-day model we estimate an average of 0.28 m of melt and 21 melt days in the 15 June–15 September period each year, with high inter-annual variability. The majority of melt events are associated with midlatitude migratory high pressure systems crossing Chile and northwesterly flows, that force adiabatic compression and warm advection, respectively. There are no trends in the frequency or magnitude of melt events over the period of record, but the annual frequency of winter melt days shows a significant, although rather weak and probably non-linear, relationship to late winter and early spring values of a multivariate El Niño Southern Oscillation Index (MEI). (letter)

  10. Records of climatic changes and volcanic events in an ice core from ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    the volcanic event that occurred in 1815 AD, has been identified based on electrical conductance ... tions and accumulation rates of ice, climatic and ..... The peak saturated values of currents (µ amp) at about 5 and 30m depths identify the past volcanic episodes Augung ..... in promoting the scientific activities by allowing us.

  11. Comparison of dementia recorded in routinely collected hospital admission data in England with dementia recorded in primary care.

    Science.gov (United States)

    Brown, Anna; Kirichek, Oksana; Balkwill, Angela; Reeves, Gillian; Beral, Valerie; Sudlow, Cathie; Gallacher, John; Green, Jane

    2016-01-01

    Electronic linkage of UK cohorts to routinely collected National Health Service (NHS) records provides virtually complete follow-up for cause-specific hospital admissions and deaths. The reliability of dementia diagnoses recorded in NHS hospital data is not well documented. For a sample of Million Women Study participants in England we compared dementia recorded in routinely collected NHS hospital data (Hospital Episode Statistics: HES) with dementia recorded in two separate sources of primary care information: a primary care database [Clinical Practice Research Datalink (CPRD), n = 340] and a survey of study participants' General Practitioners (GPs, n = 244). Dementia recorded in HES fully agreed both with CPRD and with GP survey data for 85% of women; it did not agree for 1 and 4%, respectively. Agreement was uncertain for the remaining 14 and 11%, respectively; and among those classified as having uncertain agreement in CPRD, non-specific terms compatible with dementia, such as 'memory loss', were recorded in the CPRD database for 79% of the women. Agreement was significantly better (p primary care (CPRD) than in hospital (HES) data. Age-specific rates for dementia based on the hospital admission data were lower than the rates based on the primary care data, but were similar if the delay in recording in HES was taken into account. Dementia recorded in routinely collected NHS hospital admission data for women in England agrees well with primary care records of dementia assessed separately from two different sources, and is sufficiently reliable for epidemiological research.

  12. On the predictability of extreme events in records with linear and nonlinear long-range memory: Efficiency and noise robustness

    Science.gov (United States)

    Bogachev, Mikhail I.; Bunde, Armin

    2011-06-01

    We study the predictability of extreme events in records with linear and nonlinear long-range memory in the presence of additive white noise using two different approaches: (i) the precursory pattern recognition technique (PRT) that exploits solely the information about short-term precursors, and (ii) the return interval approach (RIA) that exploits long-range memory incorporated in the elapsed time after the last extreme event. We find that the PRT always performs better when only linear memory is present. In the presence of nonlinear memory, both methods demonstrate comparable efficiency in the absence of white noise. When additional white noise is present in the record (which is the case in most observational records), the efficiency of the PRT decreases monotonously with increasing noise level. In contrast, the RIA shows an abrupt transition between a phase of low level noise where the prediction is as good as in the absence of noise, and a phase of high level noise where the prediction becomes poor. In the phase of low and intermediate noise the RIA predicts considerably better than the PRT, which explains our recent findings in physiological and financial records.

  13. Asynchronous sampled-data approach for event-triggered systems

    Science.gov (United States)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  14. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  15. Climatic and environmental events over the Last Termination, as recorded in The Netherlands: a review

    NARCIS (Netherlands)

    Hoek, W.Z.; Bohncke, S.J.P.

    The Last Termination, or Weichselian Lateglacial (ca 15-10 ka cal. BP), is a time period with rapid changes in climate and environment. The oxygen-isotope records of the Greenland ice-cores are regarded as the most complete climate proxy for the North Atlantic region. In The Netherlands several

  16. A distributed real-time system for event-driven control and dynamic data acquisition on a fusion plasma experiment

    International Nuclear Information System (INIS)

    Sousa, J.; Combo, A.; Batista, A.; Correia, M.; Trotman, D.; Waterhouse, J.; Varandas, C.A.F.

    2000-01-01

    A distributed real-time trigger and timing system, designed in a tree-type topology and implemented in VME and CAMAC versions, has been developed for a magnetic confinement fusion experiment. It provides sub-microsecond time latencies for the transport of small data objects allowing event-driven discharge control with failure counteraction, dynamic pre-trigger sampling and event recording as well as accurate simultaneous triggers and synchronism on all nodes with acceptable optimality and predictability of timeliness. This paper describes the technical characteristics of the hardware components (central unit composed by one or more reflector crates, event and synchronism reflector cards, event and pulse node module, fan-out and fan-in modules) as well as software for both tests and integration on a global data acquisition system. The results of laboratory operation for several configurations and the overall performance of the system are presented and analysed

  17. Visual exploration of movement and event data with interactive time masks

    Directory of Open Access Journals (Sweden)

    Natalia Andrienko

    2017-03-01

    Full Text Available We introduce the concept of time mask, which is a type of temporal filter suitable for selection of multiple disjoint time intervals in which some query conditions fulfil. Such a filter can be applied to time-referenced objects, such as events and trajectories, for selecting those objects or segments of trajectories that fit in one of the selected time intervals. The selected subsets of objects or segments are dynamically summarized in various ways, and the summaries are represented visually on maps and/or other displays to enable exploration. The time mask filtering can be especially helpful in analysis of disparate data (e.g., event records, positions of moving objects, and time series of measurements, which may come from different sources. To detect relationships between such data, the analyst may set query conditions on the basis of one dataset and investigate the subsets of objects and values in the other datasets that co-occurred in time with these conditions. We describe the desired features of an interactive tool for time mask filtering and present a possible implementation of such a tool. By example of analysing two real world data collections related to aviation and maritime traffic, we show the way of using time masks in combination with other types of filters and demonstrate the utility of the time mask filtering. Keywords: Data visualization, Interactive visualization, Interaction technique

  18. 14 CFR 135.152 - Flight data recorders.

    Science.gov (United States)

    2010-01-01

    ... roll or the rotorcraft begins the lift-off until the airplane has completed the landing roll or the...) Heading—primary flight crew reference (if selectable, record discrete, true or magnetic); (5) Normal...

  19. Moment magnitude determination of local seismic events recorded at selected Polish seismic stations

    Science.gov (United States)

    Wiejacz, Paweł; Wiszniowski, Jan

    2006-03-01

    The paper presents the method of local magnitude determination used at Polish seismic stations to report events originating in one of the four regions of induced seismicity in Poland or its immediate vicinity. The method is based on recalculation of the seismic moment into magnitude, whereas the seismic moment is obtained from spectral analysis. The method has been introduced at Polish seismic stations in the late 1990s but as of yet had not been described in full because magnitude discrepancies have been found between the results of the individual stations. The authors have performed statistics of these differences, provide their explanation and calculate station corrections for each station and each event source region. The limitations of the method are also discussed. The method is found to be a good and reliable method of local magnitude determination provided the limitations are observed and station correction applied.

  20. Discrete Event Modeling and Simulation-Driven Engineering for the ATLAS Data Acquisition Network

    CERN Document Server

    Bonaventura, Matias Alejandro; The ATLAS collaboration; Castro, Rodrigo Daniel

    2016-01-01

    We present an iterative and incremental development methodology for simulation models in network engineering projects. Driven by the DEVS (Discrete Event Systems Specification) formal framework for modeling and simulation we assist network design, test, analysis and optimization processes. A practical application of the methodology is presented for a case study in the ATLAS particle physics detector, the largest scientific experiment built by man where scientists around the globe search for answers about the origins of the universe. The ATLAS data network convey real-time information produced by physics detectors as beams of particles collide. The produced sub-atomic evidences must be filtered and recorded for further offline scrutiny. Due to the criticality of the transported data, networks and applications undergo careful engineering processes with stringent quality of service requirements. A tight project schedule imposes time pressure on design decisions, while rapid technology evolution widens the palett...

  1. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  2. Super instrumental El Niño events recorded by a Porites coral from the South China Sea

    Science.gov (United States)

    Wang, Xijie; Deng, Wenfeng; Liu, Xi; Wei, Gangjian; Chen, Xuefei; Zhao, Jian-xin; Cai, Guanqiang; Zeng, Ti

    2018-03-01

    The 2-7-year periodicities recorded in fossil coral records have been widely used to identify paleo-El Niño events. However, the reliability of this approach in the South China Sea (SCS) has not been assessed in detail. Therefore, this paper presents monthly resolution geochemical records covering the period 1978-2015 obtained from a Porites coral recovered from the SCS to test the reliability of this method. The results suggest that the SCS coral reliably recorded local seawater conditions and the super El Niño events that occurred over the past 3 decades, but does not appear to have been sensitive enough to record all the other El Niños. In detail, the Sr/Ca series distinctly documents only the two super El Niños of 1997-1998 and 2014-2016 as obvious low values, but does not match the Oceanic Niño Index well. The super El Niño of 1982-1983 was identified by the growth hiatus caused by the coral bleaching and subsequent death of the coral. Three distinct stepwise variations occur in the δ13C series that are coincident with the three super El Niños, which may be related to a substantial decline in endosymbiotic zooxanthellae density caused by the increase in temperature during an El Niño or the selective utilization of different zooxanthellaes that was required to survive in the extreme environment. The increase in rainfall and temperatures over the SCS during El Niños counteracts the effects on seawater δ18O (δ18Osw) and salinity; consequently, coral Δδ18O series can be used as a proxy for δ18Osw and salinity, but are not appropriate for identifying El Niño activity. The findings presented here suggest that the method to identify paleo-El Niño activity based on the 2-7-year periodicities preserved in the SCS coral records might not be reliable, because the SCS is on the edge of El Niño anomalies due to its great distance from the central equatorial Pacific and the imprints of weak and medium strength El Niño events may not be recorded by the

  3. Combining Satellite and in Situ Data with Models to Support Climate Data Records in Ocean Biology

    Science.gov (United States)

    Gregg, Watson

    2011-01-01

    The satellite ocean color data record spans multiple decades and, like most long-term satellite observations of the Earth, comes from many sensors. Unfortunately, global and regional chlorophyll estimates from the overlapping missions show substantial biases, limiting their use in combination to construct consistent data records. SeaWiFS and MODIS-Aqua differed by 13% globally in overlapping time segments, 2003-2007. For perspective, the maximum change in annual means over the entire Sea WiFS mission era was about 3%, and this included an El NinoLa Nina transition. These discrepancies lead to different estimates of trends depending upon whether one uses SeaWiFS alone for the 1998-2007 (no significant change), or whether MODIS is substituted for the 2003-2007 period (18% decline, P less than 0.05). Understanding the effects of climate change on the global oceans is difficult if different satellite data sets cannot be brought into conformity. The differences arise from two causes: 1) different sensors see chlorophyll differently, and 2) different sensors see different chlorophyll. In the first case, differences in sensor band locations, bandwidths, sensitivity, and time of observation lead to different estimates of chlorophyll even from the same location and day. In the second, differences in orbit and sensitivities to aerosols lead to sampling differences. A new approach to ocean color using in situ data from the public archives forces different satellite data to agree to within interannual variability. The global difference between Sea WiFS and MODIS is 0.6% for 2003-2007 using this approach. It also produces a trend using the combination of SeaWiFS and MODIS that agrees with SeaWiFS alone for 1998-2007. This is a major step to reducing errors produced by the first cause, sensor-related discrepancies. For differences that arise from sampling, data assimilation is applied. The underlying geographically complete fields derived from a free-running model is unaffected

  4. [ELGA--the electronic health record in the light of data protection and data security].

    Science.gov (United States)

    Ströher, Alexander; Honekamp, Wilfried

    2011-07-01

    The introduction of an electronic health record (ELGA) is a subject discussed for a long time in Austria. Another big step toward ELGA is made at the end of 2010 on the pilot project e-medication in three model regions; other projects should follow. In addition, projects of the ELGA structure are sped up on the part of the ELGA GmbH to install the base of a functioning electronic health record. Unfortunately, many of these initiatives take place, so to speak, secretly, so that in the consciousness of the general public - and that includes not only patients but also physicians and other healthcare providers - always concerns about protection and security of such a storage of health data arouse. In this article the bases of the planned act are discussed taking into account the data protection and data security.

  5. A long record of extreme wave events in coastal Lake Hamana, Japan

    Science.gov (United States)

    Boes, Evelien; Yokoyama, Yusuke; Schmidt, Sabine; Riedesel, Svenja; Fujiwara, Osamu; Nakamura, Atsunori; Garrett, Ed; Heyvaert, Vanessa; Brückner, Helmut; De Batist, Marc

    2017-04-01

    Coastal Lake Hamana is located near the convergent tectonic boundary of the Nankai-Suruga Trough, along which the Philippine Sea slab is subducted underneath the Eurasian Plate, giving rise to repeated tsunamigenic megathrust earthquakes (Mw ≥ 8). A good understanding of the earthquake- and tsunami-triggering mechanisms is crucial in order to better estimate the complexity of seismic risks. Thanks to its accommodation space, Lake Hamana may represent a good archive for past events, such as tsunamis and tropical storms (typhoons), also referred to as "extreme wave" events. Characteristic event layers, consisting of sediment entrained by these extreme waves and their backwash, are witnesses of past marine incursions. By applying a broad range of surveying methods (reflection-seismic profiling, gravity coring, piston coring), sedimentological analyses (CT-scanning, XRF-scanning, multi-sensor core logging, grain size, microfossils etc.) and dating techniques (210Pb/137Cs, 14C, OSL, tephrochronology), we attempt to trace extreme wave event deposits in a multiproxy approach. Seismic imagery shows a vertical stacking of stronger reflectors, interpreted to be coarser-grained sheets deposited by highly energetic waves. Systematic sampling of lake bottom sediments along a transect from ocean-proximal to ocean-distal sites enables us to evaluate vertical and lateral changes in stratigraphy. Ocean-proximal, we observe a sequence of eight sandy units separated by silty background sediments, up to a depth of 8 m into the lake bottom. These sand layers quickly thin out and become finer-grained land-inward. Seismic-to-core correlations show a good fit between the occurrence of strong reflectors and sandy deposits, hence confirming presumptions based on acoustic imagery alone. Sand-rich intervals typically display a higher magnetic susceptibility, density and stronger X-ray attenuation. However, based on textural and structural differences, we can make the distinction between

  6. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  7. VIIRS Climate Raw Data Record (C-RDR) from Suomi NPP, Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Suomi NPP Climate Raw Data Record (C-RDR) developed at the NOAA NCDC is an intermediate product processing level (NOAA Level 1b) between a Raw Data Record (RDR)...

  8. Data Records derived from GEOSAT Geodetic Mission (GM) and Exact Repeat Mission (ERM) data from 30 March 1985 to 31 December 1989

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This collection contains Sensor Data Records (SDRs), Geodetic Data Records (GDRs), Waveform Data Records (WDRs), and Crossover Difference data Records (XDRs) from...

  9. A software oscilloscope for DOS computers with an integrated remote control for a video tape recorder. The assignment of acoustic events to behavioural observations.

    Science.gov (United States)

    Höller, P

    1995-12-01

    With only a little knowledge of programming IBM compatible computers in Basic, it is possible to create a digital software oscilloscope with sampling rates up to 17 kHz (depending on the CPU- and bus-speed). The only additional hardware requirement is a common sound card compatible with the Soundblaster. The system presented in this paper is built to analyse the direction a flying bat is facing during sound emission. For this reason the system works with some additional hardware devices, in order to monitor video sequences at the computer screen, overlaid by an online oscillogram. Using an RS232-interface for a Panasonic video tape recorder both the oscillogram and the video tape recorder can be controlled simultaneously and moreover be analysed frame by frame. Not only acoustical events, but also APs, myograms, EEGs and other physiological data can be digitized and analysed in combination with the behavioural data of an experimental subject.

  10. Microseismic data records fault activation before and after a Mw 4.1 induced earthquake

    Science.gov (United States)

    Eyre, T.; Eaton, D. W. S.

    2017-12-01

    Several large earthquakes (Mw 4) have been observed in the vicinity of the town of Fox Creek, Alberta. These events have been determined to be induced earthquakes related to hydraulic fracturing in the region. The largest of these has a magnitude Mw = 4.1, and is associated with a hydraulic-fracturing treatment close to Crooked Lake, about 30 km west of Fox Creek. The underlying factors that lead to localization of the high numbers of hydraulic fracturing induced events in this area remain poorly understood. The treatment that is associated with the Mw 4.1 event was monitored by 93 shallow three-level borehole arrays of sensors. Here we analyze the temporal and spatial evolution of the microseismic and seismic data recorded during the treatment. Contrary to expected microseismic event clustering parallel to the principal horizontal stress (NE - SW), the events cluster along obvious fault planes that align both NNE - SSW and N - S. As the treatment well is oriented N - S, it appears that each stage of the treatment intersects a new portion of the fracture network, causing seismicity to occur. Focal-plane solutions support a strike-slip failure along these faults, with nodal planes aligning with the microseismic cluster orientations. Each fault segment is activated with a cluster of microseismicity in the centre, gradually extending along the fault as time progresses. Once a portion of a fault is active, further seismicity can be induced, regardless if the present stage is distant from the fault. However, the large events seem to occur in regions with a gap in the microseismicity. Interestingly, most of the seismicity is located above the reservoir, including the larger events. Although a shallow-well array is used, these results are believed to have relatively high depth resolution, as the perforation shots are correctly located with an average error of 26 m in depth. This information contradicts previously held views that large induced earthquakes occur primarily

  11. Pattern of presenting complaints recorded as near-drowning events in emergency departments: a national surveillance study from Pakistan.

    Science.gov (United States)

    He, Siran; Lunnen, Jeffrey C; Zia, Nukhba; Khan, Uzma; Shamim, Khusro; Hyder, Adnan A

    2015-01-01

    Drowning is a heavy burden on the health systems of many countries, including Pakistan. To date, no effective large-scale surveillance has been in place to estimate rates of drowning and near-drowning in Pakistan. The Pakistan National Emergency Department Surveillance (Pak-NEDS) study aimed to fill this gap. Patients who presented with a complaint of "near-drowning" were analyzed to explore patterns of true near-drowning (unintentional) and intentional injuries that led to the "near-drowning" complaint. Bivariate analysis was done to establish patterns among patients treated in emergency departments, including socio-demographic information, injury-related information, accompanying injuries, and emergency department resource utilization. A total of 133 patients (0.2% of all injury patients) with "near-drowning" as presenting complaints were recorded by the Pak-NEDS system. True near-drowning (50.0%) and intentional injuries that led to "near-drowning" complaints (50.0%) differed in nature of injuries. The highest proportion of true near-drowning incidents occurred among patients aged between 25-44 years (47.5%), and among males (77.5%). True near-drowning patients usually had other accompanying complaints, such as lower limb injury (40.0%). Very few patients were transported by ambulance (5.0%), and triage was done for 15% of patients. Eleven (27.5%) true near-drowning patients received cardiopulmonary resuscitation. There was major under-reporting of drowning and near-drowning cases in the surveillance study. The etiology of near-drowning cases should be further studied. Patients who experienced non-fatal drownings were more commonly sent for medical care due to other accompanying conditions, rather than near-drowning event itself. There is also need for recognizing true near-drowning incidents. The results of this study provide information on data source selection, site location, emergency care standardization, and multi-sector collaboration for future drowning

  12. Historical Chronology of ENSO Events Based Upon Documentary Data From South America: Strengths and Limitations

    Science.gov (United States)

    Luc, O.

    2007-05-01

    The first reconstructions of past El Niño occurrences were proposed by W. Quinn twenty years ago. They were based on documentary evidence of anomalous rainfall episodes, destructive floods and other possible impacts of El Niño conditions in Peru and other South-American countries. It has been shown, later, that the El Niño chronological sequence covering the last four and a half centuries produced by Quinn needed a thorough revision since many so-called EN events had not occurred while some others had been overlooked. Beside the classical methodological problems met in historical climatology studies (reliability of data, confidence in the sources, primary and secondary information), the reconstruction of former EN events faces specific difficulties dealing with the significance of the indicators and their spatial location. For instance, strong precipitation anomalies during summer in Southern Ecuador and northern Peru and precipitation excess recorded in the preceding winter in central Chile constitute quite reliable proxies of El Niño conditions, in modern times. However this observed teleconnection pattern, which is useful to reinforce the interpretation of past EN occurrences, seems to have been inoperative before the early nineteenth century. It is interpreted that atmospheric circulation features during the Little Ice Age interfered with the teleconnection system linking the EN impacts in northern Peru and central Chile. As a consequence, how should be evaluated the significance of documented winter precipitation excess in central Chile in years during which there is drought evidence in northern Peru, during the sixteenth to eighteenth century? And vice versa, are former evidences for precipitation excess in northern Peru (prior to the nineteenth century) quite reliable indicators for EN conditions, even if the preceding winter was dry in the Valparaiso-Santiago region? Other specific problems met in the building-up of a consolidated EN chronological

  13. Validation of multisource electronic health record data: an application to blood transfusion data.

    Science.gov (United States)

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  14. Constraints on early events in Martian history as derived from the cratering record

    International Nuclear Information System (INIS)

    Barlow, N.G.

    1990-01-01

    The shapes and densities of crater size-frequency distribution curves are used to constrain two major events early in Martian history: termination of high obliteration rates and viability of the multiple impact origin of the crustal dichotomy. Distribution curves of fresh craters superposed on uplands, intercrater plains, and ridged plains display shapes and densities indicative of formation prior to the end of heavy bombardment. This observation correlates with other geologic evidence, suggesting a major change in the erosional regime following the last major basin size impact (i.e., Argrye). In addition, the multisloped nature of the curves supports the idea that the downturn in the crater size-frequency distribution curves reflects the size-frequency distribution of the impactors rather than being the result of erosion. The crustal dichotomy formed prior to the heavy bombardment intermediate epoch based on distribution curves of knobby terrain; if the dichotomy resulted from a single gigantic impact, this observation places constraints on when this event happened. An alternate theory for dichotomy formation, the multiple-impact basin idea, is questioned: since distribution curves of large basins as well as heavy bombardment era units are not represented by a -3 differential power law function, this study finds fewer basins missing on Mars compare to the Moon and Mercury than previously reported. The area covered by these missing basins is less than that covered the northern plains

  15. Records Management Handbook; Source Data Automation Equipment Guide.

    Science.gov (United States)

    National Archives and Records Service (GSA), Washington, DC. Office of Records Management.

    A detailed guide to selecting appropriate source data automation equipment is presented. Source data automation equipment is used to prepare data for electronic data processing or computerized recordkeeping. The guide contains specifications, performance data cost, and pictures of the major types of machines used in source data automation.…

  16. Gas expulsions and biological activity recorded offshore Molene Island, Brittany (France): video supervised recording of OBS data and analogue modelling

    Science.gov (United States)

    Klingelhoefer, F.; Géli, L.; Dellong, D.; Evangelia, B.; Tary, J. B.; Bayrakci, G.; Lantéri, N.; Lin, J. Y.; Chen, Y. F.; Chang, E. T. Y.

    2016-12-01

    Ocean bottom seismometers (OBS) commonly record signals from Short Duration Events (SDEs), having characteristics that are very different from those produced by tectonic earthquakes, e.g.: durations Brittany within the field of view of the EMSO-Molene underwater observatory, at a water depth of 12 m. The camera images and the recordings reveal the presence of crabs, octopus and several species of fish. Other acoustic signals can be related to the presence of moving algae or the influence from bad weather. Tides produce characteristic curves in the noise recorded on the geophones. SDEs have been recorded on both instruments, that may well have been caused by gas expulsions from the seabed into the water. In order to verify this hypothesis, an aquarium was filled with water overlying an even grain-sized quartz sand layer. A constant air supply through a narrow tube produced gas bubbles in a regular manner and an immersed ocean bottom geophone recorded the resulting acoustic signals. The bubbles tend to have a uniform size and to produce a waveform very close to those found on the OBSs. By comparing the number of SDEs and the volume of escaped air, estimates can be made regarding the volume of gas escaping the seafloor in different environments.

  17. Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.

    Science.gov (United States)

    Sweet, Lauren E; Moulaison, Heather Lea

    2013-12-01

    This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven

  18. The timing, two-pulsed nature, and variable climatic expression of the 4.2 ka event: A review and new high-resolution stalagmite data from Namibia

    Science.gov (United States)

    Railsback, L. Bruce; Liang, Fuyuan; Brook, G. A.; Voarintsoa, Ny Riavo G.; Sletten, Hillary R.; Marais, Eugene; Hardt, Ben; Cheng, Hai; Edwards, R. Lawrence

    2018-04-01

    The climatic event between 4.2 and 3.9 ka BP known as the "4.2 ka event" is commonly considered to be a synchronous global drought that happened as one pulse. However, careful comparison of records from around the world shows that synchrony is possible only if the published chronologies of the various records are shifted to the extent allowed by the uncertainties of their age data, that several records suggest a two-pulsed event, and that some records suggest a wet rather than dry event. The radiometric ages constraining those records have uncertainties of several decades if not hundreds of years, and in some records the event is represented by only one or two analyses. This paper reports a new record from Stalagmite DP1 from northeastern Namibia in which high 230Th/232Th activity ratios allow small age uncertainties ranging between only 10-28 years, and the event is documented by more than 35 isotopic analyses and by petrographic observation of a surface of dissolution. The ages from Stalagmite DP1 combine with results from 11 other records from around the world to suggest an event centered at about 4.07 ka BP with bracketing ages of 4.15 to 3.93 ka BP. The isotopic and petrographic results suggest a two-pulsed wet event in northeastern Namibia, which is in the Southern Hemisphere's summer rainfall zone where more rain presumably fell with southward migration of the Inter-Tropical Convergence Zone as the result of cooling in the Northern Hemisphere. Comparison with other records from outside the region of dryness from the Mediterranean to eastern Asia suggests that multiple climatic zones similarly moved southward during the event, in some cases bringing wetter conditions that contradict the notion of global drought.

  19. The first confirmed breeding record and new distribution data for ...

    African Journals Online (AJOL)

    On 17 November 2007 during fieldwork for the Tanzania Birds Atlas in western Tanzania, we were ... The nest was about 5 m high on the end of a thin downward branch and could not be reached to check the ... Region of W Tanzania at 5-7ºS.” The Tanzania Bird Atlas currently holds 72 records for this species for all.

  20. Financial Record Checking in Surveys: Do Prompts Improve Data Quality?

    Science.gov (United States)

    Murphy, Joe; Rosen, Jeffrey; Richards, Ashley; Riley, Sarah; Peytchev, Andy; Lindblad, Mark

    2016-01-01

    Self-reports of financial information in surveys, such as wealth, income, and assets, are particularly prone to inaccuracy. We sought to improve the quality of financial information captured in a survey conducted by phone and in person by encouraging respondents to check records when reporting on income and assets. We investigated whether…

  1. Sequence of eruptive events in the Vesuvio area recorded in shallow-water Ionian Sea sediments

    Directory of Open Access Journals (Sweden)

    C. Taricco

    2008-01-01

    Full Text Available The dating of the cores we drilled from the Gallipoli terrace in the Gulf of Taranto (Ionian Sea, previously obtained by tephroanalysis, is checked by applying a method to objectively recognize volcanic events. This automatic statistical procedure allows identifying pulse-like features in a series and evaluating quantitatively the confidence level at which the significant peaks are detected. We applied it to the 2000-years-long pyroxenes series of the GT89-3 core, on which the dating is based. The method confirms the dating previously performed by detecting at a high confidence level the peaks originally used and indicates a few possible undocumented eruptions. Moreover, a spectral analysis, focussed on the long-term variability of the pyroxenes series and performed by several advanced methods, reveals that the volcanic pulses are superimposed to a millennial trend and a 400 years oscillation.

  2. Sequence of eruptive events in the Vesuvio area recorded in shallow-water Ionian Sea sediments

    Science.gov (United States)

    Taricco, C.; Alessio, S.; Vivaldo, G.

    2008-01-01

    The dating of the cores we drilled from the Gallipoli terrace in the Gulf of Taranto (Ionian Sea), previously obtained by tephroanalysis, is checked by applying a method to objectively recognize volcanic events. This automatic statistical procedure allows identifying pulse-like features in a series and evaluating quantitatively the confidence level at which the significant peaks are detected. We applied it to the 2000-years-long pyroxenes series of the GT89-3 core, on which the dating is based. The method confirms the dating previously performed by detecting at a high confidence level the peaks originally used and indicates a few possible undocumented eruptions. Moreover, a spectral analysis, focussed on the long-term variability of the pyroxenes series and performed by several advanced methods, reveals that the volcanic pulses are superimposed to a millennial trend and a 400 years oscillation.

  3. In-Flight Observations of Long-Term Single Event Effect(SEE)Performance on Orbview-2 and Xray Timing Explorer(XTE)Solid State Recorders (SSR)

    Science.gov (United States)

    Poivey, Christian; Barth, Janet L.; LaBel, Ken A.; Gee, George; Safren, Harvey

    2003-01-01

    This paper presents Single Event Effect (SEE) in-flight data on Solid State Recorders (SSR) that have been collected over a long period of time for two NASA spacecraft: Orbview-2 and XTE. SEE flight data on solid-state memories give an opportunity to study the behavior in space of SEE sensitive commercial devices. The actual Single Event Upset (SEU) rates can be compared with the calculated rates based on environment models and ground test data. The SEE mitigation schemes can also be evaluated in actual implementation. A significant amount of data has already been published concerning observed SEE effects on memories in space. However, most of the data presented cover either a short period of time or a small number of devices. The data presented here has been collected on a large number of devices during several years. This allows statistically significant information about the effect of space weather fluctuations on SEU rates, and the effectiveness of SEE countermeasures used to be analyzed. Only Orbview-2 data is presented in this summary. XTE data will be included in the final paper.

  4. Seasonal variability of stream water quality response to storm events captured using high-frequency and multi-parameter data

    Science.gov (United States)

    Fovet, O.; Humbert, G.; Dupas, R.; Gascuel-Odoux, C.; Gruau, G.; Jaffrezic, A.; Thelusma, G.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Grimaldi, C.

    2018-04-01

    The response of stream chemistry to storm is of major interest for understanding the export of dissolved and particulate species from catchments. The related challenge is the identification of active hydrological flow paths during these events and of the sources of chemical elements for which these events are hot moments of exports. An original four-year data set that combines high frequency records of stream flow, turbidity, nitrate and dissolved organic carbon concentrations, and piezometric levels was used to characterize storm responses in a headwater agricultural catchment. The data set was used to test to which extend the shallow groundwater was impacting the variability of storm responses. A total of 177 events were described using a set of quantitative and functional descriptors related to precipitation, stream and groundwater pre-event status and event dynamics, and to the relative dynamics between water quality parameters and flow via hysteresis indices. This approach led to identify different types of response for each water quality parameter which occurrence can be quantified and related to the seasonal functioning of the catchment. This study demonstrates that high-frequency records of water quality are precious tools to study/unique in their ability to emphasize the variability of catchment storm responses.

  5. CareTrack Kids—part 3. Adverse events in children's healthcare in Australia: study protocol for a retrospective medical record review

    Science.gov (United States)

    Hibbert, Peter D; Hallahan, Andrew R; Muething, Stephen E; Lachman, Peter; Hooper, Tamara D; Wiles, Louise K; Jaffe, Adam; White, Les; Wheaton, Gavin R; Runciman, William B; Dalton, Sarah; Williams, Helena M; Braithwaite, Jeffrey

    2015-01-01

    Introduction A high-quality health system should deliver care that is free from harm. Few large-scale studies of adverse events have been undertaken in children's healthcare internationally, and none in Australia. The aim of this study is to measure the frequency and types of adverse events encountered in Australian paediatric care in a range of healthcare settings. Methods and analysis A form of retrospective medical record review, the Institute of Healthcare Improvement's Global Trigger Tool, will be modified to collect data. Records of children aged <16 years managed during 2012 and 2013 will be reviewed. We aim to review 6000–8000 records from a sample of healthcare practices (hospitals, general practices and specialists). Ethics and dissemination Human Research Ethics Committee approvals have been received from the Sydney Children's Hospital Network, Children's Health Queensland Hospital and Health Service, and the Women's and Children's Hospital Network in South Australia. An application is under review with the Royal Australian College of General Practitioners. The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:25854978

  6. Large-mass di-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass jet pair fulfilling all analysis requirements observed so far by the CMS detector in collision data collected in 2015. The mass of the di-jet system is 6.14 TeV. Both jets are reconstructed in the barrel region and have transverse momenta of about 3 TeV each.

  7. Enhancing Breast Cancer Recurrence Algorithms Through Selective Use of Medical Record Data.

    Science.gov (United States)

    Kroenke, Candyce H; Chubak, Jessica; Johnson, Lisa; Castillo, Adrienne; Weltzien, Erin; Caan, Bette J

    2016-03-01

    The utility of data-based algorithms in research has been questioned because of errors in identification of cancer recurrences. We adapted previously published breast cancer recurrence algorithms, selectively using medical record (MR) data to improve classification. We evaluated second breast cancer event (SBCE) and recurrence-specific algorithms previously published by Chubak and colleagues in 1535 women from the Life After Cancer Epidemiology (LACE) and 225 women from the Women's Health Initiative cohorts and compared classification statistics to published values. We also sought to improve classification with minimal MR examination. We selected pairs of algorithms-one with high sensitivity/high positive predictive value (PPV) and another with high specificity/high PPV-using MR information to resolve discrepancies between algorithms, properly classifying events based on review; we called this "triangulation." Finally, in LACE, we compared associations between breast cancer survival risk factors and recurrence using MR data, single Chubak algorithms, and triangulation. The SBCE algorithms performed well in identifying SBCE and recurrences. Recurrence-specific algorithms performed more poorly than published except for the high-specificity/high-PPV algorithm, which performed well. The triangulation method (sensitivity = 81.3%, specificity = 99.7%, PPV = 98.1%, NPV = 96.5%) improved recurrence classification over two single algorithms (sensitivity = 57.1%, specificity = 95.5%, PPV = 71.3%, NPV = 91.9%; and sensitivity = 74.6%, specificity = 97.3%, PPV = 84.7%, NPV = 95.1%), with 10.6% MR review. Triangulation performed well in survival risk factor analyses vs analyses using MR-identified recurrences. Use of multiple recurrence algorithms in administrative data, in combination with selective examination of MR data, may improve recurrence data quality and reduce research costs. © The Author 2015. Published by Oxford University Press. All rights reserved. For

  8. Fitting experimental data by using weighted Monte Carlo events

    International Nuclear Information System (INIS)

    Stojnev, S.

    2003-01-01

    A method for fitting experimental data using modified Monte Carlo (MC) sample is developed. It is intended to help when a single finite MC source has to fit experimental data looking for parameters in a certain underlying theory. The extraction of the searched parameters, the errors estimation and the goodness-of-fit testing is based on the binned maximum likelihood method

  9. Compilation of data concerning know and suspected water hammer events in nuclear power plants, CY 1969

    International Nuclear Information System (INIS)

    Chapman, R.L.; Christensen, D.D.; Dafoe, R.E.; Hanner, O.M.; Wells, M.E.

    1981-05-01

    This report compiles data concerning known and suspected water hammer events reported by BWR and PWR power plants in the United States from January 1, 1969, to May 1, 1981. This information is summarized for each event and is tabulated for all events by plant, plant type, year of occurrence, type of water hammer, system affected, basis/cause for the event, and damage incurred. Information is also included from other events not specifically identified as water hammer related. These other events involved vibration and/or system components similar to those involved in the water hammer events. The other events are included to ensure completeness of the report, but are not used to point out particular facts or trends. This report does not evaluate findings abstracted from the data

  10. Tablet computers for recording tuberculosis data at a community ...

    African Journals Online (AJOL)

    Don O’Mahony

    2014-08-20

    Aug 20, 2014 ... There are essentially two data collection systems at CHCs. The first pertains to ... The second pertains to patient management. Patient data and .... operating system for clinical applications on tablet devices. Based on the above ... tool for data collection as it supports a total process and environment to help ...

  11. To what extent are adverse events found in patient records reported by patients and healthcare professionals via complaints, claims and incident reports?

    Directory of Open Access Journals (Sweden)

    van der Wal Gerrit

    2011-02-01

    Full Text Available Abstract Background Patient record review is believed to be the most useful method for estimating the rate of adverse events among hospitalised patients. However, the method has some practical and financial disadvantages. Some of these disadvantages might be overcome by using existing reporting systems in which patient safety issues are already reported, such as incidents reported by healthcare professionals and complaints and medico-legal claims filled by patients or their relatives. The aim of the study is to examine to what extent the hospital reporting systems cover the adverse events identified by patient record review. Methods We conducted a retrospective study using a database from a record review study of 5375 patient records in 14 hospitals in the Netherlands. Trained nurses and physicians using a method based on the protocol of The Harvard Medical Practice Study previously reviewed the records. Four reporting systems were linked with the database of reviewed records: 1 informal and 2 formal complaints by patients/relatives, 3 medico-legal claims by patients/relatives and 4 incident reports by healthcare professionals. For each adverse event identified in patient records the equivalent was sought in these reporting systems by comparing dates and descriptions of the events. The study focussed on the number of adverse event matches, overlap of adverse events detected by different sources, preventability and severity of consequences of reported and non-reported events and sensitivity and specificity of reports. Results In the sample of 5375 patient records, 498 adverse events were identified. Only 18 of the 498 (3.6% adverse events identified by record review were found in one or more of the four reporting systems. There was some overlap: one adverse event had an equivalent in both a complaint and incident report and in three cases a patient/relative used two or three systems to complain about an adverse event. Healthcare professionals

  12. NOAA Climate Data Record (CDR) of Northern Hemisphere (NH) Snow Cover Extent (SCE), Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This NOAA Climate Data Record (CDR) is a record for the Northern Hemisphere (NH) Snow Cover Extent (SCE) spanning from October 4, 1966 to present, updated monthly...

  13. Processing data communications events by awakening threads in parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2016-03-15

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  14. Sharing Neuron Data: Carrots, Sticks, and Digital Records.

    Directory of Open Access Journals (Sweden)

    Giorgio A Ascoli

    2015-10-01

    Full Text Available Routine data sharing is greatly benefiting several scientific disciplines, such as molecular biology, particle physics, and astronomy. Neuroscience data, in contrast, are still rarely shared, greatly limiting the potential for secondary discovery and the acceleration of research progress. Although the attitude toward data sharing is non-uniform across neuroscience subdomains, widespread adoption of data sharing practice will require a cultural shift in the community. Digital reconstructions of axonal and dendritic morphology constitute a particularly "sharable" kind of data. The popularity of the public repository NeuroMorpho.Org demonstrates that data sharing can benefit both users and contributors. Increased data availability is also catalyzing the grassroots development and spontaneous integration of complementary resources, research tools, and community initiatives. Even in this rare successful subfield, however, more data are still unshared than shared. Our experience as developers and curators of NeuroMorpho.Org suggests that greater transparency regarding the expectations and consequences of sharing (or not sharing data, combined with public disclosure of which datasets are shared and which are not, may expedite the transition to community-wide data sharing.

  15. Pipe break prediction based on evolutionary data-driven methods with brief recorded data

    International Nuclear Information System (INIS)

    Xu Qiang; Chen Qiuwen; Li Weifeng; Ma Jinfeng

    2011-01-01

    Pipe breaks often occur in water distribution networks, imposing great pressure on utility managers to secure stable water supply. However, pipe breaks are hard to detect by the conventional method. It is therefore necessary to develop reliable and robust pipe break models to assess the pipe's probability to fail and then to optimize the pipe break detection scheme. In the absence of deterministic physical models for pipe break, data-driven techniques provide a promising approach to investigate the principles underlying pipe break. In this paper, two data-driven techniques, namely Genetic Programming (GP) and Evolutionary Polynomial Regression (EPR) are applied to develop pipe break models for the water distribution system of Beijing City. The comparison with the recorded pipe break data from 1987 to 2005 showed that the models have great capability to obtain reliable predictions. The models can be used to prioritize pipes for break inspection and then improve detection efficiency.

  16. Genome-wide association study for ketosis in US Jerseys using producer-recorded data.

    Science.gov (United States)

    Parker Gaddis, K L; Megonigal, J H; Clay, J S; Wolfe, C W

    2018-01-01

    Ketosis is one of the most frequently reported metabolic health events in dairy herds. Several genetic analyses of ketosis in dairy cattle have been conducted; however, few have focused specifically on Jersey cattle. The objectives of this research included estimating variance components for susceptibility to ketosis and identification of genomic regions associated with ketosis in Jersey cattle. Voluntary producer-recorded health event data related to ketosis were available from Dairy Records Management Systems (Raleigh, NC). Standardization was implemented to account for the various acronyms used by producers to designate an incidence of ketosis. Events were restricted to the first reported incidence within 60 d after calving in first through fifth parities. After editing, there were a total of 42,233 records from 23,865 cows. A total of 1,750 genotyped animals were used for genomic analyses using 60,671 markers. Because of the binary nature of the trait, a threshold animal model was fitted using THRGIBBS1F90 (version 2.110) using only pedigree information, and genomic information was incorporated using a single-step genomic BLUP approach. Individual single nucleotide polymorphism (SNP) effects and the proportion of variance explained by 10-SNP windows were calculated using postGSf90 (version 1.38). Heritability of susceptibility to ketosis was 0.083 [standard deviation (SD) = 0.021] and 0.078 (SD = 0.018) in pedigree-based and genomic analyses, respectively. The marker with the largest associated effect was located on chromosome 10 at 66.3 Mbp. The 10-SNP window explaining the largest proportion of variance (0.70%) was located on chromosome 6 beginning at 56.1 Mbp. Gene Ontology (GO) and Medical Subject Heading (MeSH) enrichment analyses identified several overrepresented processes and terms related to immune function. Our results indicate that there is a genetic component related to ketosis susceptibility in Jersey cattle and, as such, genetic selection for

  17. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  18. Telling data: The accountancy record of a Chinese farmer

    NARCIS (Netherlands)

    Zhao, Y.; Ploeg, van der J.D.

    2014-01-01

    This paper presents and analyses the notebook of a Chinese farmer. The notebook contains a wealth of farm accountancy data. The data and the many interrelations contained in them, are used to describe the structure and dynamics of farming in NW China. The availability of former notebooks that played

  19. Visualizing Research Data Records for their Better Management

    DEFF Research Database (Denmark)

    Ball, Alexander; Darlington, Mansur; Howard, Thomas J.

    2014-01-01

    As academia in general, and research funders in particular, place ever greater importance on data as an output of research, so the value of good research data management practices becomes ever more apparent. In response to this, the Innovative Design and Manufacturing Research Centre (IdMRC) at t......As academia in general, and research funders in particular, place ever greater importance on data as an output of research, so the value of good research data management practices becomes ever more apparent. In response to this, the Innovative Design and Manufacturing Research Centre (Id...... with the associations between them. This method, called Research Activity Information Development (RAID) Modelling, is based on the Unified Modelling Language (UML) for portability. It is offered to the wider research community as an intuitive way for researchers both to keep track of their own data and to communicate...

  20. Mining Staff Assignment Rules from Event-Based Data

    NARCIS (Netherlands)

    Ly, Linh Thao; Rinderle, Stefanie; Dadam, Peter; Reichert, Manfred; Bussler, Christoph J.; Haller, Armin

    2006-01-01

    Process mining offers methods and techniques for capturing process behaviour from log data of past process executions. Although many promising approaches on mining the control flow have been published, no attempt has been made to mine the staff assignment situation of business processes. In this

  1. Continuous data recording on fast real-time systems

    Energy Technology Data Exchange (ETDEWEB)

    Zabeo, L., E-mail: lzabeo@jet.u [Euratom-CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Sartori, F. [Euratom-CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Neto, A. [Associacao Euratom-IST, Instituto de Plasmas e Fusao Nuclear, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Piccolo, F. [Euratom-CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Alves, D. [Associacao Euratom-IST, Instituto de Plasmas e Fusao Nuclear, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Vitelli, R. [Dipartimento di Informatica, Sistemi e Produzione, Universita di Roma, Tor Vergata, Via del Politecnico, 1-00133 Roma (Italy); Barbalace, A. [Euratom-ENEA Association, Consorzio RFX, 35127 Padova (Italy); De Tommasi, G. [Associazione EURATOM/ENEA/CREATE, Universita di Napoli Federico II, Napoli (Italy)

    2010-07-15

    The PCU-Project launched for the enhancement of the vertical stabilisation system at JET required the design of a new real-time control system with the challenging specifications of 2Gops and a cycle time of 50 {mu}s. The RTAI based architecture running on an x86 multi-core processor technology demonstrated to be the best platform for meeting the high requirements. Moreover, on this architecture thanks to the smart allocation of the interrupts it was possible to demonstrate simultaneous data streaming at 50 MBs on Ethernet while handling a real-time 100 kHz interrupt source with a maximum jitter of just 3 {mu}s. Because of the memory limitation imposed by 32 bit version Linux running in kernel mode, the RTAI-based new controller allows a maximum practical data storage of 800 MB per pulse. While this amount of data can be accepted for JET normal operation it posed some limitations in the debugging and commissioning of the system. In order to increase the capability of the data acquisition of the system we have designed a mechanism that allows continuous full bandwidth (56 MB/s) data streaming from the real-time task (running in kernel mode) to either a data collector (running in user mode) or an external data acquisition server. The exploited architecture involves a peer to peer mechanisms where the sender running in RTAI kernel mode broadcasts large chunks of data using UDP packets, implemented using the 'fcomm' RTAI extension , to a receiver that will store the data. The paper will present the results of the initial RTAI operating system tests, the design of the streaming architecture and the first experimental results.

  2. Using Electronic Health Records to Build an Ophthalmologic Data Warehouse and Visualize Patients' Data.

    Science.gov (United States)

    Kortüm, Karsten U; Müller, Michael; Kern, Christoph; Babenko, Alexander; Mayer, Wolfgang J; Kampik, Anselm; Kreutzer, Thomas C; Priglinger, Siegfried; Hirneiss, Christoph

    2017-06-01

    To develop a near-real-time data warehouse (DW) in an academic ophthalmologic center to gain scientific use of increasing digital data from electronic medical records (EMR) and diagnostic devices. Database development. Specific macular clinic user interfaces within the institutional hospital information system were created. Orders for imaging modalities were sent by an EMR-linked picture-archiving and communications system to the respective devices. All data of 325 767 patients since 2002 were gathered in a DW running on an SQL database. A data discovery tool was developed. An exemplary search for patients with age-related macular degeneration, performed cataract surgery, and at least 10 intravitreal (excluding bevacizumab) injections was conducted. Data related to those patients (3 142 204 diagnoses [including diagnoses from other fields of medicine], 720 721 procedures [eg, surgery], and 45 416 intravitreal injections) were stored, including 81 274 optical coherence tomography measurements. A web-based browsing tool was successfully developed for data visualization and filtering data by several linked criteria, for example, minimum number of intravitreal injections of a specific drug and visual acuity interval. The exemplary search identified 450 patients with 516 eyes meeting all criteria. A DW was successfully implemented in an ophthalmologic academic environment to support and facilitate research by using increasing EMR and measurement data. The identification of eligible patients for studies was simplified. In future, software for decision support can be developed based on the DW and its structured data. The improved classification of diseases and semiautomatic validation of data via machine learning are warranted. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Evaluating the data completeness in the Electronic Health Record after the Implementation of an Outpatient Electronic Health Record.

    Science.gov (United States)

    Soto, Mauricio; Capurro, Daniel; Catalán, Silvia

    2015-01-01

    Electronic health records (EHRs) present an opportunity for quality improvement in health organitations, particularly at the primary health level. However, EHR implementation impacts clinical workflows, and physicians frequently prefer to document in a non-structured way, which ultimately hinders the ability to measure quality indicators. We present an assessment of data completeness-a key data quality indicator-during the first 12 months after the implementation of an EHR at a teaching outpatient center in Santiago, Chile.

  4. Erosion of the Alps: use of Rb-Sr isotopic data from molassic sediments to identify the ages of the metamorphism recorded by the eroded rocks

    International Nuclear Information System (INIS)

    Henry, P.; Deloule, E.

    1994-01-01

    Rb-Sr isotopic data from Oligocene and Miocene peri-alpine molassic sediments allow us to identify the different periods for which the eroded rocks have or have not recorded an alpine metamorphism. The Chattian and the Burdigalian sediments result from the erosion of rocks for which the latest metamorphic event was variscan, while the Stampian, Aquitanian and ''Helvetian'' sediments show evidence for the erosion of rocks which have recorded alpine metamorphic events. The application of this method to old detrital sediments could permit determination of the ages of the tectonic events which occurred in the sediment source regions. (authors). 18 refs., 6 figs

  5. EVALUATING RISK-PREDICTION MODELS USING DATA FROM ELECTRONIC HEALTH RECORDS.

    Science.gov (United States)

    Wang, L E; Shaw, Pamela A; Mathelier, Hansie M; Kimmel, Stephen E; French, Benjamin

    2016-03-01

    The availability of data from electronic health records facilitates the development and evaluation of risk-prediction models, but estimation of prediction accuracy could be limited by outcome misclassification, which can arise if events are not captured. We evaluate the robustness of prediction accuracy summaries, obtained from receiver operating characteristic curves and risk-reclassification methods, if events are not captured (i.e., "false negatives"). We derive estimators for sensitivity and specificity if misclassification is independent of marker values. In simulation studies, we quantify the potential for bias in prediction accuracy summaries if misclassification depends on marker values. We compare the accuracy of alternative prognostic models for 30-day all-cause hospital readmission among 4548 patients discharged from the University of Pennsylvania Health System with a primary diagnosis of heart failure. Simulation studies indicate that if misclassification depends on marker values, then the estimated accuracy improvement is also biased, but the direction of the bias depends on the direction of the association between markers and the probability of misclassification. In our application, 29% of the 1143 readmitted patients were readmitted to a hospital elsewhere in Pennsylvania, which reduced prediction accuracy. Outcome misclassification can result in erroneous conclusions regarding the accuracy of risk-prediction models.

  6. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  7. Geochemical and palynological records for the end-Triassic Mass-Extinction Event in the NE Paris Basin (Luxemburg)

    Science.gov (United States)

    Kuhlmann, Natascha; van de Schootbrugge, Bas; Thein, Jean; Fiebig, Jens; Franz, Sven-Oliver; Hanzo, Micheline; Colbach, Robert; Faber, Alain

    2016-04-01

    The End-Triassic mass-extinction event is one of the "big five" mass extinctions in Earth's history. Large scale flood basalt volcanism associated with the break-up of Pangaea, which resulted in the opening of the central Atlantic Ocean, is considered as the leading cause. In addition, an asteroid impact in Rochechouart (France; 201 ± 2 Ma) may have had a local influence on ecosystems and sedimentary settings. The Luxembourg Embayment, in the NE Paris Basin, offers a rare chance to study both effects in a range of settings from deltaic to lagoonal. A multidisciplinary study (sedimentology, geochemistry, palynology) has been carried out on a number of outcrops and cores that span from the Norian to lower Hettangian. Combined geochemical and palynological records from the Boust core drilled in the NE Paris Basin, provide evidence for paleoenvironmental changes associated with the end-Triassic mass-extinction event. The Triassic-Jurassic stratigraphy of the Boust core is well constrained by palynomorphs showing the disappaerance of typical Triassic pollen taxa (e.g. Ricciisporites tuberculates) and the occurrence of the marker species Polypodiisporites polymicroforatus within the uppermost Rhaetian, prior to the Hettangian dominance of Classopollis pollen. The organic carbon stable isotope record (δ13Corg) spanning the Norian to Hettangian, shows a series of prominent negative excursions within the middle Rhaetian, followed by a trend towards more positive values (approx -24 per mille) within the uppermost Rhaetian Argiles de Levallois Member. The lowermost Hettangian is characterized by a major negative excursion, reaching - 30 per mille that occurs in organic-rich sediments. This so-called "main negative excursion" is well-known from other locations, for example from Mariental in Northern Germany and from St Audrie's Bay in England, and Stenlille in Denmark. Based on redox-sensitive trace element records (V, Cr, Ni, Co, Th, U) the lowermost Hettangian in most of

  8. The application of an event data store to safe operation

    International Nuclear Information System (INIS)

    Bowen, J.H.

    1977-01-01

    The problem has been considered of how those responsible in Industry could attempt to demonstrate the degree of safety which the First Report of the Major Hazard Committee in the U.K. indicated as a minimum for the public at large. Reliance on Codes of Practice appears inappropriate at this point in time. Documentary evidence on the lines of a Hazard Analysis, quantified with generalised failure data as obtainable from a Data Bank, is likewise rejected as not conveying sufficient confidence. The recommended approach would include Hazard Analysis as an essential feature; but would envisage a feedback of information from the early operations of the actual plant to update in Bayesian fashion the estimates, and to revise and review major assumptions about plant management and control, using techniques of multivariate analysis. Correlation of failure information from many 'similar' plants is recommended. The extension of Data Banks to include information of a more operational type is recommended; and, preferably in conjunction with this, techniques of operational research should be applied to test correct parametrization. Progress on all these aspects is reported. (orig.) [de

  9. Characterization of System Level Single Event Upset (SEU) Responses using SEU Data, Classical Reliability Models, and Space Environment Data

    Science.gov (United States)

    Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael

    2017-01-01

    We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.

  10. NOAA/NSIDC Climate Data Record of Passive Microwave Sea Ice Concentration

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set provides a Climate Data Record (CDR) of sea ice concentration from passive microwave data. It provides a consistent, daily and monthly time series of...

  11. MESSENGER H XRS 5 REDUCED DATA RECORD (RDR) FOOTPRINTS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract ======== This data set consists of the MESSENGER XRS reduced data record (RDR) footprints which are derived from the navigational meta-data for each...

  12. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  13. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    Science.gov (United States)

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  14. Process cubes : slicing, dicing, rolling up and drilling down event data for process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2013-01-01

    Recent breakthroughs in process mining research make it possible to discover, analyze, and improve business processes based on event data. The growth of event data provides many opportunities but also imposes new challenges. Process mining is typically done for an isolated well-defined process in

  15. Di-photon events recorded by the CMS detector (Run 2, 13 TeV, 0 T)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    This image shows a collision event with a photon pair observed by the CMS detector in proton-collision data collected in 2015 with no magnetic field present. The energy deposits of the two photons are represented by the two large green towers. The mass of the di-photon system is between 700 and 800 GeV. The candidates are consistent with what is expected for prompt isolated photons.

  16. Large-mass di-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    This image shows a collision event with the largest-mass jet pair fulfilling all analysis requirements observed so far by the CMS detector in proton-proton collision data collected in 2016. The mass of the di-jet system is 7.7 TeV. Both jets are reconstructed in the barrel region and each have transverse momenta of over 3 TeV.

  17. GPS location data enhancement in electronic traffic records.

    Science.gov (United States)

    2013-01-01

    In this project we developed a new GPS-based Geographical Information Exchange : Framework (GIEF) to improve the correctness and accuracy of location data reported on : electronic police forms in Oklahoma. A second major goal was to provide a base le...

  18. Complex life histories of fishes revealed through natural information storage devices: case studies of diadromous events as recorded by otoliths

    International Nuclear Information System (INIS)

    Elfman, M.; Limburg, K.E.; Kristiansson, P.; Svedaeng, H.; Westin, L.; Wickstroem, H.; Malmqvist, K.; Pallon, J.

    2000-01-01

    Diadromous fishes - species that move across salinity gradients as part of their life repertoire - form a major part of coastal and inland fisheries. Conventional mark-recapture techniques have long been used to track their movements, but give incomplete information at best. On the other hand, otoliths (ear-stones) of fishes can provide a complete record of major life history events, as reflected both in their microstructure and elemental composition. Strontium, which substitutes for calcium in the aragonite matrix of otoliths, is a powerful tracer of salinity histories in many migratory fishes. We measured Sr and Ca with a nuclear microprobe (PIXE) and show examples (eel, Anguilla anguilla; brown trout, Salmo trutta; American shad, Alosa sapidissima) of how the technique has solved several mysteries within fisheries biology

  19. Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data

    Science.gov (United States)

    Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.

    2017-12-01

    Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.

  20. Changing Requirements for Archiving Climate Data Records Derived From Remotely Sensed Data

    Science.gov (United States)

    Fleig, A. J.; Tilmes, C.

    2007-05-01

    With the arrival of long term sets of measurements of remotely sensed data it becomes important to improve the standard practices associated with archival of information needed to allow creation of climate data records, CDRs, from individual sets of measurements. Several aspects of the production of CDRs suggest that there should be changes in standard best practices for archival. A fundamental requirement for understanding long- term trends in climate data is that changes with time shown by the data reflect changes in actual geophysical parameters rather than changes in the measurement system. Even well developed and validated data sets from remotely sensed measurements contain artifacts. If the nature of the measurement and the algorithm is consistent over time, these artifacts may have little impact on trends derived from the data. However data sets derived with different algorithms created with different assumptions are likely to introduce non-physical changes in trend data. Yet technology for making measurements and analyzing data improves with time and this must be accounted for. To do this for an ongoing long term data set based on multiple instruments it is important to understand exactly how the preceding data was produced. But we are reaching the point where the scientists and engineers that developed the initial measurements and algorithms are no longer available to explain and assist in adapting today's systems for use with future measurement systems. In an era where tens to hundreds of man years are involved in calibrating an instrument and producing and validating a set of geophysical measurements from the calibrated data we have long passed the time when it was reasonable to say "just give me the basic measurement and a bright graduate student and I can produce anything I need in a year." Examples of problems encountered and alternative solutions will be provided based on developing and reprocessing data sets from long term measurements of

  1. Estimating the Probability of Wind Ramping Events: A Data-driven Approach

    OpenAIRE

    Wang, Cheng; Wei, Wei; Wang, Jianhui; Qiu, Feng

    2016-01-01

    This letter proposes a data-driven method for estimating the probability of wind ramping events without exploiting the exact probability distribution function (PDF) of wind power. Actual wind data validates the proposed method.

  2. Record-low primary productivity and high plant damage in the Nordic Arctic Region in 2012 caused by multiple weather events and pest outbreaks

    International Nuclear Information System (INIS)

    Bjerke, Jarle W; Jepsen, Jane U; Lovibond, Sarah; Tømmervik, Hans; Rune Karlsen, Stein; Arild Høgda, Kjell; Malnes, Eirik; Vikhamar-Schuler, Dagrun

    2014-01-01

    The release of cold temperature constraints on photosynthesis has led to increased productivity (greening) in significant parts (32–39%) of the Arctic, but much of the Arctic shows stable (57–64%) or reduced productivity (browning, <4%). Summer drought and wildfires are the best-documented drivers causing browning of continental areas, but factors dampening the greening effect of more maritime regions have remained elusive. Here we show how multiple anomalous weather events severely affected the terrestrial productivity during one water year (October 2011–September 2012) in a maritime region north of the Arctic Circle, the Nordic Arctic Region, and contributed to the lowest mean vegetation greenness (normalized difference vegetation index) recorded this century. Procedures for field data sampling were designed during or shortly after the events in order to assess both the variability in effects and the maximum effects of the stressors. Outbreaks of insect and fungal pests also contributed to low greenness. Vegetation greenness in 2012 was 6.8% lower than the 2000–11 average and 58% lower in the worst affected areas that were under multiple stressors. These results indicate the importance of events (some being mostly neglected in climate change effect studies and monitoring) for primary productivity in a high-latitude maritime region, and highlight the importance of monitoring plant damage in the field and including frequencies of stress events in models of carbon economy and ecosystem change in the Arctic. Fourteen weather events and anomalies and 32 hypothesized impacts on plant productivity are summarized as an aid for directing future research. (letter)

  3. Interval-Censored Time-to-Event Data Methods and Applications

    CERN Document Server

    Chen, Ding-Geng

    2012-01-01

    Interval-Censored Time-to-Event Data: Methods and Applications collects the most recent techniques, models, and computational tools for interval-censored time-to-event data. Top biostatisticians from academia, biopharmaceutical industries, and government agencies discuss how these advances are impacting clinical trials and biomedical research. Divided into three parts, the book begins with an overview of interval-censored data modeling, including nonparametric estimation, survival functions, regression analysis, multivariate data analysis, competing risks analysis, and other models for interva

  4. Using gamification to drive patient’s personal data validation in a Personal Health Record

    Directory of Open Access Journals (Sweden)

    Guido Giunti

    2015-10-01

    Full Text Available Gamification is a term used to describe using game elements in non-game environments to enhance user experience. It has been incorporated with commercial success into several platforms (Linkedin, Badgeville, Facebook this has made some researchers theorize that it could also be used in education as a tool to increase student engagement and to drive desirable learning behaviors on them. While in the past years some game elements have been incorporated to healthcare there is still little evidence on how effective they are. Game elements provide engagement consistent with various theories of motivation, positive psychology (e.g., flow, and also provide instant feedback. Feedback is more effective when it provides sufficient and specific information for goal achievement and is presented relatively close in time to the event being evaluated. Feedback can reference individual progress, can make social comparisons, or can refer to task criteria. Electronic personal health record systems (PHRs support patient centered healthcare by making medical records and other relevant information accessible to patients, thus assisting patients in health self-management. A particularly difficult data set that is often difficult to capture are those regarding social and cultural background information. This data set is not only useful to help better healthcare system management, it is also relevant as it is used for epidemiological and preventive purposes. We used gamified mechanics that involve instant feedback to test if they would increase patient’s personal data validation and completion in our PHR as well as overall PHR use. On our presentation we will describe our results and the story behind them.

  5. Real-time digital filtering, event triggering, and tomographic reconstruction of JET soft x-ray data (abstract)

    Science.gov (United States)

    Edwards, A. W.; Blackler, K.; Gill, R. D.; van der Goot, E.; Holm, J.

    1990-10-01

    Based upon the experience gained with the present soft x-ray data acquisition system, new techniques are being developed which make extensive use of digital signal processors (DSPs). Digital filters make 13 further frequencies available in real time from the input sampling frequency of 200 kHz. In parallel, various algorithms running on further DSPs generate triggers in response to a range of events in the plasma. The sawtooth crash can be detected, for example, with a delay of only 50 μs from the onset of the collapse. The trigger processor interacts with the digital filter boards to ensure data of the appropriate frequency is recorded throughout a plasma discharge. An independent link is used to pass 780 and 24 Hz filtered data to a network of transputers. A full tomographic inversion and display of the 24 Hz data is carried out in real time using this 15 transputer array. The 780 Hz data are stored for immediate detailed playback following the pulse. Such a system could considerably improve the quality of present plasma diagnostic data which is, in general, sampled at one fixed frequency throughout a discharge. Further, it should provide valuable information towards designing diagnostic data acquisition systems for future long pulse operation machines when a high degree of real-time processing will be required, while retaining the ability to detect, record, and analyze events of interest within such long plasma discharges.

  6. Regression analysis of mixed recurrent-event and panel-count data.

    Science.gov (United States)

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  8. How to integrate proxy data from two informants in life event assessment in psychological autopsy.

    Science.gov (United States)

    Zhang, Jie; Wang, Youqing; Fang, Le

    2018-04-27

    Life event assessment is an important part in psychological autopsy, and how to integrate its proxy data from two informants is a major methodological issue which needs solving. Totally 416 living subjects and their two informants were interviewed by psychological autopsy, and life events were assessed with Paykel's Interview for Recent Life Events. Validities of integrated proxy data using six psychological autopsy information reconstruction methods were evaluated, with living subjects' self-reports used as gold-standard criteria. For all the life events, average value of Youden Indexes for proxy data by type C information reconstruction method (choosing positive value from two informants) was larger than other five methods'. For family life related events, proxy data by type 1st information reconstruction method were not significantly different from living subjects' self-reports (P = 0.828). For all other life events, proxy data by type C information reconstruction method were not significantly different from the gold-standard. Choosing positive value is a relatively better method for integrating dichotomous (positive vs. negative) proxy data from two informants in life event assessment in psychological autopsy, except for family life related events. In that case, using information provided by 1st informants (mainly family member) is recommended.

  9. Is detection of adverse events affected by record review methodology? an evaluation of the "Harvard Medical Practice Study" method and the "Global Trigger Tool".

    Science.gov (United States)

    Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin

    2013-04-15

    There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study

  10. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    Science.gov (United States)

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  11. Constructing the Web of Events from Raw Data in the Web of Things

    Directory of Open Access Journals (Sweden)

    Yunchuan Sun

    2014-01-01

    Full Text Available An exciting paradise of data is emerging into our daily life along with the development of the Web of Things. Nowadays, volumes of heterogeneous raw data are continuously generated and captured by trillions of smart devices like sensors, smart controls, readers and other monitoring devices, while various events occur in the physical world. It is hard for users including people and smart things to master valuable information hidden in the massive data, which is more useful and understandable than raw data for users to get the crucial points for problems-solving. Thus, how to automatically and actively extract the knowledge of events and their internal links from the big data is one key challenge for the future Web of Things. This paper proposes an effective approach to extract events and their internal links from large scale data leveraging predefined event schemas in the Web of Things, which starts with grasping the critical data for useful events by filtering data with well-defined event types in the schema. A case study in the context of smart campus is presented to show the application of proposed approach for the extraction of events and their internal semantic links.

  12. Data Management for a Climate Data Record in an Evolving Technical Landscape

    Science.gov (United States)

    Moore, K. D.; Walter, J.; Gleason, J. L.

    2017-12-01

    For nearly twenty years, NASA Langley Research Center's Clouds and the Earth's Radiant Energy System (CERES) Science Team has been producing a suite of data products that forms a persistent climate data record of the Earth's radiant energy budget. Many of the team's physical scientists and key research contributors have been with the team since the launch of the first CERES instrument in 1997. This institutional knowledge is irreplaceable and its longevity and continuity are among the reasons that the team has been so productive. Such legacy involvement, however, can also be a limiting factor. Some CERES scientists-cum-coders might possess skills that were state-of-the-field when they were emerging scientists but may now be outdated with respect to developments in software development best practices and supporting technologies. Both programming languages and processing frameworks have evolved significantly in the past twenty years, and updating one of these factors warrants consideration of updating the other. With the imminent launch of a final CERES instrument and the good health of those in flight, the CERES data record stands to continue far into the future. The CERES Science Team is, therefore, undergoing a re-architecture of its codebase to maintain compatibility with newer data processing platforms and technologies and to leverage modern software development best practices. This necessitates training our staff and consequently presents several challenges, including: Development continues immediately on the next "edition" of research algorithms upon release of the previous edition. How can code be rewritten at the same time that the science algorithms are being updated and integrated? With limited time to devote to training, how can we update the staff's existing skillset without slowing progress or introducing new errors? The CERES Science Team is large and complex, much like the current state of its codebase. How can we identify, in a breadth-wise manner

  13. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  14. Quality of record linkage in a highly automated cancer registry that relies on encrypted identity data

    Directory of Open Access Journals (Sweden)

    Schmidtmann, Irene

    2016-06-01

    Full Text Available Objectives: In the absence of unique ID numbers, cancer and other registries in Germany and elsewhere rely on identity data to link records pertaining to the same patient. These data are often encrypted to ensure privacy. Some record linkage errors unavoidably occur. These errors were quantified for the cancer registry of North Rhine Westphalia which uses encrypted identity data. Methods: A sample of records was drawn from the registry, record linkage information was included. In parallel, plain text data for these records were retrieved to generate a gold standard. Record linkage error frequencies in the cancer registry were determined by comparison of the results of the routine linkage with the gold standard. Error rates were projected to larger registries.Results: In the sample studied, the homonym error rate was 0.015%; the synonym error rate was 0.2%. The F-measure was 0.9921. Projection to larger databases indicated that for a realistic development the homonym error rate will be around 1%, the synonym error rate around 2%.Conclusion: Observed error rates are low. This shows that effective methods to standardize and improve the quality of the input data have been implemented. This is crucial to keep error rates low when the registry’s database grows. The planned inclusion of unique health insurance numbers is likely to further improve record linkage quality. Cancer registration entirely based on electronic notification of records can process large amounts of data with high quality of record linkage.

  15. 78 FR 47210 - National Practitioner Data Bank and Privacy Act; Exempt Records System; Technical Correction

    Science.gov (United States)

    2013-08-05

    ... reference cited in the Privacy Act regulations. The National Practitioner Data Bank (NPDB) system of records... DEPARTMENT OF HEALTH AND HUMAN SERVICES 45 CFR Part 5b RIN 0906-AA97 National Practitioner Data Bank and Privacy Act; Exempt Records System; Technical Correction AGENCY: Health Resources and Services...

  16. 42 CFR 417.806 - Financial records, statistical data, and cost finding.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Financial records, statistical data, and cost... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Health Care Prepayment Plans § 417.806 Financial records, statistical data, and cost finding. (a) The principles specified in § 417.568 apply to HCPPs, except those in...

  17. An event-oriented database for continuous data flows in the TJ-II environment

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain)], E-mail: edi.sanchez@ciemat.es; Pena, A. de la; Portas, A.; Pereira, A.; Vega, J. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain); Neto, A.; Fernandes, H. [Associacao Euratom/IST, Centro de Fusao Nuclear, Avenue Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    A new database for storing data related to the TJ-II experiment has been designed and implemented. It allows the storage of raw data not acquired during plasma shots, i.e. data collected continuously or between plasma discharges while testing subsystems (e.g. during neutral beam test pulses). This new database complements already existing ones by permitting the storage of raw data that are not classified by shot number. Rather these data are indexed according to a more general entity entitled event. An event is defined as any occurrence relevant to the TJ-II environment. Such occurrences are registered thus allowing relationships to be established between data acquisition, TJ-II control-system and diagnostic control-system actions. In the new database, raw data are stored in files on the TJ-II UNIX central server disks while meta-data are stored in Oracle tables thereby permitting fast data searches according to different criteria. In addition, libraries for registering data/events in the database from different subsystems within the laboratory local area network have been developed. Finally, a Shared Data Access System has been implemented for external access to data. It permits both new event-indexed as well as old data (indexed by shot number) to be read from a common event perspective.

  18. Paleoclimate Records from New Zealand Maar Lakes, Insights into ENSO Teleconnections and Climatic Events in the South (West) Pacific.

    Science.gov (United States)

    Shulmeister, J.; Nobes, D. C.; Striewski, B.

    2008-05-01

    The maar craters of the New Zealand Auckland Volcanic Field (36.5°S, 174.5°E) contain some of the highest resolution late-Quaternary paleoclimate records in the Southern Hemisphere. Here we integrate laminae count results from recent drilling in the Hopua Crater with existing records from the nearby Onepoto Crater (Pepper et al., 2004). In total these records cover many thousands of years between the onset of the last glaciation maximum and the early mid-Holocene. The cores are strongly laminated. Individual laminae in both craters are very fine (sub-mm to mm scale) and form couplets which comprise a darker mineralogenic rich layer and a lighter diatomaceous layer. In places these couplets are annual, and may reflect seasonal algal blooms, but in other sections of the record, notably through the late-Glacial and Holocene, the couplets are deposited at inter-annual time scales. Spectral analyses of couplet thickness counts using a fast Fourier transform (FFT) with 64 to 256-year running windows, and a 50 per cent overlap indicate strong spectral power during the LGM and markedly weaker power during both the deglaciation and early Holocene. In fact there is no spectral strength for most of these periods. Three brief (centennial duration) events punctuate this extended period of low spectral power. These occur at c. 16 ka, c. 14.8 ka and during the early Holocene. They display spectral power in the 5-7yr ENSO window and also at longer time intervals that may be consistent with the Pacific Decadal Oscillation. We infer the local switching on (or up) of ENSO and PDO teleconnections and suspect these are embedded in circum-polar circulation changes. In addition to these spectral power episodes, there is a general increase in the number of couplet cycles per century between the deglaciation and the early mid-Holocene. This matches observations from Equador and Peru and suggests that trans-Pacific ENSO responses are in phase between western tropical South America and New

  19. NOAA Climate Data Record (CDR) of Atmospheric Layer Temperatures, Version 3.3

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Atmospheric Layer Temperature Climate Data Record (CDR) dataset is a monthly analysis of the tropospheric and stratospheric data using temperature sounding...

  20. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Sensor Data Record (SDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sensor Data Records (SDRs), or Level 1b data, from the Visible Infrared Imaging Radiometer Suite (VIIRS) are the calibrated and geolocated radiance and reflectance...

  1. JUNO JUPITER MWR 2 EXPERIMENT DATA RECORDS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Juno MWR EDR data sets will ultimately include all uncalibrated MWR science data records for the entire Juno mission. The set in this volume will contain only...

  2. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Mask Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains a high quality Environmental Data Record (EDR) of cloud masks from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument onboard...

  3. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  4. NOAA Climate Data Record (CDR) of Passive Microwave Sea Ice Concentration, Version 1.0

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset version has been superseded by version 2. This data set provides a Climate Data Record (CDR) of passive microwave sea ice concentration based on the...

  5. MGN V RDRS 5 GLOBAL DATA RECORD TOPOGRAPHIC V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Global Topographic Data Record (GTDR). The range to surface is derived by fitting altimeter echoes from the fan-beam altimetry...

  6. MGN V RDRS 5 GLOBAL DATA RECORD SLOPE V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Global Slope Data Record (GSDR). The surface meter-scale slopes are derived by fitting altimeter echoes from the fan-beam...

  7. Barriers to retrieving patient information from electronic health record data: failure analysis from the TREC Medical Records Track.

    Science.gov (United States)

    Edinger, Tracy; Cohen, Aaron M; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems.

  8. Relating tilt measurements recorded at Mponeng Gold Mine, South Africa to the rupture of an M 2.2 event

    CSIR Research Space (South Africa)

    Share, P

    2013-09-01

    Full Text Available H. Ogasawara, G. Hofmann, H. Kato, M. Nakatani, H. Moriya, M. Naoi, Y. Yabe, R. Durrheim, A. Cichowicz, T. Kgarume, A. Milev, O. Murakami, T. Satoh and H. Kawakata 51 Quasi-static fault growth in a gabbro sample retrieved from a South African... of the event by Naoi et al.2011 produced a seismic moment of 2.9 × 1012 N·m. In contrast, calculations using the same data by the Institute of Mining Seismology (IMS, Hofmann2012, pers. comm.) gave a seismic moment of 9.875 × 1011 N·m, a corner frequency...

  9. Highest-mass di-photon event recorded by CMS as of Dec '15 (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass photon pair so far observed by the CMS detector in collision data collected in 2015. The mass of the di-photon system is 1.5 TeV. One photon candidate, with a transverse momentum of 530 GeV is reconstructed in the endcap region, while a second, with a transverse momentum of 400 GeV, is reconstructed in the barrel region. Both photon candidates are consistent with the expectation that they are prompt isolated photons.

  10. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations.

    Science.gov (United States)

    Arndt, Brian G; Beasley, John W; Watkinson, Michelle D; Temte, Jonathan L; Tuan, Wen-Jan; Sinsky, Christine A; Gilchrist, Valerie J

    2017-09-01

    Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non-face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from "event logging" records over a 3-year period for both direct patient care and non-face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. © 2017 Annals of Family Medicine, Inc.

  11. Organizational needs for managing and preserving geospatial data and related electronic records

    Directory of Open Access Journals (Sweden)

    R R Downs

    2006-01-01

    Full Text Available Government agencies and other organizations are required to manage and preserve records that they create and use to facilitate future access and reuse. The increasing use of geospatial data and related electronic records presents new challenges for these organizations, which have relied on traditional practices for managing and preserving records in printed form. This article reports on an investigation of current and future needs for managing and preserving geospatial electronic records on the part of localand state-level organizations in the New York City metropolitan region. It introduces the study and describes organizational needs observed, including needs for organizational coordination and interorganizational cooperation throughout the entire data lifecycle.

  12. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Analyzing the reliability of volcanic and archeomagnetic data by comparison with historical records

    Science.gov (United States)

    Arneitz, Patrick; Egli, Ramon; Leonhardt, Roman

    2017-04-01

    Records of the past geomagnetic field are obtained from historical observations (direct records) on the one hand, and by the magnetization acquired by archeological artifacts, rocks and sediments (indirect records) on the other hand. Indirect records are generally less reliable than direct ones due to recording mechanisms that cannot be fully reproduced in the laboratory, age uncertainties and alteration problems. Therefore, geomagnetic field modeling approaches must deal with random and systematic errors of field values and age estimates that are hard to assess. Here, we present a new approach to investigate the reliability of volcanic and archeomagnetic data, which is based on comparisons with historical records. Temporal and spatial mismatches between data are handled by the implementation of weighting functions and error estimates derived from a stochastic model of secular variation. Furthermore, a new strategy is introduced for the statistical analysis of inhomogeneous and internally correlated data sets. Application of these new analysis tools to an extended database including direct and indirect records shows an overall good agreement between different record categories. Nevertheless, some biases exist between selected material categories, laboratory procedures, and quality checks/corrections (e.g., inclination shallowing of volcanic records). These findings can be used to obtain a better understanding of error sources affecting indirect records, thereby facilitating more reliable reconstructions of the geomagnetic past.

  14. ADEPt, a semantically-enriched pipeline for extracting adverse drug events from free-text electronic health records.

    Directory of Open Access Journals (Sweden)

    Ehtesham Iqbal

    Full Text Available Adverse drug events (ADEs are unintended responses to medical treatment. They can greatly affect a patient's quality of life and present a substantial burden on healthcare. Although Electronic health records (EHRs document a wealth of information relating to ADEs, they are frequently stored in the unstructured or semi-structured free-text narrative requiring Natural Language Processing (NLP techniques to mine the relevant information. Here we present a rule-based ADE detection and classification pipeline built and tested on a large Psychiatric corpus comprising 264k patients using the de-identified EHRs of four UK-based psychiatric hospitals. The pipeline uses characteristics specific to Psychiatric EHRs to guide the annotation process, and distinguishes: a the temporal value associated with the ADE mention (whether it is historical or present, b the categorical value of the ADE (whether it is assertive, hypothetical, retrospective or a general discussion and c the implicit contextual value where the status of the ADE is deduced from surrounding indicators, rather than explicitly stated. We manually created the rulebase in collaboration with clinicians and pharmacists by studying ADE mentions in various types of clinical notes. We evaluated the open-source Adverse Drug Event annotation Pipeline (ADEPt using 19 ADEs specific to antipsychotics and antidepressants medication. The ADEs chosen vary in severity, regularity and persistence. The average F-measure and accuracy achieved by our tool across all tested ADEs were 0.83 and 0.83 respectively. In addition to annotation power, the ADEPT pipeline presents an improvement to the state of the art context-discerning algorithm, ConText.

  15. A comparison of recording modalities of P300 event-related potentials (ERP) for brain-computer interface (BCI) paradigm.

    Science.gov (United States)

    Mayaud, L; Congedo, M; Van Laghenhove, A; Orlikowski, D; Figère, M; Azabou, E; Cheliout-Heraut, F

    2013-10-01

    A brain-computer interface aims at restoring communication and control in severely disabled people by identification and classification of EEG features such as event-related potentials (ERPs). The aim of this study is to compare different modalities of EEG recording for extraction of ERPs. The first comparison evaluates the performance of six disc electrodes with that of the EMOTIV headset, while the second evaluates three different electrode types (disc, needle, and large squared electrode). Ten healthy volunteers gave informed consent and were randomized to try the traditional EEG system (six disc electrodes with gel and skin preparation) or the EMOTIV Headset first. Together with the six disc electrodes, a needle and a square electrode of larger surface were simultaneously recording near lead Cz. Each modality was evaluated over three sessions of auditory P300 separated by one hour. No statically significant effect was found for the electrode type, nor was the interaction between electrode type and session number. There was no statistically significant difference of performance between the EMOTIV and the six traditional EEG disc electrodes, although there was a trend showing worse performance of the EMOTIV headset. However, the modality-session interaction was highly significant (P<0.001) showing that, while the performance of the six disc electrodes stay constant over sessions, the performance of the EMOTIV headset drops dramatically between 2 and 3h of use. Finally, the evaluation of comfort by participants revealed an increasing discomfort with the EMOTIV headset starting with the second hour of use. Our study does not recommend the use of one modality over another based on performance but suggests the choice should be made on more practical considerations such as the expected length of use, the availability of skilled labor for system setup and above all, the patient comfort. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  16. Variability in recording and scoring of respiratory events during sleep in Europe: a need for uniform standards.

    Science.gov (United States)

    Arnardottir, Erna S; Verbraecken, Johan; Gonçalves, Marta; Gjerstad, Michaela D; Grote, Ludger; Puertas, Francisco Javier; Mihaicuta, Stefan; McNicholas, Walter T; Parrino, Liborio

    2016-04-01

    Uniform standards for the recording and scoring of respiratory events during sleep are lacking in Europe, although many centres follow the published recommendations of the American Academy of Sleep Medicine. The aim of this study was to assess the practice for the diagnosis of sleep-disordered breathing throughout Europe. A specially developed questionnaire was sent to representatives of the 31 national sleep societies in the Assembly of National Sleep Societies of the European Sleep Research Society, and a total of 29 countries completed the questionnaire. Polysomnography was considered the primary diagnostic method for sleep apnea diagnosis in 10 (34.5%), whereas polygraphy was used primarily in six (20.7%) European countries. In the remaining 13 countries (44.8%), no preferred methodology was used. Fifteen countries (51.7%) had developed some type of national uniform standards, but these standards varied significantly in terms of scoring criteria, device specifications and quality assurance procedures between countries. Only five countries (17.2%) had published these standards. Most respondents supported the development of uniform recording and scoring criteria for Europe, which might be based partly on the existing American Academy of Sleep Medicine rules, but also take into account differences in European practice when compared to North America. This survey highlights the current varying approaches to the assessment of patients with sleep-disordered breathing throughout Europe and supports the need for the development of practice parameters in the assessment of such patients that would be suited to European clinical practice. © 2015 European Sleep Research Society.

  17. Detecting Smoking Events Using Accelerometer Data Collected Via Smartwatch Technology: Validation Study.

    Science.gov (United States)

    Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F; Valafar, Homayoun

    2017-12-13

    Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self

  18. CMS Higgs Search in 2011 and 2012 data: candidate photon-photon event (8 TeV)

    CERN Multimedia

    McCauley, Thomas

    2013-01-01

    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). The event could also be due to known standard model background processes.

  19. Regression analysis of mixed panel count data with dependent terminal events.

    Science.gov (United States)

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Punctuated Sediment Input into Small Subpolar Ocean Basins During Heinrich Events and Preservation in the Stratigraphic Record

    Science.gov (United States)

    Hesse, R.

    2006-12-01

    generated from fresh-water discharges into the sea that can produce reversed buoyancy, as is well known from experiments. When the flows have traveled long enough, their tops will have lost enough sediment by settling such that their density decreases below that of the ambient seawater causing the current tops to lift up. The turbid fresh-water clouds buoyantly rise out of the turbidity current to a level of equal density, presumably the pycnocline, where they spread out laterally, even up-current, and generate interflows that deposit graded layers. The process is slow enough to allow incorporation into the graded layers of debris melting out of drifting icebergs. The observed lofted depositional facies is exclusively found in Heinrich layers. The most likely candidates for the parent currents from which lofting occurred were the sandy flows that formed the sand abyssal plain. Through this stratigraphic relationship the lofted facies ties the main pulses of Late Pleistocene sediment supply in the Labrador Basin to Heinrich events. Dating of pelagic interlayers during future ocean drilling may provide the proof that packages of sand turbidites underlying the abyssal plain are correlated to individual Heinrich events. The correlation may thus be documented in the stratigraphic record. Similar situations may exist in the Bering Sea or along the Maury Channel System in North Atlantic.

  1. Single event monitoring system based on Java 3D and XML data binding

    International Nuclear Information System (INIS)

    Wang Liang; Chinese Academy of Sciences, Beijing; Zhu Kejun; Zhao Jingwei

    2007-01-01

    Online single event monitoring is important to BESIII DAQ System. Java3D is extension of Java Language in 3D technology, XML data binding is more efficient to handle XML document than SAX and DOM. This paper mainly introduce the implementation of BESIII single event monitoring system with Java3D and XML data binding, and interface for track fitting software with JNI technology. (authors)

  2. A computer interface for processing multi-parameter data of multiple event types

    International Nuclear Information System (INIS)

    Katayama, I.; Ogata, H.

    1980-01-01

    A logic circuit called a 'Raw Data Processor (RDP)' which functions as an interface between ADCs and the PDP-11 computer has been developed at RCNP, Osaka University for general use. It enables data processing simultaneously for numbers of events of various types up to 16, and an arbitrary combination of ADCs of any number up to 14 can be assigned to each event type by means of a pinboard matrix. The details of the RDP and its application are described. (orig.)

  3. Integrating phenotypic data from electronic patient records with molecular level systems biology

    DEFF Research Database (Denmark)

    Brunak, Søren

    2011-01-01

    Electronic patient records remain a rather unexplored, but potentially rich data source for discovering correlations between diseases. We describe a general approach for gathering phenotypic descriptions of patients from medical records in a systematic and non-cohort dependent manner. By extracti...... Classification of Disease ontology and is therefore in principle language independent. As a use case we show how records from a Danish psychiatric hospital lead to the identification of disease correlations, which subsequently are mapped to systems biology frameworks....

  4. Joint Models for Longitudinal and Time-to-Event Data With Applications in R

    CERN Document Server

    Rizopoulos, Dimitris

    2012-01-01

    In longitudinal studies it is often of interest to investigate how a marker that is repeatedly measured in time is associated with a time to an event of interest, e.g., prostate cancer studies where longitudinal PSA level measurements are collected in conjunction with the time-to-recurrence. Joint Models for Longitudinal and Time-to-Event Data: With Applications in R provides a full treatment of random effects joint models for longitudinal and time-to-event outcomes that can be utilized to analyze such data. The content is primarily explanatory, focusing on applications of joint modeling, but

  5. Machine learning algorithms for meteorological event classification in the coastal area using in-situ data

    Science.gov (United States)

    Sokolov, Anton; Gengembre, Cyril; Dmitriev, Egor; Delbarre, Hervé

    2017-04-01

    The problem is considered of classification of local atmospheric meteorological events in the coastal area such as sea breezes, fogs and storms. The in-situ meteorological data as wind speed and direction, temperature, humidity and turbulence are used as predictors. Local atmospheric events of 2013-2014 were analysed manually to train classification algorithms in the coastal area of English Channel in Dunkirk (France). Then, ultrasonic anemometer data and LIDAR wind profiler data were used as predictors. A few algorithms were applied to determine meteorological events by local data such as a decision tree, the nearest neighbour classifier, a support vector machine. The comparison of classification algorithms was carried out, the most important predictors for each event type were determined. It was shown that in more than 80 percent of the cases machine learning algorithms detect the meteorological class correctly. We expect that this methodology could be applied also to classify events by climatological in-situ data or by modelling data. It allows estimating frequencies of each event in perspective of climate change.

  6. The U.S. Army Person-Event Data Environment: A Military-Civilian Big Data Enterprise.

    Science.gov (United States)

    Vie, Loryana L; Scheier, Lawrence M; Lester, Paul B; Ho, Tiffany E; Labarthe, Darwin R; Seligman, Martin E P

    2015-06-01

    This report describes a groundbreaking military-civilian collaboration that benefits from an Army and Department of Defense (DoD) big data business intelligence platform called the Person-Event Data Environment (PDE). The PDE is a consolidated data repository that contains unclassified but sensitive manpower, training, financial, health, and medical records covering U.S. Army personnel (Active Duty, Reserve, and National Guard), civilian contractors, and military dependents. These unique data assets provide a veridical timeline capturing each soldier's military experience from entry to separation from the armed forces. The PDE was designed to afford unprecedented cost-efficiencies by bringing researchers and military scientists to a single computerized repository rather than porting vast data resources to individual laboratories. With funding from the Robert Wood Johnson Foundation, researchers from the University of Pennsylvania Positive Psychology Center joined forces with the U.S. Army Research Facilitation Laboratory, forming the scientific backbone of the military-civilian collaboration. This unparalleled opportunity was necessitated by a growing need to learn more about relations between psychological and health assets and health outcomes, including healthcare utilization and costs-issues of major importance for both military and civilian population health. The PDE represents more than 100 times the population size and many times the number of linked variables covered by the nation's leading sources of population health data (e.g., the National Health and Nutrition Examination Survey). Following extensive Army vetting procedures, civilian researchers can mine the PDE's trove of information using a suite of statistical packages made available in a Citrix Virtual Desktop. A SharePoint collaboration and governance management environment ensures user compliance with federal and DoD regulations concerning human subjects' protections and also provides a secure

  7. An informatics structure for the component event data bank of the ERDS feasibility project

    International Nuclear Information System (INIS)

    Capobianchi, S.; Borella, A.

    1980-01-01

    The development of the ERDS involves complex problems in organisation and data processing and it has therefore been decided to proceed by means of pilot experiments in order to test in practice the proposed solutions. The first experiment is concerned with the development of a computerised model for collecting, handling and retrieving raw data events through an experimental Component Event Data Bank (CEDB). With CEDB this refers to organised information related to events such as failures, repairs, maintenance actions etc., concerning major LWR components of which the technical specification, operational requirements and environmental conditions are specified in detail. This pilot experiment is indeed the most challenging in the framework of the ERDS feasibility project. It is foreseen that the raw data will be supplied by national European data banks using forms and codes proper to each of them. The conversion and standardisation of these data into homogeneous 'European' codes and classifications is for the most part automatically performed. (author)

  8. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  9. Using machine-coded event data for the micro-level study of political violence

    Directory of Open Access Journals (Sweden)

    Jesse Hammond

    2014-07-01

    Full Text Available Machine-coded datasets likely represent the future of event data analysis. We assess the use of one of these datasets—Global Database of Events, Language and Tone (GDELT—for the micro-level study of political violence by comparing it to two hand-coded conflict event datasets. Our findings indicate that GDELT should be used with caution for geo-spatial analyses at the subnational level: its overall correlation with hand-coded data is mediocre, and at the local level major issues of geographic bias exist in how events are reported. Overall, our findings suggest that due to these issues, researchers studying local conflict processes may want to wait for a more reliable geocoding method before relying too heavily on this set of machine-coded data.

  10. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  11. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  12. OSCAR experiment high-density network data report: Event 3 - April 16-17, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Dana, M.T.; Easter, R.C.; Thorp, J.M.

    1984-12-01

    The OSCAR (Oxidation and Scavenging Characteristics of April Rains) experiment, conducted during April 1981, was a cooperative field investigation of wet removal in cyclonic storm systems. The high-density component of OSCAR was located in northeast Indiana and included sequential precipitation chemistry measurements on a 100 by 100 km network, as well as airborne air chemistry and cloud chemistry measurements, surface air chemistry measurements, and supporting meteorological measurements. Four separate storm events were studied during the experiment. This report summarizes data taken by Pacific Northwest Laboratory (PNL) during the third storm event, April 16-17. The report contains the high-density network precipitation chemistry data, air chemistry and cloud chemistry data from the PNL aircraft, and meteorological data for the event, including standard National Weather Service products and radar and rawindsonde data from the network. 4 references, 76 figures, 6 tables.

  13. OSCAR experiment high-density network data report: Event 1 - April 8-9, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Dana, M.T.; Easter, R.C.; Thorp, J.M.

    1984-12-01

    The OSCAR (Oxidation and Scavenging Characteristics of April Rains) experiment, conducted during April 1981, was a cooperative field investigation of wet removal in cyclonic storm systems. The high-densiy component of OSCAR was located in northeast Indiana and included sequential precipitation chemistry measurements on a 100 by 100 km network, as well as airborne air chemistry and cloud chemistry measurements, surface air chemistry measurements, and supporting meteorological measurements. Four separate storm events were studied during the experiment. This report summarizes data taken by Pacific Northwest Laboratory (PNL) during the first storm event, April 8-9. The report contains the high-density network precipitation chemistry data, air chemistry data from the PNL aircraft, and meteorological data for the event, including standard National Weather Service products and radar data from the network. 4 references, 72 figures, 5 tables.

  14. Design of a medical record review study on the incidence and preventability of adverse events requiring a higher level of care in Belgian hospitals

    Directory of Open Access Journals (Sweden)

    Vlayen Annemie

    2012-08-01

    Full Text Available Abstract Background Adverse events are unintended patient injuries that arise from healthcare management resulting in disability, prolonged hospital stay or death. Adverse events that require intensive care admission imply a considerable financial burden to the healthcare system. The epidemiology of adverse events in Belgian hospitals has never been assessed systematically. Findings A multistage retrospective review study of patients requiring a transfer to a higher level of care will be conducted in six hospitals in the province of Limburg. Patient records are reviewed starting from January 2012 by a clinical team consisting of a research nurse, a physician and a clinical pharmacist. Besides the incidence and the level of causation and preventability, also the type of adverse events and their consequences (patient harm, mortality and length of stay will be assessed. Moreover, the adequacy of the patient records and quality/usefulness of the method of medical record review will be evaluated. Discussion This paper describes the rationale for a retrospective review study of adverse events that necessitate a higher level of care. More specifically, we are particularly interested in increasing our understanding in the preventability and root causes of these events in order to implement improvement strategies. Attention is paid to the strengths and limitations of the study design.

  15. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  16. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Snow Cover Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of snow cover from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument...

  17. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Active Fires Environmental Data Record (EDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational environmental data record (EDR) that contains pinpoint locations of active fires (AF) as identified by an algorithm...

  18. NOAA Climate Data Record (CDR) of Ocean Near Surface Atmospheric Properties, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature; near-surface wind speed, air temperature, and specific...

  19. NOAA Climate Data Record (CDR) of Ocean Heat Fluxes, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature; near-surface wind speed, air temperature, and specific...

  20. NOAA Climate Data Record (CDR) of Sea Surface Temperature - WHOI, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature, near-surface atmospheric properties, and heat fluxes....

  1. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Aerosol Detection Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of suspended matter from the Visible Infrared Imaging Radiometer Suite (VIIRS)...

  2. NOAA Climate Data Record (CDR) of Total Solar Irradiance (TSI), NRLTSI Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains total solar irradiance (TSI) as a function of time created with the Naval Research Laboratory model for spectral and total...

  3. NOAA Climate Data Record (CDR) of Solar Spectral Irradiance (SSI), NRLSSI Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains solar spectral irradiance (SSI) as a function of time and wavelength created with the Naval Research Laboratory model for...

  4. A new prosthetic alignment device to read and record prosthesis alignment data.

    Science.gov (United States)

    Pirouzi, Gholamhossein; Abu Osman, Noor Azuan; Ali, Sadeeq; Davoodi Makinejad, Majid

    2017-12-01

    Prosthetic alignment is an essential process to rehabilitate patients with amputations. This study presents, for the first time, an invented device to read and record prosthesis alignment data. The digital device consists of seven main parts: the trigger, internal shaft, shell, sensor adjustment button, digital display, sliding shell, and tip. The alignment data were read and recorded by the user or a computer to replicate prosthesis adjustment for future use or examine the sequence of changes in alignment and its effect on the posture of the patient. Alignment data were recorded at the anterior/posterior and medial/lateral positions for five patients. Results show the high level of confidence to record alignment data and replicate adjustments. Therefore, the device helps patients readjust their prosthesis by themselves, or prosthetists to perform adjustment for patients and analyze the effects of malalignment.

  5. NOAA Climate Data Record (CDR) of Passive Microwave Sea Ice Concentration, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Passive Microwave Sea Ice Concentration Climate Data Record (CDR) dataset is generated using daily gridded brightness temperatures from the Defense...

  6. NOAA Climate Data Record (CDR) of AVHRR Polar Pathfinder Extended (APP-X) Cryosphere

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA Climate Data Record (CDR) of the extended AVHRR Polar Pathfinder (APP-x) cryosphere contains 19 geophysical variables over the Arctic and Antarctic for the...

  7. Idaho: basic data for thermal springs and wells as recorded in GEOTHERM, Part A

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, J.D.

    1983-07-01

    All chemical data for geothermal fluids in Idaho available as of December 1981 is maintained on GEOTHERM, computerized information system. This report presents summaries and sources of records for Idaho. 7 refs. (ACR)

  8. NOAA Climate Data Records (CDR) of AMSU-A/B and MHS Hydrological Properties, Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Hydrological Properties for Applications Thematic Climate Data Record (TCDR) consist of Advanced Microwave Sounding Unit-A (AMSU-A), Advanced Microwave...

  9. NOAA Climate Data Record (CDR) of Monthly Outgoing Longwave Radiation (OLR), Version 2.2-1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) of monthly mean High Resolution Infrared Radiation Sounder (HIRS) Outgoing Longwave Radiation (OLR) flux at the top of the atmosphere...

  10. NOAA Climate Data Record (CDR) of Daily Outgoing Longwave Radiation (OLR), Version 1.2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains the daily mean Outgoing Longwave Radiation (OLR) time series in global 1 degree x 1 degree equal-angle gridded maps spanning...

  11. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  12. NOAA Climate Data Record (CDR) of Normalized Difference Vegetation Index (NDVI), Version 4

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains gridded daily Normalized Difference Vegetation Index (NDVI) derived from the NOAA Climate Data Record (CDR) of Advanced Very High Resolution...

  13. NUCAPS: NOAA Unique Combined Atmospheric Processing System Environmental Data Record (EDR) Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of numerous retrieved estimates of hydrological variables and trace gases as Environmental Data Record (EDR) products from the NOAA Unique...

  14. MRO CRISM MAP-PROJECTED TARGETED REDUCED DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This volume contains the CRISM Map-projected Targeted Reduced Data Record (MTRDR) archive, a collection of multiband image cubes derived from targeted (gimbaled)...

  15. NOAA Fundamental Climate Data Record (CDR) of AMSU-B and MHS Brightness Temperature, Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Climate Data Record (CDR) of Advanced Microwave Sounding Unit-B (AMSU-B) and Microwave Humidity Sounder (MHS) brightness temperature (Tb) in "window...

  16. NOAA Climate Data Record (CDR) of AVHRR Polar Pathfinder (APP) Cryosphere

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This NOAA Climate Data Record (CDR) contains the AVHRR Polar Pathfinder (APP) product. APP is a fundamental CDR comprised of calibrated and navigated AVHRR channel...

  17. Unified Sea Ice Thickness Climate Data Record Collection Spanning 1947-2012

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Unified Sea Ice Thickness Climate Data Record is the result of a concerted effort to collect as many observations as possible of Arctic sea-ice draft, freeboard,...

  18. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Active Fires Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) that contains pinpoint locations of active fires (AF) as identified by an algorithm...

  19. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  20. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  1. The Albian oceanic anoxic events record in central and northern Tunisia: Geochemical data and paleotectonic controls

    OpenAIRE

    Khalifa , Zina; Affouri , Hassene; Rigane , Adel; Jacob , Jérémy

    2018-01-01

    International audience; The Albian organic-rich successions of the lower part of the Fahdene Formation (Albian to Cenomanian, Tunisia) were studied using sedimentology (analysis of carbonate contents and observation of thin sections), bulk organic geochemistry (Rock-Eval pyrolysis), and molecular biomarker distributions. The selected outcrops cover different structural domains from western central Tunisia (Jebel Hamra) to the Diapir zone or the Tunisian Trough (Koudiat Berkouchia, Jebel Ghazo...

  2. The Cadmium Isotope Record of the Great Oxidation Event from the Turee Creek Group, Hamersley Basin, Australia

    Science.gov (United States)

    Abouchami, W.; Busigny, V.; Philippot, P.; Galer, S. J. G.; Cheng, C.; Pecoits, E.

    2016-12-01

    The evolution of the ocean, atmosphere and biosphere throughout Earth's history has impacted on the biogeochemistry of some key trace metals that are of particular importance in regulating the exchange between Earth's reservoirs. Several geochemical proxies exhibit isotopic shifts that have been linked to major changes in the oxygenation levels of the ancient oceans during the Great Oxygenation Event (GOE) between 2.45 and 2.2 Ga and the Neoproterozoic Oxygenation Event at ca. 0.6 Ga. Studies of the modern marine biogeochemical cycle of the transition metal Cadmium have shown that stable Cd isotope fractionation is mainly driven by biological uptake of light Cd into marine phytoplankton in surface waters leaving behind the seawater enriched in the heavy Cd isotopes. Here we use of the potential of this novel proxy to trace ancient biological productivity which remains an enigma, particularly during the early stages of Earth history. The Turee Creek Group in the Hamersley Basin, Australia, provides a continuous stratigraphic sedimentary section covering the GOE and at least two glacial events, offering a unique opportunity to examine the changes that took place during these periods and possibly constrain the evolution, timing and onset of oxygenic photosynthesis. Stable Cd isotope data were obtained on samples from the Boolgeeda Iron Fm. (BIFs), the siliciclastic and carbonate successions of Kungara (including the Meteorite Bore Member) and the Kazputt Fm., using a double spike technique by TIMS (ThermoFisher Triton) and Cd concentrations were determined by isotope dilution. The Boolgeeda BIFs have generally low Cd concentrations varying between 8 and 50ppb, with two major excursions marked by an increase in Cd content, reaching similar levels to those in the overlying Kungarra Fm. (≥150 ppb). These variations are associated with a large range in ɛ112/110Cd values (-2 to +2), with the most negative values typically found in the organic and Cd-rich shales and

  3. Citizen Science for Data Rescue: Recovering Historical Climate Records with a Network of 20,000 Volunteers.

    Science.gov (United States)

    Brohan, P.

    2014-12-01

    Recent years have seen many extreme and damaging weather events - for example the low Arctic sea-ice of 2012, and the severe winter of 2013/4 in North America and the UK. To understand these events, and to judge whether they represent environmental change, we need to compare today's weather to the long-term historical record. Our long-term historical record of the weather is based on the billions of observations, from scientists, explorers, mariners, and others, that have been made, across the world, over the last few centuries. Many of these records are still dark: They exist only as hand-written paper documents in various archives and libraries, and are inaccessible to science. As a result our historical weather reconstructions have major gaps, where we do not know how the climate has varied. oldWeather.org is a citizen science project rescuing these observations. By providing an web interface to scans of paper records, we enable volunteers around the world to contribute to the task of rescuing the observations. So far a community of around 20,000 volunteers have read well over 1 million pages of paper records and contributed millions of recovered weather observations to international climate datasets. As well as learning about past weather, we are also learning what it takes to build a successful volunteer science project in this area: building a community, breaking down the task into manageable steps, feeding back success to the volunteers, and enabling comitted volunteers to take on more responsibilities were all vital to our success. We are currently using those lessons to build a new version of oldWeather that can rescue even more data.

  4. Protocol for Validation of the Land Surface Reflectance Fundamental Climate Data Record using AERONET: Application to the Global MODIS and VIIRS Data Records

    Science.gov (United States)

    Roger, J. C.; Vermote, E.; Holben, B. N.

    2014-12-01

    The land surface reflectance is a fundamental climate data record at the basis of the derivation of other climate data records (Albedo, LAI/Fpar, Vegetation indices) and a key parameter in the understanding of the land-surface-climate processes. It is essential that a careful validation of its uncertainties is performed on a global and continuous basis. One approach is the direct comparison of this product with ground measurements but that approach presents several issues related to scale, the episodic nature of ground measurements and the global representativeness. An alternative is to compare the surface reflectance product to reference reflectance determined from Top of atmosphere reflectance corrected using accurate radiative transfer code and very detailed measurements of the atmosphere obtained over the AERONET sites (Vermote and al, 2014, RSE) which allows to test for a large range of aerosol characteristics; formers being important inputs for atmospheric corrections. However, the application of this method necessitates the definition of a very detailed protocol for the use of AERONET data especially as far as size distribution and absorption are concerned, so that alternative validation methods or protocols could be compared. This paper describes the protocol we have been working on based on our experience with the AERONET data and its application to the MODIS and VIIRS record.

  5. A guide to collect data from abnormal events in industrial radiography

    International Nuclear Information System (INIS)

    Martins, M.M.; Silva, F.C.; Tahuata, L.

    1996-01-01

    The review of abnormal radiological events provides important information to evaluate the reasons of their cause. The IAEA and other institutions have dedicated special attention to this subject, studying mainly radiological accidents that affected individuals from the public and workers exposed. According to UNSCEAR, industrial radiography and other radiographic techniques are responsible for the great number of overexposure events. This paper can be used by health physicists and other professionals as a guide to extract the most important information related to abnormal events that happen in industrial radiography. This guide was used in 1992 in the information registration data base (1976-1992) of the Brazilian Nuclear Energy Commission (CNEN), where 175 events were identified with a minimal number of information for the analysis. The collected data is presented too. (authors). 6 refs., 1 ill

  6. A Compound-Specific Hydrogen Isotope Record at the Onset of Ocean Anoxic Event 2, Kaiparowits Plateau, Southern Utah

    Science.gov (United States)

    Todes, J.; Jones, M. M.; Sageman, B. B.; Osburn, M. R.

    2017-12-01

    Rhythmic lithologic variations (limestone-shale couplets) interpreted to reflect Milankovitch cycles occur at the onset of Ocean Anoxic Event 2 (OAE2) in deposits of the Western Interior Seaway. These couplets have been interpreted to reflect climate cycles: however, the physical mechanism(s) through which climate cycles were translated to the sedimentary record during peak greenhouse conditions remain unsettled. Although glacioeustasy has been considered, variance in surface ocean temperature, ocean circulation, or local hydrology may be more plausible options. Compound-specific hydrogen isotope ratios (δ2H) of n-alkanes and other biomarkers may provide a means to evaluate such mechanisms. Since sedimentary alkanes are direct products of plants and membrane lipid diagenesis and are resistant to secondary hydrogen exchange during thermal maturation at low (chain length distributions suggest low thermal maturity and the possible preservation of primary δ2H values. Short and long chain ­n-alkanes are potentially sourced from planktonic biomass and terrestrial plants, respectively, enabling a comparison of climatic processes between marine and terrestrial settings. Biomarkers, including both steranes and hopanes, are also preserved and reflect putative source organisms and local paleoenvironmental conditions. Facies-specific δ2H analysis will allow for evaluation of changes in the dominant source of atmospheric moisture in the Western Interior during orbitally-forced climate cycles. Organic matter deposited during periods of northerly Boreal influence would have a depleted 2H-isotope composition relative to those deposited during periods of more southerly Tethys influence. In this model, these variations are reflected by lithology - limestone deposition would occur during warm, evaporative Tethys-dominated times, while cooler, wetter Boreal periods would promote shale deposition.

  7. Detection of unusual events and trends in complex non-stationary data streams

    International Nuclear Information System (INIS)

    Charlton-Perez, C.; Perez, R.B.; Protopopescu, V.; Worley, B.A.

    2011-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for diverse applications, ranging from power plant operation to homeland security. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden events inside intermittent signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  8. The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD statement.

    Directory of Open Access Journals (Sweden)

    Eric I Benchimol

    2015-10-01

    Full Text Available Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE. The REporting of studies Conducted using Observational Routinely collected health Data (RECORD statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist and explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included herein. This document, as well as the accompanying website and message board (http://www.record-statement.org, will enhance the implementation and understanding of RECORD. Through implementation of RECORD, authors, journals editors, and peer reviewers can encourage transparency of research reporting.

  9. Data book of the component failure rate stored in the RECORD

    International Nuclear Information System (INIS)

    Oikawa, Testukuni; Sasaki, Shinobu; Hikawa, Michihiro; Higuchi, Suminori.

    1989-04-01

    The Japan Atomic Energy Research Insitute (JAERI) has developed a computerized component reliability data base and its retrieval system, RECORD, on collected failure rates from published literatures in order to promote convenience and efficiency of systems reliability analysis in the PSA (Probabilistic Safety Assessment). In order to represent collected failure rates in a uniform format, codes are defined for component category, failure mode, data source, unit of failure rate and statistocal parameter. Up to now, approximately 11,500 pieces of component failure rate data from about 35 open literatures have been stored in the RECORD. This report provides the failure rate stored in the RECORD data base for the usage by systems analysts, as well as brief descriptions about the data base structure and how to use this data book. (author)

  10. Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.

    Science.gov (United States)

    Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth

    2017-09-01

    The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.

  11. Effects of various event building techniques on data acquisition system architectures

    International Nuclear Information System (INIS)

    Barsotti, E.; Booth, A.; Bowden, M.

    1990-04-01

    The preliminary specifications for various new detectors throughout the world including those at the Superconducting Super Collider (SSC) already make it clear that existing event building techniques will be inadequate for the high trigger and data rates anticipated for these detectors. In the world of high-energy physics many approaches have been taken to solving the problem of reading out data from a whole detector and presenting a complete event to the physicist, while simultaneously keeping deadtime to a minimum. This paper includes a review of multiprocessor and telecommunications interconnection networks and how these networks relate to event building in general, illustrating advantages of the various approaches. It presents a more detailed study of recent research into new event building techniques which incorporate much greater parallelism to better accommodate high data rates. The future in areas such as front-end electronics architectures, high speed data links, event building and online processor arrays is also examined. Finally, details of a scalable parallel data acquisition system architecture being developed at Fermilab are given. 35 refs., 31 figs., 1 tab

  12. Event Handler II: a fast, programmable, CAMAC-coupled data acquisition interface

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1979-01-01

    The architecture of the Event Handler II, a fast, programmable data acquisition interface linked to and through CAMAC is described. The special features of this interface make it a powerful tool in implementing data acquisition systems for experiments in nuclear physics. 1 figure, 1 table

  13. Event Handler: a fast programmable, CAMAC-coupled data acquisition interface

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1978-01-01

    The purpose of this paper is to describe the architecture and performance of the Event Handler, a fast, programmable data acquisition interface which is linked to and through CAMAC. The special features of this interface make it a powerful tool in implementing data acquisition systems for experiments in nuclear physics

  14. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Science.gov (United States)

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  15. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  16. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  17. Pemanfaatan Data ARR (Automatic Rainfall Recorder) untuk Peningkatan Efektifitas Model Hujan Satelit (Studi Kasus DAS Indragiri)

    OpenAIRE

    Hendra, Yuli; Fauzi, Manyuk; Sutikno, Sigit

    2015-01-01

    The availability of data on hydrological modeling always become a problem considering the incompleteness and the imprecision of data. As the development of technology, many models of hidrology using data acquired from the satellite have emerged. The accuracy and the model correlation was still unachieved from the previous research using satellite data. This problems was caused by the unstable weather conditions thus the process of recording and dowloading of the satellite data become less opt...

  18. Yield estimation based on calculated comparisons to particle velocity data recorded at low stress

    International Nuclear Information System (INIS)

    Rambo, J.

    1993-01-01

    This paper deals with the problem of optimizing the yield estimation process if some of the material properties are known from geophysical measurements and others are inferred from in-situ dynamic measurements. The material models and 2-D simulations of the event are combined to determine the yield. Other methods of yield determination from peak particle velocity data have mostly been based on comparisons of nearby events in similar media at NTS. These methods are largely empirical and are subject to additional error when a new event has different properties than the population being used for a basis of comparison. The effect of material variations can be examined using LLNL's KDYNA computer code. The data from an NTS event provide the instructive example for simulation

  19. Data Matching Concepts and Techniques for Record Linkage, Entity Resolution, and Duplicate Detection

    CERN Document Server

    Christen, Peter

    2012-01-01

    Data matching (also known as record or data linkage, entity resolution, object identification, or field matching) is the task of identifying, matching and merging records that correspond to the same entities from several databases or even within one database. Based on research in various domains including applied statistics, health informatics, data mining, machine learning, artificial intelligence, database management, and digital libraries, significant advances have been achieved over the last decade in all aspects of the data matching process, especially on how to improve the accuracy of da

  20. Geochemistry and Cyclostratigraphy of Magnetic Susceptibility data from the Frasnian-Famennian event interval in western Canada: Insights in the pattern and timing of a biotic crisis

    Science.gov (United States)

    Whalen, M. T.; De Vleeschouwer, D.; Sliwinski, M. G.; Claeys, P. F.; Day, J. E.

    2012-12-01

    Cyclostratigraphic calibration of magnetic susceptibility data along with stable isotopic and geochemical proxy data for redox, productivity, and detrital input from western Canada provide insight into the pace and timing of the Late Devonian, Frasnian-Famennian (F-F) biotic crisis. Two organic-rich shales that, in much of the world, display geochemical anomalies indicating low oxygen conditions and carbon burial characterize the F-F event. These events, referred to as the Lower and Upper Kellwasser events (LKE & UKE), have been linked to the evolutionary expansion of deeply rooted terrestrial forests and the concomitant changes in soil development and chemical weathering and changes in Late Devonian climate. Our geochemical data record relatively high levels of redox sensitive trace metals (Mo, U, V), proxies for biological productivity (Ba, Cu, Ni, Zn), and detrital input (Al, Si, Ti, Zr) during both events. C stable isotopic data generated from organic matter records a 3-4‰ positive excursion during both events. Each event is recorded in lowstand and/or early transgressive facies. These data corroborate hypotheses about enhanced biological productivity, driven by heightened terrestrial detrital input, leading to low oxygen conditions and decreases in biotic diversity during during relatively low stands of Late Devonian sea level. Age dating of such events in deep time is problematic due to insufficient biochronologic control. Each event is within one conodont biostratigraphic zone, with durations on the order of 0.5-1.0 Ma. Time series analysis of high-resolution magnetic susceptibility data identified 16 long eccentricity cycles (405 ky) during the Frasnian stage and one in the earliest Famennian stage. The geochemical anomalies associated with the LKE and UKE are recorded over 7 and 14 m of stratigraphic section respectively. These strata represent only a portion of a 405 ky long eccentricity cycle and astronomical tuning implies that the LKE likely occurred

  1. Modeling the Process of Event Sequence Data Generated for Working Condition Diagnosis

    Directory of Open Access Journals (Sweden)

    Jianwei Ding

    2015-01-01

    Full Text Available Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM, which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis. Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.

  2. Using data from ambient assisted living and smart homes in electronic health records.

    Science.gov (United States)

    Knaup, P; Schöpe, L

    2014-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Using Data from Ambient Assisted Living and Smart Homes in Electronic Health Records". To increase efficiency in the health care of the future, data from innovative technology like it is used for ambient assisted living (AAL) or smart homes should be available for individual health decisions. Integrating and aggregating data from different medical devices and health records enables a comprehensive view on health data. The objective of this paper is to present examples of the state of the art in research on information management that leads to a sustainable use and long-term storage of health data provided by innovative assistive technologies in daily living. Current research deals with the perceived usefulness of sensor data, the participatory design of visual displays for presenting monitoring data, and communication architectures for integrating sensor data from home health care environments with health care providers either via a regional health record bank or via a telemedical center. Integrating data from AAL systems and smart homes with data from electronic patient or health records is still in an early stage. Several projects are in an advanced conceptual phase, some of them exploring feasibility with the help of prototypes. General comprehensive solutions are hardly available and should become a major issue of medical informatics research in the near future.

  3. Designing Alternative Transport Methods for the Distributed Data Collection of ATLAS EventIndex Project

    CERN Document Server

    Fernandez Casani, Alvaro; The ATLAS collaboration

    2016-01-01

    One of the key and challenging tasks of the ATLAS EventIndex project is to index and catalog all the produced events not only at CERN but also at hundreds of worldwide grid sites, and convey the data in real time to a central Hadoop instance at CERN. While this distributed data collection is currently operating correctly in production, there are some issues that might impose performance bottlenecks in the future, with an expected rise in the event production and reprocessing rates. In this work, we first describe the current approach based on a messaging system, which conveys the data from the sources to the central catalog, and we identify some weaknesses of this system. Then, we study a promising alternative transport method based on an object store, presenting a performance comparison with the current approach, and the architectural design changes needed to adapt the system to the next run of the ATLAS experiment at CERN.

  4. Identifying FRBR Work-Level Data in MARC Bibliographic Records for Manifestations of Moving Images

    Directory of Open Access Journals (Sweden)

    Lynne Bisko

    2008-12-01

    Full Text Available The library metadata community is dealing with the challenge of implementing the conceptual model, Functional Requirements for Bibliographic Records (FRBR. In response, the Online Audiovisual Catalogers (OLAC created a task force to study the issues related to creating and using FRBR-based work-level records for moving images. This article presents one part of the task force's work: it looks at the feasibility of creating provisional FRBR work-level records for moving images by extracting data from existing manifestation-level bibliographic records. Using a sample of 941 MARC records, a subgroup of the task force conducted a pilot project to look at five characteristics of moving image works. Here they discuss their methodology; analysis; selected results for two elements, original date (year and director name; and conclude with some suggested changes to MARC coding and current cataloging policy.

  5. The cause-consequence data base: a retrieval system for records pertaining to accident management

    International Nuclear Information System (INIS)

    Kumamoto, H.; Inoue, K.; Sawaragi, Y.

    1981-01-01

    This paper describes a proposal to store in a data base important paragraphs from reports of investigations into many types of accidents. The data base is to handle not only reports on TMI, but also reports on other events at nuclear reactors, chemical plant explosions, earthquakes, hurricanes, fires, and so forth. (author)

  6. Radionuclide data analysis in connection of DPRK event in May 2009

    Science.gov (United States)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  7. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    Science.gov (United States)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  8. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2013-10-09

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework, to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.

  9. Automated Feature and Event Detection with SDO AIA and HMI Data

    Science.gov (United States)

    Davey, Alisdair; Martens, P. C. H.; Attrill, G. D. R.; Engell, A.; Farid, S.; Grigis, P. C.; Kasper, J.; Korreck, K.; Saar, S. H.; Su, Y.; Testa, P.; Wills-Davey, M.; Savcheva, A.; Bernasconi, P. N.; Raouafi, N.-E.; Delouille, V. A.; Hochedez, J. F..; Cirtain, J. W.; Deforest, C. E.; Angryk, R. A.; de Moortel, I.; Wiegelmann, T.; Georgouli, M. K.; McAteer, R. T. J.; Hurlburt, N.; Timmons, R.

    The Solar Dynamics Observatory (SDO) represents a new frontier in quantity and quality of solar data. At about 1.5 TB/day, the data will not be easily digestible by solar physicists using the same methods that have been employed for images from previous missions. In order for solar scientists to use the SDO data effectively they need meta-data that will allow them to identify and retrieve data sets that address their particular science questions. We are building a comprehensive computer vision pipeline for SDO, abstracting complete metadata on many of the features and events detectable on the Sun without human intervention. Our project unites more than a dozen individual, existing codes into a systematic tool that can be used by the entire solar community. The feature finding codes will run as part of the SDO Event Detection System (EDS) at the Joint Science Operations Center (JSOC; joint between Stanford and LMSAL). The metadata produced will be stored in the Heliophysics Event Knowledgebase (HEK), which will be accessible on-line for the rest of the world directly or via the Virtual Solar Observatory (VSO) . Solar scientists will be able to use the HEK to select event and feature data to download for science studies.

  10. Distributed Data Collection For Next Generation ATLAS EventIndex Project

    CERN Document Server

    Fernandez Casani, Alvaro; The ATLAS collaboration

    2018-01-01

    The ATLAS EventIndex currently runs in production in order to build a complete catalogue of events for experiments with large amounts of data. The current approach is to index all final produced data files at CERN Tier0, and at hundreds of grid sites, with a distributed data collection architecture using Object Stores to temporary maintain the conveyed information, with references to them sent with a Messaging System. The final backend of all the indexed data is a central Hadoop infrastructure at CERN; an Oracle relational database is used for faster access to a subset of this information. In the future of ATLAS, instead of files, the event should be the atomic information unit for metadata. This motivation arises in order to accommodate future data processing and storage technologies. Files will no longer be static quantities, possibly dynamically aggregating data, and also allowing event-level granularity processing in heavily parallel computing environments. It also simplifies the handling of loss and or e...

  11. Creation of a long-term data record of total O3 - issues and challenges in prescribing the uncertainties

    Science.gov (United States)

    Haffner, D. P.; Bhartia, P. K.; Li, J. Y.

    2012-12-01

    With the launch of the BUV instrument on NASA's Nimbus-4 satellite in April 1970, ozone became one of the first atmospheric variables to be measured from space with high accuracy. By 1980, the quality of total column ozone measured from the TOMS instrument on the Nimbus-7 satellite had improved to the point that it started to be used to identify poorly calibrated instruments in the venerable Dobson ground-based network. Now we have a total ozone record spanning 42 years created by more than a dozen instruments. We will discuss the issues and challenges that we have faced in creating a consistent long-term record and in providing uncertainty estimates. This work is not yet finished. We are currently developing a new algorithm (Version 9) that will be used to reprocess the entire record. The main motivation for developing this algorithm is not so much to improve the quality of the data, which is quite high already, but to provide better estimates of uncertainties when errors are spatially and temporally correlated, and to develop better techniques to catch "Black Swan" events (BSE). These are events that occur infrequently but cause errors larger than expected by Gaussian probability distribution. For example, the eruption of El Chichón revealed that our ozone algorithm had unexpected sensitivity to volcanic SO2, and evidence of the ozone hole was initially interpreted as a problem with the TOMS instrument. We also provide mathematical operators that can be applied by sophisticated users to compute their own uncertainties for their particular applications. This is necessary because uncertainties change in complex ways when the data are smoothed or averaged. The modern data archival system should be designed to accommodate such operators and provide software for using them.

  12. Sedimentary record and luminescence chronology of palaeoflood events along the Gold Gorge of the upper Hanjiang River, middle Yangtze River basin, China

    Science.gov (United States)

    Guo, Yongqiang; Huang, Chun Chang; Zhou, Yali; Pang, Jiangli; Zha, Xiaochun; Fan, Longjiang; Mao, Peini

    2018-05-01

    Palaeoflood slackwater deposits (SWDs) along the river banks have important implications for the reconstruction of the past hydro-climatic events. Two palaeoflood SWD beds were identified in the Holocene loess-soil sequences on the cliff river banks along the Gold Gorge of the upper Hanjiang River by field investigation and laboratory analysis. They have recorded two palaeoflood events which were dated by optically stimulated luminescence to 3.2-2.8 ka and 2.1-1.8 ka, respectively. The reliability of the ages obtained for the two events are further confirmed by the presence of archaeological remains and good regional pedostratigraphic correlation. The peak discharges of two palaeoflood events at the studied sites were estimated to be 16,560-17,930 m3/s. A correlation with the palaeoflood events identified in the other reaches shows that great floods occurred frequently during the episodes of 3200-2800 and 2000-1700 a BP along the upper Hanjiang River valley during the last 4000 years. These phases of palaeoflood events in central China are well correlated with the climatic variability identified by δ18O record in the stalagmites from the middle Yangtze River Basin and show apparent global linkages. Palaeoflood studies in a watershed scale also imply that strengthened human activities during the Shang dynasty (BCE 1600-1100) and Han dynasty (BCE206-CE265) may have caused accelerated soil erosion along the upper Hanjiang River valley.

  13. Direct and indirect costs for adverse drug events identified in medical records across care levels, and their distribution among payers.

    Science.gov (United States)

    Natanaelsson, Jennie; Hakkarainen, Katja M; Hägg, Staffan; Andersson Sundell, Karolina; Petzold, Max; Rehnberg, Clas; Jönsson, Anna K; Gyllensten, Hanna

    2017-11-01

    Adverse drug events (ADEs) cause considerable costs in hospitals. However, little is known about costs caused by ADEs outside hospitals, effects on productivity, and how the costs are distributed among payers. To describe the direct and indirect costs caused by ADEs, and their distribution among payers. Furthermore, to describe the distribution of patient out-of-pocket costs and lost productivity caused by ADEs according to socio-economic characteristics. In a random sample of 5025 adults in a Swedish county, prevalence-based costs for ADEs were calculated. Two different methods were used: 1) based on resource use judged to be caused by ADEs, and 2) as costs attributable to ADEs by comparing costs among individuals with ADEs to costs among matched controls. Payers of costs caused by ADEs were identified in medical records among those with ADEs (n = 596), and costs caused to individual patients were described by socio-economic characteristics. Costs for resource use caused by ADEs were €505 per patient with ADEs (95% confidence interval €345-665), of which 38% were indirect costs. Compared to matched controls, the costs attributable to ADEs were €1631, of which €410 were indirect costs. The local health authorities paid 58% of the costs caused by ADEs. Women had higher productivity loss than men (€426 vs. €109, p = 0.018). Out-of-pocket costs displaced a larger proportion of the disposable income among low-income earners than higher income earners (0.7% vs. 0.2%-0.3%). We used two methods to identify costs for ADEs, both identifying indirect costs as an important component of the overall costs for ADEs. Although the largest payers of costs caused by ADEs were the local health authorities responsible for direct costs, employers and patients costs for lost productivity contributed substantially. Our results indicate inequalities in costs caused by ADEs, by sex and income. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Passive Visual Analytics of Social Media Data for Detection of Unusual Events

    OpenAIRE

    Rustagi, Kush; Chae, Junghoon

    2016-01-01

    Now that social media sites have gained substantial traction, huge amounts of un-analyzed valuable data are being generated. Posts containing images and text have spatiotemporal data attached as well, having immense value for increasing situational awareness of local events, providing insights for investigations and understanding the extent of incidents, their severity, and consequences, as well as their time-evolving nature. However, the large volume of unstructured social media data hinders...

  15. The January 2001, El Salvador event: a multi-data analysis

    Science.gov (United States)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  16. A novel approach to leveraging electronic health record data to enhance pediatric surgical quality improvement bundle process compliance.

    Science.gov (United States)

    Fisher, Jason C; Godfried, David H; Lighter-Fisher, Jennifer; Pratko, Joseph; Sheldon, Mary Ellen; Diago, Thelma; Kuenzler, Keith A; Tomita, Sandra S; Ginsburg, Howard B

    2016-06-01

    Quality improvement (QI) bundles have been widely adopted to reduce surgical site infections (SSI). Improvement science suggests when organizations achieve high-reliability to QI processes, outcomes dramatically improve. However, measuring QI process compliance is poorly supported by electronic health record (EHR) systems. We developed a custom EHR tool to facilitate capture of process data for SSI prevention with the aim of increasing bundle compliance and reducing adverse events. Ten SSI prevention bundle processes were linked to EHR data elements that were then aggregated into a snapshot display superimposed on weekly case-log reports. The data aggregation and user interface facilitated efficient review of all SSI bundle elements, providing an exact bundle compliance rate without random sampling or chart review. Nine months after implementation of our custom EHR tool, we observed centerline shifts in median SSI bundle compliance (46% to 72%). Additionally, as predicted by high reliability principles, we began to see a trend toward improvement in SSI rates (1.68 to 0.87 per 100 operations), but a discrete centerline shift was not detected. Simple informatics solutions can facilitate extraction of QI process data from the EHR without relying on adjunctive systems. Analyses of these data may drive reductions in adverse events. Pediatric surgical departments should consider leveraging the EHR to enhance bundle compliance as they implement QI strategies. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.

    Science.gov (United States)

    Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi

    2018-01-31

    Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.

  18. Digital data recording system for the 4 πβ-γ coincidence apparatus

    International Nuclear Information System (INIS)

    Shaha, V.V.; Srivastava, P.K.

    1975-01-01

    The data recording system for the 4πβ-γ coincidence apparatus consists of three scalers, a timer, a day-clock, a print control unit and a Hewlett-Packard printer. The print control unit serves as an interface unit as well as generates necessary electronic commands for starting, scanning, recycling and actuating the printer. It also generates the run number and identification number. It has made the data recording and recycling completely automatic. The report describes the data recording system which has been in continuous use since March 1973. Brief description of the scalers, the timer, the day-clock and the printer is given. The print control unit is described and the working of the data handling, scanning and cycle counting sections is explained. (author)

  19. ISVASE: identification of sequence variant associated with splicing event using RNA-seq data.

    Science.gov (United States)

    Aljohi, Hasan Awad; Liu, Wanfei; Lin, Qiang; Yu, Jun; Hu, Songnian

    2017-06-28

    Exon recognition and splicing precisely and efficiently by spliceosome is the key to generate mature mRNAs. About one third or a half of disease-related mutations affect RNA splicing. Software PVAAS has been developed to identify variants associated with aberrant splicing by directly using RNA-seq data. However, it bases on the assumption that annotated splicing site is normal splicing, which is not true in fact. We develop the ISVASE, a tool for specifically identifying sequence variants associated with splicing events (SVASE) by using RNA-seq data. Comparing with PVAAS, our tool has several advantages, such as multi-pass stringent rule-dependent filters and statistical filters, only using split-reads, independent sequence variant identification in each part of splicing (junction), sequence variant detection for both of known and novel splicing event, additional exon-exon junction shift event detection if known splicing events provided, splicing signal evaluation, known DNA mutation and/or RNA editing data supported, higher precision and consistency, and short running time. Using a realistic RNA-seq dataset, we performed a case study to illustrate the functionality and effectiveness of our method. Moreover, the output of SVASEs can be used for downstream analysis such as splicing regulatory element study and sequence variant functional analysis. ISVASE is useful for researchers interested in sequence variants (DNA mutation and/or RNA editing) associated with splicing events. The package is freely available at https://sourceforge.net/projects/isvase/ .

  20. NOvA Event Building, Buffering and Data-Driven Triggering From Within the DAQ System

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M. [Fermilab; Green, C. [Fermilab; Kowalkowski, J. [Fermilab; Norman, A. [Fermilab; Paterno, M. [Fermilab; Rechenmacher, R. [Fermilab

    2012-06-22

    To make its core measurements, the NOvA experiment needs to make real-time data-driven decisions involving beam-spill time correlation and other triggering issues. NOvA-DDT is a prototype Data-Driven Triggering system, built using the Fermilab artdaq generic DAQ/Event-building tools set. This provides the advantages of sharing online software infrastructure with other Intensity Frontier experiments, and of being able to use any offline analysis module--unchanged--as a component of the online triggering decisions. The NOvA-artdaq architecture chosen has significant advantages, including graceful degradation if the triggering decision software fails or cannot be done quickly enough for some fraction of the time-slice ``events.'' We have tested and measured the performance and overhead of NOvA-DDT using an actual Hough transform based trigger decision module taken from the NOvA offline software. The results of these tests--98 ms mean time per event on only 1/16 of th e available processing power of a node, and overheads of about 2 ms per event--provide a proof of concept: NOvA-DDT is a viable strategy for data acquisition, event building, and trigger processing at the NOvA far detector.

  1. Creating personalized memories from social events: community-based support for multi-camera recordings of school concerts

    NARCIS (Netherlands)

    R.L. Guimarães (Rodrigo); P.S. Cesar Garcia (Pablo Santiago); D.C.A. Bulterman (Dick); V. Zsombori; I. Kegel

    2011-01-01

    htmlabstractThe wide availability of relatively high-quality cameras makes it easy for many users to capture video fragments of social events such as concerts, sports events or community gatherings. The wide availability of simple sharing tools makes it nearly as easy to upload individual fragments

  2. Identification of incident poisoning, fracture and burn events using linked primary care, secondary care and mortality data from England: implications for research and surveillance.

    Science.gov (United States)

    Baker, Ruth; Tata, Laila J; Kendrick, Denise; Orton, Elizabeth

    2016-02-01

    English national injury data collection systems are restricted to hospitalisations and deaths. With recent linkage of a large primary care database, the Clinical Practice Research Datalink (CPRD), with secondary care and mortality data, we aimed to assess the utility of linked data for injury research and surveillance by examining recording patterns and comparing incidence of common injuries across data sources. The incidence of poisonings, fractures and burns was estimated for a cohort of 2 147 853 0-24 year olds using CPRD linked to Hospital Episode Statistics (HES) and Office for National Statistics (ONS) mortality data between 1997 and 2012. Time-based algorithms were developed to identify incident events, distinguishing between repeat follow-up records for the same injury and those for a new event. We identified 42 985 poisoning, 185 517 fracture and 36 719 burn events in linked CPRD-HES-ONS data; incidence rates were 41.9 per 10 000 person-years (95% CI 41.4 to 42.4), 180.8 (179.8-181.7) and 35.8 (35.4-36.1), respectively. Of the injuries, 22 628 (53%) poisonings, 139 662 (75%) fractures and 33 462 (91%) burns were only recorded within CPRD. Only 16% of deaths from poisoning (n=106) or fracture (n=58) recorded in ONS were recorded within CPRD and/or HES records. None of the 10 deaths from burns were recorded in CPRD or HES records. It is essential to use linked primary care, hospitalisation and deaths data to estimate injury burden, as many injury events are only captured within a single data source. Linked routinely collected data offer an immediate and affordable mechanism for injury surveillance and analyses of population-based injury epidemiology in England. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Exploring data sources for road traffic injury in Cameroon: Collection and completeness of police records, newspaper reports, and a hospital trauma registry.

    Science.gov (United States)

    Juillard, Catherine; Kouo Ngamby, Marquise; Ekeke Monono, Martin; Etoundi Mballa, Georges Alain; Dicker, Rochelle A; Stevens, Kent A; Hyder, Adnan A

    2017-12-01

    Road traffic injury surveillance systems are a cornerstone of organized efforts at injury control. Although high-income countries rely on established trauma registries and police databases, in low- and middle-income countries, the data source that provides the best collection of road traffic injury events in specific low- and middle-income country contexts without mature surveillance systems is unclear. The objective of this study was to compare the information available on road traffic injuries in 3 data sources used for surveillance in the sub-Saharan African country of Cameroon, providing potential insight on data sources for road traffic injury surveillance in low- and middle-income countries. We assessed the number of events captured and the information available in Yaoundé, Cameroon, from 3 separate sources of data on road traffic injuries: trauma registry, police records, and newspapers. Data were collected from a single-hospital trauma registry, police records, and the 6 most widely circulated newspapers in Yaoundé during a 6-month period in 2009. The number of road traffic injury events, mortality, and other variables included commonly in injury surveillance systems were recorded. We compared these sources using descriptive analysis. Hospital, police, and newspaper sources recorded 1,686, 273, and 480 road traffic injuries, respectively. The trauma registry provided the most complete data for the majority of variables explored; however, the newspaper data source captured 2, mass casualty, train crash events unrecorded in the other sources. Police data provided the most complete information on first responders to the scene, missing in only 7%. Investing in the hospital-based trauma registry may yield the best surveillance for road traffic injuries in some low- and middle-income countries, such as Yaoundé, Cameroon; however, police and newspaper reports may serve as alternative data sources when specific information is needed. Copyright © 2017 Elsevier

  4. Migrant Student Record Transfer System (MSRTS) [machine-readable data file].

    Science.gov (United States)

    Arkansas State Dept. of Education, Little Rock. General Education Div.

    The Migrant Student Record Transfer System (MSRTS) machine-readable data file (MRDF) is a collection of education and health data on more than 750,000 migrant children in grades K-12 in the United States (except Hawaii), the District of Columbia, and the outlying territories of Puerto Rico and the Mariana and Marshall Islands. The active file…

  5. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    Science.gov (United States)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  6. Experimental Seismic Event-screening Criteria at the Prototype International Data Center

    Science.gov (United States)

    Fisk, M. D.; Jepsen, D.; Murphy, J. R.

    - Experimental seismic event-screening capabilities are described, based on the difference of body-and surface-wave magnitudes (denoted as Ms:mb) and event depth. These capabilities have been implemented and tested at the prototype International Data Center (PIDC), based on recommendations by the IDC Technical Experts on Event Screening in June 1998. Screening scores are presented that indicate numerically the degree to which an event meets, or does not meet, the Ms:mb and depth screening criteria. Seismic events are also categorized as onshore, offshore, or mixed, based on their 90% location error ellipses and an onshore/offshore grid with five-minute resolution, although this analysis is not used at this time to screen out events.Results are presented of applications to almost 42,000 events with mb>=3.5 in the PIDC Standard Event Bulletin (SEB) and to 121 underground nuclear explosions (UNE's) at the U.S. Nevada Test Site (NTS), the Semipalatinsk and Novaya Zemlya test sites in the Former Soviet Union, the Lop Nor test site in China, and the Indian, Pakistan, and French Polynesian test sites. The screening criteria appear to be quite conservative. None of the known UNE's are screened out, while about 41 percent of the presumed earthquakes in the SEB with mb>=3.5 are screened out. UNE's at the Lop Nor, Indian, and Pakistan test sites on 8 June 1996, 11 May 1998, and 28 May 1998, respectively, have among the lowest Ms:mb scores of all events in the SEB.To assess the validity of the depth screening results, comparisons are presented of SEB depth solutions to those in other bulletins that are presumed to be reliable and independent. Using over 1600 events, the comparisons indicate that the SEB depth confidence intervals are consistent with or shallower than over 99.8 percent of the corresponding depth estimates in the other bulletins. Concluding remarks are provided regarding the performance of the experimental event-screening criteria, and plans for future

  7. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  8. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  9. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    Science.gov (United States)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  10. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. The use of MP3 recorders to log data from equine hoof mounted accelerometers.

    Science.gov (United States)

    Parsons, K J; Wilson, A M

    2006-11-01

    MP3 recorders are readily available, small, lightweight and low cost, providing the potential for logging analogue hoof mounted accelerometer signals for the characterisation of equine locomotion. These, however, require testing in practice. To test whether 1) multiple MP3 recorders can maintain synchronisation, giving the ability to synchronise independent recorders for the logging of multiple limbs simultaneously; and 2) features of a foot mounted accelerometer signal attributable to foot-on and foot-off can be accurately identified from horse foot mounted accelerometers logged directly into an MP3 recorder. Three experiments were performed: 1) Maintenance of synchronisation was assessed by counting the number of samples recorded by each of 4 MP3 recorders while mounted on a trotting horse and over 2 consecutive 30 min periods in 8 recorders on a bench. 2) Foot-on and foot-off times obtained from manual transcription of MP3 logged data and directly logged accelerometer signal were compared. 3) MP3/accelerometer acquisition units were used to log accelerometer signals from racehorses during extended training sessions. Mean absolute error of synchronisation between MP3 recorders was 10 samples per million (compared to mean number of samples, range 1-32 samples per million). Error accumulation showed a linear correlation with time. Features attributable to foot on and foot off were equally identifiable from the MP3 recorded signal over a range of equine gaits. Multiple MP3 recorders can be synchronised and used as a relatively cheap, robust, reliable and accurate logging system when combined with an accelerometer and external battery for the specific application of the measurement of stride timing variables across the range of equine gaits during field locomotion. Footfall timings can be used to identify intervals between the fore and hind contacts, the identification of diagonal advanced placement and to calculate stride timing variables (stance time, protraction

  12. The occurrence of adverse events potentially attributable to nursing care in medical units: cross sectional record review.

    Science.gov (United States)

    D'Amour, Danielle; Dubois, Carl-Ardy; Tchouaket, Eric; Clarke, Sean; Blais, Régis

    2014-06-01

    Ensuring the safety of hospitalized patients remains a major challenge for healthcare systems, and nursing services are at the center of hospital care. Yet our knowledge about safety of nursing care is quite limited. In fact, most earlier studies examined one, or at most two, indicators, thus presenting an incomplete picture of safety at an institutional or broader level. Furthermore, methodologies have differed from one study to another, making benchmarking difficult. The aim of this study was to describe the frequencies of six adverse events widely considered in the literature to be nursing-sensitive outcomes and to estimate the degree to which these events could be attributed to nursing care. Cross-sectional review of charts of 2699 patients hospitalized on 22 medical units in 11 hospitals in Quebec, Canada. The events included: pressure sores, falls, medication administration errors, pneumonias, urinary infections, and inappropriate use of restraints. Experienced nurse reviewers abstracted patients' charts based on a grid developed for the study. Patient-level risk for at least one of these six adverse events was 15.3%, ranging from 9% to 28% across units. Of the 412 patients who experienced an event, 30% experienced two or more, for a total of 568 events. The risk of experiencing an adverse event with consequences was 6.2%, with a unit-level range from 3.2% to 13.5%. Abstractors concluded that 76.8% of the events were attributable to nursing care. While the measurement approach adopted here has limitations stemming from reliance on review of documentation, it provided a practical means of assessing several nursing-sensitive adverse events simultaneously. Given that patient safety issues are so complex, tracking their prevalence and impact is important, as is finding means of evaluating progress in reducing them. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  13. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Science.gov (United States)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  14. Identification of major cardiovascular events in patients with diabetes using primary care data

    NARCIS (Netherlands)

    Pouwels, Koen Bernardus; Voorham, Jaco; Hak, Eelko; Denig, Petra

    2016-01-01

    Background: Routine primary care data are increasingly being used for evaluation and research purposes but there are concerns about the completeness and accuracy of diagnoses and events captured in such databases. We evaluated how well patients with major cardiovascular disease (CVD) can be

  15. 75 FR 16140 - Common Formats for Patient Safety Data Collection and Event Reporting

    Science.gov (United States)

    2010-03-31

    ... FR 45457-45458. Definition of Common Formats The term ``Common Formats'' is used to describe clinical... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Common Formats for Patient Safety Data Collection and Event Reporting AGENCY: Agency for Healthcare Research and...

  16. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  17. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  18. Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs.

    Science.gov (United States)

    Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian

    2017-08-07

    In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs' flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV's optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity.

  19. Gridded sunshine duration climate data record for Germany based on combined satellite and in situ observations

    Science.gov (United States)

    Walawender, Jakub; Kothe, Steffen; Trentmann, Jörg; Pfeifroth, Uwe; Cremer, Roswitha

    2017-04-01

    The purpose of this study is to create a 1 km2 gridded daily sunshine duration data record for Germany covering the period from 1983 to 2015 (33 years) based on satellite estimates of direct normalised surface solar radiation and in situ sunshine duration observations using a geostatistical approach. The CM SAF SARAH direct normalized irradiance (DNI) satellite climate data record and in situ observations of sunshine duration from 121 weather stations operated by DWD are used as input datasets. The selected period of 33 years is associated with the availability of satellite data. The number of ground stations is limited to 121 as there are only time series with less than 10% of missing observations over the selected period included to keep the long-term consistency of the output sunshine duration data record. In the first step, DNI data record is used to derive sunshine hours by applying WMO threshold of 120 W/m2 (SDU = DNI ≥ 120 W/m2) and weighting of sunny slots to correct the sunshine length between two instantaneous image data due to cloud movement. In the second step, linear regression between SDU and in situ sunshine duration is calculated to adjust the satellite product to the ground observations and the output regression coefficients are applied to create a regression grid. In the last step regression residuals are interpolated with ordinary kriging and added to the regression grid. A comprehensive accuracy assessment of the gridded sunshine duration data record is performed by calculating prediction errors (cross-validation routine). "R" is used for data processing. A short analysis of the spatial distribution and temporal variability of sunshine duration over Germany based on the created dataset will be presented. The gridded sunshine duration data are useful for applications in various climate-related studies, agriculture and solar energy potential calculations.

  20. An ontology-based method for secondary use of electronic dental record data

    Science.gov (United States)

    Schleyer, Titus KL; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P.; Liu, Kaihong; Hernandez, Pedro

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance. PMID:24303273

  1. An ontology-based method for secondary use of electronic dental record data.

    Science.gov (United States)

    Schleyer, Titus Kl; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P; Liu, Kaihong; Hernandez, Pedro

    2013-01-01

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance.

  2. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  4. Implementation of a Big Data Accessing and Processing Platform for Medical Records in Cloud.

    Science.gov (United States)

    Yang, Chao-Tung; Liu, Jung-Chun; Chen, Shuo-Tsung; Lu, Hsin-Wen

    2017-08-18

    Big Data analysis has become a key factor of being innovative and competitive. Along with population growth worldwide and the trend aging of population in developed countries, the rate of the national medical care usage has been increasing. Due to the fact that individual medical data are usually scattered in different institutions and their data formats are varied, to integrate those data that continue increasing is challenging. In order to have scalable load capacity for these data platforms, we must build them in good platform architecture. Some issues must be considered in order to use the cloud computing to quickly integrate big medical data into database for easy analyzing, searching, and filtering big data to obtain valuable information.This work builds a cloud storage system with HBase of Hadoop for storing and analyzing big data of medical records and improves the performance of importing data into database. The data of medical records are stored in HBase database platform for big data analysis. This system performs distributed computing on medical records data processing through Hadoop MapReduce programming, and to provide functions, including keyword search, data filtering, and basic statistics for HBase database. This system uses the Put with the single-threaded method and the CompleteBulkload mechanism to import medical data. From the experimental results, we find that when the file size is less than 300MB, the Put with single-threaded method is used and when the file size is larger than 300MB, the CompleteBulkload mechanism is used to improve the performance of data import into database. This system provides a web interface that allows users to search data, filter out meaningful information through the web, and analyze and convert data in suitable forms that will be helpful for medical staff and institutions.

  5. Dal record al dato. Linked data e ricerca dell’informazione nell’OPAC.

    Directory of Open Access Journals (Sweden)

    Antonella Iacono

    2013-12-01

    In this paper the author explores the possibility the new record deconstructed and connected with the other data on the Web is able to facilitate the creation of knowledge in the use of the catalogue. The author then analyzes the potential of application of linked data to the catalogue with regards to the capabilities for research, the new possibilities of semantic search and the ways to access bibliographic data.

  6. Satellite-based climate data records of surface solar radiation from the CM SAF

    Science.gov (United States)

    Trentmann, Jörg; Cremer, Roswitha; Kothe, Steffen; Müller, Richard; Pfeifroth, Uwe

    2017-04-01

    The incoming surface solar radiation has been defined as an essential climate variable by GCOS. Long term monitoring of this part of the earth's energy budget is required to gain insights on the state and variability of the climate system. In addition, climate data sets of surface solar radiation have received increased attention over the recent years as an important source of information for solar energy assessments, for crop modeling, and for the validation of climate and weather models. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) is deriving climate data records (CDRs) from geostationary and polar-orbiting satellite instruments. Within the CM SAF these CDRs are accompanied by operational data at a short time latency to be used for climate monitoring. All data from the CM SAF is freely available via www.cmsaf.eu. Here we present the regional and the global climate data records of surface solar radiation from the CM SAF. The regional climate data record SARAH (Surface Solar Radiation Dataset - Heliosat, doi: 10.5676/EUM_SAF_CM/SARAH/V002) is based on observations from the series of Meteosat satellites. SARAH provides 30-min, daily- and monthly-averaged data of the effective cloud albedo, the solar irradiance (incl. spectral information), the direct solar radiation (horizontal and normal), and the sunshine duration from 1983 to 2015 for the full view of the Meteosat satellite (i.e, Europe, Africa, parts of South America, and the Atlantic ocean). The data sets are generated with a high spatial resolution of 0.05° allowing for detailed regional studies. The global climate data record CLARA (CM SAF Clouds, Albedo and Radiation dataset from AVHRR data, doi: 10.5676/EUM_SAF_CM/CLARA_AVHRR/V002) is based on observations from the series of AVHRR satellite instruments. CLARA provides daily- and monthly-averaged global data of the solar irradiance (SIS) from 1982 to 2015 with a spatial resolution of 0.25°. In addition to the solar surface

  7. Creating personalized memories from social events: Community-based support for multi-camera recordings of school concerts

    OpenAIRE

    Guimaraes R.L.; Cesar P.; Bulterman D.C.A.; Zsombori V.; Kegel I.

    2011-01-01

    htmlabstractThe wide availability of relatively high-quality cameras makes it easy for many users to capture video fragments of social events such as concerts, sports events or community gatherings. The wide availability of simple sharing tools makes it nearly as easy to upload individual fragments to on-line video sites. Current work on video mashups focuses on the creation of a video summary based on the characteristics of individual media fragments, but it fails to address the interpersona...

  8. Higgs boson produced via vector boson fusion event recorded by CMS (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    Real proton-proton collision event at 13 TeV in the CMS detector in which two high-energy electrons (green lines), two high-energy muons (red lines), and two-high energy jets (dark yellow cones) are observed. The event shows characteristics expected from Higgs boson production via vector boson fusion with subsequent decay of the Higgs boson in four leptons, and is also consistent with background standard model physics processes.

  9. The realization of the storage of XML and middleware-based data of electronic medical records

    International Nuclear Information System (INIS)

    Liu Shuzhen; Gu Peidi; Luo Yanlin

    2007-01-01

    In this paper, using the technology of XML and middleware to design and implement a unified electronic medical records storage archive management system and giving a common storage management model. Using XML to describe the structure of electronic medical records, transform the medical data from traditional 'business-centered' medical information into a unified 'patient-centered' XML document and using middleware technology to shield the types of the databases at different departments of the hospital and to complete the information integration of the medical data which scattered in different databases, conducive to information sharing between different hospitals. (authors)

  10. Evaluation of corrective action data for reportable events at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1991-01-01

    805The Nuclear Regulatory Commission (NRC) approved the adoption of cause codes for reportable events as a new performance indicator (PI) in March 1989. Corrective action data associated with the causes of events were to be compiled also. The corrective action data was considered as supplemental information but not identified formally as a performance indicator. In support of NRC, the Nuclear Operations Analysis Center (NOAC) at the Oak Ridge National Laboratory (ORNL) has been routinely evaluating licensee event reports (LERs) for cause code and corrective action data since 1989. The compilation of corrective action data by NOAC represents the first systematic and comprehensive compilation of this type data. The thrust of analyzing the corrective action data was to identify areas where licensees allocated resources to solve problems and prevent the recurrence of personnel errors and equipment failures. The predominant areas of corrective action reported by licensees are to be evaluated by NRC to compare with NRC programs designed to improve plant performance. The set of corrective action codes used to correlate with individual cause codes and included in the analyses were: training, procedural modification, corrective discipline, management change, design modification, equipment replacement/adjustment, other, and unknown. 1 fig

  11. The tip of the iceberg : challenges of accessing hospital electronic health record data for biological data mining

    NARCIS (Netherlands)

    Denaxas, Spiros C; Asselbergs, Folkert W; Moore, Jason H

    2016-01-01

    Modern cohort studies include self-reported measures on disease, behavior and lifestyle, sensor-based observations from mobile phones and wearables, and rich -omics data. Follow-up is often achieved through electronic health record (EHR) linkages across primary and secondary healthcare providers.

  12. Persistent storage of non-event data in the CMS databases

    International Nuclear Information System (INIS)

    De Gruttola, M; Di Guida, S; Innocente, V; Schlatter, D; Futyan, D; Glege, F; Paolucci, P; Govi, G; Picca, P; Pierro, A; Xie, Z

    2010-01-01

    In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first experience obtained during the 2008 and 2009 cosmic data taking are presented.

  13. Ontology-Based Data Integration of Open Source Electronic Medical Record and Data Capture Systems

    Science.gov (United States)

    Guidry, Alicia F.

    2013-01-01

    In low-resource settings, the prioritization of clinical care funding is often determined by immediate health priorities. As a result, investment directed towards the development of standards for clinical data representation and exchange are rare and accordingly, data management systems are often redundant. Open-source systems such as OpenMRS and…

  14. Event Management of RFID Data Streams: Fast Moving Consumer Goods Supply Chains

    Science.gov (United States)

    Mo, John P. T.; Li, Xue

    Radio Frequency Identification (RFID) is a wireless communication technology that uses radio-frequency waves to transfer information between tagged objects and readers without line of sight. This creates tremendous opportunities for linking real world objects into a world of "Internet of things". Application of RFID to Fast Moving Consumer Goods sector will introduce billions of RFID tags in the world. Almost everything is tagged for tracking and identification purposes. This phenomenon will impose a new challenge not only to the network capacity but also to the scalability of processing of RFID events and data. This chapter uses two national demonstrator projects in Australia as case studies to introduce an event managementframework to process high volume RFID data streams in real time and automatically transform physical RFID observations into business-level events. The model handles various temporal event patterns, both simple and complex, with temporal constraints. The model can be implemented in a data management architecture that allows global RFID item tracking and enables fast, large-scale RFID deployment.

  15. Automatic event-based synchronization of multimodal data streams from wearable and ambient sensors

    NARCIS (Netherlands)

    Bannach, D.; Amft, O.D.; Lukowicz, P.; Barnaghi, P.; Moessner, K.; Presser, M.; Meissner, S.

    2009-01-01

    A major challenge in using multi-modal, distributed sensor systems for activity recognition is to maintain a temporal synchronization between individually recorded data streams. A common approach is to use well defined ‘synchronization actions’ performed by the user to generate, easily identifiable

  16. NOvA Event Building, Buffering and Data-Driven Triggering From Within the DAQ System

    International Nuclear Information System (INIS)

    Fischler, M; Rechenmacher, R; Green, C; Kowalkowski, J; Norman, A; Paterno, M

    2012-01-01

    The NOvA experiment is a long baseline neutrino experiment design to make precision probes of the structure of neutrino mixing. The experiment features a unique deadtimeless data acquisition system that is capable acquiring and building an event data stream from the continuous readout of the more than 360,000 far detector channels. In order to achieve its physics goals the experiment must be able to buffer, correlate and extract the data in this stream with the beam-spills that occur that Fermilab. In addition the NOvA experiment seeks to enhance its data collection efficiencies for rare class of event topologies that are valuable for calibration through the use of data driven triggering. The NOvA-DDT is a prototype Data-Driven Triggering system. NOvA-DDT has been developed using the Fermilab artdaq generic DAQ/Event-building toolkit. This toolkit provides the advantages of sharing online software infrastructure with other Intensity Frontier experiments, and of being able to use any offline analysis module-unchanged-as a component of the online triggering decisions. We have measured the performance and overhead of NOvA-DDT framework using a Hough transform based trigger decision module developed for the NOvA detector to identify cosmic rays. The results of these tests which were run on the NOvA prototype near detector, yielded a mean processing time of 98 ms per event, while consuming only 1/16th of the available processing capacity. These results provide a proof of concept that a NOvA-DDT based processing system is a viable strategy for data acquisition and triggering for the NOvA far detector.

  17. Global Trends in Chlorophyll Concentration Observed with the Satellite Ocean Colour Data Record

    Science.gov (United States)

    Melin, F.; Vantrepotte, V.; Chuprin, A.; Grant, M.; Jackson, T.; Sathyendranath, S.

    2016-08-01

    To detect climate change signals in the data records derived from remote sensing of ocean colour, combining data from multiple missions is required, which implies that the existence of inter-mission differences be adequately addressed prior to undertaking trend studies. Trend distributions associated with merged products are compared with those obtained from single-mission data sets in order to evaluate their suitability for climate studies. Merged products originally developed for operational applications such as near-real time distribution (GlobColour) do not appear to be proper climate data records, showing large parts of the ocean with trends significantly different from trends obtained with SeaWiFS, MODIS or MERIS. On the other hand, results obtained from the Climate Change Initiative (CCI) data are encouraging, showing a good consistency with single-mission products.

  18. The ESA climate change initiative: Satellite data records for essential climate variables

    DEFF Research Database (Denmark)

    Hollmann, R.; Merchant, C.J.; Saunders, R.

    2013-01-01

    The European Space Agency (ESA) has launched the Climate Change Initiative (CCI) to provide satellite-based climate data records (CDRs) that meet the challenging requirements of the climate community. The aim is to realize the full potential of the long-term Earth observation (EO) archives...... that both ESA and third parties have established. This includes aspects of producing a CDR, which involve data acquisition, calibration, algorithm development, validation, maintenance, and provision of the data to the climate research community. The CCI is consistent with several international efforts...... targeting the generation of satellite derived climate data records. One focus of the CCI is to provide products for climate modelers who increasingly use satellite data to initialize, constrain, and validate models on a wide range of space and time scales....

  19. Snpp CrIS Instrumental Status and Raw Data Record Quality Since the Mission

    Science.gov (United States)

    Jin, X.; Han, Y.; Sun, N.; Weng, F.; Wang, L.; Chen, Y.; Tremblay, D. A.

    2014-12-01

    The SNPP CrIS (cross-track infrared sounder) has been in service for more than two years. As the first operational interferometric hyper-spectral sounder onboard the new-generation polar-orbit meteorological satellite, CrIS's instrumental performance and data quality are widely concerned. NOAA/NESDIS/STAR CrIS Cal/Val team have been actively involved since the beginning of the mission. An intact record of the CrIS instrumental performance and raw data record (RDR) has been established. In this presentation, the continuous records of some critical indicators such as noise, gain, laser wavelength drifting and many other parameters related to the internal thermal status, are presented. It is found that the hardware performance is extremely stable in the past two years and the degradation is very small. These features make CrIS a great candidate for long-term climate studies. Moreover, the completeness of RDR data is another advantage of taking CrIS for climate studies. NOAA/NESDIS/STAR has recorded all of the CrIS RDR data since the launch and has been dedicated to improving the data quality.

  20. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    International Nuclear Information System (INIS)

    Perez, Rafael B.; Protopopescu, Vladimir A.; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method