WorldWideScience

Sample records for event data recorders

  1. 76 FR 47478 - Event Data Recorders

    Science.gov (United States)

    2011-08-05

    ... medical community, without imposing unnecessary burdens or deterring future improvements to EDRs that have... braking system (ABS) status, stability control status, and seat track position. The AIAM requested that... vehicle. He stated that devices are being offered to consumers to alter odometer readings, erase EDR data...

  2. Event metadata records as a testbed for scalable data mining

    Science.gov (United States)

    van Gemmeren, P.; Malon, D.

    2010-04-01

    At a data rate of 200 hertz, event metadata records ("TAGs," in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise "data mining," but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  3. Event metadata records as a testbed for scalable data mining

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D, E-mail: gemmeren@anl.go [Argonne National Laboratory, Argonne, Illinois 60439 (United States)

    2010-04-01

    At a data rate of 200 hertz, event metadata records ('TAGs,' in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise 'data mining,' but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  4. THE USE OF EVENT DATA RECORDER (EDR – BLACK BOX

    Directory of Open Access Journals (Sweden)

    Gabriel Nowacki

    2014-03-01

    Full Text Available The paper refers to the registration of road events by a modern device called EDR – black box for all types of the motor vehicles. The device records data concerning vehicle’s technical condition, the way it was driven and RTS. The recorder may be used in private and commercial cars, taxies, buses and trucks. The recorder may serve the purpose of a neutral witness for the police, courts and insurance firms, for which it will facilitate making the reconstruction of the road accidents events and will provide a proof for those who caused them. The device will bring efficient driving, which will significantly contribute to decreasing the number of road accidents and limiting the environmental pollution. In the end in the last year German parliament backed a proposal to the European Commission to put black boxes, which gather information from vehicles involved in accidents, in all the new cars from 2015 on.

  5. Video event data recording of a taxi driver used for diagnosis of epilepsy.

    Science.gov (United States)

    Sakurai, Kotaro; Yamamoto, Junko; Kurita, Tsugiko; Takeda, Youji; Kusumi, Ichiro

    2014-01-01

    A video event data recorder (VEDR) in a motor vehicle records images before and after a traffic accident. This report describes a taxi driver whose seizures were recorded by VEDR, which was extremely useful for the diagnosis of epilepsy. The patient was a 63-year-old right-handed Japanese male taxi driver. He collided with a streetlight. Two years prior to this incident, he raced an engine for a long time while parked. The VEDR enabled confirmation that the accidents depended on an epileptic seizure and he was diagnosed with symptomatic localization-related epilepsy. The VEDR is useful not only for traffic accident evidence; it might also contribute to a driver's health care and road safety.

  6. Video event data recording of a taxi driver used for diagnosis of epilepsy☆

    Science.gov (United States)

    Sakurai, Kotaro; Yamamoto, Junko; Kurita, Tsugiko; Takeda, Youji; Kusumi, Ichiro

    2014-01-01

    A video event data recorder (VEDR) in a motor vehicle records images before and after a traffic accident. This report describes a taxi driver whose seizures were recorded by VEDR, which was extremely useful for the diagnosis of epilepsy. The patient was a 63-year-old right-handed Japanese male taxi driver. He collided with a streetlight. Two years prior to this incident, he raced an engine for a long time while parked. The VEDR enabled confirmation that the accidents depended on an epileptic seizure and he was diagnosed with symptomatic localization-related epilepsy. The VEDR is useful not only for traffic accident evidence; it might also contribute to a driver's health care and road safety. PMID:25667862

  7. Video event data recording of a taxi driver used for diagnosis of epilepsy

    Directory of Open Access Journals (Sweden)

    Kotaro Sakurai

    2014-01-01

    Full Text Available A video event data recorder (VEDR in a motor vehicle records images before and after a traffic accident. This report describes a taxi driver whose seizures were recorded by VEDR, which was extremely useful for the diagnosis of epilepsy. The patient was a 63-year-old right-handed Japanese male taxi driver. He collided with a streetlight. Two years prior to this incident, he raced an engine for a long time while parked. The VEDR enabled confirmation that the accidents depended on an epileptic seizure and he was diagnosed with symptomatic localization-related epilepsy. The VEDR is useful not only for traffic accident evidence; it might also contribute to a driver's health care and road safety.

  8. Common data elements for secondary use of electronic health record data for clinical trial execution and serious adverse event reporting.

    Science.gov (United States)

    Bruland, Philipp; McGilchrist, Mark; Zapletal, Eric; Acosta, Dionisio; Proeve, Johann; Askin, Scott; Ganslandt, Thomas; Doods, Justin; Dugas, Martin

    2016-11-22

    Data capture is one of the most expensive phases during the conduct of a clinical trial and the increasing use of electronic health records (EHR) offers significant savings to clinical research. To facilitate these secondary uses of routinely collected patient data, it is beneficial to know what data elements are captured in clinical trials. Therefore our aim here is to determine the most commonly used data elements in clinical trials and their availability in hospital EHR systems. Case report forms for 23 clinical trials in differing disease areas were analyzed. Through an iterative and consensus-based process of medical informatics professionals from academia and trial experts from the European pharmaceutical industry, data elements were compiled for all disease areas and with special focus on the reporting of adverse events. Afterwards, data elements were identified and statistics acquired from hospital sites providing data to the EHR4CR project. The analysis identified 133 unique data elements. Fifty elements were congruent with a published data inventory for patient recruitment and 83 new elements were identified for clinical trial execution, including adverse event reporting. Demographic and laboratory elements lead the list of available elements in hospitals EHR systems. For the reporting of serious adverse events only very few elements could be identified in the patient records. Common data elements in clinical trials have been identified and their availability in hospital systems elucidated. Several elements, often those related to reimbursement, are frequently available whereas more specialized elements are ranked at the bottom of the data inventory list. Hospitals that want to obtain the benefits of reusing data for research from their EHR are now able to prioritize their efforts based on this common data element list.

  9. Common data elements for secondary use of electronic health record data for clinical trial execution and serious adverse event reporting

    Directory of Open Access Journals (Sweden)

    Philipp Bruland

    2016-11-01

    Full Text Available Abstract Background Data capture is one of the most expensive phases during the conduct of a clinical trial and the increasing use of electronic health records (EHR offers significant savings to clinical research. To facilitate these secondary uses of routinely collected patient data, it is beneficial to know what data elements are captured in clinical trials. Therefore our aim here is to determine the most commonly used data elements in clinical trials and their availability in hospital EHR systems. Methods Case report forms for 23 clinical trials in differing disease areas were analyzed. Through an iterative and consensus-based process of medical informatics professionals from academia and trial experts from the European pharmaceutical industry, data elements were compiled for all disease areas and with special focus on the reporting of adverse events. Afterwards, data elements were identified and statistics acquired from hospital sites providing data to the EHR4CR project. Results The analysis identified 133 unique data elements. Fifty elements were congruent with a published data inventory for patient recruitment and 83 new elements were identified for clinical trial execution, including adverse event reporting. Demographic and laboratory elements lead the list of available elements in hospitals EHR systems. For the reporting of serious adverse events only very few elements could be identified in the patient records. Conclusions Common data elements in clinical trials have been identified and their availability in hospital systems elucidated. Several elements, often those related to reimbursement, are frequently available whereas more specialized elements are ranked at the bottom of the data inventory list. Hospitals that want to obtain the benefits of reusing data for research from their EHR are now able to prioritize their efforts based on this common data element list.

  10. Genomic selection for producer-recorded health event data in US dairy cattle.

    Science.gov (United States)

    Parker Gaddis, K L; Cole, J B; Clay, J S; Maltecca, C

    2014-05-01

    Emphasizing increased profit through increased dairy cow production has revealed a negative relationship of production with fitness and health traits. Decreased cow health can affect herd profitability through increased rates of involuntary culling and decreased or lost milk sales. The development of genomic selection methodologies, with accompanying substantial gains in reliability for low-heritability traits, may dramatically improve the feasibility of genetic improvement of dairy cow health. Producer-recorded health information may provide a wealth of information for improvement of dairy cow health, thus improving profitability. The principal objective of this study was to use health data collected from on-farm computer systems in the United States to estimate variance components and heritability for health traits commonly experienced by dairy cows. A single-step analysis was conducted to estimate genomic variance components and heritabilities for health events, including cystic ovaries, displaced abomasum, ketosis, lameness, mastitis, metritis, and retained placenta. A blended H matrix was constructed for a threshold model with fixed effects of parity and year-season and random effects of herd-year and sire. The single-step genomic analysis produced heritability estimates that ranged from 0.02 (standard deviation = 0.005) for lameness to 0.36 (standard deviation = 0.08) for retained placenta. Significant genetic correlations were found between lameness and cystic ovaries, displaced abomasum and ketosis, displaced abomasum and metritis, and retained placenta and metritis. Sire reliabilities increased, on average, approximately 30% with the incorporation of genomic data. From the results of these analyses, it was concluded that genetic selection for health traits using producer-recorded data are feasible in the United States, and that the inclusion of genomic data substantially improves reliabilities for these traits. Copyright © 2014 American Dairy Science

  11. SIGMATA: Storage Integrity Guaranteeing Mechanism against Tampering Attempts for Video Event Data Recorders

    Directory of Open Access Journals (Sweden)

    Hyuckmin Kwon

    2016-04-01

    Full Text Available The usage and market size of video event data recorders (VEDRs, also known as car black boxes, are rapidly increasing. Since VEDRs can provide more visual information about car accident situations than any other device that is currently used for accident investigations (e.g., closed-circuit television, the integrity of the VEDR contents is important to any meaningful investigation. Researchers have focused on the file system integrity or photographic approaches to integrity verification. However, unlike other general data, the video data in VEDRs exhibit a unique I/O behavior in that the videos are stored chronologically. In addition, the owners of VEDRs can manipulate unfavorable scenes after accidents to conceal their recorded behavior. Since prior arts do not consider the time relationship between the frames and fail to discover frame-wise forgery, a more detailed integrity assurance is required. In this paper, we focus on the development of a frame-wise forgery detection mechanism that resolves the limitations of previous mechanisms. We introduce SIGMATA, a novel storage integrity guaranteeing mechanism against tampering attempts for VEDRs. We describe its operation, demonstrate its effectiveness for detecting possible frame-wise forgery, and compare it with existing mechanisms. The result shows that the existing mechanisms fail to detect any frame-wise forgery, while our mechanism thoroughly detects every frame-wise forgery. We also evaluate its computational overhead using real VEDR videos. The results show that SIGMATA indeed discovers frame-wise forgery attacks effectively and efficiently, with the encoding overhead less than 1.5 milliseconds per frame.

  12. Using Probabilistic Record Linkage of Structured and Unstructured Data to Identify Duplicate Cases in Spontaneous Adverse Event Reporting Systems.

    Science.gov (United States)

    Kreimeyer, Kory; Menschik, David; Winiecki, Scott; Paul, Wendy; Barash, Faith; Woo, Emily Jane; Alimchandani, Meghna; Arya, Deepa; Zinderman, Craig; Forshee, Richard; Botsis, Taxiarchis

    2017-07-01

    Duplicate case reports in spontaneous adverse event reporting systems pose a challenge for medical reviewers to efficiently perform individual and aggregate safety analyses. Duplicate cases can bias data mining by generating spurious signals of disproportional reporting of product-adverse event pairs. We have developed a probabilistic record linkage algorithm for identifying duplicate cases in the US Vaccine Adverse Event Reporting System (VAERS) and the US Food and Drug Administration Adverse Event Reporting System (FAERS). In addition to using structured field data, the algorithm incorporates the non-structured narrative text of adverse event reports by examining clinical and temporal information extracted by the Event-based Text-mining of Health Electronic Records system, a natural language processing tool. The final component of the algorithm is a novel duplicate confidence value that is calculated by a rule-based empirical approach that looks for similarities in a number of criteria between two case reports. For VAERS, the algorithm identified 77% of known duplicate pairs with a precision (or positive predictive value) of 95%. For FAERS, it identified 13% of known duplicate pairs with a precision of 100%. The textual information did not improve the algorithm's automated classification for VAERS or FAERS. The empirical duplicate confidence value increased performance on both VAERS and FAERS, mainly by reducing the occurrence of false-positives. The algorithm was shown to be effective at identifying pre-linked duplicate VAERS reports. The narrative text was not shown to be a key component in the automated detection evaluation; however, it is essential for supporting the semi-automated approach that is likely to be deployed at the Food and Drug Administration, where medical reviewers will perform some manual review of the most highly ranked reports identified by the algorithm.

  13. Higgs boson candidate event from 2012 data (8 TeV) recorded by the CMS experiment: ZZ to four electrons

    CERN Multimedia

    2016-01-01

    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of Z bosons, both of which subsequently decay to a pair of electrons. The event could also be due to known standard model background processes.

  14. 77 FR 74144 - Federal Motor Vehicle Safety Standards; Event Data Recorders

    Science.gov (United States)

    2012-12-13

    ... FMVSS No. 226, ``Ejection mitigation,'' all have been updated since the publication in 2006 of the EDR.... Accurate reporting of seat belt use and pre-crash data was also observed. The findings from these studies... protection,'' FMVSS No. 126, ``Electronic stability control,'' and FMVSS No. 226, ``Ejection mitigation...

  15. Methodology for using advanced event data recorders to reconstruct vehicle trajectories for use in safety impact methodologies (SIM).

    Science.gov (United States)

    Kusano, Kristofer D; Sherony, Rini; Gabler, Hampton C

    2013-01-01

    Safety impact methodologies (SIMs) have the goal of estimating safety benefits for proposed active safety systems. Because the precrash movements of vehicles involved in real-world crashes are often unknown, previous SIMs have taken the approach to reconstruct collisions from incomplete information sources, such as scaled scene diagrams and photographic evidence. The objective of this study is to introduce a novel methodology for reconstructing the precrash vehicle trajectories using data from advanced event data recorders (EDRs). Some EDRs from model year 2009 and newer Ford vehicles can record steering wheel angle in addition to precrash vehicle speed, accelerator pedal, and throttle input prior to the crash. A model was constructed using these precrash records and a vehicle model developed in the simulation software PreScan. The model was validated using the yaw rate and longitudinal and lateral accelerations also recorded by this type of Ford EDR but not used to develop the models. In general, the model was able to approximate the dynamics recorded on the EDR. The model did not match the observed dynamics when either the vehicle departed the paved surface or when electronic stability control was active. Modifying the surface friction at the estimated point at which the vehicle departed the road produced better simulation results. The developed trajectories were used to simulate 2 road departure crashes, one into a fixed object and one into a vehicle traveling in the opposite direction, as if the departing vehicle were equipped with a lane departure warning (LDW) system. This example application demonstrates the utility of this method and its potential application to a SIM. This study demonstrated a novel method for crash reconstruction that can be applied to a SIM for active safety systems. Benefits of this method are that the driver inputs do not need to be inferred from other reconstructions because they are recorded directly by the EDR. Currently, there are

  16. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    OpenAIRE

    G. W. L. Gresik; S. Siebe; R. Drewello

    2013-01-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence...

  17. First ATLAS Events Recorded Underground

    CERN Multimedia

    Teuscher, R

    As reported in the CERN Bulletin, Issue No.30-31, 25 July 2005 The ATLAS barrel Tile calorimeter has recorded its first events underground using a cosmic ray trigger, as part of the detector commissioning programme. This is not a simulation! A cosmic ray muon recorded by the barrel Tile calorimeter of ATLAS on 21 June 2005 at 18:30. The calorimeter has three layers and a pointing geometry. The light trapezoids represent the energy deposited in the tiles of the calorimeter depicted as a thick disk. On the evening of June 21, the ATLAS detector, now being installed in the underground experimental hall UX15, reached an important psychological milestone: the barrel Tile calorimeter recorded the first cosmic ray events in the underground cavern. An estimated million cosmic muons enter the ATLAS cavern every 3 minutes, and the ATLAS team decided to make good use of some of them for the commissioning of the detector. Although only 8 of the 128 calorimeter slices ('superdrawers') were included in the trigg...

  18. Records Reaching Recording Data Technologies

    Science.gov (United States)

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-07-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  19. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    G. W. L. Gresik

    2013-07-01

    Full Text Available The goal of RECORDS (Reaching Recording Data Technologies is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  20. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  1. Digital event recorder capable of simple computations and with ...

    African Journals Online (AJOL)

    An event recorder which can summate and display stored data is described. This instrument can be used to record behavioural events or sequences in the laboratory or the field and produces a punched tape record which may be read by a computer, without need for an interface. Its ability to perform simple calculations for ...

  2. Factors contributing to commercial vehicle rear-end conflicts in China: A study using on-board event data recorders.

    Science.gov (United States)

    Bianchi Piccinini, Giulio; Engström, Johan; Bärgman, Jonas; Wang, Xuesong

    2017-09-01

    In the last 30years, China has undergone a dramatic increase in vehicle ownership and a resulting escalation in the number of road crashes. Although crash figures are decreasing today, they remain high; it is therefore important to investigate crash causation mechanisms to further improve road safety in China. To shed more light on the topic, naturalistic driving data was collected in Shanghai as part of the evaluation of a behavior-based safety service. The data collection included instrumenting 47 vehicles belonging to a commercial fleet with data acquisition systems. From the overall sample, 91 rear-end crash or near-crash (CNC) events, triggered by 24 drivers, were used in the analysis. The CNC were annotated by three researchers, through an expert assessment methodology based on videos and kinematic variables. The results show that the main factor behind the rear-end CNC was the adoption of very small safety margins. In contrast to results from previous studies in the US, the following vehicles' drivers typically had their eyes on the road and reacted quickly in response to the evolving conflict in most events. When delayed reactions occurred, they were mainly due to driving-related visual scanning mismatches (e.g., mirror checks) rather than visual distraction. Finally, the study identified four main conflict scenarios that represent the typical development of rear-end conflicts in this data. The findings of this study have several practical applications, such as informing the specifications of in-vehicle safety measures and automated driving and providing input into the design of coaching/training procedures to improve the driving habits of drivers. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Adverse events recording in electronic health record systems in primary care.

    Science.gov (United States)

    de Hoon, Sabine E M; Hek, Karin; van Dijk, Liset; Verheij, Robert A

    2017-12-06

    Adequate record keeping of medication adverse events in electronic health records systems is important for patient safety. Events that remain unrecorded cannot be communicated from one health professional to another. In the absence of a gold standard, we investigate the variation between Dutch general practices in the extent to which they record medication adverse events. Data were derived from electronic health records (EHR) of Dutch general practices participating in NIVEL Primary Care Database (NIVEL-PCD) in 2014, including 308 general practices with a total practice population of 1,256,049 listed patients. Medication adverse events were defined as recorded ICPC-code A85 (adverse effect medical agent). Between practice variation was studied using multilevel logistic regression analysis corrected for age, gender, number of different medicines prescriptions and number of chronic diseases. In 2014 there were 8330 patients with at least one medication adverse event recorded. This corresponds to 6.9 medication adverse events per 1000 patients and is higher for women, elderly, patients with polypharmacy and for patients with comorbidity. Corrected for these patient characteristics the median odds ratio (MOR = 1.92) suggests an almost twofold difference between general practices in recorded medication adverse events. Our results suggest that improvement in terms of uniformity in recording medication adverse events is possible, preventing potential damage for patients. We suggest that creating a learning health system by individual practice feedback on the number of recordings of adverse events would help practitioners to improve their recording habits.

  4. Mobile on-board vehicle event recorder: MOVER

    CSIR Research Space (South Africa)

    Kingsley Bell, L

    2017-03-01

    Full Text Available Event Recorder (MOVER) — was designed and built for the purpose of detecting car accidents through the use of acceleration thresholds. Driving data was gathered and crash simulations were run. With this data, testing and analysis were conducted in order...

  5. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    Science.gov (United States)

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  6. Event related potentials recorded in Dorsal Simultanagnosia.

    Science.gov (United States)

    Onofrj, M; Fulgente, T; Thomas, A

    1995-12-01

    Visual evoked potentials (VEPs) to central and lateral half field patterned stimuli of 1, 2 and 4 cycles per degree (cpd) were recorded in a patient with Dorsal Simultanagnosia due to bilateral lesions of parieto-occipital junction. VEPs consisted of the normal N1-P1-N2 components with same spatial frequency sensitivity as in controls. VEPs had similar latencies and amplitudes whether the patient could see or not the patterned stimuli. Event related potentials (ERPs) to visual and acoustic odd-ball paradigm were also recorded in the same patient. Visual ERPs consisted of an early NA-effect, and of N2-P3 components. P3 was recorded only from frontal, central and temporal derivations. The topographical P3 abnormality was, however, the same for visual and acoustic odd-ball paradigms. The amplitude of P3 was smaller when the patient missed visual stimuli. These findings show that severe bilateral lesions at the parieto-occipital junction, inducing Simultanagnosia, do not obliterate VEPs or ERPs components.

  7. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    Science.gov (United States)

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Pattern Discovery from Event Data

    OpenAIRE

    Le Van Quoc, Anh

    2014-01-01

    Events are ubiquitous in real-life. With the rapid rise of the popularity of social media channels, massive amounts of event data, such as information about festivals, concerts, or meetings, are increasingly created and shared by users on the Internet. Deriving insights or knowledge from such social media data provides a semantically rich basis for many applications, for instance, social media marketing, service recommendation, sales promotion, or enrichment of existing data sources. In spite...

  9. 49 CFR 229.135 - Event recorders.

    Science.gov (United States)

    2010-10-01

    ... automatic air brake, including emergency applications. The system shall record, or provide a means of... applications. The system shall record, or provide a means of determining, that a brake application or release..., “train” includes a locomotive or group of locomotives with or without cars. The duty to equip the lead...

  10. Text mining electronic health records to identify hospital adverse events

    DEFF Research Database (Denmark)

    Gerdes, Lars Ulrik; Hardahl, Christian

    2013-01-01

    Manual reviews of health records to identify possible adverse events are time consuming. We are developing a method based on natural language processing to quickly search electronic health records for common triggers and adverse events. Our results agree fairly well with those obtained using manu...

  11. Fire Event Data from Licensee Event Reports

    Data.gov (United States)

    Nuclear Regulatory Commission — The purpose of this study data is to provide a metric with which to assess the effectiveness of improvements to the U.S. NRC's fire protection regulations in support...

  12. Digital event recorder capable of simple computations and with ...

    African Journals Online (AJOL)

    event. The event recorder may be preset to automatically terminate the experiment after a fIxed time or allowed to run until stopped manually. CRYSTAL. DSCILLATOR ~ +n 1--_ CLOCK. SET_. TIME. COMPARE. STOP_. CONTROL LOGIC. RESET ------+. KEYBOARD. 213. When two or more behavioural events occur simul-.

  13. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  14. Big Data tools as applied to ATLAS event data

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2017-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and to...

  15. SMOS data and extreme events

    Science.gov (United States)

    Kerr, Yann; Wigneron, Jean-Pierre; Ferrazzoli, Paolo; Mahmoodi, Ali; Al-Yaari, Amen; Parrens, Marie; Bitar, Ahmad Al; Rodriguez-Fernandez, Nemesio; Bircher, Simone; Molero-rodenas, Beatriz; Drusch, Matthias; Mecklenburg, Susanne

    2017-04-01

    The SMOS (Soil Moisture and Ocean Salinity) satellite was successfully launched in November 2009. This ESA led mission for Earth Observation is dedicated to provide soil moisture over continental surface (with an accuracy goal of 0.04 m3/m3), vegetation water content over land, and ocean salinity. These geophysical features are important as they control the energy balance between the surface and the atmosphere. Their knowledge at a global scale is of interest for climatic and weather researches, and in particular in improving model forecasts. The Soil Moisture and Ocean Salinity mission has now been collecting data for over 7 years. The whole data set has been reprocessed (Version 620 for levels 1 and 2 and version 3 for level 3 CATDS) while operational near real time soil moisture data is now available and assimilation of SMOS data in NWP has proved successful. After 7 years it seems important to start using data for having a look at anomalies and see how they can relate to large scale events. We have also produced a 15 year soil moisture data set by merging SMOS and AMSR using a neural network approach. The purpose of this communication is to present the mission results after more than seven years in orbit in a climatic trend perspective, as through such a period anomalies can be detected. Thereby we benefit from consistent datasets provided through the latest reprocessing using most recent algorithm enhancements. Using the above mentioned products it is possible to follow large events such as the evolution of the droughts in North America, or water fraction evolution over the Amazonian basin. In this occasion we will focus on the analysis of SMOS and ancillary products anomalies to reveal two climatic trends, the temporal evolution of water storage over the Indian continent in relation to rainfall anomalies, and the global impact of El Nino types of events on the general water storage distribution. This presentation shows in detail the use of long term data sets

  16. Recording force events of single quantum-dot endocytosis.

    Science.gov (United States)

    Shan, Yuping; Hao, Xian; Shang, Xin; Cai, Mingjun; Jiang, Junguang; Tang, Zhiyong; Wang, Hongda

    2011-03-28

    We applied force spectroscopy based on atomic force microscope (AFM) to demonstrate the possibility of measuring the interaction force between single quantum-dots (QDs) and living cells at single particle level under native conditions. In the force-distance cycle, we recorded the events of cellular uptake of single QDs and single QD detachment from the cell.

  17. Increased record-breaking precipitation events under global warming

    NARCIS (Netherlands)

    Lehmann, Jascha; Coumou, Dim; Frieler, Katja

    2015-01-01

    In the last decade record-breaking rainfall events have occurred in many places around the world causing severe impacts to human society and the environment including agricultural losses and floodings. There is now medium confidence that human-induced greenhouse gases have contributed to changes in

  18. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    Science.gov (United States)

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  19. 77 FR 48492 - Event Data Recorders

    Science.gov (United States)

    2012-08-14

    ... America, Isuzu Motors America LLC, Kia Motors America, Inc., Maserati North America, Inc., Nissan North... of onboard motor vehicle crash EDRs voluntarily installed in light passenger vehicles. Specifically... agency made these changes to encourage a broad application of EDR technologies in motor vehicles and...

  20. 77 FR 47552 - Event Data Recorders

    Science.gov (United States)

    2012-08-09

    ... axis down (into the steering wheel) philosophy explained in SAE Recommended Practice J670, ``Vehicle Dynamics Terminology,'' and well as the philosophy being used to update the EDR parameter definitions in... manufacturers. A complete statement of the costs and benefits of the introduction of Part 563 are available in...

  1. Multi-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a high-multiplicity collision event observed by the CMS detector in the search for microscopic black holes, in collision data recorded in 2015. The event contains 12 jets with transverse momenta greater than 50 GeV each, and the mass of this system is 6.4 TeV. The scalar sum of the transverse energies of all energetic objects in the event (including missing transverse energy) is 5.4 TeV.

  2. Big Data Tools as Applied to ATLAS Event Data

    Science.gov (United States)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling

  3. SCADA data and the quantification of hazardous events for QMRA.

    Science.gov (United States)

    Nilsson, P; Roser, D; Thorwaldsdotter, R; Petterson, S; Davies, C; Signor, R; Bergstedt, O; Ashbolt, N

    2007-01-01

    The objective of this study was to assess the use of on-line monitoring to support the QMRA at water treatment plants studied in the EU MicroRisk project. SCADA data were obtained from three Catchment-to-Tap Systems (CTS) along with system descriptions, diary records, grab sample data and deviation reports. Particular attention was paid to estimating hazardous event frequency, duration and magnitude. Using Shewart and CUSUM we identified 'change-points' corresponding to events of between 10 min and >1 month duration in timeseries data. Our analysis confirmed it is possible to quantify hazardous event durations from turbidity, chlorine residual and pH records and distinguish them from non-hazardous variability in the timeseries dataset. The durations of most 'events' were short-term (0.5-2.3 h). These data were combined with QMRA to estimate pathogen infection risk arising from such events as chlorination failure. While analysis of SCADA data alone could identify events provisionally, its interpretation was severely constrained in the absence of diary records and other system information. SCADA data analysis should only complement traditional water sampling, rather than replace it. More work on on-line data management, quality control and interpretation is needed before it can be used routinely for event characterization.

  4. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  5. Scatterometer Climate Record Pathfinder Data

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains mean normalized backscatter coefficient (sigma-naught); backscatter variance; and other associated data files from Seasat SASS carrying a...

  6. Wheelchair type biomedical system with event-recorder function.

    Science.gov (United States)

    Han, Dong-Kyoon; Kim, Jong-Myoung; Cha, Eun-Jong; Lee, Tae-Soo

    2008-01-01

    The present study is about a biometric system for a wheelchair, which can measure both bio-signal (ECG-Electrocardiogram, BCG-Ballistocardiogram) and kinetic signal (acceleration) simultaneously and send the data to a remote medical server. The equipment was developed with the object of building a system that measures the bio-signal and kinetic signal of a subject who is moving or at rest on a wheelchair and transmits the measured signals to a remote server through a CDMA (Code Division Multiple Access) network. The equipment is composed of body area network and remote medical server. The body area network was designed to obtain bio-signal and kinetic signal simultaneously and, on the occurrence of an event, to transmit data to a remote medical server through a CDMA network. The remote medical server was designed to display event data transmitted from the body area network in real time. The performance of the developed system was evaluated through two experiments. First, we measured battery life on the occurrence of events, and second, we tested whether biometric data are transmitted accurately to the remote server on the occurrence of an event. In the first experiment using the developed equipment, events were triggered 16 times and the battery worked stably for around 29 hours. In the second experiment, when an event took place, the corresponding data were transmitted accurately to the remote medical server through a CDMA network. This system is expected to be usable for the healthcare of those moving on a wheelchair and applicable to a mobile healthcare system.

  7. An open ocean record of the Toarcian oceanic anoxic event

    Directory of Open Access Journals (Sweden)

    D. R. Gröcke

    2011-11-01

    Full Text Available Oceanic anoxic events were time intervals in the Mesozoic characterized by widespread distribution of marine organic matter-rich sediments (black shales and significant perturbations in the global carbon cycle. These perturbations are globally recorded in sediments as carbon isotope excursions irrespective of lithology and depositional environment. During the early Toarcian, black shales were deposited on the epi- and pericontinental shelves of Pangaea, and these sedimentary rocks are associated with a pronounced (ca. 7 ‰ negative (organic carbon isotope excursion (CIE which is thought to be the result of a major perturbation in the global carbon cycle. For this reason, the lower Toarcian is thought to represent an oceanic anoxic event (the T-OAE. If the T-OAE was indeed a global event, an isotopic expression of this event should be found beyond the epi- and pericontinental Pangaean localities. To address this issue, the carbon isotope composition of organic matter (δ13Corg of lower Toarcian organic matter-rich cherts from Japan, deposited in the open Panthalassa Ocean, was analysed. The results show the presence of a major (>6 ‰ negative excursion in δ13Corg that, based on radiolarian biostratigraphy, is a correlative of the lower Toarcian negative CIE known from Pangaean epi- and pericontinental strata. A smaller negative excursion in δ13Corg (ca. 2 ‰ is recognized lower in the studied succession. This excursion may, within the current biostratigraphic resolution, represent the excursion recorded in European epicontinental successions close to the Pliensbachian/Toarcian boundary. These results from the open ocean realm suggest, in conjunction with other previously published datasets, that these Early Jurassic carbon cycle perturbations affected the active global reservoirs of the exchangeable carbon cycle (deep marine, shallow marine, atmospheric.

  8. The Great Oxidation Event Recorded in Paleoproterozoic Rocks from Fennoscandia

    Directory of Open Access Journals (Sweden)

    Dmitry V. Rychanchik

    2010-04-01

    Full Text Available With support of the International Continental Scientific Drilling Program (ICDP and other funding organizations, the Fennoscandia Arctic Russia – Drilling Early Earth Project (FAR-DEEP operations have been successfully completed during 2007. A total of 3650 meters of core have been recovered from fifteen holes drilled through sedimentary and volcanic formations in Fennoscandia (Fig. 1, recording several global environmental changes spanning the time interval 2500–2000 Ma, including the Great Oxidation Event (GOE (Holland, 2002. The core was meanwhile curated and archived in Trondheim, Norway, and it has been sampled by an international team of scientists.

  9. Event displays at 13 TeV of 2016 data

    CERN Document Server

    CMS Collaboration

    2017-01-01

    This performance note presents some illustrative event displays at a center-of-mass energy of 13 TeV. The data set consists of the proton-proton collision data recorded by the CMS detector in 2016 with a magnetic field of 3.8 Tesla.

  10. Event displays in 13 TeV data

    CERN Document Server

    CMS Collaboration

    2016-01-01

    This performance note presents some illustrative event displays at a center-of-mass energy of 13 TeV. The data set consists of the first proton-proton collision data recorded by the CMS detector in 2015 with a magnetic field of 3.8 Teslas.

  11. Event displays in 13 TeV data

    CERN Document Server

    CMS Collaboration

    2016-01-01

    This performance note presents some illustrative event displays together with kinematic quantities for diboson production candidates at a center-of-mass energy of 13 TeV. The data set consists of the proton-proton collision data recorded by the CMS detector in 2016 with a magnetic field of 3.8 Teslas.

  12. NIMS EXPERIMENT DATA RECORDS: VENUS ENCOUNTER

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  13. GALILEO NIMS EXPERIMENT DATA RECORDS: JUPITER OPERATIONS

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  14. 14 CFR 91.609 - Flight data recorders and cockpit voice recorders.

    Science.gov (United States)

    2010-01-01

    ... the cause of accidents or occurrences in connection with the investigation under part 830. The... utilize a digital method of recording and storing data and a method of readily retrieving that data from... erased or otherwise obliterated. (g) In the event of an accident or occurrence requiring immediate...

  15. Semiannual Variation in the Number of Energetic Electron Precipitation Events Recorded in the Polar Atmosphere

    Science.gov (United States)

    Stozhkov, Y. Ivanovich; Makhmutov, V. S.; Bazilevskaya, G. A.; Krainev, M. B.; Svirkhevskaya, A. K.; Svirzhevsky, N. S.; Mailin, S. Y.

    2003-07-01

    The analysis of the monthly numbers of Electron Precipitation Events (EPEs) recorded at Olenya station (Murmansk region) during 1970-1987, shows the semiannual variation with two maxima centered on April and September. We analyse the interplanetary plasma and geomagnetic indices data sets associated with the EPEs recorded. The possible relationship of this variation and RusselMcPherron, Equino ctial and Axial effects is discussed.

  16. Wrong tooth extraction: an examination of 'Never Event' data.

    Science.gov (United States)

    Pemberton, M N; Ashley, M P; Saksena, A; Dickson, S

    2017-02-01

    The NHS in England has identified several adverse incidents that involve patients, including operations done at the wrong site, as "never" events. We examined published data from the period April 2012 to October 2015 and found that "wrong tooth/teeth removed" is the most common "wrong site" event, and accounted for between 20% and 25% of wrong site surgery never events, and 6% - 9% of all "never" events. All "wrong tooth/teeth removed" events seem to have been reported only by hospitals or Community Trusts. It is important to find out how these events are recorded and to find ways to prevent them. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  17. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  18. LRO DLRE 5 GRIDDED DATA RECORDS

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set consists of the Diviner Lunar Radiometer Experiment Gridded Data Records also known as GDRs. The DLRE is a surface pushbroom mapper that measures...

  19. MAGELLAN SURFACE CHARACTERISTICS VECTOR DATA RECORD

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Surface Characteristics Vector Data Record (SCVDR) which is an orbit-by-orbit reduction of Magellan scattering and emission...

  20. ATLAS Inner Detector Event Data Model

    CERN Document Server

    Akesson, F; Dobos, D; Elsing, M; Fleischmann, S; Gaponenko, A; Gnanvo, K; Keener, P T; Liebig, W; Moyse, E; Salzburger, A; Siebel, M; Wildauer, A

    2007-01-01

    The data model for event reconstruction (EDM) in the Inner Detector of the ATLAS experiment is presented. Different data classes represent evolving stages in the reconstruction data flow, and specific derived classes exist for the sub-detectors. The Inner Detector EDM also extends the data model for common tracking in ATLAS and is integrated into the modular design of the ATLAS high-level trigger and off-line software.

  1. Discovering anomalous events from urban informatics data

    Science.gov (United States)

    Jayarajah, Kasthuri; Subbaraju, Vigneshwaran; Weerakoon, Dulanga; Misra, Archan; Tam, La Thanh; Athaide, Noel

    2017-05-01

    Singapore's "smart city" agenda is driving the government to provide public access to a broader variety of urban informatics sources, such as images from traffic cameras and information about buses servicing different bus stops. Such informatics data serves as probes of evolving conditions at different spatiotemporal scales. This paper explores how such multi-modal informatics data can be used to establish the normal operating conditions at different city locations, and then apply appropriate outlier-based analysis techniques to identify anomalous events at these selected locations. We will introduce the overall architecture of sociophysical analytics, where such infrastructural data sources can be combined with social media analytics to not only detect such anomalous events, but also localize and explain them. Using the annual Formula-1 race as our candidate event, we demonstrate a key difference between the discriminative capabilities of different sensing modes: while social media streams provide discriminative signals during or prior to the occurrence of such an event, urban informatics data can often reveal patterns that have higher persistence, including before and after the event. In particular, we shall demonstrate how combining data from (i) publicly available Tweets, (ii) crowd levels aboard buses, and (iii) traffic cameras can help identify the Formula-1 driven anomalies, across different spatiotemporal boundaries.

  2. Data Bookkeeping Service 3 - Providing event metadata in CMS

    CERN Document Server

    Giffels, Manuel; Riley, Daniel

    2014-01-01

    The Data Bookkeeping Service 3 provides a catalog of event metadata for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It comprises all necessary information for tracking datasets, their processing history and associations between runs, files and datasets, on a large scale of about $200,000$ datasets and more than $40$ million files, which adds up in around $700$ GB of metadata. The DBS is an essential part of the CMS Data Management and Workload Management (DMWM) systems, all kind of data-processing like Monte Carlo production, processing of recorded event data as well as physics analysis done by the users are heavily relying on the information stored in DBS.

  3. Landsat Surface Reflectance Climate Data Records

    Science.gov (United States)

    ,

    2014-01-01

    Landsat Surface Reflectance Climate Data Records (CDRs) are high level Landsat data products that support land surface change studies. Climate Data Records, as defined by the National Research Council, are a time series of measurements with sufficient length, consistency, and continuity to identify climate variability and change. The U.S. Geological Survey (USGS) is using the valuable 40-year Landsat archive to create CDRs that can be used to document changes to Earth’s terrestrial environment.

  4. Identification of new events in Apollo 16 lunar seismic data by Hidden Markov Model-based event detection and classification

    Science.gov (United States)

    Knapmeyer-Endrun, Brigitte; Hammer, Conny

    2015-10-01

    Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.

  5. Proxy records of Holocene storm events in coastal barrier systems: Storm-wave induced markers

    Science.gov (United States)

    Goslin, Jérôme; Clemmensen, Lars B.

    2017-10-01

    Extreme storm events in the coastal zone are one of the main forcing agents of short-term coastal system behavior. As such, storms represent a major threat to human activities concentrated along the coasts worldwide. In order to better understand the frequency of extreme events like storms, climate science must rely on longer-time records than the century-scale records of instrumental weather data. Proxy records of storm-wave or storm-wind induced activity in coastal barrier systems deposits have been widely used worldwide in recent years to document past storm events during the last millennia. This review provides a detailed state-of-the-art compilation of the proxies available from coastal barrier systems to reconstruct Holocene storm chronologies (paleotempestology). The present paper aims (I) to describe the erosional and depositional processes caused by storm-wave action in barrier and back-barrier systems (i.e. beach ridges, storm scarps and washover deposits), (ii) to understand how storm records can be extracted from barrier and back-barrier sedimentary bodies using stratigraphical, sedimentological, micro-paleontological and geochemical proxies and (iii) to show how to obtain chronological control on past storm events recorded in the sedimentary successions. The challenges that paleotempestology studies still face in the reconstruction of representative and reliable storm-chronologies using these various proxies are discussed, and future research prospects are outlined.

  6. Data Bookkeeping Service 3 - A new event data catalog for CMS

    CERN Document Server

    Giffels, Manuel

    2012-01-01

    The Data Bookkeeping Service (DBS) provides an event data catalog for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) Experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It contains all the necessary information used for tracking datasets, like their processing history and associations between runs, files and datasets, on a large scale of about $10^5$ datasets and more than $10^7$ files. The DBS is widely used within CMS, since all kind of data-processing like Monte Carlo production, processing of recorded event data as well as physics analysis done by the user, are relying on the information stored in DBS.

  7. An analytical approach for estimating fossil record and diversification events in sharks, skates and rays.

    Science.gov (United States)

    Guinot, Guillaume; Adnet, Sylvain; Cappetta, Henri

    2012-01-01

    Modern selachians and their supposed sister group (hybodont sharks) have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders), the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops), thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction) are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.

  8. An analytical approach for estimating fossil record and diversification events in sharks, skates and rays.

    Directory of Open Access Journals (Sweden)

    Guillaume Guinot

    Full Text Available BACKGROUND: Modern selachians and their supposed sister group (hybodont sharks have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. METHODOLOGY/PRINCIPAL FINDINGS: Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders, the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. CONCLUSION/SIGNIFICANCE: Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops, thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.

  9. Changes in record-breaking temperature events in China and projections for the future

    Science.gov (United States)

    Deng, Hanqing; Liu, Chun; Lu, Yanyu; He, Dongyan; Tian, Hong

    2017-06-01

    As global warming intensifies, more record-breaking (RB) temperature events are reported in many places around the world where temperatures are higher than ever before http://cn.bing.com/dict/search?q=.&FORM=BDVSP6&mkt=zh-cn. The RB temperatures have caused severe impacts on ecosystems and human society. Here, we address changes in RB temperature events occurring over China in the past (1961-2014) as well as future projections (2006-2100) using observational data and the newly available simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5). The number of RB events has a significant multi-decadal variability in China, and the intensity expresses a strong decrease from 1961 to 2014. However, more frequent RB events occurred in mid-eastern and northeastern China over last 30 years (1981-2010). Comparisons with observational data indicate multi-model ensemble (MME) simulations from the CMIP5 model perform well in simulating RB events for the historical run period (1961-2005). CMIP5 MME shows a relatively larger uncertainty for the change in intensity. From 2051 to 2100, fewer RB events are projected to occur in most parts of China according to RCP 2.6 scenarios. Over the longer period from 2006 to 2100, a remarkable increase is expected for the entire country according to RCP 8.5 scenarios and the maximum numbers of RB events increase by approximately 600 per year at end of twenty-first century.

  10. Enhancing adverse drug event detection in electronic health records using molecular structure similarity: application to pancreatitis.

    Directory of Open Access Journals (Sweden)

    Santiago Vilar

    Full Text Available Adverse drug events (ADEs detection and assessment is at the center of pharmacovigilance. Data mining of systems, such as FDA's Adverse Event Reporting System (AERS and more recently, Electronic Health Records (EHRs, can aid in the automatic detection and analysis of ADEs. Although different data mining approaches have been shown to be valuable, it is still crucial to improve the quality of the generated signals.To leverage structural similarity by developing molecular fingerprint-based models (MFBMs to strengthen ADE signals generated from EHR data.A reference standard of drugs known to be causally associated with the adverse event pancreatitis was used to create a MFBM. Electronic Health Records (EHRs from the New York Presbyterian Hospital were mined to generate structured data. Disproportionality Analysis (DPA was applied to the data, and 278 possible signals related to the ADE pancreatitis were detected. Candidate drugs associated with these signals were then assessed using the MFBM to find the most promising candidates based on structural similarity.The use of MFBM as a means to strengthen or prioritize signals generated from the EHR significantly improved the detection accuracy of ADEs related to pancreatitis. MFBM also highlights the etiology of the ADE by identifying structurally similar drugs, which could follow a similar mechanism of action.The method proposed in this paper provides evidence of being a promising adjunct to existing automated ADE detection and analysis approaches.

  11. A Solar Irradiance Climate Data Record

    Science.gov (United States)

    Coddington, O.; Lean, J. L.; Pilewskie, P.; Snow, M.; Lindholm, D.

    2016-08-01

    We present a new climate data record for total solar irradiance and solar spectral irradiance between 1610 and the present day with associated wavelength and time-dependent uncertainties and quarterly updates. The data record, which is part of the National Oceanic and Atmospheric Administration’s (NOAA) Climate Data Record (CDR) program, provides a robust, sustainable, and scientifically defensible record of solar irradiance that is of sufficient length, consistency, and continuity for use in studies of climate variability and climate change on multiple time scales and for user groups spanning climate modeling, remote sensing, and natural resource and renewable energy industries. The data record, jointly developed by the University of Colorado’s Laboratory for Atmospheric and Space Physics (LASP) and the Naval Research Laboratory (NRL), is constructed from solar irradiance models that determine the changes with respect to quiet sun conditions when facular brightening and sunspot darkening features are present on the solar disk where the magnitude of the changes in irradiance are determined from the linear regression of a proxy magnesium (Mg) II index and sunspot area indices against the approximately decade-long solar irradiance measurements of the Solar Radiation and Climate Experiment (SORCE). To promote long-term data usage and sharing for a broad range of users, the source code, the dataset itself, and supporting documentation are archived at NOAA's National Centers for Environmental Information (NCEI). In the future, the dataset will also be available through the LASP Interactive Solar Irradiance Data Center (LISIRD) for user-specified time periods and spectral ranges of interest.

  12. Contrasting sulfur isotope records during the Late Devonian punctata and Upper Kellwasser events

    Science.gov (United States)

    Sim, M.; Ono, S.; Hurtgen, M. T.

    2013-12-01

    The Late Devonian was a period of intense biological and environmental changes, including terrestrial afforestation, a series of asteroid impacts, and active orogeny due to the accretion of continental blocks. High amplitude positive carbon isotope excursions, the punctata and Kellwasser events, reflect major perturbations in the global carbon cycle during this period, which have been attributed to increased continental weathering and subsequent ocean eutrophication. Despite the comparable carbon isotope anomalies, however, a global biological crisis has been reported only for the Kellwasser events, while very low extinction intensity characterizes the punctata Event. We will present sulfur isotope records of carbonate associated sulfate (CAS) and pyrite from Frasnian-Famennian sections in the Great Basin, USA, and evaluate the role of sulfur during the punctata and Upper Kellwasser events. A positive sulfur isotope shift in both CAS and pyrite accompanies the onset of the punctata Event, but with a larger extent in the latter. As a result, the sulfur isotope offset between CAS and pyrite (Δ34SCAS-py) plummeted to less than 10‰. In the middle of the punctata Event, a sharp negative δ34SCAS excursion occurred just after the Alamo Impact, leading to the negative Δ34SCAS-py values. Unlike the rapid oscillations of δ34Spy and δ34SCAS during the punctata Event, the Upper Kellwasser was a period of stability, except for a brief drop of δ34SCAS before the event. Paired sulfur isotope data, aided by a simple box model, suggests that geochemical cycle of sulfur might be responsible for the contrasting biological responses to these two events. Superheavy pyrite and high stratigraphic variability of δ34Spy and δ34SCAS demonstrate a relatively small oceanic sulfate pool during the punctata Event, and the Alamo Impact likely triggered to the rapid oxidation of microbially-produced sulfide. The expansion of sulfidic bottom water thus may have been impeded, thereby

  13. Study of flight data recorder, underwater locator beacon, data logger and flarm collision avoidance system

    Science.gov (United States)

    Timi, Purnota Hannan; Shermin, Saima; Rahman, Asifur

    2017-06-01

    Flight data recorder is one of the most important sources of flight data in event of aviation disaster which records a wide range of flight parameters including altitude, airspeed, heading etc. and also helps monitoring and analyzing aircraft performance. Cockpit voice recorder records radio microphone transmissions and sounds in the cockpit. These devices help to find out and understand the root causes of aircraft crashes and help building better aircraft systems and technical solutions to prevent similar type of crashes in future, which lead to improvement in safety of aircrafts and passengers. There are other devices also which enhance the aircraft safety and assists in emergency or catastrophic situations. This paper discusses the concept of Flight Data Recorder (FDR), Cockpit Voice Recorder (CVR), Underwater Locator Beacon (ULB), Data logger and flarm-collision avoidance system for aircraft and their applications in aviation.

  14. Computerized technique for recording board defect data

    Science.gov (United States)

    R. Bruce Anderson; R. Edward Thomas; Charles J. Gatchell; Neal D. Bennett; Neal D. Bennett

    1993-01-01

    A computerized technique for recording board defect data has been developed that is faster and more accurate than manual techniques. The lumber database generated by this technique is a necessary input to computer simulation models that estimate potential cutting yields from various lumber breakdown sequences. The technique allows collection of detailed information...

  15. A volcanic event forecasting model for multiple tephra records, demonstrated on Mt. Taranaki, New Zealand

    Science.gov (United States)

    Damaschke, Magret; Cronin, Shane J.; Bebbington, Mark S.

    2018-01-01

    Robust time-varying volcanic hazard assessments are difficult to develop, because they depend upon having a complete and extensive eruptive activity record. Missing events in eruption records are endemic, due to poor preservation or erosion of tephra and other volcanic deposits. Even with many stratigraphic studies, underestimation or overestimation of eruption numbers is possible due to mis-matching tephras with similar chemical compositions or problematic age models. It is also common to have gaps in event coverage due to sedimentary records not being available in all directions from the volcano, especially downwind. Here, we examine the sensitivity of probabilistic hazard estimates using a suite of four new and two existing high-resolution tephra records located around Mt. Taranaki, New Zealand. Previous estimates were made using only single, or two correlated, tephra records. In this study, tephra data from six individual sites in lake and peat bogs covering an arc of 120° downwind of the volcano provided an excellent temporal high-resolution event record. The new data confirm a previously identified semi-regular pattern of variable eruption frequency at Mt. Taranaki. Eruption intervals exhibit a bimodal distribution, with eruptions being an average of 65 years apart, and in 2% of cases, centuries separate eruptions. The long intervals are less common than seen in earlier studies, but they have not disappeared with the inclusion of our comprehensive new dataset. Hence, the latest long interval of quiescence, since AD 1800, is unusual, but not out of character with the volcano. The new data also suggest that one of the tephra records (Lake Rotokare) used in earlier work had an old carbon effect on age determinations. This shifted ages of the affected tephras so that they were not correlated to other sites, leading to an artificially high eruption frequency in the previous combined record. New modelled time-varying frequency estimates suggest a 33

  16. Leveraging Data Intensive Computing to Support Automated Event Services

    Science.gov (United States)

    Clune, Thomas L.; Freeman, Shawn M.; Kuo, Kwo-Sen

    2012-01-01

    A large portion of Earth Science investigations is phenomenon- or event-based, such as the studies of Rossby waves, mesoscale convective systems, and tropical cyclones. However, except for a few high-impact phenomena, e.g. tropical cyclones, comprehensive records are absent for the occurrences or events of these phenomena. Phenomenon-based studies therefore often focus on a few prominent cases while the lesser ones are overlooked. Without an automated means to gather the events, comprehensive investigation of a phenomenon is at least time-consuming if not impossible. An Earth Science event (ES event) is defined here as an episode of an Earth Science phenomenon. A cumulus cloud, a thunderstorm shower, a rogue wave, a tornado, an earthquake, a tsunami, a hurricane, or an EI Nino, is each an episode of a named ES phenomenon," and, from the small and insignificant to the large and potent, all are examples of ES events. An ES event has a finite duration and an associated geolocation as a function of time; its therefore an entity in four-dimensional . (4D) spatiotemporal space. The interests of Earth scientists typically rivet on Earth Science phenomena with potential to cause massive economic disruption or loss of life, but broader scientific curiosity also drives the study of phenomena that pose no immediate danger. We generally gain understanding of a given phenomenon by observing and studying individual events - usually beginning by identifying the occurrences of these events. Once representative events are identified or found, we must locate associated observed or simulated data prior to commencing analysis and concerted studies of the phenomenon. Knowledge concerning the phenomenon can accumulate only after analysis has started. However, except for a few high-impact phenomena. such as tropical cyclones and tornadoes, finding events and locating associated data currently may take a prohibitive amount of time and effort on the part of an individual investigator. And

  17. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  18. Assessing vaccine data recording in Brazil

    Directory of Open Access Journals (Sweden)

    Mario Lucio de Oliveira Novaes

    2015-12-01

    Full Text Available ABSTRACT: Objectives: Vaccines represent an important advancement for improving the general health of a population. The effective recording of vaccine data is a factor for the definition of its supply chain. This study investigated vaccine data recording relatively to data collected from vaccination rooms and data obtained from a government-developed Internet platform. Methods: The monthly recorded total number of diphtheria and tetanus toxoids and pertussis vaccine (alone or in combination with the Haemophilus influenzae type b conjugate vaccine doses administered in a medium-sized city of the Southeast region of Brazil was collected for the period January/2006 through December/2010 from two sources: City level (directly from vaccination rooms, the study "gold standard", and Federal level (from an Internet platform developed by the country government. Data from these sources were compared using descriptive statistics and the Percentage error. Results: The data values made available by the Internet platform differed from those obtained from the vaccination rooms, with a Percentage error relatively to the actual values in the range [-0.48; 0.39]. Concordant values were observed only in one among the sixty analyzed months (1.66%. Conclusions: A frequent and large difference between the number of diphtheria and tetanus toxoids and pertussis vaccine doses administered in the two levels was detected.

  19. Mining Electronic Health Records using Linked Data.

    Science.gov (United States)

    Odgers, David J; Dumontier, Michel

    2015-01-01

    Meaningful Use guidelines have pushed the United States Healthcare System to adopt electronic health record systems (EHRs) at an unprecedented rate. Hospitals and medical centers are providing access to clinical data via clinical data warehouses such as i2b2, or Stanford's STRIDE database. In order to realize the potential of using these data for translational research, clinical data warehouses must be interoperable with standardized health terminologies, biomedical ontologies, and growing networks of Linked Open Data such as Bio2RDF. Applying the principles of Linked Data, we transformed a de-identified version of the STRIDE into a semantic clinical data warehouse containing visits, labs, diagnoses, prescriptions, and annotated clinical notes. We demonstrate the utility of this system though basic cohort selection, phenotypic profiling, and identification of disease genes. This work is significant in that it demonstrates the feasibility of using semantic web technologies to directly exploit existing biomedical ontologies and Linked Open Data.

  20. Preserving geomorphic data records of flood disturbances

    Science.gov (United States)

    Moody, John A.; Martin, Deborah; Meade, Robert H.

    2015-01-01

    No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18 years) of the fluvial geomorphic response to natural disturbances. One disturbance was ∼50-year flood on Powder River in Montana in 1978, and the second disturbance was a catastrophic flood on Spring Creek following a ∼100-year rainstorm after a wildfire in Colorado in 1996.Two critical issues arise relative to preserving fluvial geomorphic data. The first is preserving the data themselves, but the second, and just as important, is preserving information about the location of the field research sites where the data were collected so the sites can be re-located and re-surveyed in the future. The latter allows long-term datasets to be extended into the future and to provide critical background data for interpreting future landscape changes. Data were preserved on a website to allow world-wide accessibility and to upload new data to the website as they become available. We describe the architecture of the website, lessons learned in developing the website, future improvements, and recommendations on how also to preserve information about the location of field research sites.

  1. Event time analysis of longitudinal neuroimage data.

    Science.gov (United States)

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Data-Driven Information Extraction from Chinese Electronic Medical Records.

    Directory of Open Access Journals (Sweden)

    Dong Xu

    Full Text Available This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event.Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM algorithm that innovatively utilizes Normalized Google Distance (NGD to estimate the correlation between medical events and their descriptions.The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846.In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886. In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838.The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.

  3. "Big data" and the electronic health record.

    Science.gov (United States)

    Ross, M K; Wei, W; Ohno-Machado, L

    2014-08-15

    Implementation of Electronic Health Record (EHR) systems continues to expand. The massive number of patient encounters results in high amounts of stored data. Transforming clinical data into knowledge to improve patient care has been the goal of biomedical informatics professionals for many decades, and this work is now increasingly recognized outside our field. In reviewing the literature for the past three years, we focus on "big data" in the context of EHR systems and we report on some examples of how secondary use of data has been put into practice. We searched PubMed database for articles from January 1, 2011 to November 1, 2013. We initiated the search with keywords related to "big data" and EHR. We identified relevant articles and additional keywords from the retrieved articles were added. Based on the new keywords, more articles were retrieved and we manually narrowed down the set utilizing predefined inclusion and exclusion criteria. Our final review includes articles categorized into the themes of data mining (pharmacovigilance, phenotyping, natural language processing), data application and integration (clinical decision support, personal monitoring, social media), and privacy and security. The increasing adoption of EHR systems worldwide makes it possible to capture large amounts of clinical data. There is an increasing number of articles addressing the theme of "big data", and the concepts associated with these articles vary. The next step is to transform healthcare big data into actionable knowledge.

  4. Evaluating adverse drug event reporting in administrative data from emergency departments: a validation study

    Science.gov (United States)

    2013-01-01

    Background Adverse drug events are a frequent cause of emergency department presentations. Administrative data could be used to identify patients presenting with adverse drug events for post-market surveillance, and to conduct research in patient safety and in drug safety and effectiveness. However, such data sources have not been evaluated for their completeness with regard to adverse drug event reporting. Our objective was to determine the proportion of adverse drug events to outpatient medications diagnosed at the point-of-care in emergency departments that were documented in administrative data. Methods We linked the records of patients enrolled in a prospective observational cohort study on adverse drug events conducted in two Canadian tertiary care emergency departments to their administrative data. We compared the number of adverse drug events diagnosed and recorded at the point-of-care in the prospective study with the number of adverse drug events recorded in the administrative data. Results Among 1574 emergency department visits, 221 were identified as adverse drug event-related in the prospective database. We found 15 adverse drug events documented in administrative records with ICD-10 codes clearly indicating an adverse drug event, indicating a sensitivity of 6.8% (95% CI 4.0–11.2%) of this code set. When the ICD-10 code categories were broadened to include codes indicating a very likely, likely or possible adverse event to a medication, 62 of 221 events were identifiable in administrative data, corresponding to a sensitivity of 28.1% (95% CI 22.3-34.6%). Conclusions Adverse drug events to outpatient medications were underreported in emergency department administrative data compared to the number of adverse drug events diagnosed and recorded at the point-of-care. PMID:24219303

  5. R2R Eventlogger: Community-wide Recording of Oceanographic Cruise Science Events

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Stolp, L.; Lerner, S.; Avery, J.; Thiel, T.

    2012-12-01

    Methods used by researchers to track science events during a science research cruise - and to note when and where these occur - varies widely. Handwritten notebooks, printed forms, watch-keeper logbooks, data-logging software, and customized software have all been employed. The quality of scientific results is affected by the consistency and care with which such events are recorded and integration of multi-cruise results is hampered because recording methods vary widely from cruise to cruise. The Rolling Deck to Repository (R2R) program has developed an Eventlogger system that will eventually be deployed on most vessels in the academic research fleet. It is based on the open software package called ELOG (http://midas.psi.ch/elog/) originally authored by Stefan Ritt and enhanced by our team. Lessons have been learned in its development and use on several research cruises. We have worked hard to find approaches that encourage cruise participants to use tools like the eventlogger. We examine these lessons and several eventlogger datasets from past cruises. We further describe how the R2R Science Eventlogger works in concert with the other R2R program elements to help coordinate research vessels into a coordinated mobile observing fleet. Making use of data collected on different research cruises is enabled by adopting common ways of describing science events, the science instruments employed, the data collected, etc. The use of controlled vocabularies and the practice of mapping these local vocabularies to accepted oceanographic community vocabularies helps to bind shipboard research events from different cruises into a more cohesive set of fleet-wide events that can be queried and examined in a cross-cruise manner. Examples of the use of the eventlogger during multi-cruise oceanographic research programs along with examples of resultant eventlogger data will be presented. Additionally we will highlight the importance of vocabulary use strategies to the success of the

  6. Diving into Data: Planning a Research Data Management Event

    Directory of Open Access Journals (Sweden)

    Robyn Reed

    2015-07-01

    Full Text Available The George T. Harrell Health Sciences Library at Penn State Hershey initiated its participation in institutional research data management activities by coordinating and hosting a well-attended data management symposium. To maximize relevance to clinical and basic sciences researchers, a planning committee of faculty and administrators assisted in defining important topics for the event. This article describes the symposium development and outcomes. The goal is to share this information with librarians who are seeking ways to become more involved with data management in their institutions.

  7. Pathfinder Sea Surface Temperature Climate Data Record

    Science.gov (United States)

    Baker-Yeboah, S.; Saha, K.; Zhang, D.; Casey, K. S.

    2016-02-01

    Global sea surface temperature (SST) fields are important in understanding ocean and climate variability. The NOAA National Centers for Environmental Information (NCEI) develops and maintains a high resolution, long-term, climate data record (CDR) of global satellite SST. These SST values are generated at approximately 4 km resolution using Advanced Very High Resolution Radiometer (AVHRR) instruments aboard NOAA polar-orbiting satellites going back to 1981. The Pathfinder SST algorithm is based on the Non-Linear SST algorithm using the modernized NASA SeaWiFS Data Analysis System (SeaDAS). Coefficients for this SST product were generated using regression analyses with co-located in situ and satellite measurements. Previous versions of Pathfinder included level 3 collated (L3C) products. Pathfinder Version 5.3 includes level 2 pre-processed (L2P), level 3 Uncollated (L3C), and L3C products. Notably, the data were processed in the cloud using Amazon Web Services and are made available through all of the modern web visualization and subset services provided by the THREDDS Data Server, the Live Access Server, and the OPeNDAP Hyrax Server.In this version of Pathfinder SST, anomalous hot-spots at land-water boundaries are better identified and the dataset includes updated land masks and sea ice data over the Antarctic ice shelves. All quality levels of SST values are generated, giving the user greater flexibility and the option to apply their own cloud-masking procedures. Additional improvements include consistent cloud tree tests for NOAA-07 and NOAA-19 with respect to the other sensors, improved SSTs in sun glint areas, and netCDF file format improvements to ensure consistency with the latest Group for High Resolution SST (GHRSST) requirements. This quality controlled satellite SST field is a reference environmental data record utilized as a primary resource of SST for numerous regional and global marine efforts.

  8. The ADE scorecards: a tool for adverse drug event detection in electronic health records.

    Science.gov (United States)

    Chazard, Emmanuel; Băceanu, Adrian; Ferret, Laurie; Ficheur, Grégoire

    2011-01-01

    Although several methods exist for Adverse Drug events (ADE) detection due to past hospitalizations, a tool that could display those ADEs to the physicians does not exist yet. This article presents the ADE Scorecards, a Web tool that enables to screen past hospitalizations extracted from Electronic Health Records (EHR), using a set of ADE detection rules, presently rules discovered by data mining. The tool enables the physicians to (1) get contextualized statistics about the ADEs that happen in their medical department, (2) see the rules that are useful in their department, i.e. the rules that could have enabled to prevent those ADEs and (3) review in detail the ADE cases, through a comprehensive interface displaying the diagnoses, procedures, lab results, administered drugs and anonymized records. The article shows a demonstration of the tool through a use case.

  9. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    M.F.D. Moraes

    1997-08-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  10. Using a Cardiac Event Recorder in Children with Potentially Arrhythmia-Related Symptoms.

    Science.gov (United States)

    Saygi, Murat; Ergul, Yakup; Ozyilmaz, Isa; Sengul, Fatma Sevinc; Guvenc, Osman; Aslan, Eyup; Guzeltas, Alper; Akdeniz, Celal; Tuzcu, Volkan

    2016-09-01

    In this study, we reported our experience with the use of cardiac event recorders in pediatric patients. We evaluated 583 patients fitted with an event recorder (15-30 days) between March 2010 and November 2014 at our clinic. Excluded from the study were 117 patients with no recorded events and six with records contaminated by electrocardiogram artifacts. All of the patients received electrocardiograms, Holter monitoring, and echocardiography before the cardiac event recording. The patient sample consisted of 460 patients (64% female). The mean age was 12.8 ± 4.1 years. The median number of recorded events was 7. The indications included palpitations in 336 (73%) patients, syncope in 27 (6%) patients, and chest pain and palpitations in 97 (21%) patients. Whereas 64 patients (14%) had structural heart disease according to echocardiographic examination, the remaining patients had normal echocardiographic examination results. The most frequent cardiac comorbidities were mitral valve prolapse (6%), operated tetralogy of Fallot (1.5%), and complicated congenital heart diseases with single ventricle physiology (1%). The recorded events were sinus tachycardia in 113 (25%) patients, supraventricular tachycardia in 35 (8%) patients, ventricular extrasystole in 20 (4%) patients, supraventricular extrasystole in nine (2%) patients, and ventricular tachycardia in two (0.4%) patients. Based on the event recorder and follow-up electrocardiogram findings, 46 patients received an electrophysiology study/ablation. The symptom-rhythm correlation was 39%. In the presence of possible arrhythmia-related symptoms in children, a cardiac event recorder can be considered a useful primary diagnostic method. More research on this topic is needed. © 2016 Wiley Periodicals, Inc.

  11. Biotic immigration events, speciation, and the accumulation of biodiversity in the fossil record

    Science.gov (United States)

    Stigall, Alycia L.; Bauer, Jennifer E.; Lam, Adriane R.; Wright, David F.

    2017-01-01

    Biotic Immigration Events (BIMEs) record the large-scale dispersal of taxa from one biogeographic area to another and have significantly impacted biodiversity throughout geologic time. BIMEs associated with biodiversity increases have been linked to ecologic and evolutionary processes including niche partitioning, species packing, and higher speciation rates. Yet substantial biodiversity decline has also been documented following BIMEs due to elevated extinction and/or reduced speciation rates. In this review, we develop a conceptual model for biodiversity accumulation that links BIMEs and geographic isolation with local (α) diversity, regional (β) diversity, and global (γ) diversity metrics. Within the model, BIME intervals are characterized by colonization of existing species within new geographic regions and a lack of successful speciation events. Thus, there is no change in γ-diversity, and α-diversity increases at the cost of β-diversity. An interval of regional isolation follows in which lineage splitting results in successful speciation events and diversity increases across all three metrics. Alternation of these two regimes can result in substantial biodiversity accumulation. We tested this conceptual model using a series of case studies from the paleontological record. We primarily focus on two intervals during the Middle through Late Ordovician Period (470-458 Ma): the globally pervasive BIMEs during the Great Ordovician Biodiversification Event (GOBE) and a regional BIME, the Richmondian Invasion. We further test the conceptual model by examining the Great Devonian Interchange, Neogene mollusk migrations and diversification, and the Great American Biotic Interchange. Paleontological data accord well with model predictions. Constraining the mechanisms of biodiversity accumulation provides context for conservation biology. Because α-, β-, and γ-diversity are semi-independent, different techniques should be considered for sustaining various

  12. Some parallels in the astronomical events recorded in the Maya codices and inscriptions.

    Science.gov (United States)

    Closs, M. P.

    The Dresden Codex contains two excellent examples of astronomical tables, one dedicated to the planet Venus and the other to solar and lunar eclipses. Most of the dates recorded in the monumental inscriptions are related to events in the lives of the Maya kings who commissioned the monuments. Nevertheless, it is not unusual to find that some of these dates are glyphically marked with astronomical references. The author looks of astronomical parallels in the events recorded in the inscriptions and codices.

  13. 14 CFR 23.1459 - Flight data recorders.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Flight data recorders. 23.1459 Section 23... Equipment § 23.1459 Flight data recorders. (a) Each flight recorder required by the operating rules of this... electrical power from the bus that provides the maximum reliability for operation of the flight data recorder...

  14. SOCIAL INTERACTIONS AND FEELINGS OF INFERIORITY AMONG CORRECTIONAL OFFICERS - A DAILY EVENT-RECORDING APPROACH

    NARCIS (Netherlands)

    PEETERS, MCW; BUUNK, BP; SCHAUFELI, WB

    1995-01-01

    A daily event-recording method, referred to as the Daily Interaction Record in Organizations (DIRO) was employed for assessing the influence of three types of social interaction on negative affect at work. For this purpose, 38 correctional officers (COs) completed forms, for a 1-week period, that

  15. Implications of electronic health record downtime: an analysis of patient safety event reports.

    Science.gov (United States)

    Larsen, Ethan; Fong, Allan; Wernz, Christian; Ratwani, Raj M

    2018-02-01

    We sought to understand the types of clinical processes, such as image and medication ordering, that are disrupted during electronic health record (EHR) downtime periods by analyzing the narratives of patient safety event report data. From a database of 80 381 event reports, 76 reports were identified as explicitly describing a safety event associated with an EHR downtime period. These reports were analyzed and categorized based on a developed code book to identify the clinical processes that were impacted by downtime. We also examined whether downtime procedures were in place and followed. The reports were coded into categories related to their reported clinical process: Laboratory, Medication, Imaging, Registration, Patient Handoff, Documentation, History Viewing, Delay of Procedure, and General. A majority of reports (48.7%, n = 37) were associated with lab orders and results, followed by medication ordering and administration (14.5%, n = 11). Incidents commonly involved patient identification and communication of clinical information. A majority of reports (46%, n = 35) indicated that downtime procedures either were not followed or were not in place. Only 27.6% of incidents (n = 21) indicated that downtime procedures were successfully executed. Patient safety report data offer a lens into EHR downtime-related safety hazards. Important areas of risk during EHR downtime periods were patient identification and communication of clinical information; these should be a focus of downtime procedure planning to reduce safety hazards. EHR downtime events pose patient safety hazards, and we highlight critical areas for downtime procedure improvement.

  16. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid- 2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. These slides present two components of the ATLAS EventIndex: its data collection supervisor and its web interface partner.

  17. Contrasting sediment records of marine submersion events related to wave exposure, Southwest France

    Science.gov (United States)

    Baumann, J.; Chaumillon, E.; Schneider, J.-L.; Jorissen, F.; Sauriau, P.-G.; Richard, P.; Bonnin, J.; Schmidt, S.

    2017-05-01

    Sediment records of two contrasting backshore coastal marshes, extremely vulnerable to recent and historical marine flooding events, located on the SW coast of France, have been investigated using a multiproxy approach. The studied marshes are 30 km apart and have been flooded by similar storm events (7 marine floods in the last 250 years). One is located in a wave-exposed coast but isolated from the sea by a sediment barrier, whereas the other is located in a sheltered estuarine environment and isolated from the sea by a dike. One core was collected in each marsh and information on grain-size, foraminifera, shell contents and stable carbon isotopes was obtained along with an age model using 210Pb, 137Cs and 14C. Core data combined with historical maps give evidence of a typical estuarine backfilling, part of the Holocene regressive parasequence, including an intertidal mudflat at the base and a backshore environment at the top. Despite the absence of grain size anomalies, marine flood-related sedimentation in the backshore area of both marshes is identified by a mixture of marine and terrestrial features, including marine fauna, vegetation debris and variation in the δ13C signature of the organic fraction. Very low sedimentation rates between flood events and/or bioturbation prevents the identification of individual episodic marine floods in the sediment succession. Comparison of the two sedimentary successions shows that the foraminifera deposited by marine submersions are of two different types. Foraminifera are monospecific and originate from the upper tidal mudflat in the sheltered marsh; whereas in the backshore marsh located in a wave-exposed environment, they show higher diversity and originate from both shallow and deeper water marine environments. This study shows that wave exposure can control the faunal content of marine flood sediment records in coastal marshes.

  18. 14 CFR 129.20 - Digital flight data recorders.

    Science.gov (United States)

    2010-01-01

    ... digital method of recording and storing data and a method of readily retrieving that data from the storage... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Digital flight data recorders. 129.20... § 129.20 Digital flight data recorders. No person may operate an aircraft under this part that is...

  19. Distributed Data Collection for the ATLAS EventIndex

    Science.gov (United States)

    Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.

    2015-12-01

    The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.

  20. VA Personal Health Record Sample Data

    Data.gov (United States)

    Department of Veterans Affairs — My HealtheVet (www.myhealth.va.gov) is a Personal Health Record portal designed to improve the delivery of health care services to Veterans, to promote health and...

  1. EOP TDRs (Temperature-Depth-Recordings) Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Temperature-depth-recorders (TDRs) were attached to commercial longline and research Cobb trawl gear to obtain absolute depth and temperature measurement during...

  2. Genomic inferences of domestication events are corroborated by written records in Brassica rapa.

    Science.gov (United States)

    Qi, Xinshuai; An, Hong; Ragsdale, Aaron P; Hall, Tara E; Gutenkunst, Ryan N; Chris Pires, J; Barker, Michael S

    2017-07-01

    Demographic modelling is often used with population genomic data to infer the relationships and ages among populations. However, relatively few analyses are able to validate these inferences with independent data. Here, we leverage written records that describe distinct Brassica rapa crops to corroborate demographic models of domestication. Brassica rapa crops are renowned for their outstanding morphological diversity, but the relationships and order of domestication remain unclear. We generated genomewide SNPs from 126 accessions collected globally using high-throughput transcriptome data. Analyses of more than 31,000 SNPs across the B. rapa genome revealed evidence for five distinct genetic groups and supported a European-Central Asian origin of B. rapa crops. Our results supported the traditionally recognized South Asian and East Asian B. rapa groups with evidence that pak choi, Chinese cabbage and yellow sarson are likely monophyletic groups. In contrast, the oil-type B. rapa subsp. oleifera and brown sarson were polyphyletic. We also found no evidence to support the contention that rapini is the wild type or the earliest domesticated subspecies of B. rapa. Demographic analyses suggested that B. rapa was introduced to Asia 2,400-4,100 years ago, and that Chinese cabbage originated 1,200-2,100 years ago via admixture of pak choi and European-Central Asian B. rapa. We also inferred significantly different levels of founder effect among the B. rapa subspecies. Written records from antiquity that document these crops are consistent with these inferences. The concordance between our age estimates of domestication events with historical records provides unique support for our demographic inferences. © 2017 John Wiley & Sons Ltd.

  3. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  4. Understanding extreme rainfall events in Australia through historical data

    Science.gov (United States)

    Ashcroft, Linden; Karoly, David John

    2016-04-01

    Historical climate data recovery is still an emerging field in the Australian region. The majority of Australia's instrumental climate analyses begin in 1900 for rainfall and 1910 for temperature, particularly those focussed on extreme event analysis. This data sparsity for the past in turn limits our understanding of long-term climate variability, constraining efforts to predict the impact of future climate change. To address this need for improved historical data in Australia, a new network of recovered climate observations has recently been developed, centred on the highly populated southeastern Australian region (Ashcroft et al., 2014a, 2014b). The dataset includes observations from more than 39 published and unpublished sources and extends from British settlement in 1788 to the formation of the Australian Bureau of Meteorology in 1908. Many of these historical sources provide daily temperature and rainfall information, providing an opportunity to improve understanding of the multidecadal variability of Australia's extreme events. In this study we combine the historical data for three major Australian cities - Melbourne, Sydney and Adelaide - with modern observations to examine extreme rainfall variability over the past 174 years (1839-2013). We first explore two case studies, combining instrumental and documentary evidence to support the occurrence of severe storms in Sydney in 1841 and 1844. These events appear to be at least as extreme as Sydney's modern 24-hour rainfall record. Next we use a suite of rainfall indices to assess the long-term variability of rainfall in southeastern Australia. In particular, we focus on the stationarity of the teleconnection between the El Niño-Southern Oscillation (ENSO) phenomenon and extreme rainfall events. Using ENSO reconstructions derived from both palaeoclimatic and documentary sources, we determine the historical relationship between extreme rainfall in southeastern Australia and ENSO, and examine whether or not this

  5. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    Prokoshin, Fedor; The ATLAS collaboration; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information obtained from the ATLAS detector. The basic unit of this information is event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex are the event picking, providing information for the Event Service and data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the GRID, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalog AMI and the Rucio data man...

  6. DataPlay's mobile recording technology

    Science.gov (United States)

    Bell, Bernard W., Jr.

    2002-01-01

    A small rotating memory device which utilizes optical prerecorded and writeable technology to provide a mobile recording technology solution for digital cameras, cell phones, music players, PDA's, and hybrid multipurpose devices have been developed. This solution encompasses writeable, read only, and encrypted storage media.

  7. Crash Survivable Flight Data Recording System Study.

    Science.gov (United States)

    1981-06-30

    checklist (before StaL- fled In Appendix B of this Part- IfD Each flight recorder required by Ins engines for the purpose of fljg- (1) Time, altitude...gisetrici&I power glictli dse of this sucetion must meet be installed so that- from the bus ithat provides, the mai- Me uffinh-um Performsnce Btandard

  8. The ATLAS EventIndex: data flow and inclusion of other metadata

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00064378; Cardenas Zarate, Simon Ernesto; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Prokoshin, Fedor; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2016-01-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on p...

  9. Networks of recurrent events, a theory of records, and an application to finding causal signatures in seismicity

    Science.gov (United States)

    Davidsen, Jörn; Grassberger, Peter; Paczuski, Maya

    2008-06-01

    We propose a method to search for signs of causal structure in spatiotemporal data making minimal a priori assumptions about the underlying dynamics. To this end, we generalize the elementary concept of recurrence for a point process in time to recurrent events in space and time. An event is defined to be a recurrence of any previous event if it is closer to it in space than all the intervening events. As such, each sequence of recurrences for a given event is a record breaking process. This definition provides a strictly data driven technique to search for structure. Defining events to be nodes, and linking each event to its recurrences, generates a network of recurrent events. Significant deviations in statistical properties of that network compared to networks arising from (acausal) random processes allows one to infer attributes of the causal dynamics that generate observable correlations in the patterns. We derive analytically a number of properties for the network of recurrent events composed by a random process in space and time. We extend the theory of records to treat not only the variable where records happen, but also time as continuous. In this way, we construct a fully symmetric theory of records leading to a number of results. Those analytic results are compared in detail to the properties of a network synthesized from time series of epicenter locations for earthquakes in Southern California. Significant disparities from the ensemble of acausal networks that can be plausibly attributed to the causal structure of seismicity are as follows. (1) Invariance of network statistics with the time span of the events considered. (2) The appearance of a fundamental length scale for recurrences, independent of the time span of the catalog, which is consistent with observations of the “rupture length.” (3) Hierarchy in the distances and times of subsequent recurrences. As expected, almost all of the statistical properties of a network constructed from a

  10. 14 CFR 125.225 - Flight data recorders.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Flight data recorders. 125.225 Section 125... Requirements § 125.225 Flight data recorders. (a) Except as provided in paragraph (d) of this section, after... October 1, 1969, unless it is equipped with one or more approved flight recorders that utilize a digital...

  11. 14 CFR 121.343 - Flight data recorders.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Flight data recorders. 121.343 Section 121... Flight data recorders. (a) Except as provided in paragraphs (b), (c), (d), (e), and (f) of this section... or is turbine-engine powered unless it is equipped with one or more approved flight recorders that...

  12. Motivation and intention to integrate physical activity into daily school life: the JAM World Record event.

    Science.gov (United States)

    Vazou, Spyridoula; Vlachopoulos, Symeon P

    2014-11-01

    Research on the motivation of stakeholders to integrate physical activity into daily school life is limited. The purpose was to examine the motivation of stakeholders to participate in a world record physical activity event and whether motivation was associated with future intention to use activity breaks during the daily school life and future participation in a similar event. After the 2012 JAM (Just-a-Minute) World Record event, 686 adults (591 women; 76.1% participated for children motivational regulations and future intention to (a) use the activity breaks and (b) participate in the event. High intrinsic motivation and low extrinsic motivation and amotivation for participation in the next event were reported. Hierarchical regression analysis, controlling for age, gender, and occupation, showed that intrinsic forms of motivation positively predicted, whereas amotivation negatively predicted, future intention to participate in the event and use the activity breaks. Multivariate analyses of variance revealed that school-related participants were more intrinsically motivated and intended to use the activity breaks and repeat the event more than those who were not affiliated with a school. Nonschool participants reported higher extrinsic motivation and amotivation than school-related participants. © 2014 Society for Public Health Education.

  13. Comparison of Offshore Turbidite records and Lake Disturbance Events at the Latitude of Seattle, Washington

    Science.gov (United States)

    Galer, S.; Goldfinger, C.; Morey, A. E.; Black, B.; Romsos, C.; Beeson, J. W.; Erhardt, M.

    2014-12-01

    We are investigating the paleoseismic history of northern Washington using offshore turbidite cores and lake sediments collected from forearc lakes along a transect from offshore to Seattle, Washington. Additional offshore cores, ash determinations and heavy mineral analysis flesh out the turbidite stratigraphy off northern Washington, and support 3-5 proximal turbidites in northern Washington canyons (see Adams, 1990) in addition to the 19 regionally correlated beds. Onshore, we have cored multiple lakes including (west to east) Beaver, Leland, Tarboo, Hall, Sawyer, and Wapato, east of the Cascades, and collected multibeam bathymetry, backscatter and chirp subbottom data. These lakes are small (2-113 ha), 6-18 m deep, and are all kettle lakes except Beaver Lake (landslide-dammed) and Wapato Lake, a glacial scour. These lakes were selected for their limited outside sediment sources and low sensitivity to ground shaking. The sedimentology is mostly organic-rich gyttja. All lakes contain the Mazama ash based on its similar depth occurrence in previously published cores and new EMP analysis. Computed Tomography (CT) density, gamma density, and magnetic susceptibility (ms) data show there is more stratigraphic variability than is visually apparent. Low-energy disturbance events are apparent in the stratigraphy of all lakes (except Hall) as increases in clastics, density, and ms. The number of post Mazama disturbance events is similar to the number of expected great earthquakes found offshore and onshore, though definition of the boundaries of the lake events is much less clear. Initial radiocarbon results and preliminary correlations along this 185 km transect show strong similarities in stratigraphic records between these cores over the past ~7600 years, anchored by the Mazama tephra. Preliminary comparisons with offshore cores show a striking similarity in downcore variability in physical properties. Given the evidence for earthquake origin for the offshore cores

  14. An additive-multiplicative rates model for recurrent event data with informative terminal event.

    Science.gov (United States)

    Sun, Liuquan; Kang, Fangyuan

    2013-01-01

    In this article, we propose an additive-multiplicative rates model for recurrent event data in the presence of a terminal event such as death. The association between recurrent and terminal events is nonparametric. For inference on the model parameters, estimating equation approaches are developed, and the asymptotic properties of the resulting estimators are established. The finite sample behavior of the proposed estimators is evaluated through simulation studies, and an application to a bladder cancer study is provided.

  15. Continuous event recorders did not affect anxiety or quality of life in patients with palpitations

    NARCIS (Netherlands)

    Hoefman, Emmy; Boer, Kimberly R.; van Weert, Henk C. P. M.; Reitsma, Johannes B.; Koster, Rudolf W.; Bindels, Patrick J. P.

    2007-01-01

    OBJECTIVES: Palpitations can generate feelings of anxiety and decrease quality of life (QoL) due to fear of a cardiac abnormality. Continuous event recorders (CERs) have proven to be successful in diagnosing causes of palpitations but may affect patient QoL and anxiety. The aim is to determine

  16. Records of climatic changes and volcanic events in an ice core from ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 111; Issue 1. Records of climatic changes and volcanic events in an ice core from Central Dronning Maud Land (East Antarctica) during the past century. V N Nijampurkar D K Rao H B Clausen M K Kaul A Chaturvedi. Volume 111 Issue 1 March 2002 pp 39-49 ...

  17. Recording the LHCb data and software dependencies

    Science.gov (United States)

    Trisovic, Ana; Couturier, Ben; Gibson, Val; Jones, Chris

    2017-10-01

    In recent years awareness of the importance of preserving the experimental data and scientific software at CERN has been rising. To support this effort, we are presenting a novel approach to structure dependencies of the LHCb data and software to make it more accessible in the long-term future. In this paper, we detail the implementation of a graph database of these dependencies. We list the implications that can be deduced from the graph mining (such as a search for the legacy software), with emphasis on data preservation. Furthermore, we introduce a methodology of recreating the LHCb data, thus supporting reproducible research and data stewardship. Finally, we describe how this information is made available to the users on a web portal that promotes data and analysis preservation and good practise with analysis documentation.

  18. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration; Sanchez, Javier

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment [1][2] at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. This paper presents two components of the ATLAS EventIndex [3]: its data collection supervisor and its web interface partner.

  19. MARINER 10 IMAGING ARCHIVE EXPERIMENT DATA RECORD

    Data.gov (United States)

    National Aeronautics and Space Administration — This series of fifteen CDs was produced by JPL's Science Digital Data Preservation Task (SDDPT) by migrating the original Mariner Ten image EDRs from old,...

  20. Comparison of global ocean colour data records

    Directory of Open Access Journals (Sweden)

    S. Djavidnia

    2010-01-01

    Full Text Available The extending record of ocean colour derived information, an important asset for the study of marine ecosystems and biogeochemistry, presently relies on individual satellite missions launched by several space agencies with differences in sensor design, calibration strategies and algorithms. In this study we present an extensive comparative analysis of standard products obtained from operational global ocean colour sensors (SeaWiFS, MERIS, MODIS-Aqua, MODIS-Terra, on both global and regional scales. The analysis is based on monthly mean chlorophyll a (Chl-a sea surface concentration between 2002 and 2009.

    Based on global statistics, the Chl-a records appear relatively consistent. The root mean square (RMS difference Δ between (log-transformed Chl-a from SeaWiFS and MODIS Aqua amounts to 0.137, with a bias of 0.074 (SeaWiFS Chl-a higher. The difference between these two products and MERIS Chl-a is approximately 0.15. Restricting the analysis to 2007 only, Δ between MODIS Aqua and Terra is 0.142. This global convergence is significantly modulated regionally. Statistics for biogeographic provinces representing a partition of the global ocean, show Δ values varying between 0.08 and 0.3. High latitude regions, as well as coastal and shelf provinces are generally the areas with the largest differences. Moreover, RMS differences and biases are modulated in time, with a coefficient of variation of Δ varying between 10% and 40%, with clear seasonal patterns in some provinces.

    The comparison of the province-averaged time series obtained from the various satellite products also shows a level of agreement that is geographically variable. Overall, the Chl-a SeaWiFS and MODIS Aqua series appear to have similar levels of variance and display high correlation coefficients, an agreement likely favoured by the common elements shared by the two missions. These results are degraded if the MERIS

  1. NIMS EXPERIMENT DATA RECORDS: SL-9 COMET IMPACT WITH JUPITER

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  2. NIMS EXPERIMENT DATA RECORDS: GASPRA/IDA ENCOUNTERS

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  3. Expression and cut parser for CMS event data

    Science.gov (United States)

    Lista, Luca; Jones, Christopher D.; Petrucciani, Giovanni

    2010-04-01

    We present a parser to evaluate expressions and Boolean selections that is applied on CMS event data for event filtering and analysis purposes. The parser is based on Boost Spirit grammar definition, and uses Reflex dictionaries for class introspection. The parser allows for a natural definition of expressions and cuts in users' configurations, and provides good runtime performance compared to other existing parsers.

  4. Out-of-order event processing in kinetic data structures

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Agrawal, Pankaj

    2011-01-01

    We study the problem of designing kinetic data structures (KDS’s for short) when event times cannot be computed exactly and events may be processed in a wrong order. In traditional KDS’s this can lead to major inconsistencies from which the KDS cannot recover. We present more robust KDS......’s for the maintenance of several fundamental structures such as kinetic sorting and kinetic tournament trees, which overcome the difficulty by employing a refined event scheduling and processing technique. We prove that the new event scheduling mechanism leads to a KDS that is correct except for finitely many short...

  5. Between personal health record website and portable medical health record: an online data transformation interface.

    Science.gov (United States)

    Wu, Yi-Hua; Li, Yian-Zhi; Li, Yu-Chuan

    2009-01-01

    Web-based Personal health record recently brought out the whole world attention by two famous vendors involved in this battle field. The convenience and ubiquity of user-end data management influence the user's will greatly. This study is based on the idea of easy data transfer from the portal device to the web-based personal health record, and further more this can promote the use of personal health record and assist more people manage their own health.

  6. A novel method for inferring RFID tag reader recordings into clinical events.

    Science.gov (United States)

    Chang, Yung-Ting; Syed-Abdul, Shabbir; Tsai, Chung-You; Li, Yu-Chuan

    2011-12-01

    Nosocomial infections (NIs) are among the important indicators used for evaluating patients' safety and hospital performance during accreditation of hospitals. NI rate is higher in Intensive Care Units (ICUs) than in the general wards because patients require intense care involving both invasive and non-invasive clinical procedures. The emergence of Superbugs is motivating health providers to enhance infection control measures. Contact behavior between health caregivers and patients is one of the main causes of cross infections. In this technology driven era remote monitoring of patients and caregivers in the hospital setting can be performed reliably, and thus is in demand. Proximity sensing using radio frequency identification (RFID) technology can be helpful in capturing and keeping track on all contact history between health caregivers and patients for example. This study intended to extend the use of proximity sensing of radio frequency identification technology by proposing a model for inferring RFID tag reader recordings into clinical events. The aims of the study are twofold. The first aim is to set up a Contact History Inferential Model (CHIM) between health caregivers and patients. The second is to verify CHIM with real-time observation done at the ICU ward. A pre-study was conducted followed by two study phases. During the pre-study proximity sensing of RFID was tested, and deployment of the RFID in the Clinical Skill Center in one of the medical centers in Taiwan was done. We simulated clinical events and developed CHIM using variables such as duration of time, frequency, and identity (tag) numbers assigned to caregivers. All clinical proximity events are classified into close-in events, contact events and invasive events. During the first phase three observers were recruited to do real time recordings of all clinical events in the Clinical Skill Center with the deployed automated RFID interaction recording system. The observations were used to verify

  7. An underestimated record breaking event – why summer 1540 was likely warmer than 2003

    Directory of Open Access Journals (Sweden)

    O. Wetter

    2013-01-01

    Full Text Available The heat of summer 2003 in Western and Central Europe was claimed to be unprecedented since the Middle Ages on the basis of grape harvest data (GHD and late wood maximum density (MXD data from trees in the Alps. This paper shows that the authors of these studies overlooked the fact that the heat and drought in Switzerland in 1540 likely exceeded the amplitude of the previous hottest summer of 2003, because the persistent temperature and precipitation anomaly in that year, described in an abundant and coherent body of documentary evidence, severely affected the reliability of GHD and tree-rings as proxy-indicators for temperature estimates. Spring–summer (AMJJ temperature anomalies of 4.7 °C to 6.8 °C being significantly higher than in 2003 were assessed for 1540 from a new long Swiss GHD series (1444 to 2011. During the climax of the heat wave in early August the grapes desiccated on the vine, which caused many vine-growers to interrupt or postpone the harvest despite full grape maturity until after the next spell of rain. Likewise, the leaves of many trees withered and fell to the ground under extreme drought stress as would usually be expected in late autumn. It remains to be determined by further research whether and how far this result obtained from local analyses can be spatially extrapolated. Based on the temperature estimates for Switzerland it is assumed from a great number of coherent qualitative documentary evidence about the outstanding heat drought in 1540 that AMJJ temperatures were likely more extreme in neighbouring regions of Western and Central Europe than in 2003. Considering the significance of soil moisture deficits for record breaking heat waves, these results still need to be validated with estimated seasonal precipitation. It is concluded that biological proxy data may not properly reveal record breaking heat and drought events. Such assessments thus need to be complemented with the critical study of contemporary

  8. An underestimated record breaking event - why summer 1540 was likely warmer than 2003

    Science.gov (United States)

    Wetter, O.; Pfister, C.

    2013-01-01

    The heat of summer 2003 in Western and Central Europe was claimed to be unprecedented since the Middle Ages on the basis of grape harvest data (GHD) and late wood maximum density (MXD) data from trees in the Alps. This paper shows that the authors of these studies overlooked the fact that the heat and drought in Switzerland in 1540 likely exceeded the amplitude of the previous hottest summer of 2003, because the persistent temperature and precipitation anomaly in that year, described in an abundant and coherent body of documentary evidence, severely affected the reliability of GHD and tree-rings as proxy-indicators for temperature estimates. Spring-summer (AMJJ) temperature anomalies of 4.7 °C to 6.8 °C being significantly higher than in 2003 were assessed for 1540 from a new long Swiss GHD series (1444 to 2011). During the climax of the heat wave in early August the grapes desiccated on the vine, which caused many vine-growers to interrupt or postpone the harvest despite full grape maturity until after the next spell of rain. Likewise, the leaves of many trees withered and fell to the ground under extreme drought stress as would usually be expected in late autumn. It remains to be determined by further research whether and how far this result obtained from local analyses can be spatially extrapolated. Based on the temperature estimates for Switzerland it is assumed from a great number of coherent qualitative documentary evidence about the outstanding heat drought in 1540 that AMJJ temperatures were likely more extreme in neighbouring regions of Western and Central Europe than in 2003. Considering the significance of soil moisture deficits for record breaking heat waves, these results still need to be validated with estimated seasonal precipitation. It is concluded that biological proxy data may not properly reveal record breaking heat and drought events. Such assessments thus need to be complemented with the critical study of contemporary evidence from

  9. A joint renewal process used to model event based data

    National Research Council Canada - National Science Library

    Mergenthaler, Wolfgang; Jaroszewski, Daniel; Feller, Sebastian; Laumann, Larissa

    2016-01-01

    .... Event data, herein defined as a collection of triples containing a time stamp, a failure code and eventually a descriptive text, can best be evaluated by using the paradigm of joint renewal processes...

  10. Diving into Data: Planning a Research Data Management Event

    National Research Council Canada - National Science Library

    Reed, Robyn

    2015-01-01

    The George T. Harrell Health Sciences Library at Penn State Hershey initiated its participation in institutional research data management activities by coordinating and hosting a well-attended data management symposium...

  11. Diamond Morphology: Link to Metasomatic Events in the Mantle or Record of Evolution of Kimberlitic Fluid?

    Science.gov (United States)

    Fedortchouk, Y.

    2009-05-01

    Morphology and surface features on diamonds show tremendous variation even within a single kimberlite body reflecting a complex history of growth and dissolution. But does the diamond surface record the conditions in the several mantle sources sampled by the rising kimberlite magma, or evolution of the fluid system in the kimberlite magma itself? To address this question I revised morphological classification of diamonds from several kimberlite pipes from EKATI Mine property, N.W.T., Canada. The novelty of the approach, compared to the existing classifications, is in utilizing a random but large dataset of diamond dissolution experiments accumulated by several researchers including myself. These experiments have shown that similar forms (e.g. trigon etch pits) can be produced in a variety of conditions and environments, whereas their shape and size would depend on the reactant. Similarly, different types of resorption features always form together and can be used for deriving the composition of oxidizing fluid. The proposed classification method is focused on relating various types of diamond surfaces to the composition and conditions of oxidizing media. The study uses parcels of micro-and macro-diamonds (total of 125 carats) from Misery, Grizzly, Leslie and Koala kimberlites, EKATI Mine property, Northwest Territories, Canada. Only octahedron and hexoctahedron diamonds were selected (total ~600 stones). Diamond surfaces were studied using an optical and Field- Emission Scanning Electron Microscope to define resorption elements - simple surface features. These elements were identified for each of the three categories: 1) present on octahedral faces (well-preserved diamonds), 2) present on hexoctahedral faces (rounded resorbed diamonds), and 3) frosting (micro-features). Consistent associations of several elements define Resorption Types of diamonds, which form during a single oxidizing event. We further relate these types to the composition of the C-H-O + chlorides

  12. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  13. Association Patterns in Open Data to Explore Ciprofloxacin Adverse Events.

    Science.gov (United States)

    Yildirim, P

    2015-01-01

    Ciprofloxacin is one of the main drugs to treat bacterial infections. Bacterial infections can lead to high morbidity, mortality, and costs of treatment in the world. In this study, an analysis was conducted using the U.S. Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) database on the adverse events of ciprofloxacin. The aim of this study was to explore unknown associations among the adverse events of ciprofloxacin, patient demographics and adverse event outcomes. A search of FDA AERS reports was performed and some statistics was highlighted. The most frequent adverse events and event outcomes of ciprofloxacin were listed, age and gender specific distribution of adverse events are reported, then the apriori algorithm was applied to the dataset to obtain some association rules and objective measures were used to select interesting ones. Furthermore, the results were compared against classical data mining algorithms and discussed. The search resulted in 6 531 reports. The reports included within the dataset consist of 3 585 (55.8%) female and 2 884 (44.1%) male patients. The mean age of patients is 54.59 years. Preschool child, middle aged and aged groups have most adverse events reports in all groups. Pyrexia has the highest frequency with ciprofloxacin, followed by pain, diarrhoea, and anxiety in this order and the most frequent adverse event outcome is hospitalization. Age and gender based differences in the events in patients were found. In addition, some of the interesting associations obtained from the Apriori algorithm include not only psychiatric disorders but specifically their manifestation in specific gender groups. The FDA AERS offers an important data resource to identify new or unknown adverse events of drugs in the biomedical domain. The results that were obtained in this study can provide valuable information for medical researchers and decision makers at the pharmaceutical research field.

  14. Tablet computers for recording tuberculosis data at a community ...

    African Journals Online (AJOL)

    Don O’Mahony

    2014-08-20

    Aug 20, 2014 ... those that may inform the design of an electronic health record. The aims of this study were: • Phase 1: To describe the process of identifying and developing a tablet computer programme to capture data. • Phase 2: Qualitative evaluation of the use of tablet computers to record data at a rural CHC. Method.

  15. Digital voice recording: An efficient alternative for data collection

    Science.gov (United States)

    Mark A. Rumble; Thomas M. Juntti; Thomas W. Bonnot; Joshua J. Millspaugh

    2009-01-01

    Study designs are usually constrained by logistical and budgetary considerations that can affect the depth and breadth of the research. Little attention has been paid to increasing the efficiency of data recording. Digital voice recording and translation may offer improved efficiency of field personnel. Using this technology, we increased our data collection by 55...

  16. Constructing Data Albums for Significant Severe Weather Events

    Science.gov (United States)

    Greene, Ethan; Zavodsky, Bradley; Ramachandran, Rahul; Kulkarni, Ajinkya; Li, Xiang; Bakare, Rohan; Basyal, Sabin; Conover, Helen

    2014-01-01

    Data Albums provide a one-stop-shop combining datasets from NASA, NWS, online new sources, and social media. Data Albums will help meteorologists better understand severe weather events to improve predictive models. Developed a new ontology for severe weather based off current hurricane Data Album and selected relevant NASA datasets for inclusion.

  17. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  18. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  19. Gravity Data for Indiana-over 10,000 records

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (10,629 records) were compiled by Purdue University. This data base was received in December 1989. Principal gravity parameters include Free-air...

  20. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  1. New ATLAS event generator tunes to 2010 data

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    This note describes the Monte Carlo event generator tunings for the Pythia 6 and Herwig/Jimmy generators in the ATLAS MC11 simulation production. New tunes have been produced for these generators, making maximal use of available published data from ATLAS and from the Tevatron and LEP experiments. Particular emphasis has been placed on improvement of the description of e+ e− event shape and jet rate data, and on description of hadron collider event shape observables in Pythia, as well as the established procedure of tuning the multiple parton interactions of both models to describe underlying event and minimum bias data. The tuning of Pythia is provided at this time for the MRST LO∗∗ PDF, while the purely MPI tune of Herwig/Jimmy is performed for ten different PDFs.

  2. Hardware adaptation layer for MPEG video recording on a helical scan-based digital data recorder

    Science.gov (United States)

    de Ridder, Ad C.; Kindt, S.; Frimout, Emmanuel D.; Biemond, Jan; Lagendijk, Reginald L.

    1996-03-01

    The forthcoming introduction of helical scan digital data tape recorders with high access bandwidth and large capacity will facilitate the recording and retrieval of a wide variety of multimedia information from different sources, such as computer data and digital audio and video. For the compression of digital audio and video, the MPEG standard has internationally been accepted. Although helical scan tape recorders can store and playback MPEG compressed signals transparently they are not well suited for carrying out special playback modes, in particular fast forward and fast reverse. Only random portions of a original MPEG bitstream are recovered on fast playback. Unfortunately these shreds of information cannot be interpreted by a standard MPEG decoder, due to loss of synchronization and missing reference pictures. In the EC-sponsored RACE project DART (Digital Data Recorder Terminal) the possibilities for recording and fast playback of MPEG video on a helical scan recorder have been investigated. In the approach we present in this paper, we assume that not transcoding is carried out on the incoming bitstream at recording time, nor that any additional information is recorded. To use the shreds of information for the reconstruction of interpretable pictures, a bitstream validator has been developed to achieve conformance to the MPEG-2 syntax during fast playback. The concept has been validated by realizing hardware demonstrators that connect to a prototype helical scan digital data tape recorder.

  3. Cosmic ray event in 994 C.E. recorded in radiocarbon from Danish oak

    Science.gov (United States)

    Fogtmann-Schulz, A.; Østbø, S. M.; Nielsen, S. G. B.; Olsen, J.; Karoff, C.; Knudsen, M. F.

    2017-08-01

    We present measurements of radiocarbon in annual tree rings from the time period 980-1006 Common Era (C.E.), hereby covering the cosmic ray event in 994 C.E. The new radiocarbon record from Danish oak is based on both earlywood and latewood fractions of the tree rings, which makes it possible to study seasonal variations in 14C production. The measurements show a rapid increase of ˜10‰ from 993 to 994 C.E. in latewood, followed by a modest decline and relatively high values over the ensuing ˜10 years. This rapid increase occurs from 994 to 995 C.E. in earlywood, suggesting that the cosmic ray event most likely occurred during the period between April and June 994 C.E. Our new record from Danish oak shows strong agreement with existing Δ14C records from Japan, thus supporting the hypothesis that the 994 C.E. cosmic ray event was uniform throughout the Northern Hemisphere and therefore can be used as an astrochronological tie point to anchor floating chronologies of ancient history.

  4. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  5. Discovering Anomalous Aviation Safety Events Using Scalable Data Mining Algorithms

    Data.gov (United States)

    National Aeronautics and Space Administration — The worldwide civilian aviation system is one of the most complex dynamical systems created. Most modern commercial aircraft have onboard flight data recorders that...

  6. Predicting mining collapse: Superjerks and the appearance of record-breaking events in coal as collapse precursors

    Science.gov (United States)

    Jiang, Xiang; Liu, Hanlong; Main, Ian G.; Salje, Ekhard K. H.

    2017-08-01

    The quest for predictive indicators for the collapse of coal mines has led to a robust criterion from scale-model tests in the laboratory. Mechanical collapse under uniaxial stress forms avalanches with a power-law probability distribution function of radiated energy P ˜E-ɛ , with exponent ɛ =1.5 . Impending major collapse is preceded by a reduction of the energy exponent to the mean-field value ɛ =1.32 . Concurrently, the crackling noise increases in intensity and the waiting time between avalanches is reduced when the major collapse is approaching. These latter criteria were so-far deemed too unreliable for safety assessments in coal mines. We report a reassessment of previously collected extensive collapse data sets using "record-breaking analysis," based on the statistical appearance of "superjerks" within a smaller spectrum of collapse events. Superjerks are defined as avalanche signals with energies that surpass those of all previous events. The final major collapse is one such superjerk but other "near collapse" events equally qualify. In this way a very large data set of events is reduced to a sparse sequence of superjerks (21 in our coal sample). The main collapse can be anticipated from the sequence of energies and waiting times of superjerks, ignoring all weaker events. Superjerks are excellent indicators for the temporal evolution, and reveal clear nonstationarity of the crackling noise at constant loading rate, as well as self-similarity in the energy distribution of superjerks as a function of the number of events so far in the sequence Es j˜nδ with δ =1.79 . They are less robust in identifying the precise time of the final collapse, however, than the shift of the energy exponents in the whole data set which occurs only over a short time interval just before the major event. Nevertheless, they provide additional diagnostics that may increase the reliability of such forecasts.

  7. An Efficient Decoder for the Recognition of Event-Related Potentials in High-Density MEG Recordings

    Directory of Open Access Journals (Sweden)

    Christoph Reichert

    2016-04-01

    Full Text Available Brain–computer interfacing (BCI is a promising technique for regaining communication and control in severely paralyzed people. Many BCI implementations are based on the recognition of task-specific event-related potentials (ERP such as P300 responses. However, because of the high signal-to-noise ratio in noninvasive brain recordings, reliable detection of single trial ERPs is challenging. Furthermore, the relevant signal is often heterogeneously distributed over several channels. In this paper, we introduce a new approach for recognizing a sequence of attended events from multi-channel brain recordings. The framework utilizes spatial filtering to reduce both noise and signal space considerably. We introduce different models that can be used to construct the spatial filter and evaluate the approach using magnetoencephalography (MEG data involving P300 responses, recorded during a BCI experiment. Compared to the accuracy achieved in the BCI experiment performed without spatial filtering, the recognition rate increased significantly to up to 95.3% on average (SD: 5.3%. In combination with the data-driven spatial filter construction we introduce here, our framework represents a powerful method to reliably recognize a sequence of brain potentials from high-density electrophysiological data, which could greatly improve the control of BCIs.

  8. PMU Data Event Detection: A User Guide for Power Engineers

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.; Singh, M.; Muljadi, E.; Santoso, S.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical background that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.

  9. Earth Science Data Fusion with Event Building Approach

    Science.gov (United States)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  10. Persistent ATLAS Data Structures and Reclustering of Event Data

    CERN Document Server

    Schaller, Martin

    1999-01-01

    The ATLAS experiment will start to take data in the year 2005. The amount of experimental data forms a serious challenge for data processing and data storage. About 1 PB (1015 bytes) per year has to be processed and stored. Currently, a paradigm shift in High-Energy Physics (HEP) computing is taking place. It is planned that software is written in object-oriented languages (mainly C++). For data storage the usage of object-oriented database management systems (ODBMSs) is foreseen. This thesis investigates the usage of an ODBMS in the ATLAS experiment. Work was done in several connected areas. First, we present exhaustive benchmarks of the commercial ODBMS Objectivity/DB that is today the most promising candidate for the storage system. We describe the ATLAS 1 TB milestone that was performed to investigate the reliability and performance of an ODBMS storage solution coupled to a mass storage system. Second, we report about the design and implementation of the persistent ATLAS data structures, both in the detec...

  11. The Early Toarcian Oceanic Anoxic Event: A Southern Hemisphere record from Chile

    Science.gov (United States)

    Fantasia, Alicia; Föllmi, Karl B.; Adatte, Thierry; Spangenberg, Jorge E.; Bernárdez, Enrique; Mattioli, Emanuela

    2016-04-01

    The Early Toarcian was marked by important environmental changes, marine oxygen deficiency and extensive organic-rich sediment deposition (T-OAE; ˜182 Ma, Early Jurassic). The T-OAE coincides with a marked negative carbon isotope excursion (CIE) recorded in marine carbonate, and marine and terrestrial organic carbon. This is commonly attributed to the massive release of isotopically light carbon to the atmospheric and oceanic reservoirs derived from the destabilization of methane hydrates from marine sediments and/or the emissions of thermogenic methane from the eruption of the Karoo-Ferrar LIP (e.g., Hesselbo et al., 2000; Kemp et al., 2005; Svensen et al., 2007; Mazzini et al., 2010). Moreover, in most documented marine sections, this episode is marked by a generalized crisis in carbonate production and marine invertebrate extinctions (e.g. Jenkyns, 1988; Röhl et al., 2005; Suan et al., 2001). Several studies of the T-OAE have been conducted on sediments in central and northwest Europe, but only few data are available from the Southern Hemisphere, leading to large uncertainty concerning the exact expression of this event in this part of the world. The aims of this study are to characterize the sediments deposited during the Andean equivalents of the tenuicostatum and falciferum European Zones and establish in which way the T-OAE affected this region. In the Early Jurassic, the Andean basin was in a back-arc setting with marine corridors connected to Panthalassa. In this study, we have generated new high-resolution sedimentological, geochemical and mineralogical data from the sections of El Peñon and Quebrada Asiento, located in Chile in the northeastern area of the city of Copiapó, Atacama region. The biostratigraphy of these sections has been studied by von Hillebrandt and Schidt-Effing (1981) and complemented here by a biostratigraphy based on calcareous nannofossils. The sections consist of a succession of marl, limestone and siltstone of Pliensbachian and

  12. Sedimentary record of Tropical Cyclone Pam from Vanuatu: implications for long-term event records in the tropical South Pacific

    Science.gov (United States)

    Pilarczyk, Jessica; Kosciuch, Thomas; Hong, Isabel; Fritz, Hermann; Horton, Benjamin; Wallace, Davin; Dike, Clayton; Rarai, Allan; Harrison, Morris; Jockley, Fred

    2017-04-01

    Vanuatu has a history of tropical cyclones impacting its coastlines, including Tropical Cyclone (TC) Pam, a rare Category 5 event that made landfall in March 2015. Reliable records of tropical cyclones impacting Vanuatu are limited to the last several decades, with only fragmentary evidence of events extending as far back as the 1890's. Geological investigations are a means for expanding the short historical record of tropical cyclones by hundreds to thousands of years, permitting the study of even the rare, but intense events. However, geological records of past tropical cyclones are limited in their ability to quantify the intensity of past events. Modern analogues of landfalling tropical cyclones present an opportunity to characterize overwash sediments deposited by a storm of known intensity. In this study, we document the sedimentological and micropaleontological characteristics of sediments deposited by TC Pam in order to assess sediment provenance associated with a landfalling Category 5 storm. Within three months of TC Pam making landfall on Vanuatu we surveyed high-water marks associated with the storm surge and documented the foraminiferal assemblages and grain size distributions contained within the overwash sediments from Manuro (mixed-carbonate site on Efate Island) and Port Resolution Bay (volcaniclastic site on Tanna Island). The combined use of foraminiferal taxonomy and taphonomy (surface condition of foraminifera) was most useful in distinguishing the TC Pam sediments from the underlying layer. TC Pam sediments were characterized by an influx of calcareous marine foraminifera that were dominantly unaltered relative to those that were abraded and fragmented. Similar to studies that use mollusk taphonomy to identify overwash deposits, we found that TC Pam sediments were associated with an influx of angular fragments that were broken during transport by the storm surge. A statistical comparison of foraminifera from six modern environments on Efate

  13. A study of data representation in Hadoop to optimize data storage and search performance for the ATLAS EventIndex

    Science.gov (United States)

    Baranowski, Z.; Canali, L.; Toebbicke, R.; Hrivnac, J.; Barberis, D.

    2017-10-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  14. A study of data representation in Hadoop to optimise data storage and search performance for the ATLAS EventIndex

    CERN Document Server

    AUTHOR|(CDS)2078799; The ATLAS collaboration; Canali, Luca; Toebbicke, Rainer; Hrivnac, Julius; Barberis, Dario

    2017-01-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  15. A study of data representations in Hadoop to optimize data storage and search performance of the ATLAS EventIndex

    CERN Document Server

    Baranowski, Zbigniew; The ATLAS collaboration

    2016-01-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions event records, each of which consisting of ~100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. This paper reports on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  16. Expression and cut parser for CMS event data

    Energy Technology Data Exchange (ETDEWEB)

    Lista, Luca [INFN Sezione di Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia, I-80126, Naples (Italy); Jones, Christopher D [Fermi National Accelerator Laboratory, PO Box 500, Batavia, IL 60510-5011 (United States); Petrucciani, Giovanni, E-mail: luca.lista@na.infn.i [Scuola Normale Superiore di Pisa, Piazza dei Cavalieri 7, I-56126 Pisa (Italy)

    2010-04-01

    We present a parser to evaluate expressions and Boolean selections that is applied on CMS event data for event filtering and analysis purposes. The parser is based on Boost Spirit grammar definition, and uses Reflex dictionaries for class introspection. The parser allows for a natural definition of expressions and cuts in users' configurations, and provides good runtime performance compared to other existing parsers.

  17. [Assessing the economic impact of adverse events in Spanish hospitals by using administrative data].

    Science.gov (United States)

    Allué, Natalia; Chiarello, Pietro; Bernal Delgado, Enrique; Castells, Xavier; Giraldo, Priscila; Martínez, Natalia; Sarsanedas, Eugenia; Cots, Francesc

    2014-01-01

    To evaluate the incidence and costs of adverse events registered in an administrative dataset in Spanish hospitals from 2008 to 2010. A retrospective study was carried out that estimated the incremental cost per episode, depending on the presence of adverse events. Costs were obtained from the database of the Spanish Network of Hospital Costs. This database contains data from 12 hospitals that have costs per patient records based on activities and clinical records. Adverse events were identified through the Patient Safety Indicators (validated in the Spanish Health System) created by the Agency for Healthcare Research and Quality together with indicators of the EuroDRG European project. This study included 245,320 episodes with a total cost of 1,308,791,871€. Approximately 17,000 patients (6.8%) experienced an adverse event, representing 16.2% of the total cost. Adverse events, adjusted by diagnosis-related groups, added a mean incremental cost of between €5,260 and €11,905. Six of the 10 adverse events with the highest incremental cost were related to surgical interventions. The total incremental cost of adverse events was € 88,268,906, amounting to an additional 6.7% of total health expenditure. Assessment of the impact of adverse events revealed that these episodes represent significant costs that could be reduced by improving the quality and safety of the Spanish Health System. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  18. Messinian Salinity Crisis - DREAM (Deep-sea Record of Mediterranean Messinian events) drilling projects

    Science.gov (United States)

    Lofi, Johanna; Camerlenghi, Angelo

    2014-05-01

    About 6 My ago the Mediterranean Sea was transformed into a giant saline basin. This event, commonly referred to as the Messinian salinity crisis (MSC), changed the chemistry of the global ocean and had a permanent impact on both the terrestrial and marine ecosystems of a huge area surrounding the Mediterranean area. The first fascinating MSC scenario was proposed following DSDP Leg XIII in 1970 and envisaged an almost desiccated deep Mediterranean basin with a dramatic ~1,500 m drop of sea level, the incision of deep canyons by rivers on the continental margins, and a final catastrophic flooding event when the connections between the Mediterranean Sea and the Atlantic were re-established ~5.33 My ago. In spite of 40 years of multi-disciplinary research conducted on the MSC, modalities, timing, causes, chronology and consequence at local and planetary scale are still not yet fully understood, and the MSC event remains one of the longest-living controversies in Earth Science. Key factor for the controversy is the lack of a complete record of the MSC preserved in the deepest Mediterranean basins. Anywhere else, the MSC mostly generated a sedimentary/time lag corresponding to a widespread erosion surface. Correlations with the offshore depositional units are thus complex, preventing the construction of a coherent scenario linking the outcropping MSC evaporites, the erosion on the margins, and the deposition of clastics and evaporites in the abyssal plains. Recent activity by various research groups in order to identify locations for multiple-site drilling (including riser-drilling) in the Mediterranean Sea that would contribute to solve the open questions still existing about the MSC has culminated in two DREAM Magellan+ Workshops held in 2013 and 2014. A strategy and work plan have been established in order to submit an IODP Multi-phase Drilling Project("Uncovering A Salt Giant")including several site-specific drilling proposals addressing different scientific

  19. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    Science.gov (United States)

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  20. Using mobile phone data records to determine criminal activity space

    CSIR Research Space (South Africa)

    Schmitz, Peter MU

    2007-09-01

    Full Text Available to anchor points than property crimes. Using call data records (CDR) or active tracking data makes it possible to determine the activity space of an individual using cellular data. The data alone is not sufficient and needs to be supported by local knowledge...

  1. Improving medical record retrieval for validation studies in Medicare data.

    Science.gov (United States)

    Wright, Nicole C; Delzell, Elizabeth S; Smith, Wilson K; Xue, Fei; Auroa, Tarun; Curtis, Jeffrey R

    2017-04-01

    The purpose of the study is to describe medical record retrieval for a study validating claims-based algorithms used to identify seven adverse events of special interest (AESI) in a Medicare population. We analyzed 2010-2011 Medicare claims of women with postmenopausal osteoporosis and men ≥65 years of age in the Medicare 5% national sample. The final cohorts included beneficiaries covered continuously for 12+ months by Medicare parts A, B, and D and not enrolled in Medicare Advantage before starting follow-up. We identified beneficiaries using each AESI algorithm and randomly selected 400 women and 100 men with each AESI for medical record retrieval. The Centers for Medicare and Medicaid Services provided beneficiary contact information, and we requested medical records directly from providers, without patient contact. We selected 3331 beneficiaries (women: 2272; men: 559) for whom we requested 3625 medical records. Overall, we received 1738 [47.9% (95%CI 46.3%, 49.6%)] of the requested medical records. We observed small differences in the characteristics of the total population with AESIs compared with those randomly selected for retrieval; however, no differences were seen between those selected and those retrieved. We retrieved 54.7% of records requested from hospitals compared with 26.3% of records requested from physician offices (p Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Record transfer of data between CERN and California

    CERN Document Server

    2003-01-01

    A data transfer record has been broken by transmitting at a rate of 2.38 gigabits per second for more than one hour between CERN and Sunnyvale in California, a distance of more than 10,000 km. This record-breaking performance was achieved in the framework of tests to develop a high-speed global network for the future computing grid.

  3. Geophysical Event Casting: Assembling & Broadcasting Data Relevant to Events and Disasters

    Science.gov (United States)

    Manipon, G. M.; Wilson, B. D.

    2012-12-01

    Broadcast Atom feeds are already being used to publish metadata and support discovery of data collections, granules, and web services. Such data and service casting advertises the existence of new granules in a dataset and available services to access or transform data. Similarly, data and services relevant to studying topical geophysical events (earthquakes, hurricanes, etc.) or periodic/regional structures (El Nino, deep convection) can be broadcast by publishing new entries and links in a feed for that topic. By using the geoRSS conventions, the time and space location of the event (e.g. a moving hurricane track) is specified in the feed, along with science description, images, relevant data granules, and links to useful web services (e.g. OGC/WMS). The topic cast is used to assemble all of the relevant data/images as they come in, and publish the metadata (images, links, services) to a broad group of subscribers. All of the information in the feed is structured using standardized XML tags (e.g. georss for space & time, and tags to point to external data & services), and is thus machine-readable, which is an improvement over collecting ad hoc links on a wiki. We have created a software suite in python to generate such "event casts" when a geophysical event first happens, then update them with more information as it becomes available, and display them as an event album in a web browser. Figure 1 shows a snapshot of our Event Cast Browser displaying information from a set of casts about the hurricanes in the Western Pacific during the year 2011. The 19th cyclone is selected in the left panel, so the top right panels display the entries in that feed with metadata such as maximum wind speed, while the bottom right panel displays the hurricane track (positions every 12 hours) as KML in the Google Earth plug-in, where additional data/image layers from the feed can be turned on or off by the user. The software automatically converts (georss) space & time information to

  4. Possibility of the use of data of infrasonic monitoring for identification of the nature of seismic events

    OpenAIRE

    Lyashchuk, A.; Andrushchenko, Yu.; Gordienko, Yu.; Karyagin, E.; Kornienko, I.

    2017-01-01

    The paper considers the possibility of the use of infrasound measurements conducted in Ukraine to verify the recorded seismic events and the use of infrasound data as one of the criteria for their identification. Registration of seismic and infrasonic signals was carried out via a network of geophysical Main center of special monitoring. To register infrasound small-aperture infrasound arrays were used, allowing directional monitoring of events. The data of 909 parameters of seismic events fr...

  5. Platform links clinical data with electronic health records

    Science.gov (United States)

    To make data gathered from patients in clinical trials available for use in standard care, NCI has created a new computer tool to support interoperability between clinical research and electronic health record systems. This new software represents an inno

  6. Usage reporting on recorded lectures using educational data mining

    NARCIS (Netherlands)

    Gorissen, Pierre; Van Bruggen, Jan; Jochems, Wim

    2012-01-01

    Gorissen, P., Van Bruggen, J., & Jochems, W. M. G. (2012). Usage reporting on recorded lectures using educational data mining. International Journal of Learning Technology, 7, 23-40. doi:10.1504/IJLT.2012.046864

  7. Physicists set new record for network data transfer

    CERN Multimedia

    2007-01-01

    "An international team of physicists, computer scientists, and network engineers joined forces to set new records for sustained data transfer between storage systems durint the SuperComputing 2006 (SC06) Bandwidth Challenge (BWC). (3 pages)

  8. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  9. Mining Rare Events Data for Assessing Customer Attrition Risk

    Science.gov (United States)

    Au, Tom; Chin, Meei-Ling Ivy; Ma, Guangqin

    Customer attrition refers to the phenomenon whereby a customer leaves a service provider. As competition intensifies, preventing customers from leaving is a major challenge to many businesses such as telecom service providers. Research has shown that retaining existing customers is more profitable than acquiring new customers due primarily to savings on acquisition costs, the higher volume of service consumption, and customer referrals. For a large enterprise, its customer base consists of tens of millions service subscribers, more often the events, such as switching to competitors or canceling services are large in absolute number, but rare in percentage, far less than 5%. Based on a simple random sample, popular statistical procedures, such as logistic regression, tree-based method and neural network, can sharply underestimate the probability of rare events, and often result a null model (no significant predictors). To improve efficiency and accuracy for event probability estimation, a case-based data collection technique is then considered. A case-based sample is formed by taking all available events and a small, but representative fraction of nonevents from a dataset of interest. In this article we showed a consistent prior correction method for events probability estimation and demonstrated the performance of the above data collection techniques in predicting customer attrition with actual telecommunications data.

  10. Sharing adverse drug event data using business intelligence technology.

    Science.gov (United States)

    Horvath, Monica M; Cozart, Heidi; Ahmad, Asif; Langman, Matthew K; Ferranti, Jeffrey

    2009-03-01

    Duke University Health System uses computerized adverse drug event surveillance as an integral part of medication safety at 2 community hospitals and an academic medical center. This information must be swiftly communicated to organizational patient safety stakeholders to find opportunities to improve patient care; however, this process is encumbered by highly manual methods of preparing the data. Following the examples of other industries, we deployed a business intelligence tool to provide dynamic safety reports on adverse drug events. Once data were migrated into the health system data warehouse, we developed census-adjusted reports with user-driven prompts. Drill down functionality enables navigation from aggregate trends to event details by clicking report graphics. Reports can be accessed by patient safety leadership either through an existing safety reporting portal or the health system performance improvement Web site. Elaborate prompt screens allow many varieties of reports to be created quickly by patient safety personnel without consultation with the research analyst. The reduction in research analyst workload because of business intelligence implementation made this individual available to additional patient safety projects thereby leveraging their talents more effectively. Dedicated liaisons are essential to ensure clear communication between clinical and technical staff throughout the development life cycle. Design and development of the business intelligence model for adverse drug event data must reflect the eccentricities of the operational system, especially as new areas of emphasis evolve. Future usability studies examining the data presentation and access model are needed.

  11. RECORD OF THE BINARY DATA ON SD CARD ARDUINO DUE

    Directory of Open Access Journals (Sweden)

    V. G. Mikhailov

    2016-01-01

    Full Text Available The short review of microcontrollers of family Arduino, their characteristics and application fields is given. Importance of record of parameters of researched object is marked to produce debugging of control systems on microcontrollers Arduino. Unique possibility of registration of parameters in family Arduino is record on SD a card in an alpha mode with usage of functions print (, write (. The problems connected to record of the binary data on SD a card on microcontroller Arduino Due are considered. The analysis of methods of record of the binary data on SD card Arduino Due, originating problems with neo-cleaning of memory from the previous program leading to possibility of duplication of the data on SD to a card, presence of the erratic point of view about restriction of volumes of data record and necessity of usage become outdated SD cards is carried out. Ways of elimination of the marked lacks are considered. The estimation of high-speed performance of various approaches of a data recording on SD a card is led. On the basis of the led researches the approach of multiplexing of the writeable information at the expense of conversion of the binary data is offered is byte-serial in a character array in code ASCI without magnification of their volume and record by units on 240 byte. It allows to use as much as possible standard function possibilities write ( Arduino and specificity of the organization of memory SD of cards and to increase high-speed performance more than in 1100 times in comparison with record in a character type on one byte.It is marked that usage of decisions of an exception of duplication of the data offered at forums does not provide completeness of their elimination. For Arduino Due for storage cleaning it is necessary usages of the special programmator or setting of the new program of loading.

  12. Development of a method to compensate for signal quality variations in repeated auditory event-related potential recordings

    Directory of Open Access Journals (Sweden)

    Antti K O Paukkunen

    2010-03-01

    Full Text Available Reliable measurements are mandatory in clinically relevant auditory event-related potential (AERP-based tools and applications. The comparability of the results gets worse as a result of variations in the remaining measurement error. A potential method is studied that allows optimization of the length of the recording session according to the concurrent quality of the recorded data. In this way, the sufficiency of the trials can be better guaranteed, which enables control of the remaining measurement error. The suggested method is based on monitoring the signal-to-noise ratio (SNR and remaining measurement error which are compared to predefined threshold values. The SNR test is well defined, but the criterion for the measurement error test still requires further empirical testing in practice. According to the results, the reproducibility of average AERPs in repeated experiments is improved in comparison to a case where the number of recorded trials is constant. The test-retest reliability is not significantly changed on average but the between-subject variation in the value is reduced by 33-35%. The optimization of the number of trials also prevents excessive recordings which might be of practical interest especially in the clinical context. The efficiency of the method may be further increased by implementing online tools that improve data consistency.

  13. Development of a Method to Compensate for Signal Quality Variations in Repeated Auditory Event-Related Potential Recordings

    Science.gov (United States)

    Paukkunen, Antti K. O.; Leminen, Miika M.; Sepponen, Raimo

    2010-01-01

    Reliable measurements are mandatory in clinically relevant auditory event-related potential (AERP)-based tools and applications. The comparability of the results gets worse as a result of variations in the remaining measurement error. A potential method is studied that allows optimization of the length of the recording session according to the concurrent quality of the recorded data. In this way, the sufficiency of the trials can be better guaranteed, which enables control of the remaining measurement error. The suggested method is based on monitoring the signal-to-noise ratio (SNR) and remaining measurement error which are compared to predefined threshold values. The SNR test is well defined, but the criterion for the measurement error test still requires further empirical testing in practice. According to the results, the reproducibility of average AERPs in repeated experiments is improved in comparison to a case where the number of recorded trials is constant. The test-retest reliability is not significantly changed on average but the between-subject variation in the value is reduced by 33–35%. The optimization of the number of trials also prevents excessive recordings which might be of practical interest especially in the clinical context. The efficiency of the method may be further increased by implementing online tools that improve data consistency. PMID:20407635

  14. Assignment of adverse event indexing terms in randomized clinical trials involving spinal manipulative therapy: an audit of records in MEDLINE and EMBASE databases.

    Science.gov (United States)

    Gorrell, Lindsay M; Engel, Roger M; Lystad, Reidar P; Brown, Benjamin T

    2017-03-14

    Reporting of adverse events in randomized clinical trials (RCTs) is encouraged by the authors of The Consolidated Standards of Reporting Trials (CONSORT) statement. With robust methodological design and adequate reporting, RCTs have the potential to provide useful evidence on the incidence of adverse events associated with spinal manipulative therapy (SMT). During a previous investigation, it became apparent that comprehensive search strategies combining text words with indexing terms was not sufficiently sensitive for retrieving records that were known to contain reports on adverse events. The aim of this analysis was to compare the proportion of articles containing data on adverse events associated with SMT that were indexed in MEDLINE and/or EMBASE and the proportion of those that included adverse event-related words in their title or abstract. A sample of 140 RCT articles previously identified as containing data on adverse events associated with SMT was used. Articles were checked to determine if: (1) they had been indexed with relevant terms describing adverse events in the MEDLINE and EMBASE databases; and (2) they mentioned adverse events (or any related terms) in the title or abstract. Of the 140 papers, 91% were MEDLINE records, 85% were EMBASE records, 81% were found in both MEDLINE and EMBASE records, and 4% were not in either database. Only 19% mentioned adverse event-related text words in the title or abstract. There was no significant difference between MEDLINE and EMBASE records in the proportion of available papers (p = 0.078). Of the 113 papers that were found in both MEDLINE and EMBASE records, only 3% had adverse event-related indexing terms assigned to them in both databases, while 81% were not assigned an adverse event-related indexing term in either database. While there was effective indexing of RCTs involving SMT in the MEDLINE and EMBASE databases, there was a failure of allocation of adverse event indexing terms in both databases. We

  15. Assignment of adverse event indexing terms in randomized clinical trials involving spinal manipulative therapy: an audit of records in MEDLINE and EMBASE databases

    Directory of Open Access Journals (Sweden)

    Lindsay M. Gorrell

    2017-03-01

    Full Text Available Abstract Background Reporting of adverse events in randomized clinical trials (RCTs is encouraged by the authors of The Consolidated Standards of Reporting Trials (CONSORT statement. With robust methodological design and adequate reporting, RCTs have the potential to provide useful evidence on the incidence of adverse events associated with spinal manipulative therapy (SMT. During a previous investigation, it became apparent that comprehensive search strategies combining text words with indexing terms was not sufficiently sensitive for retrieving records that were known to contain reports on adverse events. The aim of this analysis was to compare the proportion of articles containing data on adverse events associated with SMT that were indexed in MEDLINE and/or EMBASE and the proportion of those that included adverse event-related words in their title or abstract. Methods A sample of 140 RCT articles previously identified as containing data on adverse events associated with SMT was used. Articles were checked to determine if: (1 they had been indexed with relevant terms describing adverse events in the MEDLINE and EMBASE databases; and (2 they mentioned adverse events (or any related terms in the title or abstract. Results Of the 140 papers, 91% were MEDLINE records, 85% were EMBASE records, 81% were found in both MEDLINE and EMBASE records, and 4% were not in either database. Only 19% mentioned adverse event-related text words in the title or abstract. There was no significant difference between MEDLINE and EMBASE records in the proportion of available papers (p = 0.078. Of the 113 papers that were found in both MEDLINE and EMBASE records, only 3% had adverse event-related indexing terms assigned to them in both databases, while 81% were not assigned an adverse event-related indexing term in either database. Conclusions While there was effective indexing of RCTs involving SMT in the MEDLINE and EMBASE databases, there was a failure of

  16. Wireless gigabit data telemetry for large-scale neural recording.

    Science.gov (United States)

    Kuan, Yen-Cheng; Lo, Yi-Kai; Kim, Yanghyo; Chang, Mau-Chung Frank; Liu, Wentai

    2015-05-01

    Implantable wireless neural recording from a large ensemble of simultaneously acting neurons is a critical component to thoroughly investigate neural interactions and brain dynamics from freely moving animals. Recent researches have shown the feasibility of simultaneously recording from hundreds of neurons and suggested that the ability of recording a larger number of neurons results in better signal quality. This massive recording inevitably demands a large amount of data transfer. For example, recording 2000 neurons while keeping the signal fidelity ( > 12 bit, > 40 KS/s per neuron) needs approximately a 1-Gb/s data link. Designing a wireless data telemetry system to support such (or higher) data rate while aiming to lower the power consumption of an implantable device imposes a grand challenge on neuroscience community. In this paper, we present a wireless gigabit data telemetry for future large-scale neural recording interface. This telemetry comprises of a pair of low-power gigabit transmitter and receiver operating at 60 GHz, and establishes a short-distance wireless link to transfer the massive amount of neural signals outward from the implanted device. The transmission distance of the received neural signal can be further extended by an externally rendezvous wireless transceiver, which is less power/heat-constraint since it is not at the immediate proximity of the cortex and its radiated signal is not seriously attenuated by the lossy tissue. The gigabit data link has been demonstrated to achieve a high data rate of 6 Gb/s with a bit-error-rate of 10(-12) at a transmission distance of 6 mm, an applicable separation between transmitter and receiver. This high data rate is able to support thousands of recording channels while ensuring a low energy cost per bit of 2.08 pJ/b.

  17. High-Speed Data Recorder for Space, Geodesy, and Other High-Speed Recording Applications

    Science.gov (United States)

    Taveniku, Mikael

    2013-01-01

    A high-speed data recorder and replay equipment has been developed for reliable high-data-rate recording to disk media. It solves problems with slow or faulty disks, multiple disk insertions, high-altitude operation, reliable performance using COTS hardware, and long-term maintenance and upgrade path challenges. The current generation data recor - ders used within the VLBI community are aging, special-purpose machines that are both slow (do not meet today's requirements) and are very expensive to maintain and operate. Furthermore, they are not easily upgraded to take advantage of commercial technology development, and are not scalable to multiple 10s of Gbit/s data rates required by new applications. The innovation provides a softwaredefined, high-speed data recorder that is scalable with technology advances in the commercial space. It maximally utilizes current technologies without being locked to a particular hardware platform. The innovation also provides a cost-effective way of streaming large amounts of data from sensors to disk, enabling many applications to store raw sensor data and perform post and signal processing offline. This recording system will be applicable to many applications needing realworld, high-speed data collection, including electronic warfare, softwaredefined radar, signal history storage of multispectral sensors, development of autonomous vehicles, and more.

  18. Events in time: Basic analysis of Poisson data

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  19. Event displays with heavy flavour jets from 2016 CMS data

    CERN Document Server

    CMS Collaboration

    2017-01-01

    A broad range of physics analyses at CMS rely on the efficient identification of heavy flavour jets. Identification of these objects is a challenging task, especially in the presence of a large number of multiple interactions per bunch crossing. The presented summary contains a set of graphical displays of reconstructed events in data collected by CMS in proton-proton collisions at 13 TeV in 2016. The displays highlight the main properties of heavy flavour jets in several event topologies, including QCD multijet, top quark pair, W+c and boosted H to bb.

  20. Data-driven approach for creating synthetic electronic medical records.

    Science.gov (United States)

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4

  1. Digital data recording and interpretational standards in mummy science.

    Science.gov (United States)

    Beckett, Ronald G

    2017-12-01

    Beginning during the late19th century, paleoimaging has played an ever-expanding role in mummy science. Increasingly during the 21st century, digital radiographic data collected through imaging efforts have become significant. The rapid influx of imaging data raises questions regarding standardized approaches to both acquisition and interpretation. Reports using digital data presented without contextual considerations commonly lead to interpretational errors. Digital data recording and interpretation require rigorous methodology and standards in order to achieve reproducibility, accuracy and minimization of inter- and intra-observer error. Researchers applying paleoimaging methods in bioarchaeological research must understand the significant limitations inherent in data collection and interpretation from various digital data recording methods. Currently, vast amounts of digital data are being archived, allowing greater potential for hypothesis-based research and informed diagnosis by consensus. Digital databases hold great potential in preparing both radiologists and bioarchaeologists in the appropriate application and interpretation of digital data. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. High-density digital data recording/reproducing system

    Science.gov (United States)

    Leighou, R. O.

    1976-01-01

    Problems associated with reliably recording and reproducing digital data at densities of 10 to 30 kilobits per inch and the solutions to these problems are discussed. The three problems are skew, dc offset, and tape imperfections. The solutions are to use a 14-track, wideband II tape recorder; record NRZ-L; use a 24-bit sync word, 504-bit frame length, and odd parity in every 8-bit byte; and to employ circuit design techniques that minimize the effects of the remaining dc offset and tape imperfections.

  3. Tablet computers for recording tuberculosis data at a community ...

    African Journals Online (AJOL)

    Data for tuberculosis screening were captured by nurses on Android® 9.7-inch tablets over a week. Their experience was explored by means of a focus group interview. Results: Data were recorded for 24 patients and seamlessly transferred for analysis. Nurses thought that the tablets were easy to use and saved time.

  4. Data Recording in Performance Management: Trouble With the Logics

    Science.gov (United States)

    Groth Andersson, Signe; Denvall, Verner

    2017-01-01

    In recent years, performance management (PM) has become a buzzword in public sector organizations. Well-functioning PM systems rely on valid performance data, but critics point out that conflicting rationale or logic among professional staff in recording information can undermine the quality of the data. Based on a case study of social service…

  5. B.C. lab sets data-transfer speed record

    CERN Multimedia

    2002-01-01

    Four Canadian researchers based at the TRIUMF particle physics research laboratory of British Columbia, last week moved one terabyte of research data to CERN in Geneva in about three hours, doubling the previous record for speed of data transfer (1/2 page).

  6. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-01-01

    Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492

  7. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media.

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-10-01

    The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for 'symptom' related and 'mild' adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. © 2015 The British Pharmacological Society.

  8. The Early Toarcian Oceanic Anoxic Event and its sedimentary record in Switzerland

    Science.gov (United States)

    Fantasia, Alicia; Föllmi, Karl B.; Adatte, Thierry; Spangenberg, Jorge E.; Montero-Serrano, Jean-Carlos

    2015-04-01

    In the Jurassic period, the Early Toarcian Oceanic Anoxic Event (T-OAE), about 183 Ma ago, was a global perturbation of paleoclimatic and paleoenvironmental conditions. This episode was associated with a crisis in marine carbonate accumulation, climate warming, an increase in sea level, ocean acidification, enhanced continental weathering, whereas organic-rich sediments are noticeable for example in the Atlantic and in the Tethys. This episode is associated with a negative carbon excursion, which is recorded both in marine and terrestrial environments. The cause(s) of this environmental crisis remain(s) still controversial. Nevertheless, the development of negative δ13C excursions is commonly interpreted as due to the injection of isotopically-light carbon associated with gas hydrate dissociation, the thermal metamorphism of carbon-rich sediments and input of thermogenic and volcanogenic carbon related to the formation of the Karoo-Ferrar basaltic province in southern Gondwana (Hesselbo et al., 2000, 2007; Beerling et al., 2002; Cohen et al., 2004, 2007; McElwain et al., 2005, Beerling and Brentnall, 2007; Svensen et al., 2007; Hermoso et al., 2009, 2012; Mazzini et al., 2010). Several studies of the T-OAE have been conducted on sediments in central and northwest Europe, but only few data are available concerning the Swiss sedimentary records. Therefore, we focused on two sections in the Jura Plateau (canton Aargau): the Rietheim section (Montero-Serrano et al., submitted) and the Gipf section (current study). A multidisciplinary approach has been chosen and the tools to be used are based on sedimentological observations (sedimentary condensation, etc.), biostratigraphy, mineralogy (bulk-rock composition), facies and microfacies analysis (presence or absence of benthos), clay-mineralogy composition (climatic conditions), major and trace-element analyses (productivity, redox conditions, etc.), phosphorus (trophic levels, anoxia), carbon isotopes and organic

  9. The role of attributions in the cognitive appraisal of work-related stressful events : An event-recording approach

    NARCIS (Netherlands)

    Peeters, MCW; Schaufeli, WB; Buunk, BP

    1995-01-01

    This paper describes a micro-analysis of the cognitive appraisal of daily stressful events in a sample of correctional officers (COs). More specifically, the authors examined whether three attribution dimensions mediated the relationship between the occurrence of stressful events and the

  10. Nonparametric modeling of the gap time in recurrent event data.

    Science.gov (United States)

    Du, Pang

    2009-06-01

    Recurrent event data arise in many biomedical and engineering studies when failure events can occur repeatedly over time for each study subject. In this article, we are interested in nonparametric estimation of the hazard function for gap time. A penalized likelihood model is proposed to estimate the hazard as a function of both gap time and covariate. Method for smoothing parameter selection is developed from subject-wise cross-validation. Confidence intervals for the hazard function are derived using the Bayes model of the penalized likelihood. An eigenvalue analysis establishes the asymptotic convergence rates of the relevant estimates. Empirical studies are performed to evaluate various aspects of the method. The proposed technique is demonstrated through an application to the well-known bladder tumor cancer data.

  11. Lidar data assimilation for improved analyses of volcanic aerosol events

    Science.gov (United States)

    Lange, Anne Caroline; Elbern, Hendrik

    2014-05-01

    Observations of hazardous events with release of aerosols are hardly analyzable by today's data assimilation algorithms, without producing an attenuating bias. Skillful forecasts of unexpected aerosol events are essential for human health and to prevent an exposure of infirm persons and aircraft with possibly catastrophic outcome. Typical cases include mineral dust outbreaks, mostly from large desert regions, wild fires, and sea salt uplifts, while the focus aims for volcanic eruptions. In general, numerical chemistry and aerosol transport models cannot simulate such events without manual adjustments. The concept of data assimilation is able to correct the analysis, as long it is operationally implemented in the model system. Though, the tangent-linear approximation, which describes a substantial precondition for today's cutting edge data assimilation algorithms, is not valid during unexpected aerosol events. As part of the European COPERNICUS (earth observation) project MACC II and the national ESKP (Earth System Knowledge Platform) initiative, we developed a module that enables the assimilation of aerosol lidar observations, even during unforeseeable incidences of extreme emissions of particulate matter. Thereby, the influence of the background information has to be reduced adequately. Advanced lidar instruments comprise on the one hand the aspect of radiative transfer within the atmosphere and on the other hand they can deliver a detailed quantification of the detected aerosols. For the assimilation of maximal exploited lidar data, an appropriate lidar observation operator is constructed, compatible with the EURAD-IM (European Air Pollution and Dispersion - Inverse Model) system. The observation operator is able to map the modeled chemical and physical state on lidar attenuated backscatter, transmission, aerosol optical depth, as well as on the extinction and backscatter coefficients. Further, it has the ability to process the observed discrepancies with lidar

  12. Bridging data models and terminologies to support adverse drug event reporting using EHR data.

    Science.gov (United States)

    Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C

    2015-01-01

    This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.

  13. Di-photon event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with a photon pair observed by the CMS detector in proton-collision data collected in 2015. The mass of the di-photon system is 750 GeV. Both photon candidates, with transverse momenta of 400 GeV and 230 GeV respectively, are reconstructed in the barrel region. The candidates are consistent with the expectations that they are prompt isolated photons.

  14. Practical guidance for statistical analysis of operational event data

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  15. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Directory of Open Access Journals (Sweden)

    Rodrigo Tavares Silva

    2015-11-01

    Full Text Available AbstractBackground:The recording of arrhythmic events (AE in renal transplant candidates (RTCs undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used.Objective:We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR.Methods:A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE.Results:During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002, and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001 were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041.Conclusions:In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  16. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino, E-mail: martino@cardiol.br; Peixoto, Giselle de Lima; Lima, José Jayme Galvão de; Siqueira, Sérgio Freitas de; Costa, Roberto; Gowdak, Luís Henrique Wolff [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Paula, Flávio Jota de [Unidade de Transplante Renal - Divisão de Urologia do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Kalil Filho, Roberto; Ramires, José Antônio Franchini [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil)

    2015-11-15

    The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  17. A population-based temporal logic gate for timing and recording chemical events.

    Science.gov (United States)

    Hsiao, Victoria; Hori, Yutaka; Rothemund, Paul Wk; Murray, Richard M

    2016-05-17

    Engineered bacterial sensors have potential applications in human health monitoring, environmental chemical detection, and materials biosynthesis. While such bacterial devices have long been engineered to differentiate between combinations of inputs, their potential to process signal timing and duration has been overlooked. In this work, we present a two-input temporal logic gate that can sense and record the order of the inputs, the timing between inputs, and the duration of input pulses. Our temporal logic gate design relies on unidirectional DNA recombination mediated by bacteriophage integrases to detect and encode sequences of input events. For an E. coli strain engineered to contain our temporal logic gate, we compare predictions of Markov model simulations with laboratory measurements of final population distributions for both step and pulse inputs. Although single cells were engineered to have digital outputs, stochastic noise created heterogeneous single-cell responses that translated into analog population responses. Furthermore, when single-cell genetic states were aggregated into population-level distributions, these distributions contained unique information not encoded in individual cells. Thus, final differentiated sub-populations could be used to deduce order, timing, and duration of transient chemical events. © 2016 The Authors. Published under the terms of the CC BY 4.0 license.

  18. The Boltysh crater record of rapid vegetation change during the Dan-C2 hyperthermal event.

    Science.gov (United States)

    Jolley, D. W.; Daly, R.; Gilmour, I.; Gilmour, M.; Kelley, S. P.

    2012-04-01

    Analysis of a cored borehole drilled through the sedimentary fill of the 24km wide Boltysh meteorite crater, Ukraine has yielded a unique, high resolution record spanning gymnosperm - angiosperm - fern communities are replaced by precipitation limited (winterwet) plant communities within the negative CIE. Winterwet plant communities dominate the negative CIE, but are replaced within the isotope recovery stage by warm temperate floras. These in turn give way to cooler temperate floras in the post positive CIE section of the uppermost crater fill. The distribution of temperate taxa about the negative CIE represents the broadest scale of oscillatory variation in the palynofloras. Shorter frequency oscillations are evident from diversity and botanical group distributions reflecting changes in moisture availability over several thousand years. Detailed analysis of variability within one of these oscillations records plant community cyclicity across the inception of the negative CIE. This short term cyclicity provides evidence that the replacement of warm termperate by winterwet floras occurred in a stepwise manner at the negative CIE suggesting cumulative atmospheric forcing. At <1mm scale, lamination within the negative CIE showed no obvious lithological or colour differences, and are not seasonal couplets. However, palynofloral analysis of laminations from within the negative CIE has yielded evidence of annual variation identifying the potential for recoding changes in 'paleoweather' across a major hyperthermal event. [1] Jolley, D. W. et al. (2010) Geology 38, 835-838.

  19. Characterization of seizure-like events recorded in vivo in a mouse model of Rett syndrome.

    Science.gov (United States)

    Colic, Sinisa; Wither, Robert G; Zhang, Liang; Eubanks, James H; Bardakjian, Berj L

    2013-10-01

    Rett syndrome is a neurodevelopmental disorder caused by mutations in the X-linked gene encoding methyl-CpG-binding protein 2 (MECP2). Spontaneous recurrent discharge episodes are displayed in Rett-related seizures as in other types of epilepsies. The aim of this paper is to investigate the seizure-like event (SLE) and inter-SLE states in a female MeCP2-deficient mouse model of Rett syndrome and compare them to those found in other spontaneous recurrent epilepsy models. The study was performed on a small population of female MeCP2-deficient mice using telemetric local field potential (LFP) recordings over a 24 h period. Durations of SLEs and inter-SLEs were extracted using a rule-based automated SLE detection system for both daytime and nighttime, as well as high and low power levels of the delta frequency range (0.5-4 Hz) of the recorded LFPs. The results suggest SLE occurrences are not influenced by circadian rhythms, but had a significantly greater association with delta power. Investigating inter-SLE and SLE states by fitting duration histograms to the gamma distribution showed that SLE initiation and termination were associated with random and deterministic mechanisms, respectively. These findings when compared to reported studies on epilepsy suggest that Rett-related seizures share many similarities with absence epilepsy. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  20. Administrative data algorithms to identify second breast cancer events following early-stage invasive breast cancer.

    Science.gov (United States)

    Chubak, Jessica; Yu, Onchee; Pocobelli, Gaia; Lamerato, Lois; Webster, Joe; Prout, Marianne N; Ulcickas Yood, Marianne; Barlow, William E; Buist, Diana S M

    2012-06-20

    Studies of breast cancer outcomes rely on the identification of second breast cancer events (recurrences and second breast primary tumors). Cancer registries often do not capture recurrences, and chart abstraction can be infeasible or expensive. An alternative is using administrative health-care data to identify second breast cancer events; however, these algorithms must be validated against a gold standard. We developed algorithms using data from 3152 women in an integrated health-care system who were diagnosed with stage I or II breast cancer in 1993-2006. Medical record review served as the gold standard for second breast cancer events. Administrative data used in algorithm development included procedures, diagnoses, prescription fills, and cancer registry records. We randomly divided the cohort into training and testing samples and used a classification and regression tree analysis to build algorithms for classifying women as having or not having a second breast cancer event. We created several algorithms for researchers to use based on the relative importance of sensitivity, specificity, and positive predictive value (PPV) in future studies. The algorithm with high specificity and PPV had 89% sensitivity (95% confidence interval [CI] = 84% to 92%), 99% specificity (95% CI = 98% to 99%), and 90% PPV (95% CI = 86% to 94%); the high-sensitivity algorithm had 96% sensitivity (95% CI = 93% to 98%), 95% specificity (95% CI = 94% to 96%), and 74% PPV (95% CI = 68% to 78%). Algorithms based on administrative data can identify second breast cancer events with high sensitivity, specificity, and PPV. The algorithms presented here promote efficient outcomes research, allowing researchers to prioritize sensitivity, specificity, or PPV in identifying second breast cancer events.

  1. Fast stimulus sequences improve the efficiency of event-related potential P300 recordings.

    Science.gov (United States)

    Mell, Dominik; Bach, Michael; Heinrich, Sven P

    2008-09-30

    The P300 is an easily recorded component of the event-related potential (ERP). Yet, it is desirable to reduce the recording duration, for instance in patient examinations. A limiting factor is the time between stimuli that is necessary for the ERP to return to baseline. We explored whether this time could be reduced, despite an overlap of responses to successive stimuli, by presenting visual stimuli at a fast rate of 4.7 s(-1)using a standard oddball paradigm. Rare stimuli occurred at a probability of 14%. The P300 was isolated by subtracting the responses to the frequent stimuli from those to the rare stimuli, thereby eliminating the influence of response overlap. We compared the efficiency of fast stimulation to that of conventionally slow stimulation by assessing the signal-to-noise ratio of the P300 amplitude. Two presentation durations of individual stimuli, namely 53 ms and 93 ms, were tested. Not unexpectedly, P300 amplitudes were smaller for the fast sequence. However, the signal-to-noise ratio improved significantly by more than 50% due to the larger number of trials within a given time interval. When targeting a given signal-to-noise ratio, fast stimulation allows for a reduction in recording time of around 35%. Median peak times were 16-56 ms shorter for the fast stimulus sequence. Topography was comparable for fast and slow stimulation, suggesting a similar functional composition of the respective responses. Fast stimulation may thus be used to replace less efficient slow stimulation schemes in clinical diagnosis and for certain experimental questions.

  2. The CM SAF CLAAS-2 cloud property data record

    Science.gov (United States)

    Benas, Nikos; Finkensieper, Stephan; Stengel, Martin; van Zadelhoff, Gerd-Jan; Hanschmann, Timo; Hollmann, Rainer; Fokke Meirink, Jan

    2017-04-01

    A new cloud property data record was lately released by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF), based on measurements from geostationary Meteosat Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensors, spanning the period 2004-2015. The CLAAS-2 (Cloud property dAtAset using SEVIRI, Edition 2) data record includes cloud fractional coverage, thermodynamic phase, cloud top height, water path and corresponding optical thickness and particle effective radius separately for liquid and ice clouds. These variables are available at high resolution 15-minute, daily and monthly basis. In this presentation the main improvements in the retrieval algorithms compared to the first edition of the data record (CLAAS-1) are highlighted along with their impact on the quality of the data record. Subsequently, the results of extensive validation and inter-comparison efforts against ground observations, as well as active and passive satellite sensors are summarized. Overall good agreement is found, with similar spatial and temporal characteristics, along with small biases caused mainly by differences in retrieval approaches, spatial/temporal samplings and viewing geometries.

  3. The first confirmed breeding record and new distribution data for ...

    African Journals Online (AJOL)

    The first confirmed breeding record and new distribution data for Böhm's Flycatcher Myopornis boehmi in Tanzania. On 17 November 2007 during fieldwork for the Tanzania Birds Atlas in western Tanzania, we were camped at the only known site for Chestnut- mantled Sparrow Weaver Plocepasser rufoscapulatus, a recent ...

  4. Health Data Recording, Reporting and Utilization Practices Among ...

    African Journals Online (AJOL)

    Aim: The aim of this study was to assess health data recording, reporting and utilization practices at the primary health care centers in Enugu State, Nigeria. Methods: This is a descriptive, cross-sectional study. A multistage sampling method was used. Survey instruments were observational checklist and structured ...

  5. A Method for Record Linkage with Sparse Historical Data

    OpenAIRE

    Colavizza, Giovanni; Ehrmann, Maud; Rochat, Yannick

    2016-01-01

    Massive digitization of archival material, coupled with automatic document processing techniques and data visualisation tools offers great opportunities for reconstructing and exploring the past. Unprecedented wealth of historical data (e.g. names of persons, places, transaction records) can indeed be gathered through the transcription and annotation of digitized documents and thereby foster large-scale studies of past societies. Yet, the transformation of hand-written documents into well-rep...

  6. Semantic Enrichment of Mobile Phone Data Records Exploiting Background Knowledge

    OpenAIRE

    Dashdorj, Zolzaya

    2015-01-01

    Every day, billions of mobile network log data (commonly defined as Call Detailed Records, or CDRs) are generated by cell phones operators. These data provide inspiring insights about human actions and behaviors, which are essentials in the development of context aware appli- cations and services. This potential demand has fostered major research activities in a variety of domains such as social and economic development, urban planning, and health prevention. The major challenge of this thesi...

  7. Electronic Health Record in Italy and Personal Data Protection.

    Science.gov (United States)

    Bologna, Silvio; Bellavista, Alessandro; Corso, Pietro Paolo; Zangara, Gianluca

    2016-06-01

    The present article deals with the Italian Electronic Health Record (hereinafter EHR), recently introduced by Act 221/2012, with a specific focus on personal data protection. Privacy issues--e.g., informed consent, data processing, patients' rights and minors' will--are discussed within the framework of recent e-Health legislation, national Data Protection Code, the related Data Protection Authority pronouncements and EU law. The paper is aimed at discussing the problems arising from a complex, fragmentary and sometimes uncertain legal framework on e-Health.

  8. Automatic Provenance Recording for Scientific Data using Trident

    Science.gov (United States)

    Simmhan, Y.; Barga, R.; van Ingen, C.

    2008-12-01

    Provenance is increasingly recognized as being critical to the understanding and reuse of scientific datasets. Given the rapid generation of scientific data from sensors and computational model results, it is not practical to manually record provenance for data and automated techniques for provenance capture are essential. Scientific workflows provide a framework for representing computational models and complex transformations of scientific data, and present a means for tracking the operations performed to derive a dataset. The Trident Scientific Workbench is a workflow system that natively incorporates provenance capture of data derived as part of the workflow execution. The applications used as part of a Trident workflow can execute on a remote computational cluster, such as a supercomputing center on in the Cloud, or on the local desktop of the researcher, and provenance on data derived by the applications is seamlessly captured. Scientists also have the option to annotate the provenance metadata using domain specific tags such as, for example, GCMD keywords. The provenance records thus captured can be exported in the Open Provenance Model* XML format that is emerging as a provenance standard in the eScience community or visualized as a graph of data and applications. The Trident workflow system and provenance recorded by it has been successfully applied in the Neptune oceanography project and is presently being tested in the Pan-STARRS astronomy project. *http://twiki.ipaw.info/bin/view/Challenge/OPM

  9. Evaluation of Electronic Medical Record Administrative data Linked Database (EMRALD).

    Science.gov (United States)

    Tu, Karen; Mitiku, Tezeta F; Ivers, Noah M; Guo, Helen; Lu, Hong; Jaakkimainen, Liisa; Kavanagh, Doug G; Lee, Douglas S; Tu, Jack V

    2014-01-01

    Primary care electronic medical records (EMRs) represent a potentially rich source of information for research and evaluation. To assess the completeness of primary care EMR data compared with administrative data. Retrospective comparison of provincial health-related administrative databases and patient records for more than 50,000 patients of 54 physicians in 15 geographically distinct clinics in Ontario, Canada, contained in the Electronic Medical Record Administrative data Linked Database (EMRALD). Physician billings, laboratory tests, medications, specialist consultation letters, and hospital discharges captured in EMRALD were compared with health-related administrative data in a universal access healthcare system. The mean (standard deviation [SD]) percentage of clinic primary care outpatient visits captured in EMRALD compared with administrative data was 94.4% (4.88%). Consultation letters from specialists for first consultations and for hospital discharges were captured at a mean (SD) rate of 72.7% (7.98%) and 58.5% (15.24%), respectively, within 30 days of the occurrence. The mean (SD) capture within EMRALD of the most common laboratory tests billed and the most common drugs dispensed was 67.3% (21.46%) and 68.2% (8.32%), respectively, for all clinics. We found reasonable capture of information within the EMR compared with administrative data, with the advantage in the EMR of having actual laboratory results, prescriptions for patients of all ages, and detailed clinical information. However, the combination of complete EMR records and administrative data is needed to provide a full comprehensive picture of patient health histories and processes, and outcomes of care.

  10. Using the FAIMS Mobile App for field data recording

    Science.gov (United States)

    Ballsun-Stanton, Brian; Klump, Jens; Ross, Shawn

    2016-04-01

    Multiple people creating data in the field poses a hard technical problem: our ``web 2.0'' environment presumes constant connectivity, data ``authority'' held by centralised servers, and sees mobile devices as tools for presentation rather than origination. A particular design challenge is the remoteness of the sampling locations, hundreds of kilometres away from network access. The alternative, however, is hand collection with a lengthy, error prone, and expensive digitisation process. This poster will present a field-tested open-source solution to field data recording. This solution, originally created by a community of archaeologists, needed to accommodate diverse recording methodologies. The community could not agree on standard vocabularies, workflows, attributes, or methodologies, but most agreed that at app to ``record data in the field'' was desirable. As a result, the app is generalised for field data collection; not only can it record a range of data types, but it is deeply customisable. The NeCTAR / ARC funded FAIMS Project, therefore, created an app which allows for arbitrary data collection in the field. In order to accomplish this ambitious goal, FAIMS relied heavily on OSS projects including: spatialite and gdal (for GIS support), sqlite (for a lightweight key-attribute-value datastore), Javarosa and Beanshell (for UI and scripting), Ruby, and Linux. Only by standing on the shoulders of giants, FAIMS was able to make a flexible and highly generalisable field data collection system that CSIRO geoscientists were able to customise to suit most of their completely unanticipated needs. While single-task apps (i.e. those commissioned by structural geologists to take strikes and dips) will excel in their domains, other geoscientists (palaeoecologists, palaeontologists, anyone taking samples) likely cannot afford to commission domain- and methodology-specific recording tools for their custom recording needs. FAIMS shows the utility of OSS software

  11. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  12. Adverse events in patients in home healthcare: a retrospective record review using trigger tool methodology.

    Science.gov (United States)

    Schildmeijer, Kristina Görel Ingegerd; Unbeck, Maria; Ekstedt, Mirjam; Lindblad, Marléne; Nilsson, Lena

    2018-01-03

    Home healthcare is an increasingly common part of healthcare. The patients are often aged, frail and have multiple diseases, and multiple caregivers are involved in their treatment. This study explores the origin, incidence, types and preventability of adverse events (AEs) that occur in patients receiving home healthcare. A study using retrospective record review and trigger tool methodology. Ten teams with experience of home healthcare from nine regions across Sweden reviewed home healthcare records in a two-stage procedure using 38 predefined triggers in four modules. A random sample of records from 600 patients (aged 18 years or older) receiving home healthcare during 2015 were reviewed. The cumulative incidence of AEs found in patients receiving home healthcare; secondary measures were origin, types, severity of harm and preventability of the AEs. The patients were aged 20-79 years, 280 men and 320 women. The review teams identified 356 AEs in 226 (37.7%; 95% CI 33.0 to 42.8) of the home healthcare records. Of these, 255 (71.6%; 95% CI 63.2 to 80.8) were assessed as being preventable, and most (246, 69.1%; 95% CI 60.9 to 78.2) required extra healthcare visits or led to a prolonged period of healthcare. Most of the AEs (271, 76.1%; 95% CI 67.5 to 85.6) originated in home healthcare; the rest were detected during home healthcare but were related to care outside home healthcare. The most common AEs were healthcare-associated infections, falls and pressure ulcers. AEs in patients receiving home healthcare are common, mostly preventable and often cause temporary harm requiring extra healthcare resources. The most frequent types of AEs must be addressed and reduced through improvements in interprofessional collaboration. This is an important area for future studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Dynamic Data-Driven Event Reconstruction for Atmospheric Releases

    Energy Technology Data Exchange (ETDEWEB)

    Kosovic, B; Belles, R; Chow, F K; Monache, L D; Dyer, K; Glascoe, L; Hanley, W; Johannesson, G; Larsen, S; Loosmore, G; Lundquist, J K; Mirin, A; Neuman, S; Nitao, J; Serban, R; Sugiyama, G; Aines, R

    2007-02-22

    Accidental or terrorist releases of hazardous materials into the atmosphere can impact large populations and cause significant loss of life or property damage. Plume predictions have been shown to be extremely valuable in guiding an effective and timely response. The two greatest sources of uncertainty in the prediction of the consequences of hazardous atmospheric releases result from poorly characterized source terms and lack of knowledge about the state of the atmosphere as reflected in the available meteorological data. In this report, we discuss the development of a new event reconstruction methodology that provides probabilistic source term estimates from field measurement data for both accidental and clandestine releases. Accurate plume dispersion prediction requires the following questions to be answered: What was released? When was it released? How much material was released? Where was it released? We have developed a dynamic data-driven event reconstruction capability which couples data and predictive models through Bayesian inference to obtain a solution to this inverse problem. The solution consists of a probability distribution of unknown source term parameters. For consequence assessment, we then use this probability distribution to construct a ''''composite'' forward plume prediction which accounts for the uncertainties in the source term. Since in most cases of practical significance it is impossible to find a closed form solution, Bayesian inference is accomplished by utilizing stochastic sampling methods. This approach takes into consideration both measurement and forward model errors and thus incorporates all the sources of uncertainty in the solution to the inverse problem. Stochastic sampling methods have the additional advantage of being suitable for problems characterized by a non-Gaussian distribution of source term parameters and for cases in which the underlying dynamical system is non-linear. We initially

  14. Di-muon event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass muon pair so far observed by the CMS detector in proton-collision data collected in 2015. The mass of the di-muon system is 2.4 TeV. One muon, with a transverse momentum of 0.7 TeV, goes through the Drift Tubes in the central region, while the second, with a transverse momentum of 1.0 TeV, hits the Cathode Strip Chambers in the forward region. Both muons satisfy the high-transverse-momentum muon selection criteria.

  15. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  16. A flexible semiparametric transformation model for recurrent event data.

    Science.gov (United States)

    Dong, Lin; Sun, Liuquan

    2015-01-01

    In this article, we propose a class of semiparametric transformation models for recurrent event data, in which the baseline mean function is allowed to depend on covariates through an additive model, and some covariate effects are allowed to be time-varying. For inference on the model parameters, estimating equation approaches are developed, and the asymptotic properties of the resulting estimators are established. In addition, a lack-of-fit test is presented to assess the adequacy of the model. The finite sample behavior of the proposed estimators is evaluated through simulation studies, and an application to a bladder cancer study is illustrated.

  17. ATLAS TrackingEvent Data Model -- 12.0.0

    Energy Technology Data Exchange (ETDEWEB)

    ATLAS; Akesson, F.; Atkinson, T.; Costa, M.J.; Elsing, M.; Fleischmann, S.; Gaponenko, A.; Liebig, W.; Moyse, E.; Salzburger, A.; Siebel, M.

    2006-07-23

    In this report the event data model (EDM) relevant for tracking in the ATLAS experiment is presented. The core component of the tracking EDM is a common track object which is suited to describe tracks in the innermost tracking sub-detectors and in the muon detectors in offline as well as online reconstruction. The design of the EDM was driven by a demand for modularity and extensibility while taking into account the different requirements of the clients. The structure of the track object and the representation of the tracking-relevant information are described in detail.

  18. The Irish National Adverse Events Study (INAES): the frequency and nature of adverse events in Irish hospitals-a retrospective record review study.

    Science.gov (United States)

    Rafter, Natasha; Hickey, Anne; Conroy, Ronan M; Condell, Sarah; O'Connor, Paul; Vaughan, David; Walsh, Gillian; Williams, David J

    2017-02-01

    Irish healthcare has undergone extensive change recently with spending cuts and a focus on quality initiatives; however, little is known about adverse event occurrence. To assess the frequency and nature of adverse events in Irish hospitals. 1574 (53% women, mean age 54 years) randomly selected adult inpatient admissions from a sample of eight hospitals, stratified by region and size, across the Republic of Ireland in 2009 were reviewed using two-stage (nurse review of patient charts, followed by physician review of triggered charts) retrospective chart review with electronic data capture. Results were weighted to reflect the sampling strategy. The impact on adverse event rate of differing application of international adverse event criteria was also examined. 45% of charts were triggered. The prevalence of adverse events in admissions was 12.2% (95% CI 9.5% to 15.5%), with an incidence of 10.3 events per 100 admissions (95% CI 7.5 to 13.1). Over 70% of events were considered preventable. Two-thirds were rated as having a mild-to-moderate impact on the patient, 9.9% causing permanent impairment and 6.7% contributing to death. A mean of 6.1 added bed days was attributed to events, representing an expenditure of €5550 per event. The adverse event rate varied substantially (8.6%-17.0%) when applying different published adverse event eligibility criteria. This first study of adverse events in Ireland reports similar rates to other countries. In a time of austerity, adverse events in adult inpatients were estimated to cost over €194 million. These results provide important baseline data on the adverse event burden and, alongside web-based chart review, provide an incentive and methodology to monitor future patient-safety initiatives. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Quality of the record of drug-related problems in a database for voluntary adverse event reporting

    Directory of Open Access Journals (Sweden)

    Maria Teresa Aznar-Saliente

    2017-07-01

    Full Text Available Objective: To determine the number and type of errors found in the record of drug-related problems in the SINEA database, an electronic system for voluntary reporting of adverse events in healthcare, in order to quantify the differences between the raw and refined databases, suggest improvements, and determine the need for refining said databases. Methods: A Pharmacist reviewed the database and refined the adverse events reported from January to August, 2014, considering the “describe_what_happened” field as the gold standard. There was a comparison of the rates of medication errors, both potential and real, adverse reactions, impact on the patient, impact on healthcare, and medications more frequently involved in the raw and refined databases. Agreement was calculated through Cohen’s Kappa Coefficient. Results: 364 adverse events were reported: 66.7% were medication errors, 2.7% adverse reactions to the medication (2 were wrongly classified as both, showing a total percentage >100% and 31% were other events. After refinement, the percentages were 69.5%, 5.8% and 24.7%, respectively (κ=0.85; CI95% [0.80-0.90]. Before refinement, 73.6% of medication errors were considered potential vs. 82.3% after refinement (κ=0.65; CI95% [0.54- 0.76]. The medication most frequently involved was trastuzumab (20.9%. The “molecule” field was blank in 133 entries. A mean of 1.8±1.9 errors per entry were detected. Conclusions: Although agreement is good, the refinement process cannot be avoided, as it provides valuable information to improve pharmacotherapy. Data quality could be improved by reducing the number of type-in text fields, using drop-down lists, and by increasing the training of the reporters.

  20. Record transfer of data between CERN and California

    CERN Multimedia

    Maximilien Brice

    2003-01-01

    On 27 February 2003 the California Institute of Technology (Caltech), CERN, the Los Alamos National Laboratory (LANL) and the Stanford Linear Accelerator Center (SLAC) broke a data transfer record by transmitting 1 terabyte of data in less than an hour across the 10,000 kilometres between CERN and Sunnyvale in California. The team sustained a transmission rate of 2.38 gigabits per second for over an hour, which is equivalent to transferring 26 CDs per minute. The record-breaking performance was achieved in the framework of tests directly linked to the DataGrid project, which involves the creation of a network of distributed computers able to deliver the unprecedented computing power and data management capacity that will be needed by the data-intensive experiments at the LHC. CERN's participation in these high-speed data transfer tests is led by IT division's External Networking team in the framework of the CERN-led European DataTAG project. Pictured here are some of the members of the CERN DataTAG project te...

  1. Benchmarking dairy herd health status using routinely recorded herd summary data.

    Science.gov (United States)

    Parker Gaddis, K L; Cole, J B; Clay, J S; Maltecca, C

    2016-02-01

    Genetic improvement of dairy cattle health through the use of producer-recorded data has been determined to be feasible. Low estimated heritabilities indicate that genetic progress will be slow. Variation observed in lowly heritable traits can largely be attributed to nongenetic factors, such as the environment. More rapid improvement of dairy cattle health may be attainable if herd health programs incorporate environmental and managerial aspects. More than 1,100 herd characteristics are regularly recorded on farm test-days. We combined these data with producer-recorded health event data, and parametric and nonparametric models were used to benchmark herd and cow health status. Health events were grouped into 3 categories for analyses: mastitis, reproductive, and metabolic. Both herd incidence and individual incidence were used as dependent variables. Models implemented included stepwise logistic regression, support vector machines, and random forests. At both the herd and individual levels, random forest models attained the highest accuracy for predicting health status in all health event categories when evaluated with 10-fold cross-validation. Accuracy (SD) ranged from 0.61 (0.04) to 0.63 (0.04) when using random forest models at the herd level. Accuracy of prediction (SD) at the individual cow level ranged from 0.87 (0.06) to 0.93 (0.001) with random forest models. Highly significant variables and key words from logistic regression and random forest models were also investigated. All models identified several of the same key factors for each health event category, including movement out of the herd, size of the herd, and weather-related variables. We concluded that benchmarking health status using routinely collected herd data is feasible. Nonparametric models were better suited to handle this complex data with numerous variables. These data mining techniques were able to perform prediction of health status and could add evidence to personal experience in herd

  2. ATLAS event at 13 TeV - Multijet Exotics Search Event Display - 2015 data

    CERN Multimedia

    ATLAS Collaboration

    2015-01-01

    Run 279984, Event 1079767163 A 10 jet event selected in the search for strong gravity in multijet final states (CERN-PH-EP-2015-312). The scalar sum of jet transverse momenta (HT) of the event is 4.4 TeV. Run 282712, Event 474587238 The event with the largest scalar sum of jet transverse momenta (HT) selected in the search for strong gravity in multijet final states (CERN-PH-EP-2015-312). The HT of the event is 6.4 TeV.

  3. Data Resource Profile: Cardiovascular disease research using linked bespoke studies and electronic health records (CALIBER)

    Science.gov (United States)

    Denaxas, Spiros C; George, Julie; Herrett, Emily; Shah, Anoop D; Kalra, Dipak; Hingorani, Aroon D; Kivimaki, Mika; Timmis, Adam D; Smeeth, Liam; Hemingway, Harry

    2012-01-01

    The goal of cardiovascular disease (CVD) research using linked bespoke studies and electronic health records (CALIBER) is to provide evidence to inform health care and public health policy for CVDs across different stages of translation, from discovery, through evaluation in trials to implementation, where linkages to electronic health records provide new scientific opportunities. The initial approach of the CALIBER programme is characterized as follows: (i) Linkages of multiple electronic heath record sources: examples include linkages between the longitudinal primary care data from the Clinical Practice Research Datalink, the national registry of acute coronary syndromes (Myocardial Ischaemia National Audit Project), hospitalization and procedure data from Hospital Episode Statistics and cause-specific mortality and social deprivation data from the Office of National Statistics. Current cohort analyses involve a million people in initially healthy populations and disease registries with ∼105 patients. (ii) Linkages of bespoke investigator-led cohort studies (e.g. UK Biobank) to registry data (e.g. Myocardial Ischaemia National Audit Project), providing new means of ascertaining, validating and phenotyping disease. (iii) A common data model in which routine electronic health record data are made research ready, and sharable, by defining and curating with meta-data >300 variables (categorical, continuous, event) on risk factors, CVDs and non-cardiovascular comorbidities. (iv) Transparency: all CALIBER studies have an analytic protocol registered in the public domain, and data are available (safe haven model) for use subject to approvals. For more information, e-mail s.denaxas@ucl.ac.uk PMID:23220717

  4. APNEA list mode data acquisition and real-time event processing

    Energy Technology Data Exchange (ETDEWEB)

    Hogle, R.A.; Miller, P. [GE Corporate Research & Development Center, Schenectady, NY (United States); Bramblett, R.L. [Lockheed Martin Specialty Components, Largo, FL (United States)

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  5. Validation of multisource electronic health record data : An application to blood transfusion data

    NARCIS (Netherlands)

    van Hoeven, Loan R.; Bruijne, Martine C.De; Kemper, Peter F.; Koopman, Maria M.W.; Rondeel, Jan M.M.; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P.; Roes, Kit C.B.

    2017-01-01

    Background: Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps

  6. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  7. Implementation of the ATLAS Run 2 event data model

    CERN Document Server

    Buckley, Andrew; Elsing, Markus; Gillberg, Dag Ingemar; Koeneke, Karsten; Krasznahorkay, Attila; Moyse, Edward; Nowak, Marcin; Snyder, Scott; van Gemmeren, Peter

    2015-01-01

    During the 2013--2014 shutdown of the Large Hadron Collider, ATLAS switched to a new event data model for analysis, called the xAOD. A key feature of this model is the separation of the object data from the objects themselves (the `auxiliary store'). Rather being stored as member variables of the analysis classes, all object data are stored separately, as vectors of simple values. Thus, the data are stored in a `structure of arrays' format, while the user still can access it as an `array of structures'. This organization allows for on-demand partial reading of objects, the selective removal of object properties, and the addition of arbitrary user-defined properties in a uniform manner. It also improves performance by increasing the locality of memory references in typical analysis code. The resulting data structures can be written to ROOT files with data properties represented as simple ROOT tree branches. This talk will focus on the design and implementation of the auxiliary store and its interaction with RO...

  8. Clinical Research Informatics and Electronic Health Record Data

    Science.gov (United States)

    Horvath, M. M.; Rusincovitch, S. A.

    2014-01-01

    Summary Objectives The goal of this survey is to discuss the impact of the growing availability of electronic health record (EHR) data on the evolving field of Clinical Research Informatics (CRI), which is the union of biomedical research and informatics. Results Major challenges for the use of EHR-derived data for research include the lack of standard methods for ensuring that data quality, completeness, and provenance are sufficient to assess the appropriateness of its use for research. Areas that need continued emphasis include methods for integrating data from heterogeneous sources, guidelines (including explicit phenotype definitions) for using these data in both pragmatic clinical trials and observational investigations, strong data governance to better understand and control quality of enterprise data, and promotion of national standards for representing and using clinical data. Conclusions The use of EHR data has become a priority in CRI. Awareness of underlying clinical data collection processes will be essential in order to leverage these data for clinical research and patient care, and will require multi-disciplinary teams representing clinical research, informatics, and healthcare operations. Considerations for the use of EHR data provide a starting point for practical applications and a CRI research agenda, which will be facilitated by CRI’s key role in the infrastructure of a learning healthcare system. PMID:25123746

  9. Neurophysiological Effects of Meditation Based on Evoked and Event Related Potential Recordings

    Science.gov (United States)

    Singh, Nilkamal; Telles, Shirley

    2015-01-01

    Evoked potentials (EPs) are a relatively noninvasive method to assess the integrity of sensory pathways. As the neural generators for most of the components are relatively well worked out, EPs have been used to understand the changes occurring during meditation. Event-related potentials (ERPs) yield useful information about the response to tasks, usually assessing attention. A brief review of the literature yielded eleven studies on EPs and seventeen on ERPs from 1978 to 2014. The EP studies covered short, mid, and long latency EPs, using both auditory and visual modalities. ERP studies reported the effects of meditation on tasks such as the auditory oddball paradigm, the attentional blink task, mismatched negativity, and affective picture viewing among others. Both EP and ERPs were recorded in several meditations detailed in the review. Maximum changes occurred in mid latency (auditory) EPs suggesting that maximum changes occur in the corresponding neural generators in the thalamus, thalamic radiations, and primary auditory cortical areas. ERP studies showed meditation can increase attention and enhance efficiency of brain resource allocation with greater emotional control. PMID:26137479

  10. A meteor shockwave event recorded at seismic and infrasound stations in northern Taiwan

    Science.gov (United States)

    Kumar, Utpal; Chao, Benjamin F.; Hsieh, Yikai; Chang, Emmy T. Y.

    2017-12-01

    Three mysterious explosion sounds were heard in the coastal towns of Tamsui, west of Taipei in northern Taiwan, in the early evening of December 5, 2013. The event left clear signals that are identified in the recordings of 12 regional seismometers and 3 infrasound sensors and processed by means of travel time analysis. The apparent velocity of 330 m/s of the signals confirms that the energy transmission was through the atmosphere, and the characteristics of the waveforms suggest the meteor-generated shockwaves. We use the graphical method as well as the Genetic Algorithm optimization approach to constrain the trajectory of the meteor and to locate its projected intercept with the ground—(25.33 N, 121.26 E), approximately 20 km off the coast of Tamsui. The trajectory has azimuth (measured from north in a map view in the clockwise direction) of 303° and (near-vertical) elevation angle of 70°. From the observed period of 1.3 s at the maximum amplitude of the infrasound signal, we estimate by conventional scaling law that the meteor in question had impact energy on the order of 5 × 1010 J (equivalent to an earthquake of local magnitude 4) or roughly a size of 0.5 m across.

  11. Visualizing Research Data Records for their Better Management

    DEFF Research Database (Denmark)

    Ball, Alexander; Darlington, Mansur; Howard, Thomas J.

    2014-01-01

    As academia in general, and research funders in particular, place ever greater importance on data as an output of research, so the value of good research data management practices becomes ever more apparent. In response to this, the Innovative Design and Manufacturing Research Centre (Id......MRC) at the University of Bath, UK, with funding from the JISC, ran a project to draw up a data management planning regime. In carrying out this task, the ERIM (Engineering Research Information Management) Project devised a visual method of mapping out the data records produced in the course of research, along...... with the associations between them. This method, called Research Activity Information Development (RAID) Modelling, is based on the Unified Modelling Language (UML) for portability. It is offered to the wider research community as an intuitive way for researchers both to keep track of their own data and to communicate...

  12. An optical age chronology of late Quaternary extreme fluvial events recorded in Ugandan dambo soils

    Science.gov (United States)

    Mahan, S.A.; Brown, D.J.

    2007-01-01

    There is little geochonological data on sedimentation in dambos (seasonally saturated, channel-less valley floors) found throughout Central and Southern Africa. Radiocarbon dating is problematic for dambos due to (i) oxidation of organic materials during dry seasons; and (ii) the potential for contemporary biological contamination of near-surface sediments. However, for luminescence dating the equatorial site and semi-arid climate facilitate grain bleaching, while the gentle terrain ensures shallow water columns, low turbidity, and relatively long surface exposures for transported grains prior to deposition and burial. For this study, we focused on dating sandy strata (indicative of high-energy fluvial events) at various positions and depths within a second-order dambo in central Uganda. Blue-light quartz optically stimulated luminescences (OSL) ages were compared with infrared stimulated luminescence (IRSL) and thermoluminescence (TL) ages from finer grains in the same sample. A total of 8 samples were dated, with 6 intervals obtained at ???35, 33, 16, 10.4, 8.4, and 5.9 ka. In general, luminescence ages were stratigraphically, geomorphically and ordinally consistent and most blue-light OSL ages could be correlated with well-dated climatic events registered either in Greenland ice cores or Lake Victoria sediments. Based upon OSL age correlations, we theorize that extreme fluvial dambo events occur primarily during relatively wet periods, often preceding humid-to-arid transitions. The optical ages reported in this study provide the first detailed chronology of dambo sedimentation, and we anticipate that further dambo work could provide a wealth of information on the paleohydrology of Central and Southern Africa. ?? 2006 Elsevier Ltd. All rights reserved.

  13. SLAC scientists help set data transfer speed record

    CERN Multimedia

    2003-01-01

    SLAC is part of an international relay team that was recently awarded a certified data transfer speed record by the Internet2 consortium. The team transferred un-compressed data at 923 megabits per second for 58 seconds from Sunnyvale to Amsterdam — a distance of almost 6,800 miles, or about a quarter of the way around the world. This transfer speed is more than 3500 times as fast as a typical home Internet broadband connection (1/2 page).

  14. Factors Affecting Accuracy of Data Abstracted from Medical Records.

    Directory of Open Access Journals (Sweden)

    Meredith N Zozus

    Full Text Available Medical record abstraction (MRA is often cited as a significant source of error in research data, yet MRA methodology has rarely been the subject of investigation. Lack of a common framework has hindered application of the extant literature in practice, and, until now, there were no evidence-based guidelines for ensuring data quality in MRA. We aimed to identify the factors affecting the accuracy of data abstracted from medical records and to generate a framework for data quality assurance and control in MRA.Candidate factors were identified from published reports of MRA. Content validity of the top candidate factors was assessed via a four-round two-group Delphi process with expert abstractors with experience in clinical research, registries, and quality improvement. The resulting coded factors were categorized into a control theory-based framework of MRA. Coverage of the framework was evaluated using the recent published literature.Analysis of the identified articles yielded 292 unique factors that affect the accuracy of abstracted data. Delphi processes overall refuted three of the top factors identified from the literature based on importance and five based on reliability (six total factors refuted. Four new factors were identified by the Delphi. The generated framework demonstrated comprehensive coverage. Significant underreporting of MRA methodology in recent studies was discovered.The framework generated from this research provides a guide for planning data quality assurance and control for studies using MRA. The large number and variability of factors indicate that while prospective quality assurance likely increases the accuracy of abstracted data, monitoring the accuracy during the abstraction process is also required. Recent studies reporting research results based on MRA rarely reported data quality assurance or control measures, and even less frequently reported data quality metrics with research results. Given the demonstrated

  15. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Directory of Open Access Journals (Sweden)

    Yuan Zhou

    2014-02-01

    Full Text Available Background The effect of health information technology (HIT on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  16. Retrospective evaluation of all recorded horse race starts in Switzerland during a four year period focusing on discipline-specific risk factors for clinical events.

    Science.gov (United States)

    Schweizer, C; Ramseyer, A; Gerber, V; Christen, G; Burger, D; Wohlfender, F D

    2016-11-01

    Racetrack injuries are of welfare concern and the prevention of injuries is an important goal in many racing jurisdictions. Over the years this has led to more detailed recording of clinical events on racecourses. However, risk factor analyses of clinical events at race meetings have not been previously reported for Switzerland. To identify discipline-specific factors that influence the occurrence of clinical events during race meetings with the ultimate aim of improving the monitoring and safety of racetracks in Switzerland and optimising racehorse welfare. Retrospective study of horse race data collected by the Swiss horse racing association. All race starts (n = 17,670, including 6198 flat, 1257 obstacle and 10,215 trot race starts) recorded over a period of 4 years (2009-2012) were analysed in multivariable mixed effect logistic regression models including horse and racecourse related data. The models were designed to identify discipline-specific factors influencing the occurrence of clinical events on racecourses in Switzerland. Factors influencing the risk of clinical events during races were different for each discipline. The risk of a clinical event in trot racing was lower for racing on a Porphyre sand track than on grass tracks. Horses whose driver was also their trainer had an approximately 2-fold higher risk for clinical events. In obstacle races, longer distances (2401-3300 m and 3301-5400 m, respectively) had a protective effect compared with racing over shorter distances. In flat racing, 5 racecourses reported significantly fewer clinical events. In all 3 disciplines, finishing 8th place or later was associated with clinical events. Changes in management that aim to improve the safety and welfare of racehorses, such as racetrack adaptations, need to be individualised for each discipline. © 2015 EVJ Ltd.

  17. Seismic source associated with the repetitive events recorded at the Nevado del Huila volcano - Colombia in November 2008

    Science.gov (United States)

    Trujillo, N.; Valdes-González, C. M.; White, R.; Dawson, P. B.; McCausland, W. A.; Santacoloma, C.

    2016-12-01

    The Nevado del Huila Volcano recorded an eruption on November 21st, 2008. This eruptive event was preceded by approximately 11,200 seismic events associated to fluids dynamic inside volcanic conduits. These seismic signals were classified as Hybrid events (HB), Long Period events (LP) and Drumbeat events, and they presented as fundamental characteristic, great regularity in the time, i. e. their waveforms and their bandwidths were very similar to each other. Cardona et al. (2009) made a first analysis of these signals and proposed the existence of two seismic families: the first integrated by LP events and HB events registered in the period, November 9th to November 21st; and second family composed by the Drumbeats events registered between November 20th and 21st. Our project took as starting point the work of Cardona et al. (2009); we establish the degree of similarity between events of each of the two families proposed by Cardona et al. (2009). First, we made a temporal analysis by using the Hilbert Transform, and then applied the cross-correlation technique. Finally a stacking of the signals with correlation coefficients > 0.9, was obtained. The results were: 8000 events with correlation coefficients > 0.9 and the existence of six possible seismic families. A detailed analysis of the seismic signals obtained through the stacking allowed us to conclude the existence of four families, the first one recorded between the 4th to 18th of November, the second one for the drumbeat events recorded on November 11th, the third one for the seismicity recorded between the 14th and 21st of November, and the four one for the drumbeat events registered on November 20th and 21st. We suggest that each of these families was associated with a different seismic source; so, the first and third families were possibly associated to mechanisms like brittle fracturing that can occur in weak areas where cracks or conduits intersect, and where acoustic resonance can occur, and the second

  18. The Deelen infrasound array for recording sonic booms and events of CTBT interest

    NARCIS (Netherlands)

    Evers, L.; de Bree, H.E.; Haak, H.W.; Koers, A.A.

    2000-01-01

    The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) has build up expertise in infrasound measurements by investigating low frequency events in order to distinguish between seismic and sonic events. KNMI operates, amongst others, a sixteen element microbarometer array

  19. The Deelen infrasound array for recording sonic booms and events of CTBT interest

    NARCIS (Netherlands)

    Evers, L.G.; de Bree, H.E.; Haak, H.W.; Koers, A.A.

    2000-01-01

    The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) has built up expertise in infrasound measurements by investigating low frequency events in order to distinguish between seismic and sonic events. KNMI operates, amongst others, a sixteen element microbarometer array

  20. Satellite Climate Data Records: Development, Applications, and Societal Benefits

    Directory of Open Access Journals (Sweden)

    Wenze Yang

    2016-04-01

    Full Text Available This review paper discusses how to develop, produce, sustain, and serve satellite climate data records (CDRs in the context of transitioning research to operation (R2O. Requirements and critical procedures of producing various CDRs, including Fundamental CDRs (FCDRs, Thematic CDRs (TCDRs, Interim CDRs (ICDRs, and climate information records (CIRs are discussed in detail, including radiance/reflectance and the essential climate variables (ECVs of land, ocean, and atmosphere. Major international CDR initiatives, programs, and projects are summarized. Societal benefits of CDRs in various user sectors, including Agriculture, Forestry, Fisheries, Energy, Heath, Water, Transportation, and Tourism are also briefly discussed. The challenges and opportunities for CDR development, production and service are also addressed. It is essential to maintain credible CDR products by allowing free access to products and keeping the production process transparent by making source code and documentation available with the dataset.

  1. The 2015/16 El Niño Event as Recorded in Central Tropical Pacific Corals: Temperature, Hydrology, and Ocean Circulation Influences

    Science.gov (United States)

    O'Connor, G.; Cobb, K. M.; Sayani, H. R.; Grothe, P. R.; Atwood, A. R.; Stevenson, S.; Hitt, N. T.; Lynch-Stieglitz, J.

    2016-12-01

    The El Niño/Southern Oscillation (ENSO) of 2015/2016 was a record-breaking event in the central Pacific, driving profound changes in the properties of the ocean and atmosphere. Prolonged ocean warming of up to 3°C translated into a large-scale coral bleaching and mortality event on Christmas Island (2°N, 157°W) that very few individuals escaped unscathed. As part of a long-term, interdisciplinary monitoring effort underway since August 2014, we present results documenting the timing and magnitude of environmental changes on the Christmas Island reefs. In particular, we present the first coral geochemical time series spanning the last several years, using cores that were drilled from rare living coral colonies during a field expedition in April 2016, at the tail end of the event. These geochemical indicators are sensitive to both ocean temperature, salinity, and water mass properties and have been used to quantitatively reconstruct ENSO extremes of the recent [Nurhati et al., 2011] and distant [Cobb et al., 2013] past. By analyzing multiple cores from both open ocean and lagoonal settings, we are able to undertake a quantitative comparison of this event with past very strong El Niño events contained in the coral archive - including the 1940/41, 1972/73, and 1997/98 events. For the most recent event, we compare our coral geochemistry records with a rich suite of in situ environmental data, including physical and geochemical parameters collected as part of the NOAA rapid response campaign in the central tropical Pacific. This unique dataset not only provides physical context interpreting coral geochemical records from the central tropical Pacific, but allows us to assess why the 2015/2016 El Niño event was so devastating to coral reef ecosystems in this region.

  2. FDA Adverse Event Reporting System (FAERS): Latest Quartely Data Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — The FDA Adverse Event Reporting System (FAERS) is a database that contains information on adverse event and medication error reports submitted to FDA. The database...

  3. THREE-DIMENSIONAL DATA AND THE RECORDING OF MATERIAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    R. Parenti

    2012-09-01

    Full Text Available The “description” of a material structure requires a high degree of objectivity to serve the scientific interests of certain disciplines (archeological documentation, conservation and restoration, safeguarding of cultural assets and heritage. Geometric data and photographic documentation of surfaces are thus the best instruments for efficacious, clear and objective recording of architectural objects and other anthropic manifestations. In particular, the completeness and diachrony of photographic documentation has always proven essential in recording the material structure of historical buildings.The aim of our contribution is to show the results of several projects carried out with the aid of survey methodologies that utilize digital photographic images to generate RGB (ZScan point clouds of architectural monuments (urban standing buildings, monuments in archaeological areas, etc. and art objects. These technologies allow us to capture data using digital photogrammetric techniques; although not based on laser scanners, they can nonetheless create dense 3D point clouds, simply by using images that have been obtained via digital camera. The results are comparable to those achieved with laser scanner technology, although the procedures are simpler, faster and cheaper. We intend to try to adapt these technologies to the requirements and needs of scientific research and the conservation of cultural heritage. Furthermore, we will present protocols and procedures for data recording, processing and transfer in the cultural heritage field, especially with regard to historical buildings. Cooperation among experts from different disciplines (archaeology, engineering and photogrammetry will allow us to develop technologies and proposals for a widely adoptable workflow in the application of such technologies, in order to build an integrated system that can be used throughout the scientific community. Toward this end, open formats and integration will be

  4. ATLAS event at 13 TeV - Highest mass dijets resonance event in 2015 data

    CERN Multimedia

    ATLAS Collaboration

    2015-01-01

    The highest-mass, central dijet event passing the dijet resonance selection collected in 2015 (Event 1273922482, Run 280673) : the two central high-pT jets have an invariant mass of 6.9 TeV, the two leading jets have a pT of 3.2 TeV. The missing transverse momentum in this event is 46 GeV.

  5. ATLAS event at 13 TeV - Highest mass dijets angular event in 2015 data

    CERN Document Server

    ATLAS Collaboration

    2015-01-01

    The highest-mass dijet event passing the angular selection collected in 2015 (Event 478442529, Run 280464): the two central high-pT jets have an invariant mass of 7.9 TeV, the three leading jets have a pT of 1.99, 1.86 and 0.74 TeV respectively. The missing transverse momentum in this event is 46 GeV

  6. An automated cross-correlation based event detection technique and its application to surface passive data set

    Science.gov (United States)

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  7. Two Extreme Climate Events of the Last 1000 Years Recorded in Himalayan and Andean Ice Cores: Impacts on Humans

    Science.gov (United States)

    Thompson, L. G.; Mosley-Thompson, E. S.; Davis, M. E.; Kenny, D. V.; Lin, P.

    2013-12-01

    In the last few decades numerous studies have linked pandemic influenza, cholera, malaria, and viral pneumonia, as well as droughts, famines and global crises, to the El Niño-Southern Oscillation (ENSO). Two annually resolved ice core records, one from Dasuopu Glacier in the Himalaya and one from the Quelccaya Ice Cap in the tropical Peruvian Andes provide an opportunity to investigate these relationships on opposite sides of the Pacific Basin for the last 1000 years. The Dasuopu record provides an annual history from 1440 to 1997 CE and a decadally resolved record from 1000 to 1440 CE while the Quelccaya ice core provides annual resolution over the last 1000 years. Major ENSO events are often recorded in the oxygen isotope, insoluble dust, and chemical records from these cores. Here we investigate outbreaks of diseases, famines and global crises during two of the largest events recorded in the chemistry of these cores, particularly large peaks in the concentrations of chloride (Cl-) and fluoride (Fl-). One event is centered on 1789 to 1800 CE and the second begins abruptly in 1345 and tapers off after 1360 CE. These Cl- and F- peaks represent major droughts and reflect the abundance of continental atmospheric dust, derived in part from dried lake beds in drought stricken regions upwind of the core sites. For Dasuopu the likely sources are in India while for Quelccaya the sources would be the Andean Altiplano. Both regions are subject to drought conditions during the El Niño phase of the ENSO cycle. These two events persist longer (10 to 15 years) than today's typical ENSO events in the Pacific Ocean Basin. The 1789 to 1800 CE event was associated with a very strong El Niño event and was coincidental with the Boji Bara famine resulting from extended droughts that led to over 600,000 deaths in central India by 1792. Similarly extensive droughts are documented in Central and South America. Likewise, the 1345 to 1360 CE event, although poorly documented

  8. Record of palaeoenvironmental changes in the Mid-Polish Basin during the Valanginian Event

    Science.gov (United States)

    Morales, Chloé; Kujau, Ariane; Heimhofer, Ulrich; Mutterlose, Joerg; Spangenberg, Jorge; Adatte, Thierry; Ploch, Isabela; Föllmi, Karl B.

    2013-04-01

    The Valanginian stage displays the first major perturbation of the carbon cycle of the Cretaceous period. The Valanginian Weissert episode is associated with a positive excursion (CIE) in δ13Ccarb and δ13Corg values, and the occurrence of a crisis in pelagic and neritic carbonate production (Weissert et al., 1998; Erba, 2004, Föllmi et al., 2007). As for Cretaceous oceanic anoxic events (OAEs), the carbon anomaly is explained by the intensification of continental biogeochemical weathering triggering an increase in marine primary productivity and organic-matter preservation. However, to the contrary of OAEs, the organic matter trapped in the Tethyan Ocean during the Valanginian is both marine and continental and the occurrence of a widespread anoxia could not be evidenced (Westermann et al., 2010; Kujau et al., 2012). The resulting marine Corg burial rates were probably not sufficient to explain the shift in δ13C values and an alternative scheme has been proposed by Westermann et al. (2010): the carbonate platform crisis combined with the storage of organic-matter on the continent may be the major triggers of the δ13C positive shift. (Westermann et al., 2010). We present the results of an analysis of the Wawal drilling core (Mid-Polish Trough), which is of particular interest because of its near-coastal setting and its exceptional preservation, demonstrated by the presence of up to 17 wt.% aragonite. The section consists in marine silty to sandy clays deposited on top of a lower Berriasian karstified limestone. It covers the Early and early Late Valanginian, and displays the onset of the positive excursion. The lack of anoxia is evidenced by trace-element and Rock-Eval data. Two intervals of phosphogenesis are emphasised that appear equivalent in time to the condensed horizons of the northern Tethyan region (Helvetic Alps). A rapid climate change toward less humid and seasonally-contrasted conditions that is similar to the northern Tethyan areas is observed

  9. Data-record-size requirements for adaptive antenna arrays

    Science.gov (United States)

    Psaromiligkos, Ioannis N.; Batalama, Stella N.

    2000-07-01

    We investigate the data-record-size requirements to meet a given performance objective in interference suppression and direction-of-arrival (DoA) estimation problems. For interference suppression problems we consider the MVDR (minimum-variance-distortionless-response) beamformer evaluated under desired-signal-present and desired-signal- absent conditions. For the former case we adopt as the figure of merit the ratio between the output variance of the ideal and the estimated SMI (sample-matrix-inversion) MVDR filter, while for the latter we examine the inverse of the corresponding ratio of the output SINRs (signal-to- interference-plus-noise ratio). For DoA estimation problems we consider the conventional and the MVDR DoA estimation algorithm and we adopt a spectrum-based performance measure that is given as a function of the ratio between the estimated and the ideal spectrum. For all cases, we derive closed form expressions that provide the data record size that is necessary to achieve a given performance confidence level in a neighborhood of the optimal performance point as well as expressions that identify the performance level that can be reached for a given data record size. This is done by utilizing close approximations of the involved probability density functions (pdfs) and Markoff-type inequalities. The practical significance of the derived expressions lies in the fact that they are functions of the number of antenna elements only while they are independent of the ideal covariance matrix which is not known in most realistic applications. As a byproduct of the above developments we derive close approximations to the pdf of the output SINR of the MVDR beamformer when the latter is estimated in the presence or in the absence of the desired signal.

  10. Physicists set new record for network data transfer

    CERN Multimedia

    2006-01-01

    "An internatinal team of physicists, computer scientists, and network engineers led by the California Institute of Technology, CERN and the University of Michigan and partners at the University of Florida and Vanderbilt, as well as participants from Brazil (Rio de Janeiro State University, UERJ, and the State Universities of Sao Paulo, USP and UNESP) and Korea (Kyungpook National University, KISTI) joined forces to set new records for sustained data transfer between storage systems during the SuperComputing 2006 (SC06) Bandwidth Challenge (BWC)." (2 pages)

  11. Accuracy of Laboratory Data Communication on ICU Daily Rounds Using an Electronic Health Record.

    Science.gov (United States)

    Artis, Kathryn A; Dyer, Edward; Mohan, Vishnu; Gold, Jeffrey A

    2017-02-01

    Accurately communicating patient data during daily ICU rounds is critically important since data provide the basis for clinical decision making. Despite its importance, high fidelity data communication during interprofessional ICU rounds is assumed, yet unproven. We created a robust but simple methodology to measure the prevalence of inaccurately communicated (misrepresented) data and to characterize data communication failures by type. We also assessed how commonly the rounding team detected data misrepresentation and whether data communication was impacted by environmental, human, and workflow factors. Direct observation of verbalized laboratory data during daily ICU rounds compared with data within the electronic health record and on presenters' paper prerounding notes. Twenty-six-bed academic medical ICU with a well-established electronic health record. ICU rounds presenter (medical student or resident physician), interprofessional rounding team. None. During 301 observed patient presentations including 4,945 audited laboratory results, presenters used a paper prerounding tool for 94.3% of presentations but tools contained only 78% of available electronic health record laboratory data. Ninty-six percent of patient presentations included at least one laboratory misrepresentation (mean, 6.3 per patient) and 38.9% of all audited laboratory data were inaccurately communicated. Most misrepresentation events were omissions. Only 7.8% of all laboratory misrepresentations were detected. Despite a structured interprofessional rounding script and a well-established electronic health record, clinician laboratory data retrieval and communication during ICU rounds at our institution was poor, prone to omissions and inaccuracies, yet largely unrecognized by the rounding team. This highlights an important patient safety issue that is likely widely prevalent, yet underrecognized.

  12. Sustained production of multi-decadal climate records - Lessons from the NOAA Climate Data Record Program

    Science.gov (United States)

    Bates, J. J.

    2015-12-01

    NOAA's Climate Data Record (CDR) Program was designed to be responsive to the needs of climate monitoring, research, and services with the ultimate aim of serving decision making across a spectrum of users for the long term. It requires the sustained production of high quality, multidecadal time series data describing the global atmosphere, oceans, and land surface that can be used for informed decision making. The challenges of a long-term program of sustaining CDRs, as contrasted with short-term efforts of traditional three-year research programs, are substantial and different. The sustained production of CDRs requires collaboration between experts in the climate community, data management, and software development and maintenance. It is also informed by scientific application and associated user feedback on the accessibility and usability of the produced CDRs. The CDR Program has developed a metric for assessing the maturity of CDRs with respect to data management, software, and user application and applied it to over 28 CDRs. The main/primary lesson learned over the past seven years is that a rigorous, team approach to data management, employing subject matter experts at every step, is critical to open and transparent production. This approach also makes it much easier to support the needs of users who want near-real-time production of "interim" CDRs for monitoring and users who want to use CDRs for tailored authoritative information, such as a drought index. This talk will review of the history of the CDR program, current status, and plans.

  13. A 1-Ma record of sea surface temperature and extreme cooling events in the North Atlantic: A perspective from the Iberian Margin

    Science.gov (United States)

    Rodrigues, T.; Alonso-García, M.; Hodell, D. A.; Rufino, M.; Naughton, F.; Grimalt, J. O.; Voelker, A. H. L.; Abrantes, F.

    2017-09-01

    The Iberian Margin is a sensitive area to track high and low latitude processes, and is a key location to understand major past climatic and oceanographic changes. Here we present new biomarker data from IODP Site U1385 (;Shackleton site;) (1017-336 ka) that, when combined with existing data from Cores MD01-2443/4 (last 335 ka), allows us to assess the evolution of sea surface temperature (SST) and meltwater influx over the last 1 Ma at the Iberian Margin. Interglacial periods throughout the last 1 Ma show SST close to 20 °C, even during the so-called ;luke-warm; interglacials that are marked by relatively low atmospheric CO2 concentrations. During glacial periods, extremely cold stadial events are recognized at the Iberian Margin, and are very likely related to meltwater discharges from the European and British-Irish ice sheets into the NE Atlantic, which were transported southwards by the Portugal Current. We subdivided the record into four intervals on the basis of the timing and the magnitude of these extremely cold stadials: 1) from 1017 to ∼900 ka, only minor sporadic freshwater input occurred during deglaciations; 2) from 900 to 675 ka extreme cold events occur as terminal stadial events at the beginning of the deglaciations, which results in abrupt deglacial SST shifts; 3) from 675 to 450 ka only a few, very short-lived events are recorded and seldom is there freshwater input at the Iberian Margin; 4) during the last 450 ka the extreme cold events occurred under full glacial conditions, with particularly severe events during MIS 6 and 8. We propose these mid -glacial events are associated with a strong discharges of European ice sheet (EIS). The fact that these extreme cold events do not coincide with deglaciations questions the role of European ice sheet discharges in triggering deglaciations.

  14. An annually resolved marine proxy record for the 8.2K cold event from the northern North Sea based on bivalve shells

    Science.gov (United States)

    Butler, Paul; Estrella-Martínez, Juan; Scourse, James

    2017-04-01

    The so-called 8.2K cold event is a rapid cooling of about 6° +/- 2° recorded in the Greenland ice core record and thought to be a consequence of a freshwater pulse from the Laurentide ice sheet which reduced deepwater formation in the North Atlantic. In the Greenland ice cores the event is characterized by a maximum extent of 159 years and a central event lasting for 70 years. As discussed by Thomas et al (QSR, 2007), the low resolution and dating uncertainty of much palaeoclimate data makes it difficult to determine the rates of change and causal sequence that characterise the event at different locations. We present here a bivalve shell chronology based on four shells of Arctica islandica from the northern North Sea which (within radiocarbon uncertainty) is coeval with the 8.2K event recorded in the Greenland ice cores. The years of death of each shell based on radiocarbon analysis and crossmatching are 8094, 8134, 8147, and 8208 yrs BP (where "present" = AD 1950), with an associated radiocarbon uncertainty of +/-80 yrs, and their longevities are 106, 122, 112 and 79 years respectively. The total length of the chronology is 192 years (8286 - 8094 BP +/- 80 yrs). The most noticeable feature of the chronology is an 60-year period of increasing growth which may correspond to a similar period of decreasing ice accumulation in the GRIP (central Greenland) ice core record. We tentatively suggest that this reflects increasing food supply to the benthos as summer stratification is weakened by colder seawater temperatures. Stable isotope analyses (results expected to be available when this abstract is presented), will show changes at annual and seasonal resolution, potentially giving a very detailed insight into the causal factors associated with the 8.2K event and its impact in the northern North Sea.

  15. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  16. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    is for improved susceptibility rather than endurance, the error of applying a classical survival model was nonnegligible. The difference was most pronounced for scenarios with substantial underlying genetic variation in endurance and when the 2 underlying traits were lowly genetically correlated. In the presence...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...

  17. Smoothing spline ANOVA frailty model for recurrent event data.

    Science.gov (United States)

    Du, Pang; Jiang, Yihua; Wang, Yuedong

    2011-12-01

    Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.

  18. Recording real case data of earth faults in distribution lines

    Energy Technology Data Exchange (ETDEWEB)

    Haenninen, S. [VTT Energy, Espoo (Finland)

    1996-12-31

    The most common fault type in the electrical distribution networks is the single phase to earth fault. According to the earlier studies, for instance in Nordic countries, about 80 % of all faults are of this type. To develop the protection and fault location systems, it is important to obtain real case data of disturbances and faults which occur in the networks. For example, the earth fault initial transients can be used for earth fault location. The aim of this project was to collect and analyze real case data of the earth fault disturbances in the medium voltage distribution networks (20 kV). Therefore, data of fault occurrences were recorded at two substations, of which one has an unearthed and the other a compensated neutral, measured as follows: (a) the phase currents and neutral current for each line in the case of low fault resistance (b) the phase voltages and neutral voltage from the voltage measuring bay in the case of low fault resistance (c) the neutral voltage and the components of 50 Hz at the substation in the case of high fault resistance. In addition, the basic data of the fault occurrences were collected (data of the line, fault location, cause and so on). The data will be used in the development work of fault location and earth fault protection systems

  19. A new method to facilitate valid and consistent grading cardiac events in childhood cancer survivors using medical records.

    Directory of Open Access Journals (Sweden)

    Elizabeth Lieke A M Feijen

    Full Text Available BACKGROUND: Cardiac events (CEs are among the most serious late effects following childhood cancer treatment. To establish accurate risk estimates for the occurrence of CEs it is essential that they are graded in a valid and consistent manner, especially for international studies. We therefore developed a data-extraction form and a set of flowcharts to grade CEs and tested the validity and consistency of this approach in a series of patients. METHODS: The Common Terminology Criteria for Adverse Events version 3.0 and 4.0 were used to define the CEs. Forty patients were randomly selected from a cohort of 72 subjects with known CEs that had been graded by a physician for an earlier study. To establish whether the new method was valid for appropriate grading, a non-physician graded the CEs by using the new method. To evaluate consistency of the grading, the same charts were graded again by two other non-physicians, one with receiving brief introduction and one with receiving extensive training on the new method. We calculated weighted Kappa statistics to quantify inter-observer agreement. RESULTS: The inter-observer agreement was 0.92 (95% CI 0.80-1.00 for validity, and 0.88 (0.79-0.98 and 0.99 (0.96-1.00 for consistency with the outcome assessors who had the brief introduction and the extensive training, respectively. CONCLUSIONS: The newly developed standardized method to grade CEs using data from medical records has shown excellent validity and consistency. The study showed that the method can be correctly applied by researchers without a medical background, provided that they receive adequate training.

  20. Spike detection from noisy neural data in linear-probe recordings.

    Science.gov (United States)

    Takekawa, Takashi; Ota, Keisuke; Murayama, Masanori; Fukai, Tomoki

    2014-06-01

    Simultaneous recordings of multiple neuron activities with multi-channel extracellular electrodes are widely used for studying information processing by the brain's neural circuits. In this method, the recorded signals containing the spike events of a number of adjacent or distant neurons must be correctly sorted into spike trains of individual neurons, and a variety of methods have been proposed for this spike sorting. However, spike sorting is computationally difficult because the recorded signals are often contaminated by biological noise. Here, we propose a novel method for spike detection, which is the first stage of spike sorting and hence crucially determines overall sorting performance. Our method utilizes a model of extracellular recording data that takes into account variations in spike waveforms, such as the widths and amplitudes of spikes, by detecting the peaks of band-pass-filtered data. We show that the new method significantly improves the cost-performance of multi-channel electrode recordings by increasing the number of cleanly sorted neurons. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Event triggered data acquisition in the Rock Mechanics Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, R.D.

    1993-03-01

    Increasing complexity of experiments coupled with limitations of the previously used computers required improvements in both hardware and software in the Rock Mechanics Laboratories. Increasing numbers of input channels and the need for better graphics could no longer be supplied by DATAVG, an existing software package for data acquisition and display written by D. J. Holcomb in 1983. After researching the market and trying several alternatives, no commercial program was found which met our needs. The previous version of DATAVG had the basic features needed but was tied to obsolete hardware. Memory limitations on the previously used PDP-11 made it impractical to upgrade the software further. With the advances in IBM compatible computers it is now desirable to use them as data recording platforms. With this information in mind, it was decided to write a new version of DATAVG which would take advantage of newer hardware. The new version had to support multiple graphic display windows and increased channel counts. It also had to be easier to use.

  2. Analyzing System on A Chip Single Event Upset Responses using Single Event Upset Data, Classical Reliability Models, and Space Environment Data

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth; Campola, Michael; Xapsos, Michael

    2017-01-01

    We are investigating the application of classical reliability performance metrics combined with standard single event upset (SEU) analysis data. We expect to relate SEU behavior to system performance requirements. Our proposed methodology will provide better prediction of SEU responses in harsh radiation environments with confidence metrics. single event upset (SEU), single event effect (SEE), field programmable gate array devises (FPGAs)

  3. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  4. Event-by-event image reconstruction from list-mode PET data.

    Science.gov (United States)

    Schretter, Colas

    2009-01-01

    This paper adapts the classical list-mode OSEM and the globally convergent list-mode COSEM methods to the special case of singleton subsets. The image estimate is incrementally updated for each coincidence event measured by the PET scanner. Events are used as soon as possible to improve the current image estimate, and, therefore, the convergence speed toward the maximum-likelihood solution is accelerated. An alternative online formulation of the list-mode COSEM algorithm is proposed first. This method saves memory resources by re-computing previous incremental image contributions while processing a new pass over the complete dataset. This online expectation-maximization principle is applied to the list-mode OSEM method, as well. Image reconstructions have been performed from a simulated dataset for the NCAT torso phantom and from a clinical dataset. Results of the classical and event-by-event list-mode algorithms are discussed in a systematic and quantitative way.

  5. Triptans and serious adverse vascular events: data mining of the FDA Adverse Event Reporting System database.

    Science.gov (United States)

    Roberto, Giuseppe; Piccinni, Carlo; D'Alessandro, Roberto; Poluzzi, Elisabetta

    2014-01-01

    The aim of this article is to investigate the vascular safety profile of triptans through an analysis of the United States Food and Drug Administration Adverse Event Reporting System (FDA_AERS) database with a special focus on serious and unexpected adverse events. A CASE/NON-CASE analysis was performed on the reports entered in the FDA_AERS from 2004 to 2010: CASES were reports with at least one event included in the MedDRA system organ classes 'Cardiac disorder' or 'Vascular disorders', whereas NON-CASES were all the remaining reports. Co-reported cardiovascular drugs were used as a proxy of cardiovascular risk and the adjusted reporting odds ratio (adj.ROR) with 95% confidence intervals (95% CI) was calculated. Disproportionality signals were defined as adj.ROR value >1. Adverse events were considered unexpected if not mentioned on the relevant label. Among 2,131,688 reports, 7808 concerned triptans. CASES were 2593 among triptans and 665,940 for all other drugs. Unexpected disproportionality signals were found in the following high-level terms of the MedDRA hierarchy: 'Cerebrovascular and spinal necrosis and vascular insufficiency' (103 triptan cases), 'Aneurysms and dissections non-site specific' (15), 'Pregnancy-associated hypertension' (10), 'Reproductive system necrosis and vascular insufficiency' (3). Our analysis revealed three main groups of unexpected associations between triptans and serious vascular events: ischaemic cerebrovascular events, aneurysms and artery dissections, and pregnancy-related vascular events. A case-by-case assessment is needed to confirm or disprove their plausibility and large-scale analytical studies should be planned for risk rate estimation. In the meantime, clinicians should pay special attention to migraine diagnosis and vascular risk assessment before prescribing a triptan, also promptly reporting any unexpected event to pharmacovigilance systems.

  6. Continuing the Solar Irradiance Data Record with TSIS

    Science.gov (United States)

    Richard, E. C.; Pilewskie, P.; Kopp, G.; Coddington, O.; Woods, T. N.; Wu, D. L.

    2016-12-01

    The Total and Spectral Solar Irradiance Sensor (TSIS), first selected in 1998 for the National Polar-orbiting Operational Environmental Satellite System (NPOESS), re-manifested in 2010 on the NOAA-NASA Joint Polar Satellite System (JPSS), then the NOAA Polar Free Flyer, is now scheduled for deployment in 2017 on the International Space Station. The TSIS will acquire measurements of total and spectral solar irradiance (TSI and SSI, respectively). TSIS provides continuation of the Total Irradiance Monitor (TIM) and the Spectral Irradiance Monitor (SIM), currently flying on the NASA Solar Radiation and Climate Experiment (SORCE). Launched in 2003, SORCE is now more than eight years beyond its prime-mission lifetime. The launch failure of the NASA's Glory mission in 2011 coupled with diminished battery capacity on SORCE and delays in the launch of TSIS have put the continuous 38-year TSI record at risk. In 2012, a plan to maintain continuity of the TSI calibration scale between SORCE and TSIS was rapidly implemented through the USAF Space Test Program STPSat-3 that launched in late 2013. The shorter SSI record faces a likely gap between SORCE and TSIS. This paper summarizes the importance of highly accurate and stable observations of solar irradiance in understanding the present climate epoch and for predicting future climate; why continuity in the solar irradiance data record is required; improvements in the TSIS TIM and SIM, including verification of their calibration using ground-based NIST-traceable cryogenic standards; and how these improvements will impact Sun-climate studies in the near future.

  7. Integrating cancer genomic data into electronic health records

    Directory of Open Access Journals (Sweden)

    Jeremy L. Warner

    2016-10-01

    Full Text Available Abstract The rise of genomically targeted therapies and immunotherapy has revolutionized the practice of oncology in the last 10–15 years. At the same time, new technologies and the electronic health record (EHR in particular have permeated the oncology clinic. Initially designed as billing and clinical documentation systems, EHR systems have not anticipated the complexity and variety of genomic information that needs to be reviewed, interpreted, and acted upon on a daily basis. Improved integration of cancer genomic data with EHR systems will help guide clinician decision making, support secondary uses, and ultimately improve patient care within oncology clinics. Some of the key factors relating to the challenge of integrating cancer genomic data into EHRs include: the bioinformatics pipelines that translate raw genomic data into meaningful, actionable results; the role of human curation in the interpretation of variant calls; and the need for consistent standards with regard to genomic and clinical data. Several emerging paradigms for integration are discussed in this review, including: non-standardized efforts between individual institutions and genomic testing laboratories; “middleware” products that portray genomic information, albeit outside of the clinical workflow; and application programming interfaces that have the potential to work within clinical workflow. The critical need for clinical-genomic knowledge bases, which can be independent or integrated into the aforementioned solutions, is also discussed.

  8. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  9. Uncertainty information in climate data records from Earth observation

    Directory of Open Access Journals (Sweden)

    C. J. Merchant

    2017-07-01

    Full Text Available The question of how to derive and present uncertainty information in climate data records (CDRs has received sustained attention within the European Space Agency Climate Change Initiative (CCI, a programme to generate CDRs addressing a range of essential climate variables (ECVs from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions. The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the

  10. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  11. Correlating eligibility criteria generalizability and adverse events using Big Data for patients and clinical trials

    Science.gov (United States)

    Sen, Anando; Ryan, Patrick; Goldstein, Andrew; Chakrabarti, Shreya; Wang, Shuang; Koski, Eileen; Weng, Chunhua

    2016-01-01

    Randomized controlled trials can benefit from proactive assessment of how well their participant selection strategies during the design of eligibility criteria can influence the study generalizability. In this paper, we present a quantitative metric called generalizability index for study traits 2.0 (GIST 2.0) to assess the a priori generalizability (based on population representativeness) of a clinical trial by accounting for the dependencies among multiple eligibility criteria. The metric was evaluated on 16 sepsis trials identified from ClinicalTrials.gov, with their adverse event reports extracted from the trial results section. The correlation between GIST scores and adverse events was analyzed. We found that the GIST 2.0 score was significantly correlated with total adverse events and serious adverse events (weighted correlation coefficients of 0.825 and 0.709, respectively, with P < 0.01). This study exemplifies the promising use of Big Data in electronic health records and ClinicalTrials.gov for optimizing eligibility criteria design for clinical studies. PMID:27598694

  12. Correlating eligibility criteria generalizability and adverse events using Big Data for patients and clinical trials.

    Science.gov (United States)

    Sen, Anando; Ryan, Patrick B; Goldstein, Andrew; Chakrabarti, Shreya; Wang, Shuang; Koski, Eileen; Weng, Chunhua

    2017-01-01

    Randomized controlled trials can benefit from proactive assessment of how well their participant selection strategies during the design of eligibility criteria can influence the study generalizability. In this paper, we present a quantitative metric called generalizability index for study traits 2.0 (GIST 2.0) to assess the a priori generalizability (based on population representativeness) of a clinical trial by accounting for the dependencies among multiple eligibility criteria. The metric was evaluated on 16 sepsis trials identified from ClinicalTrials.gov, with their adverse event reports extracted from the trial results sections. The correlation between GIST scores and adverse events was analyzed. We found that the GIST 2.0 score was significantly correlated with total adverse events and serious adverse events (weighted correlation coefficients of 0.825 and 0.709, respectively, with P Big Data in electronic health records and ClinicalTrials.gov for optimizing eligibility criteria design for clinical studies. © 2016 New York Academy of Sciences.

  13. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data.

    Science.gov (United States)

    Saramago, Pedro; Chuang, Ling-Hsiang; Soares, Marta O

    2014-09-10

    Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest.

  14. The fossil record of evolution: Data on diversification and extinction

    Science.gov (United States)

    Sepkoski, J. J., Jr.

    1991-01-01

    Understanding of the evolution of complex life, and of the roles that changing terrestrial and extraterrestrial environments played in life's history, is dependent upon synthetic knowledge of the fossil record. Paleontologists have been describing fossils for more that two centuries. However, much of this information is dispersed in monographs and journal articles published throughout the world. Over the past several years, this literature was surveyed, and a data base on times of origination and extinction of fossil genera was compiled. The data base, which now holds approximately 32,000 genera, covers all taxonomic groups of marine animals, incorporates the most recent taxonomic assignments, and uses a detailed global time framework that can resolve originations and extinctions to intervals averaging three million years in duration. These data can be used to compile patterns of global biodiversity, measure rates of taxic evolution, and test hypotheses concerning adaptive radiations, mass extinctions, etc. Thus far, considerable effort was devoted to using the data to test the hypothesis of periodicity of mass extinction. Rates of extinction measured from the data base have also been used to calibrate models of evolutionary radiations in marine environments. It was observed that new groups, or clades of animals (i.e., orders and classes) tend to reach appreciable diversity first in nearshore environments and then to radiate in more offshore environments; during decline, these clades may disappear from the nearshore while persisting in offshore, deep water habitats. These observations have led to suggestions that there is something special about stressful or perturbed environments that promotes the evolution of novel kinds of animals that can rapidly replace their predecessors. The numerical model that is being investigated to study this phenomenon treats environments along onshore-offshore gradients as if they were discrete habitats. Other aspects of this

  15. Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements.

    Science.gov (United States)

    Larsson, Linnéa; Schwaller, Andrea; Nyström, Marcus; Stridh, Martin

    2016-12-01

    The complexity of analyzing eye-tracking signals increases as eye-trackers become more mobile. The signals from a mobile eye-tracker are recorded in relation to the head coordinate system and when the head and body move, the recorded eye-tracking signal is influenced by these movements, which render the subsequent event detection difficult. The purpose of the present paper is to develop a method that performs robust event detection in signals recorded using a mobile eye-tracker. The proposed method performs compensation of head movements recorded using an inertial measurement unit and employs a multi-modal event detection algorithm. The event detection algorithm is based on the head compensated eye-tracking signal combined with information about detected objects extracted from the scene camera of the mobile eye-tracker. The method is evaluated when participants are seated 2.6m in front of a big screen, and is therefore only valid for distant targets. The proposed method for head compensation decreases the standard deviation during intervals of fixations from 8° to 3.3° for eye-tracking signals recorded during large head movements. The multi-modal event detection algorithm outperforms both an existing algorithm (I-VDT) and the built-in-algorithm of the mobile eye-tracker with an average balanced accuracy, calculated over all types of eye movements, of 0.90, compared to 0.85 and 0.75, respectively for the compared algorithms. The proposed event detector that combines head movement compensation and information regarding detected objects in the scene video enables for improved classification of events in mobile eye-tracking data. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Tropical Cyclones within the Sedimentary Record: Analyzing Overwash Deposition from Event to Millennial Timescales

    Science.gov (United States)

    2009-02-01

    10.1029/97JC02791. Sola, E.M., 1995, Historia de los Huracanes en Puerto Rico: Puerto Rico, First Book Publishing, 76 p. Stockdon, H.F., Holman, R.A...δ18O records from Isthmus of Panama (Lachniet et al., 2004), providing further support for the Pallacocha record as an accurate reconstruction of...2004). A 1500-year El Nino/Southern Oscillation and rainfall history for the Isthmus of Panama from speleothem calcite. Journal of Geophysical

  17. Patient safety events in out-of-hospital paediatric airway management: a medical record review by the CSI-EMS.

    Science.gov (United States)

    Hansen, Matthew; Meckler, Garth; Lambert, William; Dickinson, Caitlin; Dickinson, Kathryn; Van Otterloo, Joshua; Guise, Jeanne-Marie

    2016-11-11

    To describe the frequency and characterise the nature of patient safety events in paediatric out-of-hospital airway management. We conducted a retrospective cross-sectional medical record review of all 'lights and sirens' emergency medicine services transports from 2008 to 2011 in patients <18 years of age in the Portland Oregon metropolitan area. A chart review tool (see online supplementary appendix) was adapted from landmark patient safety studies and revised after pilot testing. Expert panels of physicians and paramedics performed blinded reviews of each chart, identified safety events and described their nature. The primary outcomes were presence and severity of patient safety events related to airway management including oxygen administration, bag-valve-mask ventilation (BVM), airway adjuncts and endotracheal intubation (ETI).DC1SM110.1136/bmjopen-2016-012259.supp1supplementary appendix RESULTS: From the 11 328 paediatric transports during the study period, there were 497 'lights and sirens' (code 3) transports (4.4%). 7 transports were excluded due to missing data. Of the 490 transports included in the analysis, 329 had a total of 338 airway management procedures (some had more than 1 procedure): 61.6% were treated with oxygen, 15.3% with BVM, 8.6% with ETI and 2% with airway adjuncts. The frequency of errors was: 21% (71/338) related to oxygen use, 9.8% (33/338) related to BVM, 9.5% (32/338) related to intubation and 0.9% (3/338) related to airway adjunct use. 58% of intubations required 3 or more attempts or failed altogether. Cardiac arrest was associated with higher odds of a severe error. Errors in paediatric out-of-hospital airway management are common, especially in the context of intubations and during cardiac arrest. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. What is an Appropriate Temporal Sampling Rate to Record Floating Car Data with a GPS?

    Directory of Open Access Journals (Sweden)

    Peter Ranacher

    2016-01-01

    Full Text Available Floating car data (FCD recorded with the Global Positioning System (GPS are an important data source for traffic research. However, FCD are subject to error, which can relate either to the accuracy of the recordings (measurement error or to the temporal rate at which the data are sampled (interpolation error. Both errors affect movement parameters derived from the FCD, such as speed or direction, and consequently influence conclusions drawn about the movement. In this paper we combined recent findings about the autocorrelation of GPS measurement error and well-established findings from random walk theory to analyse a set of real-world FCD. First, we showed that the measurement error in the FCD was affected by positive autocorrelation. We explained why this is a quality measure of the data. Second, we evaluated four metrics to assess the influence of interpolation error. We found that interpolation error strongly affects the correct interpretation of the car’s dynamics (speed, direction, whereas its impact on the path (travelled distance, spatial location was moderate. Based on these results we gave recommendations for recording of FCD using the GPS. Our recommendations only concern time-based sampling, change-based, location-based or event-based sampling are not discussed. The sampling approach minimizes the effects of error on movement parameters while avoiding the collection of redundant information. This is crucial for obtaining reliable results from FCD.

  19. Advanced Big Data Analytics for -Omic Data and Electronic Health Records: Toward Precision Medicine.

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2016-10-10

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift towards precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  20. 78 FR 39968 - Flight Data Recorder Airplane Parameter Specification Omissions and Corrections

    Science.gov (United States)

    2013-07-03

    ... TRANSPORTATION Federal Aviation Administration 14 CFR Parts 91, 121 and 125 RIN 2120-AK27 Flight Data Recorder... for flight data recorders by correcting errors in recording rates in three different appendices. These... when the applicable flight data recorder parameter requirements were adopted, but which have been...

  1. Record completeness and data concordance in an anesthesia information management system using context-sensitive mandatory data-entry fields.

    Science.gov (United States)

    Avidan, Alexander; Weissman, Charles

    2012-03-01

    Use of an anesthesia information management system (AIMS) does not insure record completeness and data accuracy. Mandatory data-entry fields can be used to assure data completeness. However, they are not suited for data that is mandatory depending on the clinical situation (context sensitive). For example, information on equal breath sounds should be mandatory with tracheal intubation, but not with mask ventilation. It was hypothesized that employing context-sensitive mandatory data-entry fields can insure high data-completeness and accuracy while maintaining usability. A commercial off-the-shelf AIMS was enhanced using its built-in VBScript programming tool to build event-driven forms with context-sensitive mandatory data-entry fields. One year after introduction of the system, all anesthesia records were reviewed for data completeness. Data concordance, used as a proxy for accuracy, was evaluated using verifiable age-related data. Additionally, an anonymous satisfaction survey on general acceptance and usability of the AIMS was performed. During the initial 12 months of AIMS use, 12,241 (99.6%) of 12,290 anesthesia records had complete data. Concordances of entered data (weight, size of tracheal tubes, laryngoscopy blades and intravenous catheters) with patients' ages were 98.7-99.9%. The AIMS implementation was deemed successful by 98% of the anesthesiologists. Users rated the AIMS usability in general as very good and the data-entry forms in particular as comfortable. Due to the complexity and the high costs of implementation of an anesthesia information management system it was not possible to compare various system designs (for example with or without context-sensitive mandatory data entry-fields). Therefore, it is possible that a different or simpler design would have yielded the same or even better results. This refers also to the evaluation of usability, since users did not have the opportunity to work with different design approaches or even different

  2. Evaluating and Extending the Ocean Wind Climate Data Record

    Science.gov (United States)

    Ricciardulli, Lucrezia; Rodriguez, Ernesto; Stiles, Bryan W.; Bourassa, Mark A.; Long, David G.; Hoffman, Ross N.; Stoffelen, Ad; Verhoef, Anton; O'Neill, Larry W.; Farrar, J. Tomas; Vandemark, Douglas; Fore, Alexander G.; Hristova-Veleva, Svetla M.; Turk, F. Joseph; Gaston, Robert; Tyler, Douglas

    2017-01-01

    Satellite microwave sensors, both active scatterometers and passive radiometers, have been systematically measuring near-surface ocean winds for nearly 40 years, establishing an important legacy in studying and monitoring weather and climate variability. As an aid to such activities, the various wind datasets are being intercalibrated and merged into consistent climate data records (CDRs). The ocean wind CDRs (OW-CDRs) are evaluated by comparisons with ocean buoys and intercomparisons among the different satellite sensors and among the different data providers. Extending the OW-CDR into the future requires exploiting all available datasets, such as OSCAT-2 scheduled to launch in July 2016. Three planned methods of calibrating the OSCAT-2 σo measurements include 1) direct Ku-band σo intercalibration to QuikSCAT and RapidScat; 2) multisensor wind speed intercalibration; and 3) calibration to stable rainforest targets. Unfortunately, RapidScat failed in August 2016 and cannot be used to directly calibrate OSCAT-2. A particular future continuity concern is the absence of scheduled new or continuation radiometer missions capable of measuring wind speed. Specialized model assimilations provide 30-year long high temporal/spatial resolution wind vector grids that composite the satellite wind information from OW-CDRs of multiple satellites viewing the Earth at different local times. PMID:28824741

  3. Evaluating and Extending the Ocean Wind Climate Data Record.

    Science.gov (United States)

    Wentz, Frank J; Ricciardulli, Lucrezia; Rodriguez, Ernesto; Stiles, Bryan W; Bourassa, Mark A; Long, David G; Hoffman, Ross N; Stoffelen, Ad; Verhoef, Anton; O'Neill, Larry W; Farrar, J Tomas; Vandemark, Douglas; Fore, Alexander G; Hristova-Veleva, Svetla M; Turk, F Joseph; Gaston, Robert; Tyler, Douglas

    2017-05-01

    Satellite microwave sensors, both active scatterometers and passive radiometers, have been systematically measuring near-surface ocean winds for nearly 40 years, establishing an important legacy in studying and monitoring weather and climate variability. As an aid to such activities, the various wind datasets are being intercalibrated and merged into consistent climate data records (CDRs). The ocean wind CDRs (OW-CDRs) are evaluated by comparisons with ocean buoys and intercomparisons among the different satellite sensors and among the different data providers. Extending the OW-CDR into the future requires exploiting all available datasets, such as OSCAT-2 scheduled to launch in July 2016. Three planned methods of calibrating the OSCAT-2 σ o measurements include 1) direct Ku-band σ o intercalibration to QuikSCAT and RapidScat; 2) multisensor wind speed intercalibration; and 3) calibration to stable rainforest targets. Unfortunately, RapidScat failed in August 2016 and cannot be used to directly calibrate OSCAT-2. A particular future continuity concern is the absence of scheduled new or continuation radiometer missions capable of measuring wind speed. Specialized model assimilations provide 30-year long high temporal/spatial resolution wind vector grids that composite the satellite wind information from OW-CDRs of multiple satellites viewing the Earth at different local times.

  4. Archetype-based data warehouse environment to enable the reuse of electronic health record data.

    Science.gov (United States)

    Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G

    2015-09-01

    The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world

  5. Depicting adverse events in cardiac theatre: the preliminary conception of the RECORD model

    Science.gov (United States)

    2013-01-01

    Human error is a byproduct of the human activity and may results in random unintended events; they may have major consequences when it comes to delivery of medicine. Furthermore the causes of error in surgical practice are multifaceted and complex. This article aims to raise awareness for safety measures in the cardiac surgical room and briefly “touch upon” the human factors that could lead to adverse outcomes. Finally, we describe a model that would enable us to depict and study adverse events in the operating theatre. PMID:23510398

  6. Higgs and associated vector boson event recorded by CMS (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    Real proton-proton collision event at 13 TeV in the CMS detector in which two high-energy electrons (green lines), two high-energy muons (red lines), and two high-energy jets (dark yellow cones) are observed. The event shows characteristics expected in the production of a Higgs boson in association with a vector boson with the decay of the Higgs boson in four leptons and the decay of the vector boson in two jets, and is also consistent with background standard model physics processes.

  7. Innovative information visualization of electronic health record data: a systematic review.

    Science.gov (United States)

    West, Vivian L; Borland, David; Hammond, W Ed

    2015-03-01

    This study investigates the use of visualization techniques reported between 1996 and 2013 and evaluates innovative approaches to information visualization of electronic health record (EHR) data for knowledge discovery. An electronic literature search was conducted May-July 2013 using MEDLINE and Web of Knowledge, supplemented by citation searching, gray literature searching, and reference list reviews. General search terms were used to assure a comprehensive document search. Beginning with 891 articles, the number of articles was reduced by eliminating 191 duplicates. A matrix was developed for categorizing all abstracts and to assist with determining those to be excluded for review. Eighteen articles were included in the final analysis. Several visualization techniques have been extensively researched. The most mature system is LifeLines and its applications as LifeLines2, EventFlow, and LifeFlow. Initially, research focused on records from a single patient and visualization of the complex data related to one patient. Since 2010, the techniques under investigation are for use with large numbers of patient records and events. Most are linear and allow interaction through scaling and zooming to resize. Color, density, and filter techniques are commonly used for visualization. With the burgeoning increase in the amount of electronic healthcare data, the potential for knowledge discovery is significant if data are managed in innovative and effective ways. We identify challenges discovered by previous EHR visualization research, which will help researchers who seek to design and improve visualization techniques. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  8. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  9. Visual exploration of movement and event data with interactive time masks

    Directory of Open Access Journals (Sweden)

    Natalia Andrienko

    2017-03-01

    Full Text Available We introduce the concept of time mask, which is a type of temporal filter suitable for selection of multiple disjoint time intervals in which some query conditions fulfil. Such a filter can be applied to time-referenced objects, such as events and trajectories, for selecting those objects or segments of trajectories that fit in one of the selected time intervals. The selected subsets of objects or segments are dynamically summarized in various ways, and the summaries are represented visually on maps and/or other displays to enable exploration. The time mask filtering can be especially helpful in analysis of disparate data (e.g., event records, positions of moving objects, and time series of measurements, which may come from different sources. To detect relationships between such data, the analyst may set query conditions on the basis of one dataset and investigate the subsets of objects and values in the other datasets that co-occurred in time with these conditions. We describe the desired features of an interactive tool for time mask filtering and present a possible implementation of such a tool. By example of analysing two real world data collections related to aviation and maritime traffic, we show the way of using time masks in combination with other types of filters and demonstrate the utility of the time mask filtering. Keywords: Data visualization, Interactive visualization, Interaction technique

  10. A technique for recovering nonsynchronized data from a digital flight data recorder

    Science.gov (United States)

    Roberts, C. A.

    1980-06-01

    How critical data in out-of-synch areas of a digital flight data recorder tape can be recovered using a serial binary technique called BITDUMP is explained. Digital flight data recorder data recovery depends on the acquisition of a synchronization word once every second. Critical synchronization timing is interrupted if tape motion becomes unsteady or if the tape breaks on impact in an aircraft accident. Several cases where synchronization of critical data was lost because of tape breakage, as in the American Airlines DC-10 accident at Chicago, Illinois, on May 25, 1979 and the Air New Zealand DC-10 accident in Antarctica on 28 November 1979 are cited. The American Airlines DC-10 accident is analyzed in detail to demonstrate BITDUMP application.

  11. Patient record review of the incidence, consequences, and causes of diagnostic adverse events

    NARCIS (Netherlands)

    Zwaan, L.; de Bruijne, M.; Wagner, C.; Thijs, A.; Smits, M.; van der Wal, G.; Timmermans, D.R.M.

    2010-01-01

    Background: Diagnostic errors often result in patient harm. Previous studies have shown that there is large variability in results in different medical specialties. The present study explored diagnostic adverse events (DAEs) across all medical specialties to determine their incidence and to gain

  12. Patient record review of the incidence, consequences, and causes of diagnostic adverse events.

    NARCIS (Netherlands)

    Zwaan, L.; Bruijne, M. de; Wagner, C.; Thijs, A.; Smits, M.; Wal, G. van der; Timmermans, D.R.M.

    2010-01-01

    Background: Diagnostic errors often result in patient harm. Previous studies have shown that there is large variability in results in different medical specialties. The present study explored diagnostic adverse events (DAEs) across all medical specialties to determine their incidence and to gain

  13. Evidence for the Blake Event recorded at the Eemian archaeological site of Caours, France

    NARCIS (Netherlands)

    Sier, M.J.; Parés, J.M.; Antoine, P.; Locht, J.-L.; Dekkers, M.J.; Limondin-Lozouet, N.; Roebroeks, W.

    2015-01-01

    A palaeomagnetic study of the Last Interglacial calcareous tufa sequence at the archaeological site of Caours (northern France) identified a geomagnetic excursion that we interpret as the Blake Event. Earlier palaeontological (molluscs, mammals) and geochemical proxy studies at this site allowed

  14. Discrete Event Modeling and Simulation-Driven Engineering for the ATLAS Data Acquisition Network

    CERN Document Server

    Bonaventura, Matias Alejandro; The ATLAS collaboration; Castro, Rodrigo Daniel

    2016-01-01

    We present an iterative and incremental development methodology for simulation models in network engineering projects. Driven by the DEVS (Discrete Event Systems Specification) formal framework for modeling and simulation we assist network design, test, analysis and optimization processes. A practical application of the methodology is presented for a case study in the ATLAS particle physics detector, the largest scientific experiment built by man where scientists around the globe search for answers about the origins of the universe. The ATLAS data network convey real-time information produced by physics detectors as beams of particles collide. The produced sub-atomic evidences must be filtered and recorded for further offline scrutiny. Due to the criticality of the transported data, networks and applications undergo careful engineering processes with stringent quality of service requirements. A tight project schedule imposes time pressure on design decisions, while rapid technology evolution widens the palett...

  15. Adverse Event Recording and Reporting in Clinical Trials Comparing Lumbar Disk Replacement with Lumbar Fusion: A Systematic Review.

    Science.gov (United States)

    Hiratzka, Jayme; Rastegar, Farbod; Contag, Alec G; Norvell, Daniel C; Anderson, Paul A; Hart, Robert A

    2015-12-01

    Study Design Systematic review. Objectives (1) To compare the quality of adverse event (AE) methodology and reporting among randomized trials comparing lumbar fusion with lumbar total disk replacement (TDR) using established AE reporting systems; (2) to compare the AEs and reoperations of lumbar spinal fusion with those from lumbar TDR; (3) to make recommendations on how to report AEs in randomized controlled trials (RCTs) so that surgeons and patients have more-detailed and comprehensive information when making treatment decisions. Methods A systematic search of PubMed, the Cochrane collaboration database, and the National Guideline Clearinghouse through May 2015 was conducted. Randomized controlled trials with at least 2 years of follow-up comparing lumbar artificial disk replacement with lumbar fusion were included. Patients were required to have axial or mechanical low back pain of ≥3 months' duration due to degenerative joint disease defined as degenerative disk disease, facet joint disease, or spondylosis. Outcomes included the quality of AE acquisition methodology and results reporting, and AEs were defined as those secondary to the procedure and reoperations. Individual and pooled relative risks and their 95% confidence intervals comparing lumbar TDR with fusion were calculated. Results RCTs demonstrated a generally poor description of methods for assessing AEs. There was a consistent lack of clear definition or grading for these events. Furthermore, there was a high degree of variation in reporting of surgery-related AEs. Most studies lacked adequate reporting of the timing of AEs, and there were no clear distinctions between acute or chronic AEs. Meta-analysis of the pooled data demonstrated a twofold increased risk of AEs in patients having lumbar fusion compared with patients having lumbar TDR at 2-year follow-up, and this relative risk was maintained at 5 years. Furthermore, the pooled data demonstrated a 1.7 times greater relative risk of

  16. Predicting 30-Day Pneumonia Readmissions Using Electronic Health Record Data.

    Science.gov (United States)

    Makam, Anil N; Nguyen, Oanh Kieu; Clark, Christopher; Zhang, Song; Xie, Bin; Weinreich, Mark; Mortensen, Eric M; Halm, Ethan A

    2017-04-01

    Readmissions after hospitalization for pneumonia are common, but the few risk-prediction models have poor to modest predictive ability. Data routinely collected in the electronic health record (EHR) may improve prediction. To develop pneumonia-specific readmission risk-prediction models using EHR data from the first day and from the entire hospital stay ("full stay"). Observational cohort study using stepwise-backward selection and cross-validation. Consecutive pneumonia hospitalizations from 6 diverse hospitals in north Texas from 2009-2010. All-cause nonelective 30-day readmissions, ascertained from 75 regional hospitals. Of 1463 patients, 13.6% were readmitted. The first-day pneumonia-specific model included sociodemographic factors, prior hospitalizations, thrombocytosis, and a modified pneumonia severity index; the full-stay model included disposition status, vital sign instabilities on discharge, and an updated pneumonia severity index calculated using values from the day of discharge as additional predictors. The full-stay pneumonia-specific model outperformed the first-day model (C statistic 0.731 vs 0.695; P = 0.02; net reclassification index = 0.08). Compared to a validated multi-condition readmission model, the Centers for Medicare and Medicaid Services pneumonia model, and 2 commonly used pneumonia severity of illness scores, the full-stay pneumonia-specific model had better discrimination (C statistic range 0.604-0.681; P pneumonia. This approach outperforms a first-day pneumonia-specific model, the Centers for Medicare and Medicaid Services pneumonia model, and 2 commonly used pneumonia severity of illness scores. Journal of Hospital Medicine 2017;12:209-216.

  17. The Catalog of Event Data of the Operational Deep-ocean Assessment and Reporting of Tsunamis (DART) Stations at the National Data Buoy Center

    Science.gov (United States)

    Bouchard, R.; Locke, L.; Hansen, W.; Collins, S.; McArthur, S.

    2007-12-01

    DART systems are a critical component of the tsunami warning system as they provide the only real-time, in situ, tsunami detection before landfall. DART systems consist of a surface buoy that serves as a position locater and communications transceiver and a Bottom Pressure Recorder (BPR) on the seafloor. The BPR records temperature and pressure at 15-second intervals to a memory card for later retrieval for analysis and use by tsunami researchers, but the BPRs are normally recovered only once every two years. The DART systems also transmit subsets of the data, converted to an estimation of the sea surface height, in near real-time for use by the tsunami warning community. These data are available on NDBC's webpages, http://www.ndbc.noaa.gov/dart.shtml. Although not of the resolution of the data recorded to the BPR memory card, the near real-time data have proven to be of value in research applications [1]. Of particular interest are the DART data associated with geophysical events. The DART BPR continuously compares the measured sea height with a predicted sea-height and when the difference exceeds a threshold value, the BPR goes into Event Mode. Event Mode provides an extended, more frequent near real-time reporting of the sea surface heights for tsunami detection. The BPR can go into Event Mode because of geophysical triggers, such as tsunamis or seismic activity, which may or may not be tsunamigenic. The BPR can also go into Event Mode during recovery of the BPR as it leaves the seafloor, or when manually triggered by the Tsunami Warning Centers in advance of an expected tsunami. On occasion, the BPR will go into Event Mode without any associated tsunami or seismic activity or human intervention and these are considered "False'' Events. Approximately one- third of all Events can be classified as "False". NDBC is responsible for the operations, maintenance, and data management of the DART stations. Each DART station has a webpage with a drop-down list of all

  18. Biases introduced by filtering electronic health records for patients with "complete data".

    Science.gov (United States)

    Weber, Griffin M; Adams, William G; Bernstam, Elmer V; Bickel, Jonathan P; Fox, Kathe P; Marsolo, Keith; Raghavan, Vijay A; Turchin, Alexander; Zhou, Xiaobo; Murphy, Shawn N; Mandl, Kenneth D

    2017-11-01

    One promise of nationwide adoption of electronic health records (EHRs) is the availability of data for large-scale clinical research studies. However, because the same patient could be treated at multiple health care institutions, data from only a single site might not contain the complete medical history for that patient, meaning that critical events could be missing. In this study, we evaluate how simple heuristic checks for data "completeness" affect the number of patients in the resulting cohort and introduce potential biases. We began with a set of 16 filters that check for the presence of demographics, laboratory tests, and other types of data, and then systematically applied all 216 possible combinations of these filters to the EHR data for 12 million patients at 7 health care systems and a separate payor claims database of 7 million members. EHR data showed considerable variability in data completeness across sites and high correlation between data types. For example, the fraction of patients with diagnoses increased from 35.0% in all patients to 90.9% in those with at least 1 medication. An unrelated claims dataset independently showed that most filters select members who are older and more likely female and can eliminate large portions of the population whose data are actually complete. As investigators design studies, they need to balance their confidence in the completeness of the data with the effects of placing requirements on the data on the resulting patient cohort.

  19. International Data Centre: Reviewed Event Bulletin vs. Waveform Cross Correlation Bulletin

    CERN Document Server

    Bobrov, Dmitry; Given, Jeffrey; Khukhuudei, Urtnasan; Kitov, Ivan; Sitnikov, Kirill; Spiliopoulos, Spilio; Zerbo, Lassina

    2012-01-01

    Our objective is to assess the performance of waveform cross-correlation technique, as applied to automatic and interactive processing of the aftershock sequence of the 2012 Sumatera earthquake relative to the Reviewed Event Bulletin (REB) issued by the International Data Centre. The REB includes 1200 aftershocks between April 11 and May 25 with body wave magnitudes from 3.05 to 6.19. To automatically recover the sequence, we selected sixteen aftershocks with mb between 4.5 and 5.0. These events evenly but sparsely cover the area of the most intensive aftershock activity as recorded during the first two days after the main shock. In our study, waveform templates from only seven IMS array stations with the largest SNRs estimated for the signals from the main shock were used to calculate cross-correlation coefficients over the entire period of 44 days. Approximately 1000000 detections obtained using cross-correlation were then used to build events according to the IDC definition. After conflict resolution betwe...

  20. Decoupling of carbon isotope records between organic matter and carbonate prior to the Toarcian Oceanic Anoxic Event (Early Jurassic)

    Science.gov (United States)

    Bodin, Stephane; Kothe, Tim; Krencker, Francois-Nicolas; Suan, Guillaume; Heimhofer, Ulrich; Immenhauser, Adrian

    2014-05-01

    Across the Pliensbachian-Toarcian boundary (P-To, Early Jurassic), ca. 1 Myr before the Toarcian Oceanic Anoxic Event (T-OAE), an initial negative carbon isotope excursion has been documented in western Tethys sedimentary rocks. In carbonate, its amplitude (2-3 permil) is similar to the subsequent excursion recorded at the onset of the T-OAE. Being also associated with a rapid warming event, the significance of this first carbon isotope shift, in terms of paleoenvironmental interpretation and triggering mechanism, remains however elusive. Taking advantage of expanded and rather continuous sections in the High Atlas of Morocco, several high-resolution, paired organic-inorganic carbon isotope records have been obtained across the Upper Pliensbachian - Lower Toarcian interval. At the onset of the T-OAE, an abrupt 1-2 permil negative shift is recorded in both organic and inorganic phases, succeeded by a relatively longer term 1-2 permil negative trend and a final slow return to pre-excursion conditions. In accordance with previous interpretations, this pattern indicates a perturbation of the entire exogenic carbon isotope reservoir at the onset of the T-OAE by the sudden release of isotopically light carbon into the atmosphere. By contrast, there is no negative shift in carbon isotopes for the P-To event recorded in bulk organic matter of Morocco. Given the strong dominance of terrestrial particles in the bulk organic matter fraction, this absence indicates that massive input of 12C-rich carbon into the atmosphere is not likely to have happened during the P-To event. A pronounced (2 permil) and abrupt negative shift in carbon isotope is however recorded in the bulk carbonate phase. We suggest that this decoupling between organic and inorganic phase is due to changes in the nature of the bulk carbonate phase. Indeed, the negative shift occurs at the lithological transition between Pliensbachian-lowermost Toarcian limestone-marl alternations and the Lower Toarcian marl

  1. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    OpenAIRE

    Paniz Tavakoli; Ken Campbell

    2015-01-01

    Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from s...

  2. Adverse events of gastric electrical stimulators recorded in the Manufacturer and User Device Experience (MAUDE) Registry.

    Science.gov (United States)

    Bielefeldt, Klaus

    2017-01-01

    The role of gastric electrical stimulation for patients with refractory symptoms of gastroparesis remains controversial. Open label studies suggest benefit while randomized controlled trials did not demonstrate differences between active and sham intervention. Using a voluntary reporting system of the Federal Drug Administration, we examined the type and frequency of adverse events. We conducted an electronic search of the Manufacturer and User Device Experience (MAUDE) databank using the keyword 'Enterra' for the time between January of 2001 and October of 2015. We abstracted information about the year of stimulator implantation, the year and type of adverse effect, the resulting intervention and outcome if available. A total of 1587 entries described adverse effects related the GES. Only 36 of the reports listed perioperative complications. The vast majority described problems that could be classified as patient concerns, local complications, or system failure. The most common problem related lack or loss of efficacy, followed by pain or complications affecting the pocket site. A subset of 801 reports provided information about the time between system implant and registration of concerns, which gradually declined over time. More than one third (35.7%) of the reported adverse events prompted surgical correction. The number of voluntarily reported adverse events and the high likelihood of repeated surgical interventions clearly demonstrate the potential downside of gastric electrical stimulation. Physicians considering this intervention will need to carefully weigh these risks and include this information when counseling or consenting patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. SAGE III Meteor-3M L1B Solar Event Transmission Data (Native) V004

    Data.gov (United States)

    National Aeronautics and Space Administration — SAGE III Meteor-3M L1B Solar Event Transmission Data are Level 1B pixel group transmission profiles for a single solar event. The Stratospheric Aerosol and Gas...

  4. High density event-related potential data acquisition in cognitive neuroscience.

    Science.gov (United States)

    Slotnick, Scott D

    2010-04-16

    Functional magnetic resonance imaging (fMRI) is currently the standard method of evaluating brain function in the field of Cognitive Neuroscience, in part because fMRI data acquisition and analysis techniques are readily available. Because fMRI has excellent spatial resolution but poor temporal resolution, this method can only be used to identify the spatial location of brain activity associated with a given cognitive process (and reveals virtually nothing about the time course of brain activity). By contrast, event-related potential (ERP) recording, a method that is used much less frequently than fMRI, has excellent temporal resolution and thus can track rapid temporal modulations in neural activity. Unfortunately, ERPs are under utilized in Cognitive Neuroscience because data acquisition techniques are not readily available and low density ERP recording has poor spatial resolution. In an effort to foster the increased use of ERPs in Cognitive Neuroscience, the present article details key techniques involved in high density ERP data acquisition. Critically, high density ERPs offer the promise of excellent temporal resolution and good spatial resolution (or excellent spatial resolution if coupled with fMRI), which is necessary to capture the spatial-temporal dynamics of human brain function.

  5. Prediction of truly random future events using analysis of prestimulus electroencephalographic data

    Science.gov (United States)

    Baumgart, Stephen L.; Franklin, Michael S.; Jimbo, Hiroumi K.; Su, Sharon J.; Schooler, Jonathan

    2017-05-01

    Our hypothesis is that pre-stimulus physiological data can be used to predict truly random events tied to perceptual stimuli (e.g., lights and sounds). Our experiment presents light and sound stimuli to a passive human subject while recording electrocortical potentials using a 32-channel Electroencephalography (EEG) system. For every trial a quantum random number generator (qRNG) chooses from three possible selections with equal probability: a light stimulus, a sound stimulus, and no stimulus. Time epochs are defined preceding and post-ceding each stimulus for which mean average potentials were computed across all trials for the three possible stimulus types. Data from three regions of the brain are examined. In all three regions mean potential for light stimuli was generally enhanced relative to baseline during the period starting approximately 2 seconds before the stimulus. For sound stimuli, mean potential decreased relative to baseline during the period starting approximately 2 seconds before the stimulus. These changes from baseline may indicated the presence of evoked potentials arising from the stimulus. A P200 peak was observed in data recorded from frontal electrodes. The P200 is a well-known potential arising from the brain's processing of visual stimuli and its presence represents a replication of a known neurological phenomenon.

  6. A long record of extreme wave events in coastal Lake Hamana, Japan

    Science.gov (United States)

    Boes, Evelien; Yokoyama, Yusuke; Schmidt, Sabine; Riedesel, Svenja; Fujiwara, Osamu; Nakamura, Atsunori; Garrett, Ed; Heyvaert, Vanessa; Brückner, Helmut; De Batist, Marc

    2017-04-01

    Coastal Lake Hamana is located near the convergent tectonic boundary of the Nankai-Suruga Trough, along which the Philippine Sea slab is subducted underneath the Eurasian Plate, giving rise to repeated tsunamigenic megathrust earthquakes (Mw ≥ 8). A good understanding of the earthquake- and tsunami-triggering mechanisms is crucial in order to better estimate the complexity of seismic risks. Thanks to its accommodation space, Lake Hamana may represent a good archive for past events, such as tsunamis and tropical storms (typhoons), also referred to as "extreme wave" events. Characteristic event layers, consisting of sediment entrained by these extreme waves and their backwash, are witnesses of past marine incursions. By applying a broad range of surveying methods (reflection-seismic profiling, gravity coring, piston coring), sedimentological analyses (CT-scanning, XRF-scanning, multi-sensor core logging, grain size, microfossils etc.) and dating techniques (210Pb/137Cs, 14C, OSL, tephrochronology), we attempt to trace extreme wave event deposits in a multiproxy approach. Seismic imagery shows a vertical stacking of stronger reflectors, interpreted to be coarser-grained sheets deposited by highly energetic waves. Systematic sampling of lake bottom sediments along a transect from ocean-proximal to ocean-distal sites enables us to evaluate vertical and lateral changes in stratigraphy. Ocean-proximal, we observe a sequence of eight sandy units separated by silty background sediments, up to a depth of 8 m into the lake bottom. These sand layers quickly thin out and become finer-grained land-inward. Seismic-to-core correlations show a good fit between the occurrence of strong reflectors and sandy deposits, hence confirming presumptions based on acoustic imagery alone. Sand-rich intervals typically display a higher magnetic susceptibility, density and stronger X-ray attenuation. However, based on textural and structural differences, we can make the distinction between

  7. Nanotechnologies for stimulating and recording excitable events in neurons and cardiomyocytes.

    Science.gov (United States)

    Silva, Gabriel A; Khraiche, Massoud L

    2013-06-01

    Nanotechnologies are engineered materials and devices that have a functional organization in at least one dimension on the nanometer scale, ranging from a few to about 100 nanometers. Functionally, nanotechnologies can display physical, chemical, and engineering properties that go beyond the component building block molecules or structures that make them up. Given such properties and the physical scale involved, these technologies are capable of interacting and interfacing with target cells and tissues in unique ways. One particular emerging application of wide spread interest is the development of nanotechnologies for stimulating and recording excitable cells such as neurons and cardiomyocytes. Such approaches offer the possibility of achieving high density stimulation and recording at sub-cellular resolutions in large populations of cells. This would provide a scale of electrophysiological interactions with excitable cells beyond anything achievable by current existing methods. In this review we introduce the reader to the key concepts and methods associated with nanotechnology and nanoengineering, and discuss the work of some of the key groups developing nanoscale stimulation and recording technologies.

  8. Combining Satellite and in Situ Data with Models to Support Climate Data Records in Ocean Biology

    Science.gov (United States)

    Gregg, Watson

    2011-01-01

    The satellite ocean color data record spans multiple decades and, like most long-term satellite observations of the Earth, comes from many sensors. Unfortunately, global and regional chlorophyll estimates from the overlapping missions show substantial biases, limiting their use in combination to construct consistent data records. SeaWiFS and MODIS-Aqua differed by 13% globally in overlapping time segments, 2003-2007. For perspective, the maximum change in annual means over the entire Sea WiFS mission era was about 3%, and this included an El NinoLa Nina transition. These discrepancies lead to different estimates of trends depending upon whether one uses SeaWiFS alone for the 1998-2007 (no significant change), or whether MODIS is substituted for the 2003-2007 period (18% decline, P less than 0.05). Understanding the effects of climate change on the global oceans is difficult if different satellite data sets cannot be brought into conformity. The differences arise from two causes: 1) different sensors see chlorophyll differently, and 2) different sensors see different chlorophyll. In the first case, differences in sensor band locations, bandwidths, sensitivity, and time of observation lead to different estimates of chlorophyll even from the same location and day. In the second, differences in orbit and sensitivities to aerosols lead to sampling differences. A new approach to ocean color using in situ data from the public archives forces different satellite data to agree to within interannual variability. The global difference between Sea WiFS and MODIS is 0.6% for 2003-2007 using this approach. It also produces a trend using the combination of SeaWiFS and MODIS that agrees with SeaWiFS alone for 1998-2007. This is a major step to reducing errors produced by the first cause, sensor-related discrepancies. For differences that arise from sampling, data assimilation is applied. The underlying geographically complete fields derived from a free-running model is unaffected

  9. Cu isotopes in marine black shales record the Great Oxidation Event

    OpenAIRE

    Chi Fru, E; Konhauser, K; Lalonde, S; Andersson, P.; Weiss, DJ; Perez, N; Camille, P; A. El Albani

    2016-01-01

    Redox-sensitive transition metals and their isotopes provide some of the best lines of evidence for reconstructing early Earth’s oxygenation history, including permanent atmospheric oxygenation following the Great Oxidation Event (GOE), ∼2.45−2.32 Ga. We show a shift from dominantly negative to permanently positive copper isotope compositions in black shales spanning ∼2.66−2.08 Ga. We interpret the transition in marine δ65Cu values as reflecting some combination of waning banded iron formatio...

  10. Data Records derived from GEOSAT Geodetic Mission (GM) and Exact Repeat Mission (ERM) data from 30 March 1985 to 31 December 1989

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This collection contains Sensor Data Records (SDRs), Geodetic Data Records (GDRs), Waveform Data Records (WDRs), and Crossover Difference data Records (XDRs) from...

  11. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  12. Oregon: basic data for thermal springs and wells as recorded in GEOTHERM

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, J.D.

    1983-05-01

    This sample file contains 346 records for Oregon. The records contain data on location, sample description, analysis type, collection condition, flow rates, and chemical and physical properties of the fluid. Stable and radioactive isotope data are occasionally available. (ACR)

  13. VIIRS Climate Raw Data Record (C-RDR) from Suomi NPP, Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Suomi NPP Climate Raw Data Record (C-RDR) developed at the NOAA NCDC is an intermediate product processing level (NOAA Level 1b) between a Raw Data Record (RDR)...

  14. Iridium abundance measurements across bio-event horizons in the fossil record

    Energy Technology Data Exchange (ETDEWEB)

    Orth, C.J.; Attrep, M. Jr.; Quintana, L.R. (Los Alamos National Lab., NM (USA))

    1989-01-01

    Geochemical measurements have been performed on thousands of rock samples collected across bio-event horizons using Instrumental Neutron Activation Analysis (INAA) for about 40 common and trace elements and radiochemical isolation procedures for Ir. On selected samples, Os, Pt and Au were also radiochemically determined. These studies have encompassed the time interval from the Precambrian-Cambrian transition to the Late Eocene impact (microspherule) horizons. Our early work strengthened the Alvarez impact hypothesis by finding the Ir (PGE) anomaly at the K-T boundary in continental sedimentary sequences. In collaborations with paleontologists, weak to moderately string Ir anomalies have been discovered at the Frasnian-Famennian boundary in Australia, in the Early Mississippian of Oklahoma, at the Mississipian-Pennsylvanian boundary of Oklahoma and Texas, and in the Late Cenomanian throughout the western interior of North America and on the south coast of England to date. We have found no compelling evidence for an impact related cause for these anomalies although PGE impact signatures in the two Late Cenomanian anomalies could be masked by the strong terrestrial mafic to ultramafic overprint. Thus far, our evidence for extinction events older than the terminal Cretaceous does not support recent hypotheses which suggest that impacts from cyclic swarms of comets in the inner Solar system were responsible for the periodic mass extinctions. 50 refs., 7 figs., 3 tabs.

  15. Validation of multisource electronic health record data: an application to blood transfusion data.

    Science.gov (United States)

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  16. Extreme event archived in the geological record of the Japan Trench: New results from R/V Sonne Cruise SO-251 towards establishing J-TRACK paleoseismology

    Science.gov (United States)

    Strasser, Michael; Kopf, Achim; Kanamatsu, Toshyia; Moernaut, Jasper; Ikehara, Ken; McHugh, Cecila

    2017-04-01

    Our perspective of subduction zonés earthquake magnitude and recurrence is limited by short historical records. Examining prehistoric extreme events preserved in the geological record is essential towards understanding large earthquakes and assessing the geohazard potential associated with such rare events. The research field of "subaquatic paleoseismology" is a promising approach to investigate deposits from the deep sea, where earthquakes leave traces preserved in stratigraphic succession. However, at present we lack comprehensive data set that allow conclusive distinctions between quality and completeness of the paleoseismic archives as they may relate to different sediment transport, erosion and deposition processes vs. variability of intrinsic seismogenic behavior across different segments. Initially building on what sedimentary deposits were generated from the 2011 Magnitude 9 Tohoku-oki earthquake, the Japan Trench is a promising study area to investigate earthquake-triggered sediment remobilization processes and how they become embedded in the stratigraphic record. Here we present new results from the recent R/V Sonne expedition SO251 that acquired a complete high-resolution bathymetric map of the trench axis and nearly 2000 km of subbottom Parasound profiles, covering the entire along-strike extent of the Japan Trench from 36° to 40.3° N, and groundtruthed by several nearly 10m long piston cores retrieved from the very deep waters (7 to 8 km below sea level): Several smaller submarine landslide (up to several 100's m of lateral extent) can be identified along the trench axis in the new bathymetric data set. These features were either not yet present, or not resolved in the lower-resolution bathymetric dataset acquired before 2011. Sub-bottom acoustic reflection data reveals striking, up to several meter thick, acoustically transparent bodies interbedded in the otherwise parallel reflection pattern of the trench fill basins, providing a temporal and

  17. Analysis of recently digitized continuous seismic data recorded during the March-May, 1980, eruption sequence at Mount St. Helens

    Science.gov (United States)

    Moran, S. C.; Malone, S. D.

    2013-12-01

    The May 18, 1980, eruption of Mount St. Helens (MSH) was an historic event, both for society and for the field of volcanology. However, our knowledge of the eruption and the precursory period leading up it is limited by the fact that most of the data, particularly seismic recordings, were not kept due to severe limitations in the amount of digital data that could be handled and stored using 1980 computer technology. Because of these limitations, only about 900 digital event files have been available for seismic studies of the March-May seismic sequence out of a total of more than 4,000 events that were counted using paper records. Fortunately, data from a subset of stations were also recorded continuously on a series of 24 analog 14-track IRIG magnetic tapes. We have recently digitized these tapes and time-corrected and cataloged the resultant digital data streams, enabling more in-depth studies of the (almost) complete pre-eruption seismic sequence using modern digital processing techniques. Of the fifteen seismic stations operating near MSH for at least a part of the two months between March 20 and May 18, six stations have relatively complete analog recordings. These recordings have gaps of minutes to days because of radio noise, poor tape quality, or missing tapes. In addition, several other stations have partial records. All stations had short-period vertical-component sensors with very limited dynamic range and unknown response details. Nevertheless, because the stations were at a range of distances and were operated at a range of gains, a variety of earthquake sizes were recorded on scale by at least one station, and therefore a much more complete understanding of the evolution of event types, sizes and character should be achievable. In our preliminary analysis of this dataset we have found over 10,000 individual events as recorded on stations 35-40 km from MSH, spanning a recalculated coda-duration magnitude range of ~1.5 to 4.1, including many M 3

  18. Cu isotopes in marine black shales record the Great Oxidation Event.

    Science.gov (United States)

    Chi Fru, Ernest; Rodríguez, Nathalie P; Partin, Camille A; Lalonde, Stefan V; Andersson, Per; Weiss, Dominik J; El Albani, Abderrazak; Rodushkin, Ilia; Konhauser, Kurt O

    2016-05-03

    The oxygenation of the atmosphere ∼2.45-2.32 billion years ago (Ga) is one of the most significant geological events to have affected Earth's redox history. Our understanding of the timing and processes surrounding this key transition is largely dependent on the development of redox-sensitive proxies, many of which remain unexplored. Here we report a shift from negative to positive copper isotopic compositions (δ(65)CuERM-AE633) in organic carbon-rich shales spanning the period 2.66-2.08 Ga. We suggest that, before 2.3 Ga, a muted oxidative supply of weathering-derived copper enriched in (65)Cu, along with the preferential removal of (65)Cu by iron oxides, left seawater and marine biomass depleted in (65)Cu but enriched in (63)Cu. As banded iron formation deposition waned and continentally sourced Cu became more important, biomass sampled a dissolved Cu reservoir that was progressively less fractionated relative to the continental pool. This evolution toward heavy δ(65)Cu values coincides with a shift to negative sedimentary δ(56)Fe values and increased marine sulfate after the Great Oxidation Event (GOE), and is traceable through Phanerozoic shales to modern marine settings, where marine dissolved and sedimentary δ(65)Cu values are universally positive. Our finding of an important shift in sedimentary Cu isotope compositions across the GOE provides new insights into the Precambrian marine cycling of this critical micronutrient, and demonstrates the proxy potential for sedimentary Cu isotope compositions in the study of biogeochemical cycles and oceanic redox balance in the past.

  19. Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.

    Science.gov (United States)

    Sweet, Lauren E; Moulaison, Heather Lea

    2013-12-01

    This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven

  20. Solid-State Recorders Enhance Scientific Data Collection

    Science.gov (United States)

    2010-01-01

    Under Small Business Innovation Research (SBIR) contracts with Goddard Space Flight Center, SEAKR Engineering Inc., of Centennial, Colorado, crafted a solid-state recorder (SSR) to replace the tape recorder onboard a Spartan satellite carrying NASA's Inflatable Antenna Experiment. Work for that mission and others has helped SEAKR become the world leader in SSR technology for spacecraft. The company has delivered more than 100 systems, more than 85 of which have launched onboard NASA, military, and commercial spacecraft including imaging satellites that provide much of the high-resolution imagery for online mapping services like Google Earth.

  1. First record of single event upset on the ground, Cray-1 computer memory at Los Alamos in 1976

    Energy Technology Data Exchange (ETDEWEB)

    Michalak, Sarah E [Los Alamos National Laboratory; Quinn, Heather M [Los Alamos National Laboratory; Grider, Gary A [Los Alamos National Laboratory; Iwanchuk, Paul N [Los Alamos National Laboratory; Morrison, John F [Los Alamos National Laboratory; Wender, Stephen A [Los Alamos National Laboratory; Normand, Eugene [EN ASSOCIATES, LLC; Wert, Jerry L [BOEING RESEARCH AND TEC; Johnson, Steve [CRAY, INC.

    2010-01-01

    Records of bit flips in the Cray-1 computer installed at Los Alamos in 1976 lead to an upset rate in the Cray-1 's bipolar SRAMs that correlates with the SEUs being induced by the atmospheric neutrons. In 1976 the Cray Research Company delivered its first supercomputer, the Cray-1, installing it at Los Alamos National Laboratory. Los Alamos had competed with the Lawrence Livermore National Laboratory for the Cray-1 and won, reaching an agreement with Seymour Cray to install the machine for a period of six months for free, after which they could decide whether to buy, lease or return it. As a result, Los Alamos personnel kept track of the computer reliability and performance and so we know that during those six months of operation, 152 memory parity errors were recorded. The computer memory consisted of approximately 70,000 1Kx1 bipolar ECL static RAMs, the Fairchild 10415. What the Los Alamos engineers didn't know is that those bit flips were the result of single event upsets (SEUs) caused by the atmospheric neutrons. Thus, these 152 bit flips were the first recorded SEUs on the earth, and were observed 2 years before the SEUs in the Intel DRAMs that had been found by May and Woods in 1978. The upsets in the DRAMs were shown to have been caused by alpha particles from the chip packaging material. In this paper we will demonstrate that the Cray-1 bit flips, which were found through the use of parity bits in the Cray-1, were likely due to atmospheric neutrons. This paper will follow the same approach as that of the very first paper to demonstrate single event effects, which occurred in satellite flip-flop circuits in 1975. The main difference is that in the four events that occurred over the course of 17 satellite years of operation were shown to be due to single event effects just a few years after those satellite anomalies were recorded. In the case of the Cray-1 bit flips, there has been a delay of more than 30 years between the occurrence of the bit

  2. Multi-century tree-ring precipitation record reveals increasing frequency of extreme dry events in the upper Blue Nile River catchment.

    Science.gov (United States)

    Mokria, Mulugeta; Gebrekirstos, Aster; Abiyu, Abrham; Van Noordwijk, Meine; Bräuning, Achim

    2017-12-01

    Climate-related environmental and humanitarian crisis are important challenges in the Great Horn of Africa (GHA). In the absence of long-term past climate records in the region, tree-rings are valuable climate proxies, reflecting past climate variations and complementing climate records prior to the instrumental era. We established annually resolved multi-century tree-ring chronology from Juniperus procera trees in northern Ethiopia, the longest series yet for the GHA. The chronology correlates significantly with wet-season (r = .64, p famine and flooding, suggesting that future climate change studies should be both trend and extreme event focused. The average return periods for dry (extreme dry) and wet (extreme wet) events were 4.1 (8.8) years and 4.1 (9.5) years. Extreme-dry conditions during the 19th century were concurrent with drought episodes in equatorial eastern Africa that occurred at the end of the Little Ice Age. El Niño and La Niña events matched with 38.5% and 50% of extreme-dry and extreme-wet events. Equivalent matches for positive and negative Indian Ocean Dipole events were weaker, reaching 23.1 and 25%, respectively. Spatial correlations revealed that reconstructed rainfall represents wet-season rainfall variations over northern Ethiopia and large parts of the Sahel belt. The data presented are useful for backcasting climate and hydrological models and for developing regional strategic plans to manage scarce and contested water resources. Historical perspectives on long-term regional rainfall variability improve the interpretation of recent climate trends. © 2017 John Wiley & Sons Ltd.

  3. Template-based data entry for general description in medical records and data transfer to data warehouse for analysis.

    Science.gov (United States)

    Matsumura, Yasushi; Kuwata, Shigeki; Yamamoto, Yuichiro; Izumi, Kazunori; Okada, Yasushi; Hazumi, Michihiro; Yoshimoto, Sachiko; Mineno, Takahiro; Nagahama, Munetoshi; Fujii, Ayumi; Takeda, Hiroshi

    2007-01-01

    General descriptions in medical records are so diverse that they are usually entered as free text into an electronic medical record, and the resulting data analysis is often difficult. We developed and implemented a template-based data entry module and data analyzing system for general descriptions. We developed a template with tree structure, whose content master and entered patient's data are simultaneously expressed by XML. The entered structured data is converted to narrative form for easy reading. This module was implemented in the EMR system, and is used in 35 hospitals as of October, 2006. So far, 3725 templates (3242 concepts) have been produced. The data in XML and narrative text data are stored in the EMR database. The XML data are retrieved, and then patient's data are extracted, to be stored in the data ware-house (DWH). We developed a search assisting system that enables users to find objective data from the DWH without requiring complicated SQL. By using this method, general descriptions in medical records can be structured and made available for clinical research.

  4. Video Recording in Ethnographic SLA Research: Some Issues of Validity in Data Collection.

    Science.gov (United States)

    DuFon, Margaret A.

    2002-01-01

    Reviews visual anthropology, educational anthropology, and ethnographic filmmaking literature on questions concerning collection of valid video recorded data in the second language context. Examines h an interaction should be video recorded, who should be video recorded, and who should do the recording. Examples illustrate the kinds of research…

  5. Unsteady aerodynamic modeling of a jet transport using flight-data-recorder data

    Science.gov (United States)

    Pan, Chih-Chin

    To assess the effect of the unsteady aerodynamics on a jet transport that crashed upon executing an unsuccessful go-around operation, a fuzzy logic modeling technique is proposed to estimate the aerodynamic models using a set of data recorded by the digital flight data recorder. The longitudinal aerodynamic models are assumed to be functions of the angle of attack, time rate of angle of attack, pitch rate, reduced frequency, sideslip, elevator deflection, stabilizer deflection, flap deflection and Mach number. The lateral-directional aerodynamic models are set up to be functions of the angle of attack, sideslip angle, roll angle, roll rate, yaw rate, lateral reduced frequency, aileron deflection, rudder deflection, and Mach number. Among these parameters, the reduced frequency, sideslip, pitch rate, roll rate, and yaw rate are derived from the recorded data. The Euler angles, accelerations along three axes, and other recorded flight data are first bias-analyzed by satisfying the kinematic relations to account for the measurement errors and noise. The force and moment coefficients are also calculated by satisfying the flight dynamic equations. The bias-adjusted data are then used to establish the general dynamic aerodynamic models using a fuzzy-logic algorithm with internal functions. The identified fuzzy logic aerodynamic models are then employed to predict the stability and control derivatives along the flight path or in other flight conditions. The results show that low longitudinal control effectiveness and unstable pitch damping are possible at high flap angles and moderate angles of attack. Finally, the identified aerodynamic models are integrated into a real-time flight simulation routine running on a PC. The simulation flight scenarios can be recorded by the simulation routine for later replay and plotting. It is demonstrated that the unsteady aerodynamic models in fuzzy logic form has the potential to be used on a real flight simulator. More daily flight

  6. Processing data communications events by awakening threads in parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2016-03-15

    Processing data communications events in a parallel active messaging interface (`PAMI`) of a parallel computer that includes compute nodes that execute a parallel application, with the PAMI including data communications endpoints, and the endpoints are coupled for data communications through the PAMI and through other data communications resources, including determining by an advance function that there are no actionable data communications events pending for its context, placing by the advance function its thread of execution into a wait state, waiting for a subsequent data communications event for the context; responsive to occurrence of a subsequent data communications event for the context, awakening by the thread from the wait state; and processing by the advance function the subsequent data communications event now pending for the context.

  7. Characterization of Recent Greenland Melt Events in Atmospheric Analyses and Satellite Data

    Science.gov (United States)

    Cullather, R. I.; Nowicki, S.; Zhao, B.; Koenig, L.; Moustafa, S.

    2014-12-01

    Data from a variety of observational and modeling sources have indicated enhanced, widespread melting of the Greenland Ice Sheet (GrIS) surface in recent years. In 2012, this melting was punctuated by the circumstance on 11 July when almost the entirety of the ice sheet simultaneously experienced surface melt, including Summit. While such an event has been considered as the result of unique meteorological conditions, the melting record for the season also occurred in 2012 based on spatial extent and duration. Previous melt extent records also occurred in 2002, 2007, and 2010. Melt extent may be estimated from remote sensing methods, but runoff volume may only be obtained from select in situ measurement locations or modeling methods. The aim of this study is to assess differences in available estimates of melt extent and runoff volume, and to characterize the spatial and temporal variability of surface melt. Significantly, the evaluation is conducted over the full available period from 1980 to the present, rather than focusing on one event. The GEOS-5 global atmospheric model with an improved surface representation for the GrIS is replayed against the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) to produce an historical reanalysis. The use of GEOS-5 offers the potential for applying a global model with a realistic GrIS surface representation for the assessment of melt events and their relation to the large-scale climate. These values are compared with output from the Modèle Atmosphérique Régional (MAR) regional climate model and passive microwave remote sensing data. The approach is to spatially average values by drainage basins and evaluate the resulting time series. Seasonally, numerical analyses and models typically indicate less melt coverage during the early summer and more in late summer in comparison to passive microwave data. The relation between melt area, melt duration, and runoff volume differs markedly by drainage basin

  8. Large-mass di-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Document Server

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass jet pair fulfilling all analysis requirements observed so far by the CMS detector in collision data collected in 2015. The mass of the di-jet system is 6.14 TeV. Both jets are reconstructed in the barrel region and have transverse momenta of about 3 TeV each.

  9. Past flood events reflected in Holocene floodplain records of East-Germany

    Science.gov (United States)

    Schneider, H.; Höfer, D.; Mäusbacher, R.; Gude, M.

    2007-12-01

    The sediment archives in lakes and mires formed by salt solution within the floodplain of the Middle Werra river were used to detect effects of climate and landuse changes on the sedimentation regime of the river by means of high-resolution sedimentological and palynological methods. All archives of the Middle Werra valley show similar sedimentation sequences which are mainly influenced by climate until the Middle Ages and mostly affected by human activity between the Middle Ages and Modern Times. The likelihood of climatic influences in terms of wetter conditions is especially given by a clear increase of indicators for floodplain forest, reed communities and aquatics in combination with decreasing human landuse indicators in the investigation area. In addition, the palynological results show that the sediment input in these depressions is higher both during periods with wetter conditions and with increased human impact in the catchment. According to the fact, that during flood events the river is hydrological connected with these depressions, the minerogenic layers are suspected to be delivered during floods.

  10. Tablet computers for recording tuberculosis data at a community ...

    African Journals Online (AJOL)

    Don O’Mahony

    2014-08-20

    Aug 20, 2014 ... The advantages of handheld electronic devices for data collection compared to paper, i.e. the reduced .... operating system for clinical applications on tablet devices. Based on the above studies, data for the ... Android® data collection apps are based on OpenDataKit®,. e.g. OpenClinic®, CommCare® and ...

  11. To what extent are adverse events found in patient records reported by patients and healthcare professionals via complaints, claims and incident reports?

    Science.gov (United States)

    2011-01-01

    Background Patient record review is believed to be the most useful method for estimating the rate of adverse events among hospitalised patients. However, the method has some practical and financial disadvantages. Some of these disadvantages might be overcome by using existing reporting systems in which patient safety issues are already reported, such as incidents reported by healthcare professionals and complaints and medico-legal claims filled by patients or their relatives. The aim of the study is to examine to what extent the hospital reporting systems cover the adverse events identified by patient record review. Methods We conducted a retrospective study using a database from a record review study of 5375 patient records in 14 hospitals in the Netherlands. Trained nurses and physicians using a method based on the protocol of The Harvard Medical Practice Study previously reviewed the records. Four reporting systems were linked with the database of reviewed records: 1) informal and 2) formal complaints by patients/relatives, 3) medico-legal claims by patients/relatives and 4) incident reports by healthcare professionals. For each adverse event identified in patient records the equivalent was sought in these reporting systems by comparing dates and descriptions of the events. The study focussed on the number of adverse event matches, overlap of adverse events detected by different sources, preventability and severity of consequences of reported and non-reported events and sensitivity and specificity of reports. Results In the sample of 5375 patient records, 498 adverse events were identified. Only 18 of the 498 (3.6%) adverse events identified by record review were found in one or more of the four reporting systems. There was some overlap: one adverse event had an equivalent in both a complaint and incident report and in three cases a patient/relative used two or three systems to complain about an adverse event. Healthcare professionals reported relatively more

  12. To what extent are adverse events found in patient records reported by patients and healthcare professionals via complaints, claims and incident reports?

    Directory of Open Access Journals (Sweden)

    van der Wal Gerrit

    2011-02-01

    Full Text Available Abstract Background Patient record review is believed to be the most useful method for estimating the rate of adverse events among hospitalised patients. However, the method has some practical and financial disadvantages. Some of these disadvantages might be overcome by using existing reporting systems in which patient safety issues are already reported, such as incidents reported by healthcare professionals and complaints and medico-legal claims filled by patients or their relatives. The aim of the study is to examine to what extent the hospital reporting systems cover the adverse events identified by patient record review. Methods We conducted a retrospective study using a database from a record review study of 5375 patient records in 14 hospitals in the Netherlands. Trained nurses and physicians using a method based on the protocol of The Harvard Medical Practice Study previously reviewed the records. Four reporting systems were linked with the database of reviewed records: 1 informal and 2 formal complaints by patients/relatives, 3 medico-legal claims by patients/relatives and 4 incident reports by healthcare professionals. For each adverse event identified in patient records the equivalent was sought in these reporting systems by comparing dates and descriptions of the events. The study focussed on the number of adverse event matches, overlap of adverse events detected by different sources, preventability and severity of consequences of reported and non-reported events and sensitivity and specificity of reports. Results In the sample of 5375 patient records, 498 adverse events were identified. Only 18 of the 498 (3.6% adverse events identified by record review were found in one or more of the four reporting systems. There was some overlap: one adverse event had an equivalent in both a complaint and incident report and in three cases a patient/relative used two or three systems to complain about an adverse event. Healthcare professionals

  13. Mining Staff Assignment Rules from Event-Based Data

    NARCIS (Netherlands)

    Ly, L.T.; Rinderle, S.B.; Dadam, P.; Reichert, M.U.

    2006-01-01

    Process mining offers methods and techniques for capturing process behaviour from log data of past process executions. Although many promising approaches on mining the control flow have been published, no attempt has been made to mine the staff assignment situation of business processes. In this

  14. A Naive Bayes machine learning approach to risk prediction using censored, time-to-event data.

    Science.gov (United States)

    Wolfson, Julian; Bandyopadhyay, Sunayan; Elidrisi, Mohamed; Vazquez-Benitez, Gabriela; Vock, David M; Musgrove, Donald; Adomavicius, Gediminas; Johnson, Paul E; O'Connor, Patrick J

    2015-09-20

    Predicting an individual's risk of experiencing a future clinical outcome is a statistical task with important consequences for both practicing clinicians and public health experts. Modern observational databases such as electronic health records provide an alternative to the longitudinal cohort studies traditionally used to construct risk models, bringing with them both opportunities and challenges. Large sample sizes and detailed covariate histories enable the use of sophisticated machine learning techniques to uncover complex associations and interactions, but observational databases are often 'messy', with high levels of missing data and incomplete patient follow-up. In this paper, we propose an adaptation of the well-known Naive Bayes machine learning approach to time-to-event outcomes subject to censoring. We compare the predictive performance of our method with the Cox proportional hazards model which is commonly used for risk prediction in healthcare populations, and illustrate its application to prediction of cardiovascular risk using an electronic health record dataset from a large Midwest integrated healthcare system. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Syndromic surveillance in companion animals utilizing electronic medical records data: development and proof of concept

    Directory of Open Access Journals (Sweden)

    Philip H. Kass

    2016-05-01

    Full Text Available In an effort to recognize and address communicable and point-source epidemics in dog and cat populations, this project created a near real-time syndromic surveillance system devoted to companion animal health in the United States. With over 150 million owned pets in the US, the development of such a system is timely in light of previous epidemics due to various causes that were only recognized in retrospect. The goal of this study was to develop epidemiologic and statistical methods for veterinary hospital-based surveillance, and to demonstrate its efficacy by detection of simulated foodborne outbreaks using a database of over 700 hospitals. Data transfer protocols were established via a secure file transfer protocol site, and a data repository was constructed predominantly utilizing open-source software. The daily proportion of patients with a given clinical or laboratory finding was contrasted with an equivalent average proportion from a historical comparison period, allowing construction of the proportionate diagnostic outcome ratio and its confidence interval for recognizing aberrant heath events. A five-tiered alert system was used to facilitate daily assessment of almost 2,000 statistical analyses. Two simulated outbreak scenarios were created by independent experts, blinded to study investigators, and embedded in the 2010 medical records. Both outbreaks were detected almost immediately by the alert system, accurately detecting species affected using relevant clinical and laboratory findings, and ages involved. Besides demonstrating proof-in-concept of using veterinary hospital databases to detect aberrant events in space and time, this research can be extended to conducting post-detection etiologic investigations utilizing exposure information in the medical record.

  16. Characterization of System Level Single Event Upset (SEU) Responses using SEU Data, Classical Reliability Models, and Space Environment Data

    Science.gov (United States)

    Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael

    2017-01-01

    We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.

  17. The 2007 Stromboli eruption: Event chronology and effusion rates using thermal infrared data

    Science.gov (United States)

    Calvari, S.; Lodato, L.; Steffke, A.; Cristaldi, A.; Harris, A. J. L.; Spampinato, L.; Boschi, E.

    2010-04-01

    Using thermal infrared images recorded by a permanent thermal camera network maintained on Stromboli volcano (Italy), together with satellite and helicopter-based thermal image surveys, we have compiled a chronology of the events and processes occurring before and during Stromboli's 2007 effusive eruption. These digital data also allow us to calculate the effusion rates and lava volumes erupted during the effusive episode. At the onset of the 2007 eruption, two parallel eruptive fissures developed within the northeast crater, eventually breaching the NE flank of the summit cone and extending along the eastern margin of the Sciara del Fuoco. These fed a main effusive vent at 400 m above sea level to feed lava flows that extended to the sea. The effusive eruption was punctuated, on 15 March, by a paroxysm with features similar to those of the 5 April paroxysm that occurred during the 2002-2003 effusive eruption. A total of between 3.2 × 106 and 11 × 106 m3 of lava was erupted during the 2007 eruption, over 34 days of effusive activity. More than half of this volume was emplaced during the first 5.5 days of the eruption. Although the 2007 effusive eruption had an erupted volume comparable to that of the previous (2002-2003) effusive eruption, it had a shorter duration and thus a mean output rate (=total volume divided by eruption duration) that was 1 order of magnitude higher than that of the 2002-2003 event (˜2.4 versus 0.32 ± 0.28 m3 s-1). In this paper, we discuss similarities and differences between these two effusive events and interpret the processes occurring in 2007 in terms of the recent dynamics witnessed at Stromboli.

  18. Uncovering a Salt Giant. Deep-Sea Record of Mediterranean Messinian Events (DREAM) multi-phase drilling project

    Science.gov (United States)

    Camerlenghi, Angelo; Aoisi, Vanni; Lofi, Johanna; Hübscher, Christian; deLange, Gert; Flecker, Rachel; Garcia-Castellanos, Daniel; Gorini, Christian; Gvirtzman, Zohar; Krijgsman, Wout; Lugli, Stefano; Makowsky, Yizhaq; Manzi, Vinicio; McGenity, Terry; Panieri, Giuliana; Rabineau, Marina; Roveri, Marco; Sierro, Francisco Javier; Waldmann, Nicolas

    2014-05-01

    In May 2013, the DREAM MagellanPlus Workshop was held in Brisighella (Italy). The initiative builds from recent activities by various research groups to identify potential sites to perform deep-sea scientific drilling in the Mediterranean Sea across the deep Messinian Salinity Crisis (MSC) sedimentary record. In this workshop three generations of scientists were gathered: those who participated in formulation of the deep desiccated model, through DSDP Leg 13 drilling in 1973; those who are actively involved in present-day MSC research; and the next generation (PhD students and young post-docs). The purpose of the workshop was to identify locations for multiple-site drilling (including riser-drilling) in the Mediterranean Sea that would contribute to solve the several open questions still existing about the causes, processes, timing and consequences at local and planetary scale of an outstanding case of natural environmental change in the recent Earth history: the Messinian Salinity Crisis in the Mediterranean Sea. The product of the workshop is the identification of the structure of an experimental design of site characterization, riser-less and riser drilling, sampling, measurements, and down-hole analyses that will be the core for at least one compelling and feasible multiple phase drilling proposal. Particular focus has been given to reviewing seismic site survey data available from different research groups at pan-Mediterranean basin scale, to the assessment of additional site survey activity including 3D seismics, and to ways of establishing firm links with oil and gas industry. The scientific community behind the DREAM initiative is willing to proceed with the submission to IODP of a Multi-phase Drilling Project including several drilling proposals addressing specific drilling objectives, all linked to the driving objectives of the MSC drilling and understanding . A series of critical drilling targets were identified to address the still open questions

  19. Audio Recording Device Data for Assessing Avian Detectability, Seward Peninsula, Alaska, 2013-2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set contains information from recording devices that were set to record regularly during summer breeding seasons. A single observer listened to 2692...

  20. NOAA Climate Data Record (CDR) of Northern Hemisphere (NH) Snow Cover Extent (SCE), Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This NOAA Climate Data Record (CDR) is a record for the Northern Hemisphere (NH) Snow Cover Extent (SCE) spanning from October 4, 1966 to present, updated monthly...

  1. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    Science.gov (United States)

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  2. Top-quark mass measurement in events with jets and missing transverse energy using the full CDF data set

    Science.gov (United States)

    Aaltonen, T.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J. A.; Arisawa, T.; Artikov, A.; Asaadi, J.; Ashmanskas, W.; Auerbach, B.; Aurisano, A.; Azfar, F.; Badgett, W.; Bae, T.; Barbaro-Galtieri, A.; Barnes, V. E.; Barnett, B. A.; Barria, P.; Bartos, P.; Bauce, M.; Bedeschi, F.; Behari, S.; Bellettini, G.; Bellinger, J.; Benjamin, D.; Beretvas, A.; Bhatti, A.; Bland, K. R.; Blumenfeld, B.; Bocci, A.; Bodek, A.; Bortoletto, D.; Boudreau, J.; Boveia, A.; Brigliadori, L.; Bromberg, C.; Brucken, E.; Budagov, J.; Budd, H. S.; Burkett, K.; Busetto, G.; Bussey, P.; Butti, P.; Buzatu, A.; Calamba, A.; Camarda, S.; Campanelli, M.; Canelli, F.; Carls, B.; Carlsmith, D.; Carosi, R.; Carrillo, S.; Casal, B.; Casarsa, M.; Castro, A.; Catastini, P.; Cauz, D.; Cavaliere, V.; Cavalli-Sforza, M.; Cerri, A.; Cerrito, L.; Chen, Y. C.; Chertok, M.; Chiarelli, G.; Chlachidze, G.; Cho, K.; Chokheli, D.; Ciocci, M. A.; Clark, A.; Clarke, C.; Convery, M. E.; Conway, J.; Corbo, M.; Cordelli, M.; Cox, C. A.; Cox, D. J.; Cremonesi, M.; Cruz, D.; Cuevas, J.; Culbertson, R.; d'Ascenzo, N.; Datta, M.; De Barbaro, P.; Demortier, L.; Deninno, M.; d'Errico, M.; Devoto, F.; Di Canto, A.; Di Ruzza, B.; Dittmann, J. R.; D'Onofrio, M.; Donati, S.; Dorigo, M.; Driutti, A.; Ebina, K.; Edgar, R.; Elagin, A.; Erbacher, R.; Errede, S.; Esham, B.; Eusebi, R.; Farrington, S.; Fernández Ramos, J. P.; Field, R.; Flanagan, G.; Forrest, R.; Franklin, M.; Freeman, J. C.; Frisch, H.; Funakoshi, Y.; Garfinkel, A. F.; Garosi, P.; Gerberich, H.; Gerchtein, E.; Giagu, S.; Giakoumopoulou, V.; Gibson, K.; Ginsburg, C. M.; Giokaris, N.; Giromini, P.; Giurgiu, G.; Glagolev, V.; Glenzinski, D.; Gold, M.; Goldin, D.; Golossanov, A.; Gomez, G.; Gomez-Ceballos, G.; Goncharov, M.; González López, O.; Gorelov, I.; Goshaw, A. T.; Goulianos, K.; Gramellini, E.; Grinstein, S.; Grosso-Pilcher, C.; Group, R. C.; Guimaraes da Costa, J.; Hahn, S. R.; Han, J. Y.; Happacher, F.; Hara, K.; Hare, M.; Harr, R. F.; Harrington-Taber, T.; Hatakeyama, K.; Hays, C.; Heinrich, J.; Herndon, M.; Hocker, A.; Hong, Z.; Hopkins, W.; Hou, S.; Hughes, R. E.; Husemann, U.; Hussein, M.; Huston, J.; Introzzi, G.; Iori, M.; Ivanov, A.; James, E.; Jang, D.; Jayatilaka, B.; Jeon, E. J.; Jindariani, S.; Jones, M.; Joo, K. K.; Jun, S. Y.; Junk, T. R.; Kambeitz, M.; Kamon, T.; Karchin, P. E.; Kasmi, A.; Kato, Y.; Ketchum, W.; Keung, J.; Kilminster, B.; Kim, D. H.; Kim, H. S.; Kim, J. E.; Kim, M. J.; Kim, S. B.; Kim, S. H.; Kim, Y. J.; Kim, Y. K.; Kimura, N.; Kirby, M.; Knoepfel, K.; Kondo, K.; Kong, D. J.; Konigsberg, J.; Kotwal, A. V.; Kreps, M.; Kroll, J.; Kruse, M.; Kuhr, T.; Kurata, M.; Laasanen, A. T.; Lammel, S.; Lancaster, M.; Lannon, K.; Latino, G.; Lee, H. S.; Lee, J. S.; Leo, S.; Leone, S.; Lewis, J. D.; Limosani, A.; Lipeles, E.; Lister, A.; Liu, H.; Liu, Q.; Liu, T.; Lockwitz, S.; Loginov, A.; Lucà, A.; Lucchesi, D.; Lueck, J.; Lujan, P.; Lukens, P.; Lungu, G.; Lys, J.; Lysak, R.; Madrak, R.; Maestro, P.; Malik, S.; Manca, G.; Manousakis-Katsikakis, A.; Margaroli, F.; Marino, P.; Martínez, M.; Matera, K.; Mattson, M. E.; Mazzacane, A.; Mazzanti, P.; McNulty, R.; Mehta, A.; Mehtala, P.; Mesropian, C.; Miao, T.; Mietlicki, D.; Mitra, A.; Miyake, H.; Moed, S.; Moggi, N.; Moon, C. S.; Moore, R.; Morello, M. J.; Mukherjee, A.; Muller, Th.; Murat, P.; Mussini, M.; Nachtman, J.; Nagai, Y.; Naganoma, J.; Nakano, I.; Napier, A.; Nett, J.; Neu, C.; Nigmanov, T.; Nodulman, L.; Noh, S. Y.; Norniella, O.; Oakes, L.; Oh, S. H.; Oh, Y. D.; Oksuzian, I.; Okusawa, T.; Orava, R.; Ortolan, L.; Pagliarone, C.; Palencia, E.; Palni, P.; Papadimitriou, V.; Parker, W.; Pauletta, G.; Paulini, M.; Paus, C.; Phillips, T. J.; Piacentino, G.; Pianori, E.; Pilot, J.; Pitts, K.; Plager, C.; Pondrom, L.; Poprocki, S.; Potamianos, K.; Pranko, A.; Prokoshin, F.; Ptohos, F.; Punzi, G.; Ranjan, N.; Redondo Fernández, I.; Renton, P.; Rescigno, M.; Rimondi, F.; Ristori, L.; Robson, A.; Rodriguez, T.; Rolli, S.; Ronzani, M.; Roser, R.; Rosner, J. L.; Ruffini, F.; Ruiz, A.; Russ, J.; Rusu, V.; Sakumoto, W. K.; Sakurai, Y.; Santi, L.; Sato, K.; Saveliev, V.; Savoy-Navarro, A.; Schlabach, P.; Schmidt, E. E.; Schwarz, T.; Scodellaro, L.; Scuri, F.; Seidel, S.; Seiya, Y.; Semenov, A.; Sforza, F.; Shalhout, S. Z.; Shears, T.; Shepard, P. F.; Shimojima, M.; Shochet, M.; Shreyber-Tecker, I.; Simonenko, A.; Sinervo, P.; Sliwa, K.; Smith, J. R.; Snider, F. D.; Song, H.; Sorin, V.; Stancari, M.; Denis, R. St.; Stelzer, B.; Stelzer-Chilton, O.; Stentz, D.; Strologas, J.; Sudo, Y.; Sukhanov, A.; Suslov, I.; Takemasa, K.; Takeuchi, Y.; Tang, J.; Tecchio, M.; Teng, P. K.; Thom, J.; Thomson, E.; Thukral, V.; Toback, D.; Tokar, S.; Tollefson, K.; Tomura, T.; Tonelli, D.; Torre, S.; Torretta, D.; Totaro, P.; Trovato, M.; Ukegawa, F.; Uozumi, S.; Vázquez, F.; Velev, G.; Vellidis, C.; Vernieri, C.; Vidal, M.; Vilar, R.; Vizán, J.; Vogel, M.; Volpi, G.; Wagner, P.; Wallny, R.; Wang, S. M.; Warburton, A.; Waters, D.; Wester, W. C., III; Whiteson, D.; Wicklund, A. B.; Wilbur, S.; Williams, H. H.; Wilson, J. S.; Wilson, P.; Winer, B. L.; Wittich, P.; Wolbers, S.; Wolfe, H.; Wright, T.; Wu, X.; Wu, Z.; Yamamoto, K.; Yamato, D.; Yang, T.; Yang, U. K.; Yang, Y. C.; Yao, W.-M.; Yeh, G. P.; Yi, K.; Yoh, J.; Yorita, K.; Yoshida, T.; Yu, G. B.; Yu, I.; Zanetti, A. M.; Zeng, Y.; Zhou, C.; Zucchelli, S.

    2013-07-01

    We present a measurement of the top-quark mass using the full data set of Tevatron s=1.96TeV proton-antiproton collisions recorded by the CDF II detector, corresponding to an integrated luminosity of 8.7fb-1. The analysis uses events with one semileptonic t or t¯ decay, but without detection of the electron or muon. We select events with significant missing transverse energy and multiple jets. We veto events containing identified electrons or muons. We obtain distributions of the top-quark masses and the invariant mass of the two jets from W-boson decays from data and compare these to templates derived from signal and background samples to extract the top-quark mass and the energy scale of the calorimeter jets with in situ calibration. A likelihood fit of the templates from signal and background events to the data yields the top-quark mass, Mtop=173.93±1.64(stat)±0.87(syst)GeV/c2. This result is the most precise measurement to date of the mass of the top quark in this event topology.

  3. Sharing Neuron Data: Carrots, Sticks, and Digital Records.

    Science.gov (United States)

    Ascoli, Giorgio A

    2015-10-01

    Routine data sharing is greatly benefiting several scientific disciplines, such as molecular biology, particle physics, and astronomy. Neuroscience data, in contrast, are still rarely shared, greatly limiting the potential for secondary discovery and the acceleration of research progress. Although the attitude toward data sharing is non-uniform across neuroscience subdomains, widespread adoption of data sharing practice will require a cultural shift in the community. Digital reconstructions of axonal and dendritic morphology constitute a particularly "sharable" kind of data. The popularity of the public repository NeuroMorpho.Org demonstrates that data sharing can benefit both users and contributors. Increased data availability is also catalyzing the grassroots development and spontaneous integration of complementary resources, research tools, and community initiatives. Even in this rare successful subfield, however, more data are still unshared than shared. Our experience as developers and curators of NeuroMorpho.Org suggests that greater transparency regarding the expectations and consequences of sharing (or not sharing) data, combined with public disclosure of which datasets are shared and which are not, may expedite the transition to community-wide data sharing.

  4. Sharing Neuron Data: Carrots, Sticks, and Digital Records.

    Directory of Open Access Journals (Sweden)

    Giorgio A Ascoli

    2015-10-01

    Full Text Available Routine data sharing is greatly benefiting several scientific disciplines, such as molecular biology, particle physics, and astronomy. Neuroscience data, in contrast, are still rarely shared, greatly limiting the potential for secondary discovery and the acceleration of research progress. Although the attitude toward data sharing is non-uniform across neuroscience subdomains, widespread adoption of data sharing practice will require a cultural shift in the community. Digital reconstructions of axonal and dendritic morphology constitute a particularly "sharable" kind of data. The popularity of the public repository NeuroMorpho.Org demonstrates that data sharing can benefit both users and contributors. Increased data availability is also catalyzing the grassroots development and spontaneous integration of complementary resources, research tools, and community initiatives. Even in this rare successful subfield, however, more data are still unshared than shared. Our experience as developers and curators of NeuroMorpho.Org suggests that greater transparency regarding the expectations and consequences of sharing (or not sharing data, combined with public disclosure of which datasets are shared and which are not, may expedite the transition to community-wide data sharing.

  5. Coldest Place on Earth: New MODIS and Landsat 8 Thermal Data and Detailed Time Series of Cold Events

    Science.gov (United States)

    Haran, T. M.; Campbell, G. G.; Scambos, T. A.; Pope, A.

    2016-12-01

    Using new thermal time-series data from MODIS Collection 6, and with detailed thermal mapping in Antarctic winter using a revised processing algorithm for Landsat 8's Band 10 data, we have regenerated our analysis of ultra-cold sites in the East Antarctic Plateau. More than 18 MODIS observations are available each day, supporting a detailed analysis of the progression of surface skin temperature toward the coldest values and the break-up of the cold pattern afterward. The close match between Aqua and Terra temperature observations provide corroboration that the record low temperatures are real and consistently mapped. Multi-day trends for a series of ultra-cold events over the MODIS record, and concurrent climate reanalysis data, provide insight into the meteorology of the cold events. These events reach temperatures lower than -93°C (-135°F or 180K), and always occur under prolonged (tropospheric) cloud-free conditions. Winter acquisitions of Landsat 8 thermal images in 2013, 2014 and 2016 provide 100-meter resolution of the cold sites, showing in greater detail the spatial extent of the cold site areas seen in the MODIS 1 km data and their correlation with topography.

  6. Review and Assessment of ATV Observation Data for Events Characterization

    Science.gov (United States)

    Mazoue, Franck; Beck, James; Reynier, Philippe

    2011-02-01

    For investigating ATV re-entry, observation airborne campaigns have been carried out at the end of the mission to ISS of Jules Verne ATV. The observations of re-entry have highlighted that the in-flight scenario was quite different to the nominal one, since two explosions occurred during the flight. In this paper, the data gathered during the observations have been analysed in the perspective of the explosion analysis. The potential explosive elements present on board have been studied and the explosion products identified. To identify the possible scenario that led to the vehicle explosion the list of explosion products has been compared with the radiators put in evidence in the spectra obtained by spectroscopy during the flight. Finally, the most reliable scenario has been identified and the power available for explosion evaluated. This shows that non negligible delta V could have been produced by the explosion.

  7. Genome-wide association study for ketosis in US Jerseys using producer-recorded data.

    Science.gov (United States)

    Parker Gaddis, K L; Megonigal, J H; Clay, J S; Wolfe, C W

    2017-11-08

    Ketosis is one of the most frequently reported metabolic health events in dairy herds. Several genetic analyses of ketosis in dairy cattle have been conducted; however, few have focused specifically on Jersey cattle. The objectives of this research included estimating variance components for susceptibility to ketosis and identification of genomic regions associated with ketosis in Jersey cattle. Voluntary producer-recorded health event data related to ketosis were available from Dairy Records Management Systems (Raleigh, NC). Standardization was implemented to account for the various acronyms used by producers to designate an incidence of ketosis. Events were restricted to the first reported incidence within 60 d after calving in first through fifth parities. After editing, there were a total of 42,233 records from 23,865 cows. A total of 1,750 genotyped animals were used for genomic analyses using 60,671 markers. Because of the binary nature of the trait, a threshold animal model was fitted using THRGIBBS1F90 (version 2.110) using only pedigree information, and genomic information was incorporated using a single-step genomic BLUP approach. Individual single nucleotide polymorphism (SNP) effects and the proportion of variance explained by 10-SNP windows were calculated using postGSf90 (version 1.38). Heritability of susceptibility to ketosis was 0.083 [standard deviation (SD) = 0.021] and 0.078 (SD = 0.018) in pedigree-based and genomic analyses, respectively. The marker with the largest associated effect was located on chromosome 10 at 66.3 Mbp. The 10-SNP window explaining the largest proportion of variance (0.70%) was located on chromosome 6 beginning at 56.1 Mbp. Gene Ontology (GO) and Medical Subject Heading (MeSH) enrichment analyses identified several overrepresented processes and terms related to immune function. Our results indicate that there is a genetic component related to ketosis susceptibility in Jersey cattle and, as such, genetic selection for

  8. Special Data Collection System (SDCS) event report, Unimak Island region, 16 May 1975

    Energy Technology Data Exchange (ETDEWEB)

    Hill, K.J.; Dawkins, M.S.; Baumstark, R.R.; Gillespie, M.D.

    1976-01-22

    Seismic data are reported from the Special Data Collection System (SDCS) and other sources for the Unimak Island Region event on 16 May 1975. Published epicenter information from seismic observations is given.

  9. Visualizing research data records for their better management

    OpenAIRE

    Ball, Alexander; Darlington, Mansur; Howard, Thomas; McMahon, Christopher; Culley, Stephen

    2012-01-01

    As academia in general, and research funders in particular, place ever greater importance on data as an output of research, so the value of good research data management practices becomes ever more apparent. In response to this, the Innovative Design and Manufacturing Research Centre (IdMRC) at the University of Bath, UK, with funding from the JISC, ran a project to draw up a data management planning regime. In carrying out this task, the ERIM (Engineering Research Information Management) Pro...

  10. Computerized system of automated recording and processing of seismic data from the Upper Silesian microseismic network

    Energy Technology Data Exchange (ETDEWEB)

    Kornowski, J.; Sokolowski, H.

    1981-05-01

    This paper describes operation of the Upper Silesian microseismic network, developed and directed by the Central Mining Institute. Seismic events are detected by the T-8100 Racal-Thermionic multi-channel system with Willmore MK II seismic detectors, and are transmitted to the computer center, equipped with the PDP 11/45 minicomputer produced by the Digital Equipment Corp, by means of an FM system. The recording and processing system consists of the following stages: seismic event recording, determining time of seismic events, detection of shocks and elimination of disturbances, assessment of seismic energy of rocks, identifying time of shock recording, determining shock epicenter, storage of information on shocks. Basic computer codes of the system are described: shock detection, determination of time, determination of shock epicenters, statistical processing, and storage. The system collects and stores information on earthquakes and shocks caused by rock bursts. (6 refs.) (In Polish)

  11. Interval-Censored Time-to-Event Data Methods and Applications

    CERN Document Server

    Chen, Ding-Geng

    2012-01-01

    Interval-Censored Time-to-Event Data: Methods and Applications collects the most recent techniques, models, and computational tools for interval-censored time-to-event data. Top biostatisticians from academia, biopharmaceutical industries, and government agencies discuss how these advances are impacting clinical trials and biomedical research. Divided into three parts, the book begins with an overview of interval-censored data modeling, including nonparametric estimation, survival functions, regression analysis, multivariate data analysis, competing risks analysis, and other models for interva

  12. Using Electronic Health Records to Build an Ophthalmologic Data Warehouse and Visualize Patients' Data.

    Science.gov (United States)

    Kortüm, Karsten U; Müller, Michael; Kern, Christoph; Babenko, Alexander; Mayer, Wolfgang J; Kampik, Anselm; Kreutzer, Thomas C; Priglinger, Siegfried; Hirneiss, Christoph

    2017-06-01

    To develop a near-real-time data warehouse (DW) in an academic ophthalmologic center to gain scientific use of increasing digital data from electronic medical records (EMR) and diagnostic devices. Database development. Specific macular clinic user interfaces within the institutional hospital information system were created. Orders for imaging modalities were sent by an EMR-linked picture-archiving and communications system to the respective devices. All data of 325 767 patients since 2002 were gathered in a DW running on an SQL database. A data discovery tool was developed. An exemplary search for patients with age-related macular degeneration, performed cataract surgery, and at least 10 intravitreal (excluding bevacizumab) injections was conducted. Data related to those patients (3 142 204 diagnoses [including diagnoses from other fields of medicine], 720 721 procedures [eg, surgery], and 45 416 intravitreal injections) were stored, including 81 274 optical coherence tomography measurements. A web-based browsing tool was successfully developed for data visualization and filtering data by several linked criteria, for example, minimum number of intravitreal injections of a specific drug and visual acuity interval. The exemplary search identified 450 patients with 516 eyes meeting all criteria. A DW was successfully implemented in an ophthalmologic academic environment to support and facilitate research by using increasing EMR and measurement data. The identification of eligible patients for studies was simplified. In future, software for decision support can be developed based on the DW and its structured data. The improved classification of diseases and semiautomatic validation of data via machine learning are warranted. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Large-mass di-jet event recorded by the CMS detector (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    This image shows a collision event with the largest-mass jet pair fulfilling all analysis requirements observed so far by the CMS detector in proton-proton collision data collected in 2016. The mass of the di-jet system is 7.7 TeV. Both jets are reconstructed in the barrel region and each have transverse momenta of over 3 TeV.

  14. Tools for predicting rainfall from lightning records: events identification and rain prediction using a Bayesian hierarchical model

    OpenAIRE

    Di Giuseppe, Edmondo; Lasinio, Giovanna Jona; Pasqui, Massimiliano; Esposito, Stanislao

    2015-01-01

    We propose a new statistical protocol for the estimation of precipitation using lightning data. We first identify rainy events using a scan statistics, then we estimate Rainfall Lighting Ratio (RLR) to convert lightning number into rain volume given the storm intensity. Then we build a hierarchical Bayesian model aiming at the prediction of 15- and 30-minutes cumulated precipitation at unobserved locations and time using information on lightning in the same area. More specifically, we build a...

  15. Di-photon events recorded by the CMS detector (Run 2, 13 TeV, 0 T)

    CERN Multimedia

    Mc Cauley, Thomas

    2016-01-01

    This image shows a collision event with a photon pair observed by the CMS detector in proton-collision data collected in 2015 with no magnetic field present. The energy deposits of the two photons are represented by the two large green towers. The mass of the di-photon system is between 700 and 800 GeV. The candidates are consistent with what is expected for prompt isolated photons.

  16. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  17. 10 CFR 602.19 - Records and data.

    Science.gov (United States)

    2010-01-01

    ... cycles or conditions and rules for adding or deleting information; and (5) Detail instrument calibration... reduction methods, and other activities relevant to data collection and assembly. (c) Recipients agree to...

  18. The Person-Event Data Environment: leveraging big data for studies of psychological strengths in soldiers.

    Science.gov (United States)

    Vie, Loryana L; Griffith, Kevin N; Scheier, Lawrence M; Lester, Paul B; Seligman, Martin E P

    2013-01-01

    The Department of Defense (DoD) strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE) was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment, and security systems that influence Army-wide policies and procedures.

  19. Consistency of denominator data in electronic health records in Australian primary healthcare services: enhancing data quality.

    Science.gov (United States)

    Bailie, Ross; Bailie, Jodie; Chakraborty, Amal; Swift, Kevin

    2015-01-01

    The quality of data derived from primary healthcare electronic systems has been subjected to little critical systematic analysis, especially in relation to the purported benefits and substantial investment in electronic information systems in primary care. Many indicators of quality of care are based on numbers of certain types of patients as denominators. Consistency of denominator data is vital for comparison of indicators over time and between services. This paper examines the consistency of denominator data extracted from electronic health records (EHRs) for monitoring of access and quality of primary health care. Data collection and analysis were conducted as part of a prospective mixed-methods formative evaluation of the Commonwealth Government's Indigenous Chronic Disease Package. Twenty-six general practices and 14 Aboriginal Health Services (AHSs) located in all Australian States and Territories and in urban, regional and remote locations were purposively selected within geographically defined locations. Percentage change in reported number of regular patients in general practices ranged between -50% and 453% (average 37%). The corresponding figure for AHSs was 1% to 217% (average 31%). In approximately half of general practices and AHSs, the change was ≥ 20%. There were similarly large changes in reported numbers of patients with a diagnosis of diabetes or coronary heart disease (CHD), and Indigenous patients. Inconsistencies in reported numbers were due primarily to limited capability of staff in many general practices and AHSs to accurately enter, manage, and extract data from EHRs. The inconsistencies in data required for the calculation of many key indicators of access and quality of care places serious constraints on the meaningful use of data extracted from EHRs. There is a need for greater attention to quality of denominator data in order to realise the potential benefits of EHRs for patient care, service planning, improvement, and policy. We

  20. Constructing the Web of Events from Raw Data in the Web of Things

    Directory of Open Access Journals (Sweden)

    Yunchuan Sun

    2014-01-01

    Full Text Available An exciting paradise of data is emerging into our daily life along with the development of the Web of Things. Nowadays, volumes of heterogeneous raw data are continuously generated and captured by trillions of smart devices like sensors, smart controls, readers and other monitoring devices, while various events occur in the physical world. It is hard for users including people and smart things to master valuable information hidden in the massive data, which is more useful and understandable than raw data for users to get the crucial points for problems-solving. Thus, how to automatically and actively extract the knowledge of events and their internal links from the big data is one key challenge for the future Web of Things. This paper proposes an effective approach to extract events and their internal links from large scale data leveraging predefined event schemas in the Web of Things, which starts with grasping the critical data for useful events by filtering data with well-defined event types in the schema. A case study in the context of smart campus is presented to show the application of proposed approach for the extraction of events and their internal semantic links.

  1. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    Science.gov (United States)

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  2. MOBBED: A computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB

    Directory of Open Access Journals (Sweden)

    Jeremy eCockfield

    2013-10-01

    Full Text Available Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms.MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed.

  3. NIMS EXPERIMENT DATA RECORDS: EARTH/MOON 1 AND 2 ENCOUNTERS

    Data.gov (United States)

    National Aeronautics and Space Administration — NIMS Experiment Data Record (EDR) files contain raw data from the Galileo Orbiter Near-Infrared Mapping Spectrometer (CARLSONETAL1992). This raw data requires...

  4. NOAA/NSIDC Climate Data Record of Passive Microwave Sea Ice Concentration

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set provides a Climate Data Record (CDR) of sea ice concentration from passive microwave data. It provides a consistent, daily and monthly time series of...

  5. JUNO MWR CRUISE/SKY EDR DATA RECORDS V1.2

    Data.gov (United States)

    National Aeronautics and Space Administration — The Juno MWR EDR data set will ultimately include all uncalibrated MWR science data records for the entire Juno mission. This volume will contain only those data...

  6. MESSENGER H XRS 5 REDUCED DATA RECORD (RDR) FOOTPRINTS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract ======== This data set consists of the MESSENGER XRS reduced data record (RDR) footprints which are derived from the navigational meta-data for each...

  7. Comparison between dairy cow disease incidence in data registered by farmers and in data from a disease-recording system based on veterinary reporting.

    Science.gov (United States)

    Mörk, M; Lindberg, A; Alenius, S; Vågsholm, I; Egenvall, A

    2009-04-01

    Sweden has a national disease-recording system based on veterinary reporting. From this system, all cattle-disease records are transferred to the dairy industry cattle database (DDD) where they are used for several purposes including research and dairy-health statistics. Our objective was to evaluate the completeness of this data source by comparing it with disease data registered by dairy farmers. The proportion of veterinary-treated disease events was estimated, by diagnosis. Disease incidence in the DDD was compared, by diagnosis and age, with disease data registered by the farmers. Comparison was made, by diagnosis, for (i) all disease events and (ii) those reported as veterinary-treated. Disease events, defined as "observed deviations in health, from the normal" were recorded by the farmers during January, April, July and October 2004. For the diagnoses calving problems, peripartum disorders, puerperal paresis and retained placenta, incidence proportions (IP) with 95% confidence intervals (CIs) were estimated. For all other disease problems, incidence rates (IR) were used. In total, 177 farmers reported at least 1 month and 148 reported all 4 months. Fifty-four percent of all disease events in the farmers' data were reported as veterinary-treated. For several of the most common diagnoses, the IRs and IPs for all events were significantly higher in farmers' data than in the DDD. Examples are, in cows: clinical mastitis, cough, gastro-intestinal disorders and lameness in hoof and limb; and in young stock: cough and gastro-intestinal disorders. For veterinary-treated events only, significant differences with higher IR in the farmers' data were found in young stock for sporadic cough and sporadic gastro-intestinal disorders. The diagnosis "other disorders" had significantly more events in the DDD than in farmers' data, i.e. veterinarians tended to choose more unspecific diagnoses than the farmers. This result indicates that the true completeness is likely to be

  8. Use of Electronic Health Data to Estimate Heart Failure Events in a Population-Based Cohort with CKD

    Science.gov (United States)

    Wellman, Robert; Fuller, Sharon; Bansal, Nisha; Psaty, Bruce M.; de Boer, Ian H.; Scholes, Delia

    2016-01-01

    Background and objectives Studies that use electronic health data typically identify heart failure (HF) events from hospitalizations with a principal diagnosis of HF. This approach may underestimate the total burden of HF among persons with CKD. We assessed the accuracy of algorithms for identifying validated HF events from hospitalizations and outpatient encounters, and we used this validation information to estimate the rate of HF events in a large CKD population. Design, setting, participants, & measurements We identified a cohort of 15,141 adults age 18–89 years with an eGFR<60 ml/min per 1.73 m2 from 2008 to 2011. Potential HF events during follow-up were randomly sampled for validation with medical record review. Positive predictive values from the validation study were used to estimate the rate of validated HF events in the full cohort. Results A total of 1864 participants had at least one health care encounter that qualified as a potential HF event during 2.7 years of mean follow-up. Among 313 potential events that were randomly sampled for validation, positive predictive values were 92% for hospitalizations with a principal diagnosis of HF, 32% for hospitalizations with a secondary diagnosis of HF, and 70% for qualifying outpatient HF encounters. Through use of this validation information in the full cohort, the rate of validated HF events estimated from the most comprehensive algorithm that included principal and secondary diagnosis hospitalizations and outpatient encounters was 35.2 events/1000 person-years (95% confidence interval, 33.1 to 37.4), compared with 9.5 events/1000 person-years (95% confidence interval, 8.7 to 10.5) from the algorithm that included only principal diagnosis hospitalizations. Outpatient encounters accounted for 20% of the total number of validated HF events. Conclusions In studies that rely on electronic health data, algorithms that include hospitalizations with a secondary diagnosis of HF and outpatient HF encounters more

  9. Design, development and experimental validation of a non-invasive device for recording respiratory events during bottle feeding.

    Science.gov (United States)

    Cavaiola, C; Tamilia, E; Massaroni, C; Morbidoni, G; Schena, E; Formica, D; Taffoni, F

    2014-01-01

    In newborns, a poor coordination between sucking, swallowing and breathing may undermine the effectiveness of oral feeding and signal immaturity of Central Nervous System. The aim of this work is to develop and validate a non-invasive device for recording respiratory events of newborns during bottle feeding. The proposed device working principle is based on the convective heat exchanged between two hot bodies and the infants' breathing. The sensing elements are inserted into a duct and the gas exchanged by infants is conveyed into this duct thanks to an ad hoc designed system to be mounted on a commercial feeding bottle. Two sets of experiments have been carried out in order to investigate the discrimination threshold of the device and characterize the sensor response at oscillating flows. The effect of distance and tilt between nostrils and device, and the breathing frequency, have been investigated simulating nostrils and neonatal respiratory pattern. The device has a discrimination threshold lower than 0.5 L/min at both 10° and 20° of tilt. Distance for these two settings does not affect the threshold in the investigated range (10-20 mm). Moreover, the device is able to detect breathing events, and to discriminate the onset of expiratory phase, during a neonatal respiratory task delivered by a lung simulator. The results foster the successful application of this device to the assessment of the temporal breathing pattern of newborns during bottle feeding with a non-invasive approach.

  10. What is an Appropriate Temporal Sampling Rate to Record Floating Car Data with a GPS?

    NARCIS (Netherlands)

    Ranacher, P.; Brunauer, R.; van der Spek, S.C.; Reich, S

    2016-01-01

    Floating car data (FCD) recorded with the Global Positioning System (GPS) are an important data source for traffic research. However, FCD are subject to error, which can relate either to the accuracy of the recordings (measurement error) or to the temporal rate at which the data are sampled

  11. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Validation and data integrity of records... INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity of records in the system. (a) The FBI will be responsible for maintaining data integrity during all...

  12. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  13. A Fundamental Climate Data Record of Intercalibrated Brightness Temperature Data from SSM/I and SSMIS

    Science.gov (United States)

    Sapiano, M. R. P.; Berg, W. K.; McKague, D.; Kummerow, C. D.

    2012-04-01

    The first Special Sensor Microwave/Imager (SSM/I) was launched in June 1987 on the Defense Meteorological Satellite Program's (DMSP) F08 spacecraft and started what is now a nearly continuous 24-year record of passive microwave imager data that can be used to monitor the climate system. This includes such fields as precipitation (over both land and ocean), the extent of sea ice and snow, sea ice concentration, total precipitable water, cloud liquid water, and surface wind speed over oceans. A total of nine window channel radiometers have been launched to date in the DMSP series including the SSM/I instrument on board F08, F10, F11, F13, F14, and F15 followed by the Special Sensor Microwave Imager/Sounder (SSMIS) on board F16, F17, and F18, which is expected to operate for at least the next decade. As a result, this data record provides the best available source of long-term global observations of several hydrological variables for climate applications. Although the DMSP sensors provide a long-term record, because the sensors were developed for operational use there are a number of issues that must be addressed to produce a dataset suitable for use in climate applications. There are a several quality control and calibration issues including, but not limited to, quality control of the original antenna temperatures, geolocation, cross-track bias corrections, solar and lunar intrusion issues and emissive antennas. The goal of producing an FCDR of brightness temperature data involves not only addressing many of these instrument issues, but also developing a well-documented, transparent approach that allows for subsequent improvements as well as a framework for incorporating future sensors. Once the data have been quality controlled and various calibration corrections have been applied, the goal is to adjust the calibration of the various sensors so that they are physically consistent. Such intercalibration does not correct for changes due to local observing time, which

  14. Integrating rapid risk mapping and mobile phone call record data for strategic malaria elimination planning

    National Research Council Canada - National Science Library

    Tatem, Andrew J; Huang, Zhuojie; Narib, Clothilde; Kumar, Udayan; Kandula, Deepika; Pindolia, Deepa K; Smith, David L; Cohen, Justin M; Graupe, Bonita; Uusiku, Petrina; Lourenço, Christopher

    2014-01-01

    .... Here, using the example of Namibia, a method for targeting of interventions using surveillance data, satellite imagery, and mobile phone call records to support elimination planning is described...

  15. Using data to attribute episodes of warming and cooling in instrumental records

    Science.gov (United States)

    Tung, Ka-Kit; Zhou, Jiansong

    2013-01-01

    The observed global-warming rate has been nonuniform, and the cause of each episode of slowing in the expected warming rate is the subject of intense debate. To explain this, nonrecurrent events have commonly been invoked for each episode separately. After reviewing evidence in both the latest global data (HadCRUT4) and the longest instrumental record, Central England Temperature, a revised picture is emerging that gives a consistent attribution for each multidecadal episode of warming and cooling in recent history, and suggests that the anthropogenic global warming trends might have been overestimated by a factor of two in the second half of the 20th century. A recurrent multidecadal oscillation is found to extend to the preindustrial era in the 353-y Central England Temperature and is likely an internal variability related to the Atlantic Multidecadal Oscillation (AMO), possibly caused by the thermohaline circulation variability. The perspective of a long record helps in quantifying the contribution from internal variability, especially one with a period so long that it is often confused with secular trends in shorter records. Solar contribution is found to be minimal for the second half of the 20th century and less than 10% for the first half. The underlying net anthropogenic warming rate in the industrial era is found to have been steady since 1910 at 0.07–0.08 °C/decade, with superimposed AMO-related ups and downs that included the early 20th century warming, the cooling of the 1960s and 1970s, the accelerated warming of the 1980s and 1990s, and the recent slowing of the warming rates. Quantitatively, the recurrent multidecadal internal variability, often underestimated in attribution studies, accounts for 40% of the observed recent 50-y warming trend. PMID:23345448

  16. Using data to attribute episodes of warming and cooling in instrumental records.

    Science.gov (United States)

    Tung, Ka-Kit; Zhou, Jiansong

    2013-02-05

    The observed global-warming rate has been nonuniform, and the cause of each episode of slowing in the expected warming rate is the subject of intense debate. To explain this, nonrecurrent events have commonly been invoked for each episode separately. After reviewing evidence in both the latest global data (HadCRUT4) and the longest instrumental record, Central England Temperature, a revised picture is emerging that gives a consistent attribution for each multidecadal episode of warming and cooling in recent history, and suggests that the anthropogenic global warming trends might have been overestimated by a factor of two in the second half of the 20th century. A recurrent multidecadal oscillation is found to extend to the preindustrial era in the 353-y Central England Temperature and is likely an internal variability related to the Atlantic Multidecadal Oscillation (AMO), possibly caused by the thermohaline circulation variability. The perspective of a long record helps in quantifying the contribution from internal variability, especially one with a period so long that it is often confused with secular trends in shorter records. Solar contribution is found to be minimal for the second half of the 20th century and less than 10% for the first half. The underlying net anthropogenic warming rate in the industrial era is found to have been steady since 1910 at 0.07-0.08 °C/decade, with superimposed AMO-related ups and downs that included the early 20th century warming, the cooling of the 1960s and 1970s, the accelerated warming of the 1980s and 1990s, and the recent slowing of the warming rates. Quantitatively, the recurrent multidecadal internal variability, often underestimated in attribution studies, accounts for 40% of the observed recent 50-y warming trend.

  17. Using gamification to drive patient’s personal data validation in a Personal Health Record

    Directory of Open Access Journals (Sweden)

    Guido Giunti

    2015-10-01

    Full Text Available Gamification is a term used to describe using game elements in non-game environments to enhance user experience. It has been incorporated with commercial success into several platforms (Linkedin, Badgeville, Facebook this has made some researchers theorize that it could also be used in education as a tool to increase student engagement and to drive desirable learning behaviors on them. While in the past years some game elements have been incorporated to healthcare there is still little evidence on how effective they are. Game elements provide engagement consistent with various theories of motivation, positive psychology (e.g., flow, and also provide instant feedback. Feedback is more effective when it provides sufficient and specific information for goal achievement and is presented relatively close in time to the event being evaluated. Feedback can reference individual progress, can make social comparisons, or can refer to task criteria. Electronic personal health record systems (PHRs support patient centered healthcare by making medical records and other relevant information accessible to patients, thus assisting patients in health self-management. A particularly difficult data set that is often difficult to capture are those regarding social and cultural background information. This data set is not only useful to help better healthcare system management, it is also relevant as it is used for epidemiological and preventive purposes. We used gamified mechanics that involve instant feedback to test if they would increase patient’s personal data validation and completion in our PHR as well as overall PHR use. On our presentation we will describe our results and the story behind them.

  18. Assessing the accuracy of opioid overdose and poisoning codes in diagnostic information from electronic health records, claims data, and death records.

    Science.gov (United States)

    Green, Carla A; Perrin, Nancy A; Janoff, Shannon L; Campbell, Cynthia I; Chilcoat, Howard D; Coplan, Paul M

    2017-05-01

    The purpose of this study is to assess positive predictive value (PPV), relative to medical chart review, of International Classification of Diseases (ICD)-9/10 diagnostic codes for identifying opioid overdoses and poisonings. Data were obtained from Kaiser Permanente Northwest and Northern California. Diagnostic data from electronic health records, submitted claims, and state death records from Oregon, Washington, and California were linked. Individual opioid-related poisoning codes (e.g., 965.xx and X42), and adverse effects of opioids codes (e.g., E935.xx) combined with diagnoses possibly indicative of overdoses (e.g., respiratory depression), were evaluated by comparison with chart audits. Opioid adverse effects codes had low PPV to detect overdoses (13.4%) as assessed in 127 charts and were not pursued. Instead, opioid poisoning codes were assessed in 2100 individuals who had those codes present in electronic health records in the period between the years 2008 and 2012. Of these, 10/2100 had no available information and 241/2100 were excluded potentially as anesthesia-related. Among the 1849 remaining individuals with opioid poisoning codes, 1495 events were accurately identified as opioid overdoses; 69 were miscodes or misidentified, and 285 were opioid adverse effects, not overdoses. Thus, PPV was 81%. Opioid adverse effects or overdoses were accurately identified in 1780 of 1849 events (96.3%). Opioid poisoning codes have a predictive value of 81% to identify opioid overdoses, suggesting ICD opioid poisoning codes can be used to monitor overdose rates and evaluate interventions to reduce overdose. Further research to assess sensitivity, specificity, and negative predictive value are ongoing. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Tidal analysis of data recorded by a superconducting gravimeter

    Directory of Open Access Journals (Sweden)

    F. Palmonari

    1995-06-01

    Full Text Available A superconducting gravimeter was used to monitor the tidal signal for a period of five months. The instrument was placed in a site (Brasimone station, Italy chat-acterized by a low noise level, and was calibrated with a precision of 0.2%. Then tidal analysis on hourly data was performed and the results presented in this paper; amplitudes, gravimetric factors, phase differences for the main tidal waves, M2, S2, N2, 01, Pl, K1, QI, were calculated together with barometric pressure admittance and long term instrumental drift.

  20. Top-quark mass measurement in events with jets and missing transverse energy using the full CDF data set

    CERN Document Server

    Aaltonen, T.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J.A.; Arisawa, T.; Artikov, A.; Asaadi, J.; Ashmanskas, W.; Auerbach, B.; Aurisano, A.; Azfar, F.; Badgett, W.; Bae, T.; Barbaro-Galtieri, A.; Barnes, V.E.; Barnett, B.A.; Barria, P.; Bartos, P.; Bauce, M.; Bedeschi, F.; Behari, S.; Bellettini, G.; Bellinger, J.; Benjamin, D.; Beretvas, A.; Bhatti, A.; Bland, K.R.; Blumenfeld, B.; Bocci, A.; Bodek, A.; Bortoletto, D.; Boudreau, J.; Boveia, A.; Brigliadori, L.; Bromberg, C.; Brucken, E.; Budagov, J.; Budd, H.S.; Burkett, K.; Busetto, G.; Bussey, P.; Butti, P.; Buzatu, A.; Calamba, A.; Camarda, S.; Campanelli, M.; Canelli, F.; Carls, B.; Carlsmith, D.; Carosi, R.; Carrillo, S.; Casal, B.; Casarsa, M.; Castro, A.; Catastini, P.; Cauz, D.; Cavaliere, V.; Cavalli-Sforza, M.; Cerri, A.; Cerrito, L.; Chen, Y.C.; Chertok, M.; Chiarelli, G.; Chlachidze, G.; Cho, K.; Chokheli, D.; Ciocci, M.A.; Clark, A.; Clarke, C.; Convery, M.E.; Conway, J.; Corbo, M.; Cordelli, M.; Cox, C.A.; Cox, D.J.; Cremonesi, M.; Cruz, D.; Cuevas, J.; Culbertson, R.; d'Ascenzo, N.; Datta, M.; de Barbaro, P.; Demortier, L.; Deninno, M.; d'Errico, M.; Devoto, F.; Di Canto, A.; Di Ruzza, B.; Dittmann, J.R.; D'Onofrio, M.; Donati, S.; Dorigo, M.; Driutti, A.; Ebina, K.; Edgar, R.; Elagin, A.; Erbacher, R.; Errede, S.; Esham, B.; Eusebi, R.; Farrington, S.; Fernandez Ramos, J.P.; Field, R.; Flanagan, G.; Forrest, R.; Franklin, M.; Freeman, J.C.; Frisch, H.; Funakoshi, Y.; Garfinkel, A.F.; Garosi, P.; Gerberich, H.; Gerchtein, E.; Giagu, S.; Giakoumopoulou, V.; Gibson, K.; Ginsburg, C.M.; Giokaris, N.; Giromini, P.; Giurgiu, G.; Glagolev, V.; Glenzinski, D.; Gold, M.; Goldin, D.; Golossanov, A.; Gomez, G.; Gomez-Ceballos, G.; Goncharov, M.; Gonzalez Lopez, O.; Gorelov, I.; Goshaw, A.T.; Goulianos, K.; Gramellini, E.; Grinstein, S.; Grosso-Pilcher, C.; Group, R.C.; Guimaraes da Costa, J.; Hahn, S.R.; Han, J.Y.; Happacher, F.; Hara, K.; Hare, M.; Harr, R.F.; Harrington-Taber, T.; Hatakeyama, K.; Hays, C.; Heinrich, J.; Herndon, M.; Hocker, A.; Hong, Z.; Hopkins, W.; Hou, S.; Hughes, R.E.; Husemann, U.; Hussein, M.; Huston, J.; Introzzi, G.; Iori, M.; Ivanov, A.; James, E.; Jang, D.; Jayatilaka, B.; Jeon, E.J.; Jindariani, S.; Jones, M.; Joo, K.K.; Jun, S.Y.; Junk, T.R.; Kambeitz, M.; Kamon, T.; Karchin, P.E.; Kasmi, A.; Kato, Y.; Ketchum, W.; Keung, J.; Kilminster, B.; Kim, D.H.; Kim, H.S.; Kim, J.E.; Kim, M.J.; Kim, S.B.; Kim, S.H.; Kim, Y.J.; Kim, Y.K.; Kimura, N.; Kirby, M.; Knoepfel, K.; Kondo, K.; Kong, D.J.; Konigsberg, J.; Kotwal, A.V.; Kreps, M.; Kroll, J.; Kruse, M.; Kuhr, T.; Kurata, M.; Laasanen, A.T.; Lammel, S.; Lancaster, M.; Lannon, K.; Latino, G.; Lee, H.S.; Lee, J.S.; Leo, S.; Leone, S.; Lewis, J.D.; Limosani, A.; Lipeles, E.; Lister, A.; Liu, H.; Liu, Q.; Liu, T.; Lockwitz, S.; Loginov, A.; Luca, A.; Lucchesi, D.; Lueck, J.; Lujan, P.; Lukens, P.; Lungu, G.; Lys, J.; Lysak, R.; Madrak, R.; Maestro, P.; Malik, S.; Manca, G.; Manousakis-Katsikakis, A.; Margaroli, F.; Marino, P.; Martinez, M.; Matera, K.; Mattson, M.E.; Mazzacane, A.; Mazzanti, P.; McNulty, R.; Mehta, A.; Mehtala, P.; Mesropian, C.; Miao, T.; Mietlicki, D.; Mitra, A.; Miyake, H.; Moed, S.; Moggi, N.; Moon, C.S.; Moore, R.; Morello, M.J.; Mukherjee, A.; Muller, Th.; Murat, P.; Mussini, M.; Nachtman, J.; Nagai, Y.; Naganoma, J.; Nakano, I.; Napier, A.; Nett, J.; Neu, C.; Nigmanov, T.; Nodulman, L.; Noh, S.Y.; Norniella, O.; Oakes, L.; Oh, S.H.; Oh, Y.D.; Oksuzian, I.; Okusawa, T.; Orava, R.; Ortolan, L.; Pagliarone, C.; Palencia, E.; Palni, P.; Papadimitriou, V.; Parker, W.; Pauletta, G.; Paulini, M.; Paus, C.; Phillips, T.J.; Piacentino, G.; Pianori, E.; Pilot, J.; Pitts, K.; Plager, C.; Pondrom, L.; Poprocki, S.; Potamianos, K.; Pranko, A.; Prokoshin, F.; Ptohos, F.; Punzi, G.; Ranjan, N.; Redondo Fernandez, I.; Renton, P.; Rescigno, M.; Rimondi, F.; Ristori, L.; Robson, A.; Rodriguez, T.; Rolli, S.; Ronzani, M.; Roser, R.; Rosner, J.L.; Ruffini, F.; Ruiz, A.; Russ, J.; Rusu, V.; Sakumoto, W.K.; Sakurai, Y.; Santi, L.; Sato, K.; Saveliev, V.; Savoy-Navarro, A.; Schlabach, P.; Schmidt, E.E.; Schwarz, T.; Scodellaro, L.; Scuri, F.; Seidel, S.; Seiya, Y.; Semenov, A.; Sforza, F.; Shalhout, S.Z.; Shears, T.; Shepard, P.F.; Shimojima, M.; Shochet, M.; Shreyber-Tecker, I.; Simonenko, A.; Sinervo, P.; Sliwa, K.; Smith, J.R.; Snider, F.D.; Song, H.; Sorin, V.; Stancari, M.; St. Denis, R.; Stelzer, B.; Stelzer-Chilton, O.; Stentz, D.; Strologas, J.; Sudo, Y.; Sukhanov, A.; Suslov, I.; Takemasa, K.; Takeuchi, Y.; Tang, J.; Tecchio, M.; Teng, P.K.; Thom, J.; Thomson, E.; Thukral, V.; Toback, D.; Tokar, S.; Tollefson, K.; Tomura, T.; Tonelli, D.; Torre, S.; Torretta, D.; Totaro, P.; Trovato, M.; Ukegawa, F.; Uozumi, S.; Vazquez, F.; Velev, G.; Vellidis, C.; Vernieri, C.; Vidal, M.; Vilar, R.; Vizan, J.; Vogel, M.; Volpi, G.; Wagner, P.; Wallny, R.; Wang, S.M.; Warburton, A.; Waters, D.; Wester, W.C., III; Whiteson, D.; Wicklund, A.B.; Wilbur, S.; Williams, H.H.; Wilson, J.S.; Wilson, P.; Winer, B.L.; Wittich, P.; Wolbers, S.; Wolfe, H.; Wright, T.; Wu, X.; Wu, Z.; Yamamoto, K.; Yamato, D.; Yang, T.; Yang, U.K.; Yang, Y.C.; Yao, W.M.; Yeh, G.P.; Yi, K.; Yoh, J.; Yorita, K.; Yoshida, T.; Yu, G.B.; Yu, I.; Zanetti, A.M.; Zeng, Y.; Zhou, C.; Zucchelli, S.

    2013-07-01

    We present a measurement of the top-quark mass using the full data set of Tevatron $\\sqrt{s} = 1.96$ TeV proton-antiproton collisions recorded by the CDF II detector, corresponding to an integrated luminosity of \\invfb{8.7}. The analysis uses events with one semileptonic $t$ or $\\bar{t}$ decay, but without detection of the electron or muon. We select events with significant missing transverse energy and multiple jets. We veto events containing identified electrons or muons. We obtain distributions of the top-quark masses and the invariant mass of the two jets from $W$-boson decays from data and compare these to templates derived from signal and background samples to extract the top-quark mass and the energy scale of the calorimeter jets with {\\it in situ} calibration. A likelihood fit of the templates from signal and background events to the data yields the top-quark mass, $\\mtop = \\gevcc{\\measStatSyst{173.93}{1.64}{0.87}}$. This result is the most precise measurement to date of the mass of the top quark in t...

  1. Comparison of Two Algorithms for the Reduction of Noise Events in Orbital Optical Lightning Data

    Science.gov (United States)

    Mach, D. M.; Bateman, M. G.

    2015-12-01

    The Geostationary Lightning Mapper (GLM) will be launched as part of the GOES-R satellite. The GLM on-board software is designed to allow a large amount of non-lightning data (noise) to be transmitted to the ground. This is done so that the ground software can remove the noise while at the same time preserving the weak lightning events in the data stream. The ground software has a noise filter that removes events that do not meet certain coherency requirements. The filter utilizes the location, time, and amplitude of the event, along with the background level to determine the likelihood that the event is actually due to lightning and not noise. Due to various constraints, the technique does not use all relevant information to determine the likelihood that the event is actually due to lightning. A more complete filter that uses the full extent of the information available, including clustering results, should produce better results. However, the extra coding and complexity needed to implement the full clustering based filter may not justify the slightly better results. To test the various filter options, this study uses GLM proxy data generated from numerous ground based and orbital sources that mimic the expected characteristics of the GLM lightning event data. These proxy data sets have noise data added, again based on the expected characteristics of GLM noise data. The full proxy data sets (noise and lightning) are filtered by the two methods and the results compared.

  2. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations.

    Science.gov (United States)

    Arndt, Brian G; Beasley, John W; Watkinson, Michelle D; Temte, Jonathan L; Tuan, Wen-Jan; Sinsky, Christine A; Gilchrist, Valerie J

    2017-09-01

    Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non-face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from "event logging" records over a 3-year period for both direct patient care and non-face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. © 2017 Annals of Family Medicine, Inc.

  3. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Comparison of traditional trigger tool to data warehouse based screening for identifying hospital adverse events.

    Science.gov (United States)

    O'Leary, Kevin J; Devisetty, Vikram K; Patel, Amitkumar R; Malkenson, David; Sama, Pradeep; Thompson, William K; Landler, Matthew P; Barnard, Cynthia; Williams, Mark V

    2013-02-01

    Research supports medical record review using screening triggers as the optimal method to detect hospital adverse events (AE), yet the method is labour-intensive. This study compared a traditional trigger tool with an enterprise data warehouse (EDW) based screening method to detect AEs. We created 51 automated queries based on 33 traditional triggers from prior research, and then applied them to 250 randomly selected medical patients hospitalised between 1 September 2009 and 31 August 2010. Two physicians each abstracted records from half the patients using a traditional trigger tool and then performed targeted abstractions for patients with positive EDW queries in the complementary half of the sample. A third physician confirmed presence of AEs and assessed preventability and severity. Traditional trigger tool and EDW based screening identified 54 (22%) and 53 (21%) patients with one or more AE. Overall, 140 (56%) patients had one or more positive EDW screens (total 366 positive screens). Of the 137 AEs detected by at least one method, 86 (63%) were detected by a traditional trigger tool, 97 (71%) by EDW based screening and 46 (34%) by both methods. Of the 11 total preventable AEs, 6 (55%) were detected by traditional trigger tool, 7 (64%) by EDW based screening and 2 (18%) by both methods. Of the 43 total serious AEs, 28 (65%) were detected by traditional trigger tool, 29 (67%) by EDW based screening and 14 (33%) by both. We found relatively poor agreement between traditional trigger tool and EDW based screening with only approximately a third of all AEs detected by both methods. A combination of complementary methods is the optimal approach to detecting AEs among hospitalised patients.

  5. Permanent data recording in transparent materials by using a nanojoule-class pulse laser

    Science.gov (United States)

    Imai, Ryo; Shiozawa, Manabu; Watanabe, Takao; Umeda, Mariko; Mine, Toshiyuki; Kuretake, Satoshi; Watanabe, Koichi

    2014-09-01

    We investigated data recording for permanent data storage using an ultrafast pulse laser with nanojoule-class pulse energy and megahertz-class repetition rate in transparent materials, and driveless reading based on a simple imaging system. A transparent ceramics called Lumicera®, manufactured by Murata Mfg. Co., Ltd., was used as the recording medium. Lumicera® has a lower modification threshold and a higher recording sensitivity than those of silica glass, namely, the medium previously studied. Structural modification in Lumicera® occurs by light exposure for 10 μs, suggesting that Lumicera® has potential for a recording speed of over 100 kbps. Data recorded in Lumicera® resists heating for 2 h at 1000 °C and is expected to have a lifetime of over 300 million years. Moreover, the data recorded in Lumicera® was successfully read with a reading system based on a smart phone.

  6. Utah: basic data for thermal springs and wells as recorded in GEOTHERM

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, J.D.

    1983-05-01

    This GEOTHERM sample file contains 643 records for Utah. Records may be present which are duplicates for the same analyses. A record may contain data on location, sample description, analysis type (water, condensate, or gas), collection condition, flow rates, and the chemical and physical properties of the fluid. Stable and radioactive isotopic data are occasionally available. Some records may contain only location and temperature. This compilation should contain all the chemical data for geothermal fluids in Utah available as of December, 1981. 7 refs. (ACR)

  7. AN APPROACH FOR JOINTLY MODELING MULTIVARIATE LONGITUDINAL MEASUREMENTS AND DISCRETE TIME-TO-EVENT DATA1

    Science.gov (United States)

    Albert, Paul S.; Shih, Joanna H.

    2011-01-01

    In many medical studies, patients are followed longitudinally and interest is on assessing the relationship between longitudinal measurements and time to an event. Recently, various authors have proposed joint modeling approaches for longitudinal and time-to-event data for a single longitudinal variable. These joint modeling approaches become intractable with even a few longitudinal variables. In this paper we propose a regression calibration approach for jointly modeling multiple longitudinal measurements and discrete time-to-event data. Ideally, a two-stage modeling approach could be applied in which the multiple longitudinal measurements are modeled in the first stage and the longitudinal model is related to the time-to-event data in the second stage. Biased parameter estimation due to informative dropout makes this direct two-stage modeling approach problematic. We propose a regression calibration approach which appropriately accounts for informative dropout. We approximate the conditional distribution of the multiple longitudinal measurements given the event time by modeling all pairwise combinations of the longitudinal measurements using a bivariate linear mixed model which conditions on the event time. Complete data are then simulated based on estimates from these pairwise conditional models, and regression calibration is used to estimate the relationship between longitudinal data and time-to-event data using the complete data. We show that this approach performs well in estimating the relationship between multivariate longitudinal measurements and the time-to-event data and in estimating the parameters of the multiple longitudinal process subject to informative dropout. We illustrate this methodology with simulations and with an analysis of primary biliary cirrhosis (PBC) data. PMID:21938267

  8. AN APPROACH FOR JOINTLY MODELING MULTIVARIATE LONGITUDINAL MEASUREMENTS AND DISCRETE TIME-TO-EVENT DATA.

    Science.gov (United States)

    Albert, Paul S; Shih, Joanna H

    2010-09-01

    In many medical studies, patients are followed longitudinally and interest is on assessing the relationship between longitudinal measurements and time to an event. Recently, various authors have proposed joint modeling approaches for longitudinal and time-to-event data for a single longitudinal variable. These joint modeling approaches become intractable with even a few longitudinal variables. In this paper we propose a regression calibration approach for jointly modeling multiple longitudinal measurements and discrete time-to-event data. Ideally, a two-stage modeling approach could be applied in which the multiple longitudinal measurements are modeled in the first stage and the longitudinal model is related to the time-to-event data in the second stage. Biased parameter estimation due to informative dropout makes this direct two-stage modeling approach problematic. We propose a regression calibration approach which appropriately accounts for informative dropout. We approximate the conditional distribution of the multiple longitudinal measurements given the event time by modeling all pairwise combinations of the longitudinal measurements using a bivariate linear mixed model which conditions on the event time. Complete data are then simulated based on estimates from these pairwise conditional models, and regression calibration is used to estimate the relationship between longitudinal data and time-to-event data using the complete data. We show that this approach performs well in estimating the relationship between multivariate longitudinal measurements and the time-to-event data and in estimating the parameters of the multiple longitudinal process subject to informative dropout. We illustrate this methodology with simulations and with an analysis of primary biliary cirrhosis (PBC) data.

  9. Multimedia data capture and management for surgical events: Evaluation of a system.

    Science.gov (United States)

    Cone, Stephen W; Leung, Anna; Mora, Francisco; Rafiq, Azhar; Merrell, Ronald C

    2006-06-01

    The objective of this study was to design an electronic form of documentation of surgical procedures, which would include audio and video recording of the entire surgical procedure. Video clips have shown promise for teaching surgical procedures. To date, no systems have been described to fully record video and audio of all events during a surgical procedure. Much as such systems have aided the airline industry, surgical safety, documentation, and education could benefit from comprehensive, multimedia documentation systems. Four camcorders provided views of: (1) anesthetic monitors, (2) laparoscopic images, (3) room view, and (4) surgical field view. All video and audio were combined with real-time written documentation of events within a simple, inexpensive database for archiving, review, and evaluation. Electronic records provided answers to more than 90% of the structured review questions, leaving only 6% unanswered, versus 92% unanswerable based on the traditional paper records. This electronic documentation system provides a much more comprehensive and easily mined means of surgical documentation than traditional paper records.

  10. Highest-mass di-photon event recorded by CMS as of Dec '15 (Run 2, 13 TeV)

    CERN Multimedia

    Mc Cauley, Thomas

    2015-01-01

    This image shows a collision event with the largest-mass photon pair so far observed by the CMS detector in collision data collected in 2015. The mass of the di-photon system is 1.5 TeV. One photon candidate, with a transverse momentum of 530 GeV is reconstructed in the endcap region, while a second, with a transverse momentum of 400 GeV, is reconstructed in the barrel region. Both photon candidates are consistent with the expectation that they are prompt isolated photons.

  11. CMS Higgs Search in 2011 and 2012 data: candidate photon-photon event (8 TeV)

    CERN Multimedia

    McCauley, Thomas

    2013-01-01

    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). The event could also be due to known standard model background processes.

  12. Improving the capture of adverse event data in clinical trials: the role of the International Atomic Energy Agency.

    Science.gov (United States)

    Davidson, Susan E; Trotti, Andy; Ataman, Ozlem U; Seong, Jinsil; Lau, Fen Nee; da Motta, Neiro W; Jeremic, Branislav

    2007-11-15

    To report meetings of the Applied Radiation Biology and Radiotherapy section of the International Atomic Energy Agency (IAEA), organized to discuss issues surrounding, and develop initiatives to improve, the recording of adverse events (AE) in clinical trials. A first meeting was held in Atlanta, GA (October 2004). A second meeting was held in Denver, CO (October 2005) and focused on AE data capture. The National Cancer Institute Common Terminology Criteria for Adverse Events, version 3 (CTCAE) was suggested during the first meeting as the preferred common platform for the collection and reporting of AE data in its clinical trials. The second meeting identified and reviewed the current weaknesses and variations in the capture of AE data, and proposals to improve the quality and consistency of data capture were discussed. There is heterogeneity in the collection of AE data between both institutions and individual clinicians. The use of multiple scoring systems hampers comparisons of treatment outcomes between centers and trials. There is often insufficient detail on normal tissue treatment effects, which leads to an underestimate of toxicity. Implementation of improved data capture was suggested for one of the ongoing IAEA clinical trials. There is a need to compare the quality and completeness of data between institutions and the efficacy of structured/directed vs. traditional passive data collection. Data collection using the CTCAE (with or without a questionnaire) will be investigated in an IAEA multinational trial of radiochemotherapy and high-dose-rate brachytherapy in cervical cancer.

  13. Quality of record linkage in a highly automated cancer registry that relies on encrypted identity data

    Directory of Open Access Journals (Sweden)

    Schmidtmann, Irene

    2016-06-01

    Full Text Available Objectives: In the absence of unique ID numbers, cancer and other registries in Germany and elsewhere rely on identity data to link records pertaining to the same patient. These data are often encrypted to ensure privacy. Some record linkage errors unavoidably occur. These errors were quantified for the cancer registry of North Rhine Westphalia which uses encrypted identity data. Methods: A sample of records was drawn from the registry, record linkage information was included. In parallel, plain text data for these records were retrieved to generate a gold standard. Record linkage error frequencies in the cancer registry were determined by comparison of the results of the routine linkage with the gold standard. Error rates were projected to larger registries.Results: In the sample studied, the homonym error rate was 0.015%; the synonym error rate was 0.2%. The F-measure was 0.9921. Projection to larger databases indicated that for a realistic development the homonym error rate will be around 1%, the synonym error rate around 2%.Conclusion: Observed error rates are low. This shows that effective methods to standardize and improve the quality of the input data have been implemented. This is crucial to keep error rates low when the registry’s database grows. The planned inclusion of unique health insurance numbers is likely to further improve record linkage quality. Cancer registration entirely based on electronic notification of records can process large amounts of data with high quality of record linkage.

  14. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model ?

    OpenAIRE

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-01-01

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algori...

  15. Joint Models for Longitudinal and Time-to-Event Data With Applications in R

    CERN Document Server

    Rizopoulos, Dimitris

    2012-01-01

    In longitudinal studies it is often of interest to investigate how a marker that is repeatedly measured in time is associated with a time to an event of interest, e.g., prostate cancer studies where longitudinal PSA level measurements are collected in conjunction with the time-to-recurrence. Joint Models for Longitudinal and Time-to-Event Data: With Applications in R provides a full treatment of random effects joint models for longitudinal and time-to-event outcomes that can be utilized to analyze such data. The content is primarily explanatory, focusing on applications of joint modeling, but

  16. MGN V RDRS 5 GLOBAL DATA RECORD TOPOGRAPHIC V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Global Topographic Data Record (GTDR). The range to surface is derived by fitting altimeter echoes from the fan-beam altimetry...

  17. JUNO JUPITER MWR 2 EXPERIMENT DATA RECORDS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Juno MWR EDR data sets will ultimately include all uncalibrated MWR science data records for the entire Juno mission. The set in this volume will contain only...

  18. NOAA Climate Data Record (CDR) of Atmospheric Layer Temperatures, Version 3.3

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Atmospheric Layer Temperature Climate Data Record (CDR) dataset is a monthly analysis of the tropospheric and stratospheric data using temperature sounding...

  19. MGN V RDRS DERIVED GLOBAL VECTOR DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Global Vector Data Record (GVDR), a sorted collection of scattering and emission measurements from the Magellan Mission. The...

  20. MGN V RDRS DERIVED MOSAIC IMAGE DATA RECORD FULL RES V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the Magellan Full-resolution Mosaic Image Data Records (F-MIDR) which consists of SAR mosaics generated from F-BIDRs (i.e., with 75 meters /...

  1. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Mask Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains a high quality Environmental Data Record (EDR) of cloud masks from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument onboard...

  2. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  3. LRO MOON LAMP 5 GRIDDED DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Reconnaissance Orbiter (LRO) Lyman Alpha Mapping Project (LAMP) CODMAC Level 5 Gridded Data Record is a collection of gridded data products (maps) derived...

  4. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Sensor Data Record (SDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sensor Data Records (SDRs), or Level 1b data, from the Visible Infrared Imaging Radiometer Suite (VIIRS) are the calibrated and geolocated radiance and reflectance...

  5. ADEPt, a semantically-enriched pipeline for extracting adverse drug events from free-text electronic health records.

    Directory of Open Access Journals (Sweden)

    Ehtesham Iqbal

    Full Text Available Adverse drug events (ADEs are unintended responses to medical treatment. They can greatly affect a patient's quality of life and present a substantial burden on healthcare. Although Electronic health records (EHRs document a wealth of information relating to ADEs, they are frequently stored in the unstructured or semi-structured free-text narrative requiring Natural Language Processing (NLP techniques to mine the relevant information. Here we present a rule-based ADE detection and classification pipeline built and tested on a large Psychiatric corpus comprising 264k patients using the de-identified EHRs of four UK-based psychiatric hospitals. The pipeline uses characteristics specific to Psychiatric EHRs to guide the annotation process, and distinguishes: a the temporal value associated with the ADE mention (whether it is historical or present, b the categorical value of the ADE (whether it is assertive, hypothetical, retrospective or a general discussion and c the implicit contextual value where the status of the ADE is deduced from surrounding indicators, rather than explicitly stated. We manually created the rulebase in collaboration with clinicians and pharmacists by studying ADE mentions in various types of clinical notes. We evaluated the open-source Adverse Drug Event annotation Pipeline (ADEPt using 19 ADEs specific to antipsychotics and antidepressants medication. The ADEs chosen vary in severity, regularity and persistence. The average F-measure and accuracy achieved by our tool across all tested ADEs were 0.83 and 0.83 respectively. In addition to annotation power, the ADEPT pipeline presents an improvement to the state of the art context-discerning algorithm, ConText.

  6. Machine learning algorithms for meteorological event classification in the coastal area using in-situ data

    Science.gov (United States)

    Sokolov, Anton; Gengembre, Cyril; Dmitriev, Egor; Delbarre, Hervé

    2017-04-01

    The problem is considered of classification of local atmospheric meteorological events in the coastal area such as sea breezes, fogs and storms. The in-situ meteorological data as wind speed and direction, temperature, humidity and turbulence are used as predictors. Local atmospheric events of 2013-2014 were analysed manually to train classification algorithms in the coastal area of English Channel in Dunkirk (France). Then, ultrasonic anemometer data and LIDAR wind profiler data were used as predictors. A few algorithms were applied to determine meteorological events by local data such as a decision tree, the nearest neighbour classifier, a support vector machine. The comparison of classification algorithms was carried out, the most important predictors for each event type were determined. It was shown that in more than 80 percent of the cases machine learning algorithms detect the meteorological class correctly. We expect that this methodology could be applied also to classify events by climatological in-situ data or by modelling data. It allows estimating frequencies of each event in perspective of climate change.

  7. A comparison of recording modalities of P300 event-related potentials (ERP) for brain-computer interface (BCI) paradigm.

    Science.gov (United States)

    Mayaud, L; Congedo, M; Van Laghenhove, A; Orlikowski, D; Figère, M; Azabou, E; Cheliout-Heraut, F

    2013-10-01

    A brain-computer interface aims at restoring communication and control in severely disabled people by identification and classification of EEG features such as event-related potentials (ERPs). The aim of this study is to compare different modalities of EEG recording for extraction of ERPs. The first comparison evaluates the performance of six disc electrodes with that of the EMOTIV headset, while the second evaluates three different electrode types (disc, needle, and large squared electrode). Ten healthy volunteers gave informed consent and were randomized to try the traditional EEG system (six disc electrodes with gel and skin preparation) or the EMOTIV Headset first. Together with the six disc electrodes, a needle and a square electrode of larger surface were simultaneously recording near lead Cz. Each modality was evaluated over three sessions of auditory P300 separated by one hour. No statically significant effect was found for the electrode type, nor was the interaction between electrode type and session number. There was no statistically significant difference of performance between the EMOTIV and the six traditional EEG disc electrodes, although there was a trend showing worse performance of the EMOTIV headset. However, the modality-session interaction was highly significant (P<0.001) showing that, while the performance of the six disc electrodes stay constant over sessions, the performance of the EMOTIV headset drops dramatically between 2 and 3h of use. Finally, the evaluation of comfort by participants revealed an increasing discomfort with the EMOTIV headset starting with the second hour of use. Our study does not recommend the use of one modality over another based on performance but suggests the choice should be made on more practical considerations such as the expected length of use, the availability of skilled labor for system setup and above all, the patient comfort. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  8. Determining Time of Symptom Onset in Patients With Acute Coronary Syndromes: Agreement Between Medical Record and Interview Data.

    Science.gov (United States)

    Davis, Leslie L

    2015-01-01

    Prehospital delay, the time of symptom onset until the time of hospital arrival, for patients with symptoms of acute coronary syndrome (ACS) is frequently used to determine the course of care. Total ischemic time (time for symptom onset until the time of first coronary artery balloon inflation) is another criterion for quality of care for patients experiencing ST-segment elevation myocardial infarction. However, obtaining the exact time of symptom onset, the starting point of both time intervals, is challenging. Currently 2 methods are used to obtain the time of symptom onset: abstraction of data from the medical record and structured interviews done after the acute event. It is not clear whether these methods are equally accurate. Using identified search terms, PubMed and the Cumulative Index to Nursing and Allied Health Literature were searched for articles published from 1990 to 2014 to identify studies that examined agreement between the 2 data sources to determine prehospital delay in patients with ACS. Five studies examined the accuracy and/or agreement of prehospital delay by medical record review and structured patient interviews. In these studies, the percentage of missing/incomplete data in the medical record was higher compared with interviews (14%-40% vs 12%-13%). Three of the 4 studies that compared the 2 data sources reported more than 50% disagreement, with the time of symptom onset starting sooner when obtained by interview compared with the time recorded in their medical record at hospital presentation. There is a need for a consistent, reliable method to assess the time of symptom onset in patients with ACS. To ensure the accuracy of data collected for the medical record, training of emergency and critical care clinicians should (1) emphasize the importance of assessing symptoms broadly, (2) provide tips on interviewing techniques to help patients pinpoint the time of symptom onset, and (3) instill the value of complete documentation.

  9. The U.S. Army Person-Event Data Environment: A Military-Civilian Big Data Enterprise.

    Science.gov (United States)

    Vie, Loryana L; Scheier, Lawrence M; Lester, Paul B; Ho, Tiffany E; Labarthe, Darwin R; Seligman, Martin E P

    2015-06-01

    This report describes a groundbreaking military-civilian collaboration that benefits from an Army and Department of Defense (DoD) big data business intelligence platform called the Person-Event Data Environment (PDE). The PDE is a consolidated data repository that contains unclassified but sensitive manpower, training, financial, health, and medical records covering U.S. Army personnel (Active Duty, Reserve, and National Guard), civilian contractors, and military dependents. These unique data assets provide a veridical timeline capturing each soldier's military experience from entry to separation from the armed forces. The PDE was designed to afford unprecedented cost-efficiencies by bringing researchers and military scientists to a single computerized repository rather than porting vast data resources to individual laboratories. With funding from the Robert Wood Johnson Foundation, researchers from the University of Pennsylvania Positive Psychology Center joined forces with the U.S. Army Research Facilitation Laboratory, forming the scientific backbone of the military-civilian collaboration. This unparalleled opportunity was necessitated by a growing need to learn more about relations between psychological and health assets and health outcomes, including healthcare utilization and costs-issues of major importance for both military and civilian population health. The PDE represents more than 100 times the population size and many times the number of linked variables covered by the nation's leading sources of population health data (e.g., the National Health and Nutrition Examination Survey). Following extensive Army vetting procedures, civilian researchers can mine the PDE's trove of information using a suite of statistical packages made available in a Citrix Virtual Desktop. A SharePoint collaboration and governance management environment ensures user compliance with federal and DoD regulations concerning human subjects' protections and also provides a secure

  10. Development of a time-oriented data warehouse based on a medical information event model.

    Science.gov (United States)

    Yamamoto, Yuichiro; Namikawa, Hirokazu; Inamura, Kiyonari

    2002-01-01

    We designed a new medical information event model and developed a time-oriented data warehouse based on the model. Here, the medical information event in a basic data unit is handled by a medical information system. The timing of decision making and treatment for a patient in the processing of his medical information is sometimes very critical. The time-oriented data warehouse was developed, to provide a search feature on the time axis. Our medical information event model has a unique simple data structure. PC-ORDERING2000 developed by NEC, which used Oracle, had about 600 pages of tables. However, we reduced these 600 complicated data structures to one unique and simple event model. By means of shifting clinical data from the old type order entry system into the new order entry system of the medical information event model, we produced a simple and flexible system, and the easy secondary use of clinical data of patients was realized. Evaluation of our system revealed heightened data retrieval efficiency and shortened response time 1:600 at a terminal, owing to the 1:600 reduction of the number of tables as mentioned above.

  11. Barriers to retrieving patient information from electronic health record data: failure analysis from the TREC Medical Records Track.

    Science.gov (United States)

    Edinger, Tracy; Cohen, Aaron M; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems.

  12. The medical record: narration and story as a path through patient data.

    Science.gov (United States)

    Kluge, E H

    1996-06-01

    Kay and Purves' proposed narratological model of the medical record is based on the familiar phenomenological insight that the perception of data is conditioned by the conceptual framework of the perceiver. Unfortunately, unless handled very carefully, this approach will make the significance of a medical record unique to the person who constructed it and impermeable to outside scrutiny. However, when integrated into the analog-model of the medical record, the narratological model can be accommodated as the clinician-relative construction of a patient profile within the data that make up the medical record. Some implications for the construction of expert systems and competence analysis are indicated.

  13. Organizational needs for managing and preserving geospatial data and related electronic records

    Directory of Open Access Journals (Sweden)

    R R Downs

    2006-01-01

    Full Text Available Government agencies and other organizations are required to manage and preserve records that they create and use to facilitate future access and reuse. The increasing use of geospatial data and related electronic records presents new challenges for these organizations, which have relied on traditional practices for managing and preserving records in printed form. This article reports on an investigation of current and future needs for managing and preserving geospatial electronic records on the part of localand state-level organizations in the New York City metropolitan region. It introduces the study and describes organizational needs observed, including needs for organizational coordination and interorganizational cooperation throughout the entire data lifecycle.

  14. Exploring methods for identifying related patient safety events using structured and unstructured data.

    Science.gov (United States)

    Fong, Allan; Hettinger, A Zachary; Ratwani, Raj M

    2015-12-01

    Most healthcare systems have implemented patient safety event reporting systems to identify safety hazards. Searching the safety event data to find related patient safety reports and identify trends is challenging given the complexity and quantity of these reports. Structured data elements selected by the event reporter may be inaccurate and the free-text narrative descriptions are difficult to analyze. In this paper we present and explore methods for utilizing both the unstructured free-text and structured data elements in safety event reports to identify and rank similar events. We evaluate the results of three different free-text search methods, including a unique topic modeling adaptation, and structured element weights, using a patient fall use case. The various search techniques and weight combinations tended to prioritize different aspects of the event reports leading to different search and ranking results. These search and prioritization methods have the potential to greatly improve patient safety officers, and other healthcare workers, understanding of which safety event reports are related. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Analyzing the reliability of volcanic and archeomagnetic data by comparison with historical records

    Science.gov (United States)

    Arneitz, Patrick; Egli, Ramon; Leonhardt, Roman

    2017-04-01

    Records of the past geomagnetic field are obtained from historical observations (direct records) on the one hand, and by the magnetization acquired by archeological artifacts, rocks and sediments (indirect records) on the other hand. Indirect records are generally less reliable than direct ones due to recording mechanisms that cannot be fully reproduced in the laboratory, age uncertainties and alteration problems. Therefore, geomagnetic field modeling approaches must deal with random and systematic errors of field values and age estimates that are hard to assess. Here, we present a new approach to investigate the reliability of volcanic and archeomagnetic data, which is based on comparisons with historical records. Temporal and spatial mismatches between data are handled by the implementation of weighting functions and error estimates derived from a stochastic model of secular variation. Furthermore, a new strategy is introduced for the statistical analysis of inhomogeneous and internally correlated data sets. Application of these new analysis tools to an extended database including direct and indirect records shows an overall good agreement between different record categories. Nevertheless, some biases exist between selected material categories, laboratory procedures, and quality checks/corrections (e.g., inclination shallowing of volcanic records). These findings can be used to obtain a better understanding of error sources affecting indirect records, thereby facilitating more reliable reconstructions of the geomagnetic past.

  16. Detection, tracking and event localization of jet stream features in 4-D atmospheric data

    Directory of Open Access Journals (Sweden)

    S. Limbach

    2012-04-01

    Full Text Available We introduce a novel algorithm for the efficient detection and tracking of features in spatiotemporal atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. The algorithm works on data given on a four-dimensional structured grid. Feature selection and clustering are based on adjustable local and global criteria, feature tracking is predominantly based on spatial overlaps of the feature's full volumes. The resulting 3-D features and the identified correspondences between features of consecutive time steps are represented as the nodes and edges of a directed acyclic graph, the event graph. Merging and splitting events appear in the event graph as nodes with multiple incoming or outgoing edges, respectively. The precise localization of the splitting events is based on a search for all grid points inside the initial 3-D feature that have a similar distance to two successive 3-D features of the next time step. The merging event is localized analogously, operating backward in time. As a first application of our method we present a climatology of upper-tropospheric jet streams and their events, based on four-dimensional wind speed data from European Centre for Medium-Range Weather Forecasts (ECMWF analyses. We compare our results with a climatology from a previous study, investigate the statistical distribution of the merging and splitting events, and illustrate the meteorological significance of the jet splitting events with a case study. A brief outlook is given on additional potential applications of the 4-D data segmentation technique.

  17. OSCAR experiment high-density network data report: Event 1 - April 8-9, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Dana, M.T.; Easter, R.C.; Thorp, J.M.

    1984-12-01

    The OSCAR (Oxidation and Scavenging Characteristics of April Rains) experiment, conducted during April 1981, was a cooperative field investigation of wet removal in cyclonic storm systems. The high-densiy component of OSCAR was located in northeast Indiana and included sequential precipitation chemistry measurements on a 100 by 100 km network, as well as airborne air chemistry and cloud chemistry measurements, surface air chemistry measurements, and supporting meteorological measurements. Four separate storm events were studied during the experiment. This report summarizes data taken by Pacific Northwest Laboratory (PNL) during the first storm event, April 8-9. The report contains the high-density network precipitation chemistry data, air chemistry data from the PNL aircraft, and meteorological data for the event, including standard National Weather Service products and radar data from the network. 4 references, 72 figures, 5 tables.

  18. OSCAR experiment high-density network data report: Event 3 - April 16-17, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Dana, M.T.; Easter, R.C.; Thorp, J.M.

    1984-12-01

    The OSCAR (Oxidation and Scavenging Characteristics of April Rains) experiment, conducted during April 1981, was a cooperative field investigation of wet removal in cyclonic storm systems. The high-density component of OSCAR was located in northeast Indiana and included sequential precipitation chemistry measurements on a 100 by 100 km network, as well as airborne air chemistry and cloud chemistry measurements, surface air chemistry measurements, and supporting meteorological measurements. Four separate storm events were studied during the experiment. This report summarizes data taken by Pacific Northwest Laboratory (PNL) during the third storm event, April 16-17. The report contains the high-density network precipitation chemistry data, air chemistry and cloud chemistry data from the PNL aircraft, and meteorological data for the event, including standard National Weather Service products and radar and rawindsonde data from the network. 4 references, 76 figures, 6 tables.

  19. Integrating phenotypic data from electronic patient records with molecular level systems biology

    DEFF Research Database (Denmark)

    Brunak, Søren

    2011-01-01

    Electronic patient records remain a rather unexplored, but potentially rich data source for discovering correlations between diseases. We describe a general approach for gathering phenotypic descriptions of patients from medical records in a systematic and non-cohort dependent manner. By extracti...... Classification of Disease ontology and is therefore in principle language independent. As a use case we show how records from a Danish psychiatric hospital lead to the identification of disease correlations, which subsequently are mapped to systems biology frameworks....

  20. Dose-Specific Adverse Drug Reaction Identification in Electronic Patient Records: Temporal Data Mining in an Inpatient Psychiatric Population

    DEFF Research Database (Denmark)

    Eriksson, Robert; Werge, Thomas; Jensen, Lars Juhl

    2014-01-01

    all indication areas.The aim of this study was to take advantage of techniques for temporal data mining of EPRs in order to detect ADRs in a patient- and dose-specific manner.We used a psychiatric hospital’s EPR system to investigate undesired drug effects. Within one workflow the method identified...... patient-specific adverse events (AEs) and links these to specific drugs and dosages in a temporal manner, based on integration of text mining results and structured data. The structured data contained precise information on drug identity, dosage and strength.When applying the method to the 3,394 patients......Data collected for medical, filing and administrative purposes in electronic patient records (EPRs) represent a rich source of individualised clinical data, which has great potential for improved detection of patients experiencing adverse drug reactions (ADRs), across all approved drugs and across...

  1. ATLAS Event Data Organization and I/O Framework Capabilities in Support of Heterogeneous Data Access and Processing Models

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219732; The ATLAS collaboration; Cranshaw, Jack; van Gemmeren, Peter; Nowak, Marcin

    2016-01-01

    Choices in persistent data models and data organization have significant performance ramifications for data-intensive scientific computing. In experimental high energy physics, organizing file-based event data for efficient per-attribute retrieval may improve the I/O performance of some physics analyses but hamper the performance of processing that requires full-event access. In-file data organization tuned for serial access by a single process may be less suitable for opportunistic sub-file-based processing on distributed computing resources. Unique I/O characteristics of high-performance computing platforms pose additional challenges. The ATLAS experiment at the Large Hadron Collider employs a flexible I/O framework and a suite of tools and techniques for persistent data organization to support an increasingly heterogeneous array of data access and processing models.

  2. ATLAS Event Data Organization and I/O Framework Capabilities in Support of Heterogeneous Data Access and Processing Models

    CERN Document Server

    Malon, David; The ATLAS collaboration; van Gemmeren, Peter

    2016-01-01

    Choices in persistent data models and data organization have significant performance ramifications for data-intensive scientific computing. In experimental high energy physics, organizing file-based event data for efficient per-attribute retrieval may improve the I/O performance of some physics analyses but hamper the performance of processing that requires full-event access. In-file data organization tuned for serial access by a single process may be less suitable for opportunistic sub-file-based processing on distributed computing resources. Unique I/O characteristics of high-performance computing platforms pose additional challenges. This paper describes work in the ATLAS experiment at the Large Hadron Collider to provide an I/O framework and tools for persistent data organization to support an increasingly heterogenous array of data access and processing models.

  3. Design of a medical record review study on the incidence and preventability of adverse events requiring a higher level of care in Belgian hospitals

    Directory of Open Access Journals (Sweden)

    Vlayen Annemie

    2012-08-01

    Full Text Available Abstract Background Adverse events are unintended patient injuries that arise from healthcare management resulting in disability, prolonged hospital stay or death. Adverse events that require intensive care admission imply a considerable financial burden to the healthcare system. The epidemiology of adverse events in Belgian hospitals has never been assessed systematically. Findings A multistage retrospective review study of patients requiring a transfer to a higher level of care will be conducted in six hospitals in the province of Limburg. Patient records are reviewed starting from January 2012 by a clinical team consisting of a research nurse, a physician and a clinical pharmacist. Besides the incidence and the level of causation and preventability, also the type of adverse events and their consequences (patient harm, mortality and length of stay will be assessed. Moreover, the adequacy of the patient records and quality/usefulness of the method of medical record review will be evaluated. Discussion This paper describes the rationale for a retrospective review study of adverse events that necessitate a higher level of care. More specifically, we are particularly interested in increasing our understanding in the preventability and root causes of these events in order to implement improvement strategies. Attention is paid to the strengths and limitations of the study design.

  4. Variability and trends of surface solar radiation in Europe based on CM SAF satellite data records

    Science.gov (United States)

    Trentmann, Jörg; Pfeifroth, Uwe; Sanchez-Lorenzo, Arturo; Urbain, Manon; Clerbaux, Nicolas

    2017-04-01

    The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) generates satellite-based high-quality climate data records, with a focus on the global energy and water cycle. Here, the latest releases of the CM SAF's data records of surface solar radiation, Surface Solar Radiation Data Set - Heliosat (SARAH), and CM SAF cLouds, Albedo and Radiation dataset from AVHRR data (CLARA), are analyzed and validated with reference to ground-based measurements, e.g., provided by the Baseline Surface Radiation Network (BSRN), the World Radiation Data Center (WRDC) and the Global Energy Balance Archive (GEBA). Focus is given to the trends and the variability of the surface irradiance in Europe as derived from the surface and the satellite-based data records. Both data sources show an overall increase (i.e., brightening) after the 1980s, and indicate substantial decadal variability with periods of reduced increase (or even a decrease) and periods with a comparable high increase. Also the increase shows a pronounced spatial pattern, which is also found to be consistent between the two data sources. The good correspondence between the satellite-based data records and the surface measurements highlight the potential of the satellite data to represent the variability and changes in the surface irradiance and document the dominant role of clouds over aerosol to explain its variations. Reasons for remaining differences between the satellite- and the surface-based data records (e.g., in Southern Europe) will be discussed. To test the consistency of the CM SAF solar radiation data records we also assess the decadal variability of the solar reflected radiation at the top-of-the atmosphere (TOA) from the CM SAF climate data record based on the MVIRI / SEVIRI measurements from 1983 to 2015. This data record complements the SARAH data record in its temporal and spatial coverage; fewer and different assumptions are used in the retrieval to generate the TOA reflected solar

  5. Reported Adverse Events with Painkillers: Data Mining of the US Food and Drug Administration Adverse Events Reporting System.

    Science.gov (United States)

    Min, Jae; Osborne, Vicki; Kowalski, Allison; Prosperi, Mattia

    2017-11-02

    One-third of adults in the USA experience chronic pain and use a variety of painkillers, such as nonsteroidal anti-inflammatory drugs (NSAIDs), acetaminophen, and opioids. However, some serious adverse events (AEs), such as cardiovascular incidents, overdose, and death, have been found to be related to painkillers. We used 2015 and 2016 AE reports from the US FDA's Adverse Events Reporting System (FAERS) to conduct exploratory analysis on the demographics of those who reported painkiller-related AEs, examine the AEs most commonly associated with different types of painkillers, and identify potential safety signals. Summary descriptive statistics and proportional reporting ratios (PRRs) were performed. Out of over 2 million reports submitted to FAERS in 2015 and 2016, a total of 64,354 AE reports were associated with painkillers. Reports of opioid-associated AEs were more likely to be from males or younger patients (mean age 47.6 years). The highest numbers of AEs were reported for NSAID and opioid use, and the most commonly found AEs were related to drug ineffectiveness, administration issues, abuse, and overdose. Death was reported in 20.0% of the reports, and serious adverse reactions, including death, were reported in 67.0%; both adverse outcomes were highest among patients using opioids or combinations of painkillers and were associated with PRRs of 2.12 and 1.87, respectively. This study examined the AEs most commonly associated with varying classes of painkillers by mining the FAERS database. Our results and methods are relevant for future secondary analyses of big data and for understanding adverse outcomes related to painkillers.

  6. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    Science.gov (United States)

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  7. Psychometric Evaluation of Data from the Race-Related Events Scale

    Science.gov (United States)

    Crusto, Cindy A.; Dantzler, John; Roberts, Yvonne Humenay; Hooper, Lisa M.

    2015-01-01

    Using exploratory factor analysis, we examined the factor structure of data collected from the Race-Related Events Scale, which assesses perceived exposure to race-related stress. Our sample (N = 201) consisted of diverse caregivers of Head Start preschoolers. Three factors explained 81% of the variance in the data and showed sound reliability.

  8. From Environmental Data Record (EDR) to Information Data Record (IDR) - Towards The Development of S-NPP/JPSS Real-Time Informatics in Community Satellite Processing Package (CSPP)

    Science.gov (United States)

    Huang, A. A.

    2016-12-01

    In cooperation with the NOAA Suomi NPP/JPSS program, CIMSS/SSEC continues to leverage and expand the NASA funded International MODIS/AIRS Processing Package (IMAPP) effort, and to facilitate the use of international polar orbiter satellite data through the development of a unified Community Satellite Processing Package (CSPP). CSPP supports the Suomi NPP and JPSS, and will subsequently build up over time, to support operational GOES-R, METOP series, FY-3 series, and geostationary meteorological and environmental satellites for the global weather and environmental user community. This paper briefly highlights 16 years (2000-2016) of success of IMAPP and, more recently, of CSPP, that latter as a pathway to the development of a freely available software package to transform VIIRS, CrIS, and ATMS Raw Data Records (RDRs) (i.e. Level 0) to Sensor Data Records (SDRs) (i.e. Level 1), and SDRs to Environmental Data Records (EDRs) (i.e. Level 2) in support of Suomi NPP and subsequently the JPSS missions under the CSPP framework. Examples of CSPP in implementing the customized - UW multi-instrument hyperspectral retrieval and NOAA enterprise algorithms - 1) The Clouds from AVHRR Extended (CLAVR-X), 2) Microwave Integrated Retrieval (MIR), 3) Advanced Clear-SKY Processor for Oceans (ACSPO), 4) NOAA Unique CrIS-ATMS Processing System (NUCAPS) will be outlined. Moreover, the current innovations in the development of Information Data Record (IDR) from single or multiple EDRs and other ancillary and auxiliary data, to become the foundation of CSPP Informatics (CSPP science information processing and integration system) will be discussed. Several current CSPP Informatics examples such as 1) Infusion Data into Environmental Air Quality Application - International (IDEA-I), 2) AWH (Aviation Weather Hazard), and 3) Aerosol Visibility are to be highlighted.

  9. The Cadmium Isotope Record of the Great Oxidation Event from the Turee Creek Group, Hamersley Basin, Australia

    Science.gov (United States)

    Abouchami, W.; Busigny, V.; Philippot, P.; Galer, S. J. G.; Cheng, C.; Pecoits, E.

    2016-12-01

    The evolution of the ocean, atmosphere and biosphere throughout Earth's history has impacted on the biogeochemistry of some key trace metals that are of particular importance in regulating the exchange between Earth's reservoirs. Several geochemical proxies exhibit isotopic shifts that have been linked to major changes in the oxygenation levels of the ancient oceans during the Great Oxygenation Event (GOE) between 2.45 and 2.2 Ga and the Neoproterozoic Oxygenation Event at ca. 0.6 Ga. Studies of the modern marine biogeochemical cycle of the transition metal Cadmium have shown that stable Cd isotope fractionation is mainly driven by biological uptake of light Cd into marine phytoplankton in surface waters leaving behind the seawater enriched in the heavy Cd isotopes. Here we use of the potential of this novel proxy to trace ancient biological productivity which remains an enigma, particularly during the early stages of Earth history. The Turee Creek Group in the Hamersley Basin, Australia, provides a continuous stratigraphic sedimentary section covering the GOE and at least two glacial events, offering a unique opportunity to examine the changes that took place during these periods and possibly constrain the evolution, timing and onset of oxygenic photosynthesis. Stable Cd isotope data were obtained on samples from the Boolgeeda Iron Fm. (BIFs), the siliciclastic and carbonate successions of Kungara (including the Meteorite Bore Member) and the Kazputt Fm., using a double spike technique by TIMS (ThermoFisher Triton) and Cd concentrations were determined by isotope dilution. The Boolgeeda BIFs have generally low Cd concentrations varying between 8 and 50ppb, with two major excursions marked by an increase in Cd content, reaching similar levels to those in the overlying Kungarra Fm. (≥150 ppb). These variations are associated with a large range in ɛ112/110Cd values (-2 to +2), with the most negative values typically found in the organic and Cd-rich shales and

  10. Geochemistry and Cyclostratigraphy of Magnetic Susceptibility data from the Frasnian-Famennian event interval in western Canada: Insights in the pattern and timing of a biotic crisis

    Science.gov (United States)

    Whalen, M. T.; De Vleeschouwer, D.; Sliwinski, M. G.; Claeys, P. F.; Day, J. E.

    2012-12-01

    Cyclostratigraphic calibration of magnetic susceptibility data along with stable isotopic and geochemical proxy data for redox, productivity, and detrital input from western Canada provide insight into the pace and timing of the Late Devonian, Frasnian-Famennian (F-F) biotic crisis. Two organic-rich shales that, in much of the world, display geochemical anomalies indicating low oxygen conditions and carbon burial characterize the F-F event. These events, referred to as the Lower and Upper Kellwasser events (LKE & UKE), have been linked to the evolutionary expansion of deeply rooted terrestrial forests and the concomitant changes in soil development and chemical weathering and changes in Late Devonian climate. Our geochemical data record relatively high levels of redox sensitive trace metals (Mo, U, V), proxies for biological productivity (Ba, Cu, Ni, Zn), and detrital input (Al, Si, Ti, Zr) during both events. C stable isotopic data generated from organic matter records a 3-4‰ positive excursion during both events. Each event is recorded in lowstand and/or early transgressive facies. These data corroborate hypotheses about enhanced biological productivity, driven by heightened terrestrial detrital input, leading to low oxygen conditions and decreases in biotic diversity during during relatively low stands of Late Devonian sea level. Age dating of such events in deep time is problematic due to insufficient biochronologic control. Each event is within one conodont biostratigraphic zone, with durations on the order of 0.5-1.0 Ma. Time series analysis of high-resolution magnetic susceptibility data identified 16 long eccentricity cycles (405 ky) during the Frasnian stage and one in the earliest Famennian stage. The geochemical anomalies associated with the LKE and UKE are recorded over 7 and 14 m of stratigraphic section respectively. These strata represent only a portion of a 405 ky long eccentricity cycle and astronomical tuning implies that the LKE likely occurred

  11. Designing Alternative Transport Methods for the Distributed Data Collection of ATLAS EventIndex Project

    CERN Document Server

    Fernandez Casani, Alvaro; The ATLAS collaboration

    2016-01-01

    One of the key and challenging tasks of the ATLAS EventIndex project is to index and catalog all the produced events not only at CERN but also at hundreds of worldwide grid sites, and convey the data in real time to a central Hadoop instance at CERN. While this distributed data collection is currently operating correctly in production, there are some issues that might impose performance bottlenecks in the future, with an expected rise in the event production and reprocessing rates. In this work, we first describe the current approach based on a messaging system, which conveys the data from the sources to the central catalog, and we identify some weaknesses of this system. Then, we study a promising alternative transport method based on an object store, presenting a performance comparison with the current approach, and the architectural design changes needed to adapt the system to the next run of the ATLAS experiment at CERN.

  12. VizieR Online Data Catalog: Spitzer IRAC events observed in crowded fields (Calchi+, 2015)

    Science.gov (United States)

    Calchi Novati, S.; Gould, A.; Yee, J. C.; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Pogge, R. W.; Shvartzvald, Y.; Wibking, B.; Zhu, W.; Spitzer Team; Udalski, A.; Poleski, R.; Pawlak, M.; Szymanski, M. K.; Skowron, J.; Mroz, P.; Kozlowski, S.; Wyrzykowski, L.; Pietrukowicz, P.; Pietrzynski, G.; Soszynski, I.; Ulaczyk, K.; OGLE Group

    2017-10-01

    In Table 1 we list the 170 events monitored in 2015. For each, we report the event name, the coordinates, the first and last day of observation, and the number of observed epochs. The events were chosen based on the microlensing alerts provided by the OGLE (Udalski et al. 2015AcA....65....1U) and MOA (Bond et al. 2004ApJ...606L.155B) collaborations. The current analysis is based on the preliminary reduced data made available by the Spitzer Science Center almost in real time (on average, 2-3 days after the observations). The final reduction of the data is now publicly available at the NASA/IPAC Infrared Science Database (IRSA, http://irsa.ipac.caltech.edu/frontpage/). (1 data file).

  13. Identifying Weak Ties from Publicly Available Social Media Data in an Event

    DEFF Research Database (Denmark)

    Prakash Gupta, Jayesh; Menon, Karan; Kärkkäinen, Hannu

    2016-01-01

    or potential weak ties using publicly available social media data in the context of an event. Our case study environment is community managers' online discussions in social media in connection to the yearly-organized Community Manager Appreciation Day (CMAD 2016) event in Finland. We were able to identify...... potential weak ties using the conversation based structural holes, making use of social network analysis methods (like clustering) and content analysis in the context of events. We add to the understanding of and useful data sources for the Strength of weak ties theory originated from Granovetter......, and developed further by other researchers. Our approach may be used in future to make more sophisticated conference recommendation systems, and significantly automate the data extraction for making useful contact recommendations from them for conference participants....

  14. NOAA Climate Data Record (CDR) of Ocean Heat Fluxes, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature; near-surface wind speed, air temperature, and specific...

  15. Geosat Geodetic Mission Sensor Data Records (SDR) for June, 1986 (NODC Accession 0002547)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of June 01, 1986 to June 30, 1986....

  16. Geosat Geodetic Mission Sensor Data Records (SDR) for September, 1985 (NODC Accession 0002538)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of September 01, 1985 to September 30,...

  17. Geosat Geodetic Mission Sensor Data Records (SDR) for March 1986 (NODC Accession 0002544)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of March 01, 1986 to March 31, 1986....

  18. Geosat Geodetic Mission Sensor Data Records (SDR) for May, 1985 (NODC Accession 0002351)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of May 01, 1985 to May 31, 1985. Parameters...

  19. Geosat Geodetic Mission Sensor Data Records (SDR) for August, 1985 (NODC Accession 0002537)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of August 01, 1985 to August 31, 1985....

  20. Geosat Geodetic Mission Sensor Data Records (SDR) for January, 1986 (NODC Accession 0002542)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of January 01, 1986 to January 31, 1986....

  1. Geosat Geodetic Mission Sensor Data Records (SDR) for October, 1985 (NODC Accession 0002539)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of October 01, 1985 to October 31, 1985....

  2. Geosat Geodetic Mission Sensor Data Records (SDR) for June, 1985 (NODC Accession 0002359)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of June 01, 1985 to June 30, 1985....

  3. Geosat Geodetic Mission Sensor Data Records (SDR) for February 1986 (NODC Accession 0002543)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of February 01, 1986 to February 28, 1986....

  4. Geosat Geodetic Mission Sensor Data Records (SDR) for August, 1986 (NODC Accession 0002549)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of August 01, 1986 to August 31, 1986....

  5. Geosat Geodetic Mission Sensor Data Records (SDR) for July, 1986 (NODC Accession 0002548)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of July 01, 1986 to July 31, 1986....

  6. Geosat Geodetic Mission Sensor Data Records (SDR) for May, 1986 (NODC Accession 0002546)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of May 01, 1986 to May 31, 1986. Parameters...

  7. Geosat Geodetic Mission Sensor Data Records (SDR) for April, 1985 (NODC Accession 0002350)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of April 01, 1985 to April 30, 1985....

  8. Geosat Geodetic Mission Sensor Data Records (SDR) for December, 1985 (NODC Accession 0002541)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of December 01, 1985 to December 31, 1985....

  9. Geosat Geodetic Mission Sensor Data Records (SDR) for July, 1985 (NODC Accession 0002536)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of July 01, 1985 to July 31, 1985....

  10. Geosat Geodetic Mission Sensor Data Records (SDR) for April, 1986 (NODC Accession 0002545)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of April 01, 1986 to April 30, 1986....

  11. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  12. Idaho: basic data for thermal springs and wells as recorded in GEOTHERM, Part A

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, J.D.

    1983-07-01

    All chemical data for geothermal fluids in Idaho available as of December 1981 is maintained on GEOTHERM, computerized information system. This report presents summaries and sources of records for Idaho. 7 refs. (ACR)

  13. NOAA Climate Data Record (CDR) of Solar Spectral Irradiance (SSI), NRLSSI Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains solar spectral irradiance (SSI) as a function of time and wavelength created with the Naval Research Laboratory model for...

  14. NOAA Climate Data Record (CDR) of Sea Surface Temperature - WHOI, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature, near-surface atmospheric properties, and heat fluxes....

  15. Geosat Geodetic Mission Waveform Data Records (WDR) for April, 1986 (NODC Accession 0002561)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of waveform data records (WDRs) from the GEOSAT Geodetic Mission (GM) and(or) Exact Repeat Mission (ERM) for the time period of...

  16. Geosat Geodetic Mission Waveform Data Records (WDR) for June, 1986 (NODC Accession 0002563)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of waveform data records (WDRs) from the GEOSAT Geodetic Mission (GM) and(or) Exact Repeat Mission (ERM) for the time period of...

  17. Geosat Geodetic Mission Sensor Data Records (SDR) for September, 1986 (NODC Accession 0002550)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of September 01, 1986 to Setpember 30,...

  18. NOAA Climate Data Record (CDR) of Total Solar Irradiance (TSI), NRLTSI Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains total solar irradiance (TSI) as a function of time created with the Naval Research Laboratory model for spectral and total...

  19. Unified Sea Ice Thickness Climate Data Record Collection Spanning 1947-2012

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Unified Sea Ice Thickness Climate Data Record is the result of a concerted effort to collect as many observations as possible of Arctic sea-ice draft, freeboard,...

  20. NOAA Climate Data Record (CDR) of Daily Outgoing Longwave Radiation (OLR), Version 1.2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) contains the daily mean Outgoing Longwave Radiation (OLR) time series in global 1 degree x 1 degree equal-angle gridded maps spanning...

  1. Geosat Exact Repeat Mission Waveform Data Records (WDR) (NODC Accession 0061150)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains waveform data records (WDRs) from the US Navy Geodetic Satellite (GEOSAT) Exact Repeat Mission (ERM) for the time period of November 08, 1986...

  2. Geosat Geodetic Mission Waveform Data Records (WDR) for May, 1985 (NODC Accession 0002365)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of waveform data records (WDRs) from the GEOSAT Geodetic Mission (GM) and(or) Exact Repeat Mission (ERM) for the time period of May...

  3. NOAA Climate Data Record (CDR) of Monthly Outgoing Longwave Radiation (OLR), Version 2.2-1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Climate Data Record (CDR) of monthly mean High Resolution Infrared Radiation Sounder (HIRS) Outgoing Longwave Radiation (OLR) flux at the top of the atmosphere...

  4. LRO MOON LAMP 3 REDUCED DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Reconnaissance Orbiter (LRO) Lyman Alpha Mapping Project (LAMP) CODMAC Level 3 Reduced Data Record is a collection of the far ultraviolet photon detections...

  5. JUNO JUPITER UVS 3 REDUCED DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Juno Ultraviolet Spectrograph (UVS) CODMAC Level 3 Reduced Data Record is a collection of the far ultraviolet photon detections obtained by the UVS instrument,...

  6. Nevada: basic data for thermal springs and wells as recorded in GEOTHERM. Part A

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, J.D.

    1983-06-01

    All chemical data for geothermal fluids in Nevada available as of December 1981 are maintained on GEOTHERM, a computerized information system. This report presents summaries and sources of records for Nevada. 7 refs. (ACR)

  7. NOAA Climate Data Record (CDR) of Normalized Difference Vegetation Index (NDVI), Version 4

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains gridded daily Normalized Difference Vegetation Index (NDVI) derived from the NOAA Climate Data Record (CDR) of Advanced Very High Resolution...

  8. Geosat Geodetic Mission Sensor Data Records (SDR) for November, 1985 (NODC Accession 0002540)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of sensor data records (SDRs) from the GEOSAT Geodetic Mission (GM) for the time period of November 01, 1985 to November 30, 1985....

  9. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Active Fires Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) that contains pinpoint locations of active fires (AF) as identified by an algorithm...

  10. NUCAPS: NOAA Unique Combined Atmospheric Processing System Environmental Data Record (EDR) Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of numerous retrieved estimates of hydrological variables and trace gases as Environmental Data Record (EDR) products from the NOAA Unique...

  11. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Aerosol Detection Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of suspended matter from the Visible Infrared Imaging Radiometer Suite (VIIRS)...

  12. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Snow Cover Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of snow cover from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument...

  13. Geosat Geodetic Mission Waveform Data Records (WDR) for January, 1986 (NODC Accession 0002558)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This accession contains one month of waveform data records (WDRs) from the GEOSAT Geodetic Mission (GM) and(or) Exact Repeat Mission (ERM) for the time period of...

  14. LRO MOON LAMP 2 EXPERIMENT DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Reconnaissance Orbiter (LRO) Lyman Alpha Mapping Project (LAMP) CODMAC Level 2 Experiment Data Record is a collection of the far ultraviolet photon...

  15. JUNO JUPITER UVS 2 EXPERIMENT DATA RECORD V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Juno Ultraviolet Spectrograph (UVS) CODMAC Level 2 Experiment Data Record is a collection of the far ultraviolet photon detections obtained by the UVS...

  16. NOAA Climate Data Record (CDR) of Passive Microwave Sea Ice Concentration, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Passive Microwave Sea Ice Concentration Climate Data Record (CDR) dataset is generated using daily gridded brightness temperatures from the Defense...

  17. NOAA Climate Data Record (CDR) of AVHRR Polar Pathfinder Extended (APP-X) Cryosphere

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA Climate Data Record (CDR) of the extended AVHRR Polar Pathfinder (APP-x) cryosphere contains 19 geophysical variables over the Arctic and Antarctic for the...

  18. NOAA Climate Data Record (CDR) of Ocean Near Surface Atmospheric Properties, Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Ocean Surface Bundle (OSB) Climate Data Record (CDR) consist of three parts: sea surface temperature; near-surface wind speed, air temperature, and specific...

  19. NOAA Climate Data Record (CDR) of AVHRR Polar Pathfinder (APP) Cryosphere

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This NOAA Climate Data Record (CDR) contains the AVHRR Polar Pathfinder (APP) product. APP is a fundamental CDR comprised of calibrated and navigated AVHRR channel...

  20. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2013-10-09

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework, to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.

  1. Automated Feature and Event Detection with SDO AIA and HMI Data

    Science.gov (United States)

    Davey, Alisdair; Martens, P. C. H.; Attrill, G. D. R.; Engell, A.; Farid, S.; Grigis, P. C.; Kasper, J.; Korreck, K.; Saar, S. H.; Su, Y.; Testa, P.; Wills-Davey, M.; Savcheva, A.; Bernasconi, P. N.; Raouafi, N.-E.; Delouille, V. A.; Hochedez, J. F..; Cirtain, J. W.; Deforest, C. E.; Angryk, R. A.; de Moortel, I.; Wiegelmann, T.; Georgouli, M. K.; McAteer, R. T. J.; Hurlburt, N.; Timmons, R.

    The Solar Dynamics Observatory (SDO) represents a new frontier in quantity and quality of solar data. At about 1.5 TB/day, the data will not be easily digestible by solar physicists using the same methods that have been employed for images from previous missions. In order for solar scientists to use the SDO data effectively they need meta-data that will allow them to identify and retrieve data sets that address their particular science questions. We are building a comprehensive computer vision pipeline for SDO, abstracting complete metadata on many of the features and events detectable on the Sun without human intervention. Our project unites more than a dozen individual, existing codes into a systematic tool that can be used by the entire solar community. The feature finding codes will run as part of the SDO Event Detection System (EDS) at the Joint Science Operations Center (JSOC; joint between Stanford and LMSAL). The metadata produced will be stored in the Heliophysics Event Knowledgebase (HEK), which will be accessible on-line for the rest of the world directly or via the Virtual Solar Observatory (VSO) . Solar scientists will be able to use the HEK to select event and feature data to download for science studies.

  2. The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD statement.

    Directory of Open Access Journals (Sweden)

    Eric I Benchimol

    2015-10-01

    Full Text Available Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE. The REporting of studies Conducted using Observational Routinely collected health Data (RECORD statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist and explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included herein. This document, as well as the accompanying website and message board (http://www.record-statement.org, will enhance the implementation and understanding of RECORD. Through implementation of RECORD, authors, journals editors, and peer reviewers can encourage transparency of research reporting.

  3. Novel data-mining methodologies for adverse drug event discovery and analysis.

    Science.gov (United States)

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis.

  4. An Automated System for Coding Data from Summary Time Oriented Record (STOR)

    Science.gov (United States)

    Whiting-O'Keefe, Q.; Strong, Phillip C.; Simborg, Donald W.

    1983-01-01

    A system to automatically encode a portion of the patient specific data of an ambulatory record system has been developed and implemented. The first use of the system that passes the clinical data of 224 patients to the ARAMIS research databank is described. Issues concerning the capture and use of naturally occurring patient data for clinical research are discussed.

  5. To what extent are adverse events found in patient records reported by patients and healthcare professionals via complaints, claims and incident reports?

    NARCIS (Netherlands)

    Christiaans-Dingelhoff, I.; Smits, M.; Zwaan, L.; Lubberding, S.; Wal, G. van der; Wagner, C.

    2011-01-01

    BACKGROUND: Patient record review is believed to be the most useful method for estimating the rate of adverse events among hospitalised patients. However, the method has some practical and financial disadvantages. Some of these disadvantages might be overcome by using existing reporting systems in

  6. Reconstructing high-magnitude/low-frequency landslide events based on soil redistribution modelling and a Late-Holocene sediment record from New Zealand

    NARCIS (Netherlands)

    Claessens, L.F.G.; Lowe, D.J.; Hayward, B.W.; Schaap, B.F.; Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    A sediment record is used, in combination with shallow landslide soil redistribution and sediment-yield modelling, to reconstruct the incidence of high-magnitude/low-frequency landslide events in the upper part of a catchment and the history of a wetland in the lower part. Eleven sediment cores were

  7. Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.

    Science.gov (United States)

    Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth

    2017-09-01

    The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.

  8. Adverse events with medical devices in anesthesia and intensive care unit patients recorded in the French safety database in 2005-2006.

    Science.gov (United States)

    Beydon, Laurent; Ledenmat, Pierre Yves; Soltner, Christophe; Lebreton, Frédéric; Hardin, Vincent; Benhamou, Dan; Clergue, François; Laguenie, Gérard

    2010-02-01

    French regulations require that adverse events involving medical devices be reported to the national healthcare safety agency. The authors evaluated reports made in 2005-2006 for patients in anesthesiology and critical care. For each type of device, the authors recorded the severity and cause of the event and the manufacturer's response where relevant. The authors compared the results with those obtained previously from the reports (n = 1,004) sent in 1998 to the same database. The authors identified 4,188 events, of which 91% were minor, 7% severe, and 2% fatal. The cause was available for 1,935 events (46%). Faulty manufacturing was the main cause of minor events. Inappropriate use was the cause in a significantly larger proportion of severe events than minor events (P device verification before use. Compared to with that in 1998, the annual number of reported events doubled and the rate of severe events decreased slightly (12-10%, P = 0.03). The rate of events related to manufacturing problems remained stable (59-60%, P = nonsignificant), and the rate of events caused by human errors was 32-42% (P = 0.01). There were no changes in the mortality rate (2% in both studies). The number of adverse events related to medical devices indicates a need for greater attention to these complex pieces of equipment that can suffer from faulty design and manufacturing and from inappropriate use. Improvements in clinician knowledge of medical devices, and to a lesser extent improvement in manufacturing practices, should improve safety.

  9. Geodetic Infrastructure, Data, Education and Community Engagement in Response to Earthquakes and Other Geophysical Events: An Overview of UNAVCO Support Resources Plus Highlights from Recent Event Response

    Science.gov (United States)

    Phillips, D. A.; Meertens, C. M.; Mattioli, G. S.; Miller, M. M.; Charlevoix, D. J.; Maggert, D.; Hodgkinson, K. M.; Henderson, D. B.; Puskas, C. M.; Bartel, B. A.; Baker, S.; Blume, F.; Normandeau, J.; Feaux, K.; Galetzka, J.; Williamson, H.; Pettit, J.; Crosby, C. J.; Boler, F. M.

    2015-12-01

    UNAVCO responds to community requests for support during and following significant geophysical events such as earthquakes, volcanic activity, landslides, glacial and ice-sheet movements, unusual uplift or subsidence, extreme meteorological events, or other hazards. UNAVCO can also respond proactively to events in anticipation of community demand for relevant data, data products or other services. Recent major events to which UNAVCO responded include the 2015 M7.8 Nepal EQ, the 2014 M6.0 American Canyon (Napa) EQ, the 2014 M8.2 Chile EQ, the 2011 M9.0 Tohoku, Japan EQ and tsunami, the 2010 M8.8 Maule, Chile EQ, and the 2010 M7.0 Haiti EQ. UNAVCO provided geophysical event response support for 15 events in 2014 alone. UNAVCO event response resources include geodetic infrastructure, data, and education and community engagement. Specific support resources include: field engineering personnel; continuous and campaign GNSS/GPS station deployment; real-time and/or high rate field GNSS/GPS station upgrades or deployment; data communications and power systems deployment; tiltmeter, strainmeter, and borehole seismometer deployments; terrestrial laser scanning (TLS a.k.a. ground-based LiDAR); InSAR data support; education and community engagement assistance or products; data processing services; generation of custom GNSS/GPS or borehole data sets and products; equipment shipping and logistics coordination; and assistance with RAPID proposal preparation, budgeting, and submission. The most critical aspect of a successful event response is effective and efficient communication. To facilitate such communication, UNAVCO creates event response web pages describing the event and the support being provided, and in the case of major events also provides an online event response forum. These resources are shared broadly with the geophysical community through multiple dissemination strategies including social media of UNAVCO and partner organizations. We will provide an overview of

  10. ISVASE: identification of sequence variant associated with splicing event using RNA-seq data.

    Science.gov (United States)

    Aljohi, Hasan Awad; Liu, Wanfei; Lin, Qiang; Yu, Jun; Hu, Songnian

    2017-06-28

    Exon recognition and splicing precisely and efficiently by spliceosome is the key to generate mature mRNAs. About one third or a half of disease-related mutations affect RNA splicing. Software PVAAS has been developed to identify variants associated with aberrant splicing by directly using RNA-seq data. However, it bases on the assumption that annotated splicing site is normal splicing, which is not true in fact. We develop the ISVASE, a tool for specifically identifying sequence variants associated with splicing events (SVASE) by using RNA-seq data. Comparing with PVAAS, our tool has several advantages, such as multi-pass stringent rule-dependent filters and statistical filters, only using split-reads, independent sequence variant identification in each part of splicing (junction), sequence variant detection for both of known and novel splicing event, additional exon-exon junction shift event detection if known splicing events provided, splicing signal evaluation, known DNA mutation and/or RNA editing data supported, higher precision and consistency, and short running time. Using a realistic RNA-seq dataset, we performed a case study to illustrate the functionality and effectiveness of our method. Moreover, the output of SVASEs can be used for downstream analysis such as splicing regulatory element study and sequence variant functional analysis. ISVASE is useful for researchers interested in sequence variants (DNA mutation and/or RNA editing) associated with splicing events. The package is freely available at https://sourceforge.net/projects/isvase/ .

  11. Using machine learning to detect events in eye-tracking data.

    Science.gov (United States)

    Zemblys, Raimondas; Niehorster, Diederick C; Komogortsev, Oleg; Holmqvist, Kenneth

    2017-02-23

    Event detection is a challenging stage in eye movement data analysis. A major drawback of current event detection methods is that parameters have to be adjusted based on eye movement data quality. Here we show that a fully automated classification of raw gaze samples as belonging to fixations, saccades, or other oculomotor events can be achieved using a machine-learning approach. Any already manually or algorithmically detected events can be used to train a classifier to produce similar classification of other data without the need for a user to set parameters. In this study, we explore the application of random forest machine-learning technique for the detection of fixations, saccades, and post-saccadic oscillations (PSOs). In an effort to show practical utility of the proposed method to the applications that employ eye movement classification algorithms, we provide an example where the method is employed in an eye movement-driven biometric application. We conclude that machine-learning techniques lead to superior detection compared to current state-of-the-art event detection algorithms and can reach the performance of manual coding.

  12. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    Science.gov (United States)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  13. Observation of an excess at 30 GeV in the opposite sign di-muon spectra of ${\\rm Z} \\to b\\overline{ b} + {\\rm X}$ events recorded by the ALEPH experiment at LEP

    CERN Document Server

    Heister, Arno

    2016-01-01

    The re-analysis of the archived data recorded at the ${\\rm Z}^{0}$ resonance by the ALEPH experiment at LEP during the years 1992-1995 shows an excess in the opposite sign di-muon mass spectra at 30.40 GeV in events containing b quarks. The excess has a natural width of 1.78 GeV. A compatible but smaller excess is visible in the opposite di-electron mass spectrum as well.

  14. Designing an algorithm to preserve privacy for medical record linkage with error-prone data.

    Science.gov (United States)

    Pal, Doyel; Chen, Tingting; Zhong, Sheng; Khethavath, Praveen

    2014-01-20

    Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients' privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other's database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully

  15. Experimental Seismic Event-screening Criteria at the Prototype International Data Center

    Science.gov (United States)

    Fisk, M. D.; Jepsen, D.; Murphy, J. R.

    - Experimental seismic event-screening capabilities are described, based on the difference of body-and surface-wave magnitudes (denoted as Ms:mb) and event depth. These capabilities have been implemented and tested at the prototype International Data Center (PIDC), based on recommendations by the IDC Technical Experts on Event Screening in June 1998. Screening scores are presented that indicate numerically the degree to which an event meets, or does not meet, the Ms:mb and depth screening criteria. Seismic events are also categorized as onshore, offshore, or mixed, based on their 90% location error ellipses and an onshore/offshore grid with five-minute resolution, although this analysis is not used at this time to screen out events.Results are presented of applications to almost 42,000 events with mb>=3.5 in the PIDC Standard Event Bulletin (SEB) and to 121 underground nuclear explosions (UNE's) at the U.S. Nevada Test Site (NTS), the Semipalatinsk and Novaya Zemlya test sites in the Former Soviet Union, the Lop Nor test site in China, and the Indian, Pakistan, and French Polynesian test sites. The screening criteria appear to be quite conservative. None of the known UNE's are screened out, while about 41 percent of the presumed earthquakes in the SEB with mb>=3.5 are screened out. UNE's at the Lop Nor, Indian, and Pakistan test sites on 8 June 1996, 11 May 1998, and 28 May 1998, respectively, have among the lowest Ms:mb scores of all events in the SEB.To assess the validity of the depth screening results, comparisons are presented of SEB depth solutions to those in other bulletins that are presumed to be reliable and independent. Using over 1600 events, the comparisons indicate that the SEB depth confidence intervals are consistent with or shallower than over 99.8 percent of the corresponding depth estimates in the other bulletins. Concluding remarks are provided regarding the performance of the experimental event-screening criteria, and plans for future

  16. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  17. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  18. Data Matching Concepts and Techniques for Record Linkage, Entity Resolution, and Duplicate Detection

    CERN Document Server

    Christen, Peter

    2012-01-01

    Data matching (also known as record or data linkage, entity resolution, object identification, or field matching) is the task of identifying, matching and merging records that correspond to the same entities from several databases or even within one database. Based on research in various domains including applied statistics, health informatics, data mining, machine learning, artificial intelligence, database management, and digital libraries, significant advances have been achieved over the last decade in all aspects of the data matching process, especially on how to improve the accuracy of da

  19. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Direct and indirect costs for adverse drug events identified in medical records across care levels, and their distribution among payers.

    Science.gov (United States)

    Natanaelsson, Jennie; Hakkarainen, Katja M; Hägg, Staffan; Andersson Sundell, Karolina; Petzold, Max; Rehnberg, Clas; Jönsson, Anna K; Gyllensten, Hanna

    2016-11-19

    Adverse drug events (ADEs) cause considerable costs in hospitals. However, little is known about costs caused by ADEs outside hospitals, effects on productivity, and how the costs are distributed among payers. To describe the direct and indirect costs caused by ADEs, and their distribution among payers. Furthermore, to describe the distribution of patient out-of-pocket costs and lost productivity caused by ADEs according to socio-economic characteristics. In a random sample of 5025 adults in a Swedish county, prevalence-based costs for ADEs were calculated. Two different methods were used: 1) based on resource use judged to be caused by ADEs, and 2) as costs attributable to ADEs by comparing costs among individuals with ADEs to costs among matched controls. Payers of costs caused by ADEs were identified in medical records among those with ADEs (n = 596), and costs caused to individual patients were described by socio-economic characteristics. Costs for resource use caused by ADEs were €505 per patient with ADEs (95% confidence interval €345-665), of which 38% were indirect costs. Compared to matched controls, the costs attributable to ADEs were €1631, of which €410 were indirect costs. The local health authorities paid 58% of the costs caused by ADEs. Women had higher productivity loss than men (€426 vs. €109, p = 0.018). Out-of-pocket costs displaced a larger proportion of the disposable income among low-income earners than higher income earners (0.7% vs. 0.2%-0.3%). We used two methods to identify costs for ADEs, both identifying indirect costs as an important component of the overall costs for ADEs. Although the largest payers of costs caused by ADEs were the local health authorities responsible for direct costs, employers and patients costs for lost productivity contributed substantially. Our results indicate inequalities in costs caused by ADEs, by sex and income. Copyright © 2016 Elsevier Inc. All rights reserved.