WorldWideScience

Sample records for based record analysis

  1. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  2. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey

    Science.gov (United States)

    Rau, Hsiao-Hsien; Chen, Kang-Hua

    2017-01-01

    Background Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced “My health bank,” a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. Objective This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. Methods This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 (“disagree strongly”) to 5 (“Agree strongly”). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. Results This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65

  3. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey.

    Science.gov (United States)

    Rau, Hsiao-Hsien; Wu, Yi-Syuan; Chu, Chi-Ming; Wang, Fu-Chung; Hsu, Min-Huei; Chang, Chi-Wen; Chen, Kang-Hua; Lee, Yen-Liang; Kao, Senyeong; Chiu, Yu-Lung; Wen, Hsyien-Chia; Fuad, Anis; Hsu, Chien-Yeh; Chiu, Hung-Wen

    2017-04-27

    Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced "My health bank," a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 ("disagree strongly") to 5 ("Agree strongly"). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65.1%). They were still students (195 out of 350

  4. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe B.; Fauerskov, Inge; Osmanagic, Armin

    2013-01-01

    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals...... for heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  5. Borneo: a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    OpenAIRE

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species occurrence based on the identified relationships between species recorded presences and the ecological circumstances at those localities. A new statistical method was developed to test the species distribut...

  6. How do repeat suicide attempters differ from first timers? An exploratory record based analysis

    Directory of Open Access Journals (Sweden)

    Vikas Menon

    2016-01-01

    Full Text Available Background: Evidence indicates that repeat suicide attempters, as a group, may differ from 1st time attempters. The identification of repeat attempters is a powerful but underutilized clinical variable. Aims: In this research, we aimed to compare individuals with lifetime histories of multiple attempts with 1st time attempters to identify factors predictive of repeat attempts. Setting and Design: This was a retrospective record based study carried out at a teaching cum Tertiary Care Hospital in South India. Methods: Relevant data was extracted from the clinical records of 1st time attempters (n = 362 and repeat attempters (n = 61 presenting to a single Tertiary Care Center over a 4½ year period. They were compared on various sociodemographic and clinical parameters. The clinical measures included Presumptive Stressful Life Events Scale, Beck Hopelessness Scale, Coping Strategies Inventory – Short Form, and the Global Assessment of Functioning Scale. Statistical Analysis Used: First time attempters and repeaters were compared using appropriate inferential statistics. Logistic regression was used to identify independent predictors of repeat attempts. Results: The two groups did not significantly differ on sociodemographic characteristics. Repeat attempters were more likely to have given prior hints about their act (χ2 = 4.500, P = 0.034. In the final regression model, beck hopelessness score emerged as a significant predictor of repeat suicide attempts (odds ratio = 1.064, P = 0.020. Conclusion: Among suicide attempters presenting to the hospital, the presence of hopelessness is a predictor of repeat suicide attempts, independent of clinical depression. This highlights the importance of considering hopelessness in the assessment of suicidality with a view to minimize the risk of future attempts.

  7. Borneo : a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    NARCIS (Netherlands)

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species

  8. Towards successful coordination of electronic health record based-referrals: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Paul Lindsey A

    2011-07-01

    Full Text Available Abstract Background Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP, the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs (i.e., e-referrals, lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. Methods We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Results Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Conclusions Despite facilitating information transfer between PCPs and

  9. A theoretical analysis of spatial/temporal modulation-based systems for prevention of illegal recordings in movie theaters

    Science.gov (United States)

    Bourdon, Pascal; Thiebaud, Sylvain; Doyen, Didier

    2008-02-01

    This document proposes a convenient theoretical analysis of light modulation-based systems for prevention of illegal recordings in movie theaters. Although the works presented in this paper do not solve the problem of camcorder piracy, people in the security community may find them interesting for further work in this area.

  10. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann

    2005-01-01

    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas......, there is still need for further improvement of the resolution. Here a method for resolution enhancement of CFA data is presented. It is demonstrated that it is possible to improve the resolution of CFA data by restoring some of the detail that was lost in the measuring process, thus improving the usefulness...

  11. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    Science.gov (United States)

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access

  12. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt

    2014-11-01

    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  13. NLP based congestive heart failure case finding: A prospective analysis on statewide electronic medical records.

    Science.gov (United States)

    Wang, Yue; Luo, Jin; Hao, Shiying; Xu, Haihua; Shin, Andrew Young; Jin, Bo; Liu, Rui; Deng, Xiaohong; Wang, Lijuan; Zheng, Le; Zhao, Yifan; Zhu, Chunqing; Hu, Zhongkai; Fu, Changlin; Hao, Yanpeng; Zhao, Yingzhen; Jiang, Yunliang; Dai, Dorothy; Culver, Devore S; Alfreds, Shaun T; Todd, Rogow; Stearns, Frank; Sylvester, Karl G; Widen, Eric; Ling, Xuefeng B

    2015-12-01

    In order to proactively manage congestive heart failure (CHF) patients, an effective CHF case finding algorithm is required to process both structured and unstructured electronic medical records (EMR) to allow complementary and cost-efficient identification of CHF patients. We set to identify CHF cases from both EMR codified and natural language processing (NLP) found cases. Using narrative clinical notes from all Maine Health Information Exchange (HIE) patients, the NLP case finding algorithm was retrospectively (July 1, 2012-June 30, 2013) developed with a random subset of HIE associated facilities, and blind-tested with the remaining facilities. The NLP based method was integrated into a live HIE population exploration system and validated prospectively (July 1, 2013-June 30, 2014). Total of 18,295 codified CHF patients were included in Maine HIE. Among the 253,803 subjects without CHF codings, our case finding algorithm prospectively identified 2411 uncodified CHF cases. The positive predictive value (PPV) is 0.914, and 70.1% of these 2411 cases were found to be with CHF histories in the clinical notes. A CHF case finding algorithm was developed, tested and prospectively validated. The successful integration of the CHF case findings algorithm into the Maine HIE live system is expected to improve the Maine CHF care. Copyright © 2015. Published by Elsevier Ireland Ltd.

  14. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt; Fauerskov, Inge; Osmanagic, Armin

    2013-01-01

    According to the World Health Organization, cardiovascular diseases are the number one cause of death globally. Early diagnosis and treatment of many of these patients depend on ambulatory electrocardiography recordings. Therefore a novel wireless patch technology has been designed for easy......, reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals...... together with other clinical tests and medical history....

  15. Understanding the Connection Between Traumatic Brain Injury and Alzheimer’s Disease: A Population Based Medical Record Review Analysis

    Science.gov (United States)

    2016-10-01

    AWARD NUMBER: W81XWH-15-1-0573 TITLE: Understanding the Connection Between Traumatic Brain Injury and Alzheimer’s Disease: A Population -Based...Sep 2015 - 14 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Understanding the Connection Between Traumatic Brain Injury and Alzheimer’s Disease...A Population -Based Medical Record Review Analysis 5b. GRANT NUMBER W81XWH-15-1-0573 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Allen W. Brown 5d

  16. Monitoring sleep depth: analysis of bispectral index (BIS) based on polysomnographic recordings and sleep deprivation.

    Science.gov (United States)

    Giménez, Sandra; Romero, Sergio; Alonso, Joan Francesc; Mañanas, Miguel Ángel; Pujol, Anna; Baxarias, Pilar; Antonijoan, Rosa Maria

    2017-02-01

    The assessment and management of sleep are increasingly recommended in the clinical practice. Polysomnography (PSG) is considered the gold standard test to monitor sleep objectively, but some practical and technical constraints exist due to environmental and patient considerations. Bispectral index (BIS) monitoring is commonly used in clinical practice for guiding anesthetic administration and provides an index based on relationships between EEG components. Due to similarities in EEG synchronization between anesthesia and sleep, several studies have assessed BIS as a sleep monitor with contradictory results. The aim of this study was to evaluate objectively both the feasibility and reliability of BIS for sleep monitoring through a robust methodology, which included full PSG recordings at a baseline situation and after 40 h of sleep deprivation. Results confirmed that the BIS index was highly correlated with the hypnogram (0.89 ± 0.02), showing a progressive decrease as sleep deepened, and an increase during REM sleep (awake: 91.77 ± 8.42; stage N1: 83.95 ± 11.05; stage N2: 71.71 ± 11.99; stage N3: 42.41 ± 9.14; REM: 80.11 ± 8.73). Mean and median BIS values were lower in the post-deprivation night than in the baseline night, showing statistical differences for the slow wave sleep (baseline: 42.41 ± 9.14 vs. post-deprivation: 39.49 ± 10.27; p = 0.02). BIS scores were able to discriminate properly between deep (N3) and light (N1, N2) sleep. BIS values during REM overlapped those of other sleep stages, although EMG activity provided by the BIS monitor could help to identify REM sleep if needed. In conclusion, BIS monitors could provide a useful measure of sleep depth in especially particular situations such as intensive care units, and they could be used as an alternative for sleep monitoring in order to reduce PSG-derived costs and to increase capacity in ambulatory care.

  17. A simulation-based performance analysis of a National Electronic Health Record System.

    Science.gov (United States)

    Orfanidis, Leonidas; Bamidis, Panagiotis D; Eaglestone, Barry

    2007-01-01

    This paper addresses through simulation experiments a number of technical issues which are raised during the development and operation of a National Electronic Health Record System (NEHRS). The simulation experiments represent the NEHRS performance for a variety of technological infrastructures, within the context of a realistic scenario. The scenario includes the estimation of the delays created in queues during the exchange of Electronic Patient Records (EPR) between different health service points. It is essential to clarify the delays derive from LAN and Internet technologies, the EPR encryption/decryption, the HL7 message generation/parsing, and the databases. The results of this study identify how a number of technical aspects influence the NEHRS development and operation.

  18. Magnetoencephalography recording and analysis

    Directory of Open Access Journals (Sweden)

    Jayabal Velmurugan

    2014-01-01

    Full Text Available Magnetoencephalography (MEG non-invasively measures the magnetic field generated due to the excitatory postsynaptic electrical activity of the apical dendritic pyramidal cells. Such a tiny magnetic field is measured with the help of the biomagnetometer sensors coupled with the Super Conducting Quantum Interference Device (SQUID inside the magnetically shielded room (MSR. The subjects are usually screened for the presence of ferromagnetic materials, and then the head position indicator coils, electroencephalography (EEG electrodes (if measured simultaneously, and fiducials are digitized using a 3D digitizer, which aids in movement correction and also in transferring the MEG data from the head coordinates to the device and voxel coordinates, thereby enabling more accurate co-registration and localization. MEG data pre-processing involves filtering the data for environmental and subject interferences, artefact identification, and rejection. Magnetic resonance Imaging (MRI is processed for correction and identifying fiducials. After choosing and computing for the appropriate head models (spherical or realistic; boundary/finite element model, the interictal/ictal epileptiform discharges are selected and modeled by an appropriate source modeling technique (clinically and commonly used - single equivalent current dipole - ECD model. The equivalent current dipole (ECD source localization of the modeled interictal epileptiform discharge (IED is considered physiologically valid or acceptable based on waveform morphology, isofield pattern, and dipole parameters (localization, dipole moment, confidence volume, goodness of fit. Thus, MEG source localization can aid clinicians in sublobar localization, lateralization, and grid placement, by evoking the irritative/seizure onset zone. It also accurately localizes the eloquent cortex-like visual, language areas. MEG also aids in diagnosing and delineating multiple novel findings in other neuropsychiatric

  19. Monitoring sleep depth: analysis of bispectral index (BIS) based on polysomnographic recordings and sleep deprivation

    OpenAIRE

    Giménez Badia, Sandra; Romero Lafuente, Sergio; Alonso López, Joan Francesc; Mañanas Villanueva, Miguel Ángel; Pujol, Anna; Baxarias, Pilar; Antonijoan Arbós, Rosa Maria

    2015-01-01

    © 2015 Springer Science+Business Media Dordrecht The assessment and management of sleep are increasingly recommended in the clinical practice. Polysomnography (PSG) is considered the gold standard test to monitor sleep objectively, but some practical and technical constraints exist due to environmental and patient considerations. Bispectral index (BIS) monitoring is commonly used in clinical practice for guiding anesthetic administration and provides an index based on relationships between...

  20. Theoretical analysis of intracortical microelectrode recordings

    Science.gov (United States)

    Lempka, Scott F.; Johnson, Matthew D.; Moffitt, Michael A.; Otto, Kevin J.; Kipke, Daryl R.; McIntyre, Cameron C.

    2011-08-01

    Advanced fabrication techniques have now made it possible to produce microelectrode arrays for recording the electrical activity of a large number of neurons in the intact brain for both clinical and basic science applications. However, the long-term recording performance desired for these applications is hindered by a number of factors that lead to device failure or a poor signal-to-noise ratio (SNR). The goal of this study was to identify factors that can affect recording quality using theoretical analysis of intracortical microelectrode recordings of single-unit activity. Extracellular microelectrode recordings were simulated with a detailed multi-compartment cable model of a pyramidal neuron coupled to a finite-element volume conductor head model containing an implanted recording microelectrode. Recording noise sources were also incorporated into the overall modeling infrastructure. The analyses of this study would be very difficult to perform experimentally; however, our model-based approach enabled a systematic investigation of the effects of a large number of variables on recording quality. Our results demonstrate that recording amplitude and noise are relatively independent of microelectrode size, but instead are primarily affected by the selected recording bandwidth, impedance of the electrode-tissue interface and the density and firing rates of neurons surrounding the recording electrode. This study provides the theoretical groundwork that allows for the design of the microelectrode and recording electronics such that the SNR is maximized. Such advances could help enable the long-term functionality required for chronic neural recording applications.

  1. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    Science.gov (United States)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  2. Time trends and risk factors for diabetes mellitus in dogs: analysis of veterinary medical data base records (1970-1999).

    Science.gov (United States)

    Guptill, L; Glickman, L; Glickman, N

    2003-05-01

    The objectives of the study were to identify recent trends in the prevalence of diabetes mellitus (DM) in dogs and to identify host risk factors. Veterinary Medical Data Base (VMDB) electronic records of 6860 dogs with a diagnosis of DM (VMDB code 870178500) between 1970 and 1999 were evaluated to determine time trends. Records of 6707 dogs with DM and 6707 frequency matched dogs with any diagnosis other than DM from the same teaching hospitals in the same year, selected as controls, were evaluated for risk factor analysis. The prevalence of DM in dogs presented to veterinary teaching hospitals increased from 19 cases per 10,000 admissions per year in 1970 to 64 cases per 10,000 in 1999, while the case-fatality rate decreased from 37% to 5%. The hospital prevalence of DM was consistently greater over time in older compared with younger dogs with the highest prevalence occurring in dogs 10-15 years of age. Dogs weighing risk of DM compared with heavier dogs. Female dogs had an increased risk of DM compared with males (P<0.001).

  3. Microcomputer-based system for 24-hour recording of oesophageal motility and pH profile with automated analysis

    NARCIS (Netherlands)

    Breedijk, M.; Smout, A. J.; van der Zouw, C.; Verwey, H.; Akkermans, L. M.

    1989-01-01

    A system developed for long-term simultaneous recording of oesophageal motility and pH in the ambulant patient is described. The system consists of a microprocessor based data-acquisition and preprocessing device, a personal computer for postprocessing, report generation and data storage, a

  4. Understanding the Connection Between Traumatic Brain Injury and Alzheimer’s Disease: A Population-Based Medical Record Review Analysis

    Science.gov (United States)

    2017-10-01

    disorders (ADRD) is to identify incident TBI events by medical record review within a defined population and classify each by injury severity, identify...matched referents within that same population, and follow both cohorts over time to observe incidence rates of ADRD. Scope: Compared to other study...matched to their population-based controls. 15. SUBJECT TERMS Population; epidemiology; dementia; neurocognitive disorders ; brain injuries; Parkinsonian

  5. Cenozoic climate changes: A review based on time series analysis of marine benthic δ18O records

    Science.gov (United States)

    Mudelsee, Manfred; Bickert, Torsten; Lear, Caroline H.; Lohmann, Gerrit

    2014-09-01

    The climate during the Cenozoic era changed in several steps from ice-free poles and warm conditions to ice-covered poles and cold conditions. Since the 1950s, a body of information on ice volume and temperature changes has been built up predominantly on the basis of measurements of the oxygen isotopic composition of shells of benthic foraminifera collected from marine sediment cores. The statistical methodology of time series analysis has also evolved, allowing more information to be extracted from these records. Here we provide a comprehensive view of Cenozoic climate evolution by means of a coherent and systematic application of time series analytical tools to each record from a compilation spanning the interval from 4 to 61 Myr ago. We quantitatively describe several prominent features of the oxygen isotope record, taking into account the various sources of uncertainty (including measurement, proxy noise, and dating errors). The estimated transition times and amplitudes allow us to assess causal climatological-tectonic influences on the following known features of the Cenozoic oxygen isotopic record: Paleocene-Eocene Thermal Maximum, Eocene-Oligocene Transition, Oligocene-Miocene Boundary, and the Middle Miocene Climate Optimum. We further describe and causally interpret the following features: Paleocene-Eocene warming trend, the two-step, long-term Eocene cooling, and the changes within the most recent interval (Miocene-Pliocene). We review the scope and methods of constructing Cenozoic stacks of benthic oxygen isotope records and present two new latitudinal stacks, which capture besides global ice volume also bottom water temperatures at low (less than 30°) and high latitudes. This review concludes with an identification of future directions for data collection, statistical method development, and climate modeling.

  6. Analysis of intracranial pressure recordings: comparison of PCA and signal averaging based filtering methods and signal period estimation.

    Science.gov (United States)

    Calisto, A; Galeano, M; Bramanti, A; Angileri, F; Campobello, G; Serrano, S; Azzerboni, B

    2010-01-01

    Intracranial pressure monitoring is a common used approach for neuro-intensive care in cases of brain damages and injuries or to investigate chronic pathologies. Several types of noises and artifacts normally contaminate ICP recordings. They can be sorted in 2 classes, i.e. high-frequency noises (due to measurement and amplifier devices or electricity supply presence) and low-frequency noises (due to unwanted patient's movement, speeches, coughing during the recording and quantization noise). Thus, deep investigations on ICP components aimed to extract features from ICP signal, require a denoised signal. For this reason the authors have addressed a study upon the most common filtering techniques. On each ICP recording we have performed 4 configurations of filters, which involve the use of a FIR filter together with Signal Averaging filters or PCA based filters. Next step is period estimation for absolute minima detection. The results obtained by the algorithm for automatic ICP marking are compared to those ones obtained from manual marking (peaks are manually identified and annotated by a brain surgeon). The procedure is repeated varying the filters sliding window size to minimize the mean square error. The results show how the configurations FIR filter + Signal averaging provides smaller mean squared error (MSE=118.84[sample(2)]) than the others 3 configurations FIR filter + PCA filter based (MSE=135.29-147.15[sample(2)]).

  7. The Use of Continuous Wavelet Transform Based on the Fast Fourier Transform in the Analysis of Multi-channel Electrogastrography Recordings.

    Science.gov (United States)

    Komorowski, Dariusz; Pietraszek, Stanislaw

    2016-01-01

    This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.

  8. Analysis of the soundscape in an intensive care unit based on the annotation of an audio recording.

    Science.gov (United States)

    Park, Munhum; Kohlrausch, Armin; de Bruijn, Werner; de Jager, Peter; Simons, Koen

    2014-04-01

    The acoustic environments in hospitals, particularly in intensive care units (ICUs), are characterized by frequent high-level sound events which may negatively affect patient outcome. Many studies performed acoustic surveys, but the measurement protocol was not always reported in detail, and the scope of analysis was limited by the selected mode of sound level meters. Fewer studies systematically investigated the noise sources in ICUs by employing an observer in the patient room, which may potentially bias the measurement. In the current study, the soundscape of an ICU was evaluated where acoustic parameters were extracted from a ∼67-h audio recording, and a selected 24-h recording was annotated off-line for a source-specific analysis. The results showed that the patient-involved noise accounted for 31% of the acoustic energy and 11% of the predicted loudness peaks (PLPs). Excluding the patient-involved noise, the remaining acoustic energy was attributed to staff members (57%), alarms (30%), and the operational noise of life-supporting devices (13%). Furthermore, the contribution of each noise category to the PLPs was found to be more uneven: Staff (92%), alarms (6%), and device noise (2%). The current study suggests that most of the noise sources in ICUs may be associated with modifiable human factors.

  9. Probe-based recording technology

    International Nuclear Information System (INIS)

    Naberhuis, Steve

    2002-01-01

    The invention of the scanning tunneling microscope (STM) prompted researchers to contemplate whether such technology could be used as the basis for the storage and retrieval of information. With magnetic data storage technology facing limits in storage density due to the thermal instability of magnetic bits, the super-paramagnetic limit, the heir-apparent for information storage at higher densities appeared to be variants of the STM or similar probe-based storage techniques such as atomic force microscopy (AFM). Among these other techniques that could provide replacement technology for magnetic storage, near-field optical scanning optical microscopy (NSOM or SNOM) has also been investigated. Another alternative probe-based storage technology called atomic resolution storage (ARS) is also currently under development. An overview of these various technologies is herein presented, with an analysis of the advantages and disadvantages inherent in each particularly with respect to reduced device dimensions. The role of micro electro mechanical systems (MEMS) is emphasized

  10. Electronic Health Record Implementation: A SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Leila Shahmoradi

    2017-12-01

    Full Text Available Electronic Health Record (EHR is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS. The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians’ and other clinical staff’s resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  11. Electronic Health Record Implementation: A SWOT Analysis.

    Science.gov (United States)

    Shahmoradi, Leila; Darrudi, Alireza; Arji, Goli; Farzaneh Nejad, Ahmadreza

    2017-10-01

    Electronic Health Record (EHR) is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats) analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS). The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians' and other clinical staff's resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  12. Amplitude histogram-based method of analysis of patch clamp recordings that involve extreme changes in channel activity levels.

    Science.gov (United States)

    Yakubovich, Daniel; Rishal, Ida; Dessauer, Carmen W; Dascal, Nathan

    2009-03-01

    Many ion channels show low basal activity, which is increased hundreds-fold by the relevant gating factor. A classical example is the activation G-protein-activated K(+) channels (GIRK) by Gbetagamma subunit dimer. The extent of activation (relative to basal current), R(a), is an important physiological parameter, usually readily estimated from whole cell recordings. However, calculation of R(a) often becomes non-trivial in multi-channel patches because of extreme changes in activity upon activation, from a seemingly single-channel pattern to a macroscopic one. In such cases, calculation of the net current flowing through the channels in the patch, I, before and after activation may require different methods of analysis. To address this problem, we utilized neuronal GIRK channels activated by purified Gbetagamma in excised patches of Xenopus oocytes. Channels were expressed at varying densities, from a few to several hundreds per patch. We present a simple and fast method of calculating I using amplitude histogram analysis and establish its accuracy by comparing with I calculated from event lists. This method allows the analysis of extreme changes in I in multichannel patches, which would be impossible using the standard methods of idealization and event list generation.

  13. ICOHR: intelligent computer based oral health record.

    Science.gov (United States)

    Peterson, L C; Cobb, D S; Reynolds, D C

    1995-01-01

    The majority of work on computer use in the dental field has focused on non-clinical practice management information needs. Very few computer-based dental information systems provide management support of the clinical care process, particularly with respect to quality management. Traditional quality assurance methods rely on the paper record and provide only retrospective analysis. Today, proactive quality management initiatives are on the rise. Computer-based dental information systems are being integrated into the care environment, actively providing decision support as patient care is being delivered. These new systems emphasize assessment and improvement of patient care at the time of treatment, thus building internal quality management into the caregiving process. The integration of real time quality management and patient care will be expedited by the introduction of an information system architecture that emulates the gathering and storage of clinical care data currently provided by the paper record. As a proposed solution to the problems associated with existing dental record systems, the computer-based patient record has emerged as a possible alternative to the paper dental record. The Institute of Medicine (IOM) recently conducted a study on improving the efficiency and accuracy of patient record keeping. As a result of this study, the IOM advocates the development and implementation of computer-based patient records as the standard for all patient care records. This project represents the ongoing efforts of The University of Iowa College of Dentistry's collaboration with the University of Uppsala Data Center, Uppsala, Sweden, on a computer-based patient dental record model. ICOHR (Intelligent Computer Based Oral Health Record) is an information system which brings together five important parts of the patient's dental record: medical and dental history; oral status; treatment planning; progress notes; and a Patient Care Database, generated from their

  14. Joint time-frequency analysis of EEG signals based on a phase-space interpretation of the recording process

    Science.gov (United States)

    Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.

    2012-10-01

    Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.

  15. Generation of digital time database from paper ECG records and Fourier transform-based analysis for disease identification.

    Science.gov (United States)

    Mitra, Sucharita; Mitra, M; Chaudhuri, B B

    2004-10-01

    ECG signals recorded on paper are transferred to the digital time database with the help of an automated data extraction system developed here. A flatbed scanner is used to form an image database of each 12-lead ECG signal. Those images are then fed into a Pentium PC having a system to extract pixel-to-pixel co-ordinate information to form a raw database with the help of some image processing techniques. These raw data are then ported to the regeneration domain of the system to check the captured pattern with the original wave shape. The sampling period of each ECG signal is computed after detection of QRS complex. Finally, discrete Fourier transform of the generated database is performed to observe the frequency response properties of every ECG signal. Some interesting amplitude properties of monopolar chest lead V4 and V6 are noticed which are stated.

  16. Cenozoic climate changes: A review based on time series analysis of marine benthic d18O records

    OpenAIRE

    Mudelsee, Manfred; Bickert, T.; Lear, Caroline H.; Lohmann, Gerrit

    2014-01-01

    The climate during the Cenozoic era changed in several steps from ice-free poles and warm conditions to ice-covered poles and cold conditions. Since the 1950s, a body of information on ice volume and temperature changes has been built up predominantly on the basis of measurements of the oxygen isotopic composition of shells of benthic foraminifera collected from marine sediment cores. The statistical methodology of time series analysis has also evolved, allowing more information to be extract...

  17. Cenozoic climate changes: A review based on time series analysis of marine benthic δ18O records

    OpenAIRE

    Mudelsee, Manfred; Bickert, Torsten; Lear, Caroline Helen; lohmann, Gerrit

    2014-01-01

    The climate during the Cenozoic era changed in several steps from ice-free poles and warm conditions to ice-covered poles and cold conditions. Since the 1950s, a body of information on ice-volume and temperature changes has been built up predominantly on the basis of measurements of the oxygen isotopic composition of shells of benthic foraminifera collected from marine sediment cores. The statistical methodology of time series analysis has also evolved, allowing more information to be extract...

  18. Analysis and design of shingled magnetic recording systems

    Science.gov (United States)

    Keng Teo, Kim; Elidrissi, Moulay Rachid; Chan, Kheong Sann; Kanai, Yasushi

    2012-04-01

    Shingled magnetic recording (SMR) is an upcoming technology that will extend the life of conventional granular magnetic recording (CGMR). SMR differs from conventional recording in that the tracks are written in a raster scan format, in one direction only, leaving tracks that are overlapped like the shingles on a roof. This simple change means that adjacent track overwrite only occurs from one side, and tracks need to survive this overwrite only once. In contrast, conventional recording needs to survive thousands of overwrites from both sides. This work performs analysis of SMR from three perspectives. First, an analysis of how much gain one might expect for SMR based on the assumptions for the magnetic write width (MWW), magnetic read width (MRW), and erase bandwidth (EBW) is performed. Second, this analysis is corroborated via simulated 747 curves using the grain flipping probability (GFP) model. The third part validates the 747 curves from the model with results from the spinstand.

  19. Teenage pregnancy outcome: a record based study.

    Science.gov (United States)

    Ambadekar, N N; Khandait, D W; Zodpey, S P; Kasturwar, N B; Vasudeo, N D

    1999-01-01

    Present record based study was undertaken in Medical record section of Government medical college, Nagpur, to assess teenage as a risk factor for pregnancy complications, outcome, and operative or assisted delivery. Five year (January 1993 to December 1997) data was scanned, which gave sample of 1830 teenage pregnancies; while equal number of subsequent partly matched controls (> 20.29 years) were taken. Results showed proportion of low birth weight baby to be significantly greater in teenagers (p pregnancies (p pregnancy, premature rupture of membrane, placenta previa, accidental haemorrhage though more in adult pregnancies was statistically not significant. There were no differences in cogenital anamoly and twins between cases and controls. But breech deliveries were significantly (p < 0.001) more in adults.

  20. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... DEPARTMENT OF STATE [Public Notice 7709] Privacy Act; System of Records: State-78, Risk Analysis... a system of records, Risk Analysis and Management Records, State-78, pursuant to the provisions of... INFORMATION: The Department of State proposes that the new system will be ``Risk Analysis and Management...

  1. Voice Stress Analysis: Use of Telephone Recordings.

    Science.gov (United States)

    Waln, Ronald F.; Downey, Ronald G.

    The ability to detect lying is an important skill. While the polygraph is the most common mechanical method used for lie detection, other electronic-based methods have also been developed. One such method, the analysis of voice stress patterns, is based on the assumption that lying is a stressful activity which reduces involuntary frequency…

  2. Cepstrum Analysis of Terrestrial Impact Crater Records

    Directory of Open Access Journals (Sweden)

    Heon-Young Chang

    2008-06-01

    Full Text Available Study of terrestrial impact craters is important not only in the field of the solar system formation and evolution but also of the Galactic astronomy. The terrestrial impact cratering record recently has been examined, providing short- and intermediate-term periodicities, such as, Myrs, Myrs. The existence of such a periodicity has an implication in the Galactic dynamics, since the terrestrial impact cratering is usually interpreted as a result of the environmental variation during solar orbiting in the Galactic plane. The aim of this paper is to search for a long-term periodicity with a novel method since no attempt has been made so far in searching a long-term periodicity in this research field in spite of its great importance. We apply the cepstrum analysis method to the terrestrial impact cratering record for the first time. As a result of the analysis we have found noticeable peaks in the Fourier power spectrum appear ing at periods of Myrs and Myrs, which seem in a simple resonance with the revolution period of the Sun around the Galactic center. Finally we briefly discuss its implications and suggest theoretical study be pursued to explain such a long-term periodicity.

  3. Analysis of records of external occupational dose records in Brazil

    International Nuclear Information System (INIS)

    Mauricio, Claudia L.P.; Silva, Herica L.R. da; Silva, Claudio Ribeiro da

    2014-01-01

    Brazil, a continental country, with actually more than 150,000 workers under individual monitoring for ionizing radiation, has implemented in 1987 a centralized system for storage of external occupational dose. This database has been improved over the years and is now a web-based information system called Brazilian External Occupational Dose Management Database System - GDOSE. This paper presents an overview of the Brazilian external occupational dose over the years. The estimated annual average effective dose shows a decrease from 2.4 mSv in 1987 to about 0.6 mSv, having been a marked reduction from 1987 to 1990. Analyzing by type of controlled practice, one sees that the medical and dental radiology is the area with the largest number of users of individual monitors (70%); followed by education practices (8%) and the industrial radiography (7%). Additionally to photon whole body monitoring; neutron monitors are used in maintenance (36%), reactor (30%) and education (27%); and extremity monitors, in education (27%), nuclear medicine (22%) and radiology (19%). In terms of collective dose, the highest values are also found in conventional radiology, but the highest average dose values are those of interventional radiology. Nuclear medicine, R and D and radiotherapy also have average annual effective dose higher than 1 mSv. However, there is some very high dose values registered in GDOSE that give false information. This should be better analyzed in the future. Annual doses above 500 are certainly not realistic. (author)

  4. Structure and performance of a real-time algorithm to detect tsunami or tsunami-like alert conditions based on sea-level records analysis

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2011-05-01

    Full Text Available The goal of this paper is to present an original real-time algorithm devised for detection of tsunami or tsunami-like waves we call TEDA (Tsunami Early Detection Algorithm, and to introduce a methodology to evaluate its performance. TEDA works on the sea level records of a single station and implements two distinct modules running concurrently: one to assess the presence of tsunami waves ("tsunami detection" and the other to identify high-amplitude long waves ("secure detection". Both detection methods are based on continuously updated time functions depending on a number of parameters that can be varied according to the application. In order to select the most adequate parameter setting for a given station, a methodology to evaluate TEDA performance has been devised, that is based on a number of indicators and that is simple to use. In this paper an example of TEDA application is given by using data from a tide gauge located at the Adak Island in Alaska, USA, that resulted in being quite suitable since it recorded several tsunamis in the last years using the sampling rate of 1 min.

  5. A web-based electronic patient record (ePR) system for data integration in movement analysis research on wheel-chair users to minimize shoulder pain

    Science.gov (United States)

    Deshpande, Ruchi R.; Requejo, Philip; Sutisna, Erry; Wang, Ximing; Liu, Margaret; McNitt-Gray, Sarah; Ruparel, Puja; Liu, Brent J.

    2012-02-01

    Patients confined to manual wheel-chairs are at an added risk of shoulder injury. There is a need for developing optimal bio-mechanical techniques for wheel-chair propulsion through movement analysis. Data collected is diverse and in need of normalization and integration. Current databases are ad-hoc and do not provide flexibility, extensibility and ease of access. The need for an efficient means to retrieve specific trial data, display it and compare data from multiple trials is unmet through lack of data association and synchronicity. We propose the development of a robust web-based ePR system that will enhance workflow and facilitate efficient data management.

  6. Recording and automated analysis of naturalistic bioptic driving.

    Science.gov (United States)

    Luo, Gang; Peli, Eli

    2011-05-01

    People with moderate central vision loss are legally permitted to drive with a bioptic telescope in 39 US states and the Netherlands, but the safety of bioptic driving remains highly controversial. There is no scientific evidence about bioptic use and its impact on safety. We propose searching for evidence by recording naturalistic driving activities in patients' cars. In a pilot study we used an analogue video system to record two bioptic drivers' daily driving activities for 10 and 5 days, respectively. In this technical report, we also describe our novel digital system that collects vehicle manoeuvre information and enables recording over more extended periods, and discuss our approach to analyzing the vast amount of data. Our observations of telescope use by the pilot subjects were quite different from their reports in a previous survey. One subject used the telescope only seven times in nearly 6 h of driving. For the other subject, the average interval between telescope use was about 2 min, and Mobile (cell) phone use in one trip extended the interval to almost 5 min. We demonstrate that computerized analysis of lengthy recordings based on video, GPS, acceleration, and black box data can be used to select informative segments for efficient off-line review of naturalistic driving behaviours. The inconsistency between self reports and objective data as well as infrequent telescope use underscores the importance of recording bioptic driving behaviours in naturalistic conditions over extended periods. We argue that the new recording system is important for understanding bioptic use behaviours and bioptic driving safety. © 2011 The College of Optometrists.

  7. Statistical analysis of molecular signal recording.

    Directory of Open Access Journals (Sweden)

    Joshua I Glaser

    Full Text Available A molecular device that records time-varying signals would enable new approaches in neuroscience. We have recently proposed such a device, termed a "molecular ticker tape", in which an engineered DNA polymerase (DNAP writes time-varying signals into DNA in the form of nucleotide misincorporation patterns. Here, we define a theoretical framework quantifying the expected capabilities of molecular ticker tapes as a function of experimental parameters. We present a decoding algorithm for estimating time-dependent input signals, and DNAP kinetic parameters, directly from misincorporation rates as determined by sequencing. We explore the requirements for accurate signal decoding, particularly the constraints on (1 the polymerase biochemical parameters, and (2 the amplitude, temporal resolution, and duration of the time-varying input signals. Our results suggest that molecular recording devices with kinetic properties similar to natural polymerases could be used to perform experiments in which neural activity is compared across several experimental conditions, and that devices engineered by combining favorable biochemical properties from multiple known polymerases could potentially measure faster phenomena such as slow synchronization of neuronal oscillations. Sophisticated engineering of DNAPs is likely required to achieve molecular recording of neuronal activity with single-spike temporal resolution over experimentally relevant timescales.

  8. Last glacial and Holocene stable isotope record of fossil dripwater from subtropical Brazil based on analysis of fluid inclusions in stalagmites

    NARCIS (Netherlands)

    Millo, Christian; Strikis, Nicolás M.; Vonhof, Hubert B.; Deininger, Michael; da Cruz, Francisco W.; Wang, Xianfeng; Cheng, Hai; Lawrence Edwards, R.

    2017-01-01

    The stable isotope composition of fossil dripwater preserved in stalagmites fluid inclusions is a promising tool to reconstruct paleopluviosity in the tropics and subtropics. Here we present δD and δ18O records of fossil dripwater from two stalagmites collected in Botuverá Cave (subtropical Brazil),

  9. Time/Frequency Analysis of Terrestrial Impack Crater Records

    Directory of Open Access Journals (Sweden)

    Heon-Young Chang

    2006-09-01

    Full Text Available The terrestrial impact cratering record recently has been examined in the time domain by Chang & Moon (2005. It was found that the ˜ 26 Myr periodicity in the impact cratering rate exists over the last ˜ 250 Myrs. Such a periodicity can be found regardless of the lower limit of the diameter up to D ˜ 35 km. It immediately called pros and cons. The aim of this paper is two-fold: (1 to test if reported periodicities can be obtained with an independent method, (2 to see, as attempted earlier, if the phase is modulated. To achieve these goals we employ the time/frequency analysis and for the first time apply this method to the terrestrial impact cratering records. We have confirmed that without exceptions noticeable peaks appear around ˜ 25 Myr, corresponding to a frequency of ˜ 0.04 (Myr^{-1}. We also find periodicities in the data base including small impact craters, which are longer. Though the time/frequency analysis allows us to observe directly phase variations, we cannot find any indications of such changes. Instead, modes display slow variations of power in time. The time/frequency analysis shows a nonstationary behavior of the modes. The power can grow from just above the noise level and then decrease back to its initial level in a time of order of 10 Myrs.

  10. Cost analysis and the effective management of records throughout ...

    African Journals Online (AJOL)

    The article deals with the concept of cost analysis in the context of managing records throughout their life cycle. Ways of cost saving or avoidance in managing records are covered. Document management strategies have the potential to provide some substantial cost – saving benefits if they are used judiciously.

  11. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  12. Analysis of event data recorder data for vehicle safety improvement

    Science.gov (United States)

    2008-04-01

    The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...

  13. Multimodal EEG Recordings, Psychometrics and Behavioural Analysis.

    Science.gov (United States)

    Boeijinga, Peter H

    2015-01-01

    High spatial and temporal resolution measurements of neuronal activity are preferably combined. In an overview on how this approach can take shape, multimodal electroencephalography (EEG) is treated in 2 main parts: by experiments without a task and in the experimentally cued working brain. It concentrates first on the alpha rhythm properties and next on data-driven search for patterns such as the default mode network. The high-resolution volumic distributions of neuronal metabolic indices result in distributed cortical regions and possibly relate to numerous nuclei, observable in a non-invasive manner in the central nervous system of humans. The second part deals with paradigms in which nowadays assessment of target-related networks can align level-dependent blood oxygenation, electrical responses and behaviour, taking the temporal resolution advantages of event-related potentials. Evidence-based electrical propagation in serial tasks during performance is now to a large extent attributed to interconnected pathways, particularly chronometry-dependent ones, throughout a chain including a dorsal stream, next ventral cortical areas taking the flow of information towards inferior temporal domains. The influence of aging is documented, and results of the first multimodal studies in neuropharmacology are consistent. Finally a scope on implementation of advanced clinical applications and personalized marker strategies in neuropsychiatry is indicated. © 2016 S. Karger AG, Basel.

  14. Reconstruction of historical changes in northern fur seal prey availability and diversity in the western North Pacific through individual-based analysis of dietary records

    Science.gov (United States)

    Kiyota, Masashi; Yonezaki, Shiroh

    2017-06-01

    We analyzed long-term dietary records of northern fur seals (Callorhinus ursinus) to reconstruct historical changes in prey availability and diversity in the western North Pacific off northeastern Japan. The nominal relationships between the occurrence frequencies of fishes or squids in fur seal stomachs and the sampling locations reflected the spatial heterogeneity of fish and squid distributions along the shelf-slope-offshore continuum off northeastern Japan, whereas changes in the temporal occurrence frequencies reflected mainly the migration and foraging patterns of the fur seals. The occurrence probabilities of fishes and squids in fur seal stomachs were standardized by using generalized linear models to compensate for sampling biases in space and time. The reconstructed historical trends revealed decadal shifts in relatively high prey abundance-from mackerels in the 1970s to Japanese sardine in the 1980s and myctophids/sparkling enope squids in the 1990s-that were related to decadal shifts in the oceanographic regime. The sequential increase in mackerel and Japanese sardine abundances coincided with the annual catch trends of commercial fisheries. The index of overall prey availability calculated from the standardized occurrence probabilities of fishes and squids in fur seal stomachs was fairly stable over the decades.

  15. Coverage and predictors of vaccination against 2012/13 seasonal influenza in Madrid, Spain: analysis of population-based computerized immunization registries and clinical records.

    Science.gov (United States)

    Jiménez-García, Rodrigo; Esteban-Vasallo, María D; Rodríguez-Rieiro, Cristina; Hernandez-Barrera, Valentín; Domínguez-Berjón, M A Felicitas; Carrasco Garrido, Pilar; Lopez de Andres, Ana; Cameno Heras, Moises; Iniesta Fornies, Domingo; Astray-Mochales, Jenaro

    2014-01-01

    We aim to determine 2012-13 seasonal influenza vaccination coverage. Data were analyzed by age group and by coexistence of concomitant chronic conditions. Factors associated with vaccine uptake were identified. We also analyze a possible trend in vaccine uptake in post pandemic seasons. We used computerized immunization registries and clinical records of the entire population of the Autonomous Community of Madrid, Spain (6,284,128 persons) as data source. A total of 871,631 individuals were vaccinated (13.87%). Coverage for people aged ≥ 65 years was 56.57%. Global coverage in people with a chronic condition was 15.7% in children and 18.69% in adults aged 15-59 years. The variables significantly associated with a higher likelihood of being vaccinated in the 2012-13 campaign for the age groups studied were higher age, being Spanish-born, higher number of doses of seasonal vaccine received in previous campaigns, uptake of pandemic vaccination, and having a chronic condition. We conclude that vaccination coverage in persons agedlow coverage among children with chronic conditions calls for urgent interventions. Among those aged ≥60 years, uptake is higher but still far from optimal and seems to be descending in post-pandemic campaigns. For those aged ≥65 years the mean percentage of decrease from the 2009/10 to the actual campaign has been 12%. Computerized clinical and immunization registers are useful tools for providing rapid and detailed information about influenza vaccination coverage in the population.

  16. Effect of electrode contact area on the information content of the recorded electrogastrograms: An analysis based on Rényi entropy and Teager-Kaiser Energy

    Science.gov (United States)

    Alagumariappan, Paramasivam; Krishnamurthy, Kamalanand; Kandiah, Sundravadivelu; Ponnuswamy, Mannar Jawahar

    2017-06-01

    Electrogastrograms (EGG) are electrical signals originating from the digestive system, which are closely correlated with its mechanical activity. Electrogastrography is an efficient non-invasive method for examining the physiological and pathological states of the human digestive system. There are several factors such as fat conductivity, abdominal thickness, change in electrode surface area etc, which affects the quality of the recorded EGG signals. In this work, the effect of variations in the contact area of surface electrodes on the information content of the measured electrogastrograms is analyzed using Rényi entropy and Teager-Kaiser Energy (TKE). Two different circular cutaneous electrodes with approximate contact areas of 201.14 mm2 and 283.64 mm2, have been adopted and EGG signals were acquired using the standard three electrode protocol. Further, the information content of the measured EGG signals were analyzed using the computed values of entropy and energy. Results demonstrate that the information content of the measured EGG signals increases by 6.72% for an increase in the contact area of the surface electrode by 29.09%. Further, it was observed that the average energy increases with increase in the contact surface area. This work appears to be of high clinical significance since the accurate measurement of EGG signals without loss in its information content, is highly useful for the design of diagnostic assistance tools for automated diagnosis and mass screening of digestive disorders.

  17. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    William Cross, R.; Montemorra, Michael

    2012-01-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in 2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  18. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    -and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  19. Cognitive analysis of the summarization of longitudinal patient records.

    Science.gov (United States)

    Reichert, Daniel; Kaufman, David; Bloxham, Benjamin; Chase, Herbert; Elhadad, Noémie

    2010-11-13

    Electronic health records contain an abundance of valuable information that can be used to guide patient care. However, the large volume of information embodied in these records also renders access to relevant information a time-consuming and inefficient process. Our ultimate objective is to develop an automated summarizer that succinctly captures all relevant information in the patient record. In this paper, we present a cognitive study of 8 clinicians who were asked to create summaries based on data contained in the patients' electronic health record. The study characterized the primary sources of information that were prioritized by clinicians, the temporal strategies used to develop a summary and the cognitive operations used to guide the summarization process. Although we would not expect the automated summarizer to emulate human performance, we anticipate that this study will inform its development in instrumental ways.

  20. Documentary Analysis and Record Utilization: New Uses for Old Methods.

    Science.gov (United States)

    Lincoln, Yvonna S.

    Stressing the value of documents and records as information sources for the educational evaluation community, this report explores the differences between the two, their utility for inquirers, and methods and procedures for dealing with them. Three forms of documentary analysis are described: (1) simple tracking, which involves documenting both…

  1. A cloud based architecture to support Electronic Health Record.

    Science.gov (United States)

    Zangara, Gianluca; Corso, Pietro Paolo; Cangemi, Francesco; Millonzi, Filippo; Collova, Francesco; Scarlatella, Antonio

    2014-01-01

    We introduce a novel framework of electronic healthcare enabled by a Cloud platform able to host both Hospital Information Systems (HIS) and Electronic Medical Record (EMR) systems and implement an innovative model of Electronic Health Record (EHR) that is not only patient-oriented but also supports a better governance of the whole healthcare system. The proposed EHR model adopts the state of the art of the Cloud technologies, being able to join the different clinical data of the patient stored within the HISs and EMRs either placed into a local Data Center or hosted into a Cloud Platform enabling new directions of data analysis.

  2. Jitter analysis utilizing a high speed FM tape recorder.

    Science.gov (United States)

    Nazliel, B; Kuruoğlu, R

    2000-09-01

    Jitter analysis in single fiber EMG (SFEMG) is usually done on-line during recording. However, this technique frequently prolongs the study and makes re-analysis impossible. We attempted to measure jitter with a high speed FM tape recorder and compare the results with the previously published values. SFEMG data, acquired with voluntary activation on extensor digitorum communis muscle of 25 healthy relatives of children with myasthenia gravis were retrospectively analyzed. Fiber density (FD) was estimated on-line. Five to 18 single fiber action potential (SFAP) pairs were studied in each subject. The wow of the tape recorder was 6 microseconds. Mean (SD) (upper 95th percentile) FD, individual jitter, highest jitter, mean jitter and interspike interval were 1.60 (0.18) (1.90), 25.30 (11.20) (57.00) microseconds, 31.24 (6.87) (47.00) microseconds, 25.08 (5.04) (43.00) microseconds, and 0.67 (0.11) (0.91) ms respectively. Mean jitter in the pooled SFAP pairs and mean MCD were found to be lower than the published values of the Ad Hoc Committee of the AAEM Special Interest Group on Single Fiber EMG. A high speed FM tape recorder can be reliably used for the off-line analysis of jitter.

  3. An Autonomous Underwater Recorder Based on a Single Board Computer

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost. PMID:26076479

  4. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  5. Analysis of deep brain stimulation electrode characteristics for neural recording

    Science.gov (United States)

    Kent, Alexander R.; Grill, Warren M.

    2014-08-01

    Objective. Closed-loop deep brain stimulation (DBS) systems have the potential to optimize treatment of movement disorders by enabling automatic adjustment of stimulation parameters based on a feedback signal. Evoked compound action potentials (ECAPs) and local field potentials (LFPs) recorded from the DBS electrode may serve as suitable closed-loop control signals. The objective of this study was to understand better the factors that influence ECAP and LFP recording, including the physical presence of the electrode, the geometrical dimensions of the electrode, and changes in the composition of the peri-electrode space across recording conditions. Approach. Coupled volume conductor-neuron models were used to calculate single-unit activity as well as ECAP responses and LFP activity from a population of model thalamic neurons. Main results. Comparing ECAPs and LFPs measured with and without the presence of the highly conductive recording contacts, we found that the presence of these contacts had a negligible effect on the magnitude of single-unit recordings, ECAPs (7% RMS difference between waveforms), and LFPs (5% change in signal magnitude). Spatial averaging across the contact surface decreased the ECAP magnitude in a phase-dependent manner (74% RMS difference), resulting from a differential effect of the contact on the contribution from nearby or distant elements, and decreased the LFP magnitude (25% change). Reductions in the electrode diameter or recording contact length increased signal energy and increased spatial sensitivity of single neuron recordings. Moreover, smaller diameter electrodes (500 µm) were more selective for recording from local cells over passing axons, with the opposite true for larger diameters (1500 µm). Changes in electrode dimensions had phase-dependent effects on ECAP characteristics, and generally had small effects on the LFP magnitude. ECAP signal energy and LFP magnitude decreased with tighter contact spacing (100 µm), compared to

  6. 'Citizen science' recording of fossils by adapting existing computer-based biodiversity recording tools

    Science.gov (United States)

    McGowan, Alistair

    2014-05-01

    Biodiversity recording activities have been greatly enhanced by the emergence of online schemes and smartphone applications for recording and sharing data about a wide variety of flora and fauna. As a palaeobiologist, one of the areas of research I have been heavily involved in is the question of whether the amount of rock available to sample acts as a bias on our estimates of biodiversity through time. Although great progress has been made on this question over the past ten years by a number of researchers, I still think palaeontology has not followed the lead offered by the 'citizen science' revolution in studies of extant biodiversity. By constructing clearly structured surveys with online data collection support, it should be possible to collect field data on the occurrence of fossils at the scale of individual exposures, which are needed to test competing hypotheses about these effects at relatively small spatial scales. Such data collection would be hard to justify for universities and museums with limited personnel but a co-ordinated citizen science programme would be capable of delivering such a programme. Data collection could be based on the MacKinnon's Lists method, used in rapid conservation assessment work. It relies on observers collecting lists of a fixed length (e.g. 10 species long) but what is important is that it focuses on getting observers to ignore sightings of the same species until that list is complete. This overcomes the problem of 'common taxa being commonly recorded' and encourages observers to seek out and identify the rarer taxa. This gives a targeted but finite task. Rather than removing fossils, participants would be encouraged to take photographs to share via a recording website. The success of iSpot, which allows users to upload photos of plants and animals for other users to help with identifications, offers a model for overcoming the problems of identifying fossils, which can often look nothing like the examples illustrated in

  7. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  8. Social science and linguistic text analysis of nurses’ records

    DEFF Research Database (Denmark)

    Buus, N.; Hamilton, B. E.

    2016-01-01

    The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles...... information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients......' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More...

  9. Applying XDS for sharing CDA-based medical records

    Science.gov (United States)

    Kim, Joong Il; Jang, Bong Mun; Han, Dong Hoon; Yang, Keon Ho; Kang, Won-Suk; Jung, Haijo; Kim, Hee-Joung

    2006-03-01

    Many countries have set long-term objectives for establishing an Electronic Healthcare Records system(EHRs). Various IT Strategies note that integration of EHR systems has a high priority. Because the EHR systems are based on different information models and different technology platforms, one of the key integration problems in the realization of the EHRs for the continuity of patient care, is the inability to share patient records between various institutions. Integrating the Healthcare Enterprise (IHE) committee has defined the detailed implementations of existing standards such as DICOM, HL7, in a publicly available document called the IHE technical framework (IHE-TF). Cross-enterprise document sharing (XDS), one of IHE technical frameworks, is describing how to apply the standards into the information systems for the sharing of medical documents among hospitals. This study aims to design Clinical Document Architecture (CDA) schema based on HL7, and to apply implementation strategies of XDS using this CDA schema.

  10. Reconstructing Fire Records from Ground-Based Routine Aerosol Monitoring

    Directory of Open Access Journals (Sweden)

    Hongmei Zhao

    2016-03-01

    Full Text Available Long-term fire records are important to understanding the trend of biomass burning and its interactions with air quality and climate at regional and global scales. Traditionally, such data have been compiled from ground surveys or satellite remote sensing. To obtain aerosol information during a fire event to use in analyzing air quality, we propose a new method of developing a long-term fire record for the contiguous United States using an unconventional data source: ground-based aerosol monitoring. Assisted by satellite fire detection, the mass concentration, size distribution, and chemical composition data of surface aerosols collected from the Interagency Monitoring of Protected Visual Environments (IMPROVE network are examined to identify distinct aerosol characteristics during satellite-detected fire and non-fire periods. During a fire episode, elevated aerosol concentrations and heavy smoke are usually recorded by ground monitors and satellite sensors. Based on the unique physical and chemical characteristics of fire-dominated aerosols reported in the literature, we analyzed the surface aerosol observations from the IMPROVE network during satellite-detected fire events to establish a set of indicators to identify fire events from routine aerosol monitoring data. Five fire identification criteria were chosen: (1 high concentrations of PM2.5 and PM10 (particles smaller than 2.5 and 10 in diameters, respectively; (2 a high PM2.5/PM10 ratio; (3 high organic carbon (OC/PM2.5 and elemental carbon (EC/PM2.5 ratios; (4 a high potassium (K/PM2.5 ratio; and (5 a low soil/PM2.5 ratio. Using these criteria, we are able to identify a number of fire episodes close to 15 IMPROVE monitors from 2001 to 2011. Most of these monitors are located in the Western and Central United States. In any given year within the study period fire events often occurred between April and September, especially in the two months of April and September. This ground-based fire

  11. A medical record linkage analysis of abortion underreporting.

    Science.gov (United States)

    Udry, J R; Gaughan, M; Schwingl, P J; van den Berg, B J

    1996-01-01

    Inaccuracy in women's reports of their abortion histories affects many areas of interest to reproductive health professionals and researchers. The identification of characteristics that affect the accuracy of reporting is essential for the improvement of data collection methods. A comparison of the medical records of 104 American women aged 27-30 in 1990-1991 with their self-reported abortion histories revealed that 19% of these women failed to report one or more abortions. Results of logistic regression analysis indicate that nonwhite women were 3.3 times as likely as whites to underreport. With each additional year that had elapsed since the first recorded abortion, women became somewhat more likely to underreport (odds ratio of 1.3), while each additional year of a woman's education slightly decreased the likelihood of underreporting (odds ratio of 0.7).

  12. A New Spectral Shape-Based Record Selection Approach Using Np and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Edén Bojórquez

    2013-01-01

    Full Text Available With the aim to improve code-based real records selection criteria, an approach inspired in a parameter proxy of spectral shape, named Np, is analyzed. The procedure is based on several objectives aimed to minimize the record-to-record variability of the ground motions selected for seismic structural assessment. In order to select the best ground motion set of records to be used as an input for nonlinear dynamic analysis, an optimization approach is applied using genetic algorithms focuse on finding the set of records more compatible with a target spectrum and target Np values. The results of the new Np-based approach suggest that the real accelerograms obtained with this procedure, reduce the scatter of the response spectra as compared with the traditional approach; furthermore, the mean spectrum of the set of records is very similar to the target seismic design spectrum in the range of interest periods, and at the same time, similar Np values are obtained for the selected records and the target spectrum.

  13. Analysis of Handling Processes of Record Versions in NoSQL Databases

    Directory of Open Access Journals (Sweden)

    Yu. A. Grigorev

    2015-01-01

    Full Text Available This article investigates the handling processes versions of a record in NoSQL databases. The goal of this work is to develop a model, which enables users both to handle record versions and work with a record simultaneously. This model allows us to estimate both a time distribution for users to handle record versions and a distribution of the count of record versions. With eventual consistency (W=R=1 there is a possibility for several users to update any record simultaneously. In this case, several versions of records with the same key will be stored in database. When reading, the user obtains all versions, handles them, and saves a new version, while older versions are deleted. According to the model, the user’s time for handling the record versions consists of two parts: random handling time of each version and random deliberation time for handling a result. Record saving time and records deleting time are much less than handling time, so, they are ignored in the model. The paper offers two model variants. According to the first variant, client's handling time of one record version is calculated as the sum of random handling times of one version based on the count of record versions. This variant ignores explicitly the fact that handling time of record versions may depend on the update count, performed by the other users between the sequential updates of the record by the current client. So there is the second variant, which takes this feature into consideration. The developed models were implemented in the GPSS environment. The model experiments with different counts of clients and different ratio between one record handling time and results deliberation time were conducted. The analysis showed that despite the resemblance of model variants, a difference in change nature between average values of record versions count and handling time is significant. In the second variant dependences of the average count of record versions in database and

  14. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  15. Barriers to retrieving patient information from electronic health record data: failure analysis from the TREC Medical Records Track.

    Science.gov (United States)

    Edinger, Tracy; Cohen, Aaron M; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems.

  16. Network Analysis of Time-Lapse Microscopy Recordings

    Directory of Open Access Journals (Sweden)

    Erik eSmedler

    2014-09-01

    Full Text Available Multicellular organisms rely on intercellular communication to regulate important cellular processes critical to life. To further our understanding of those processes there is a need to scrutinize dynamical signaling events and their functions in both cells and organisms. Here, we report a method and provide MATLAB code that analyzes time-lapse microscopy recordings to identify and characterize network structures within large cell populations, such as interconnected neurons. The approach is demonstrated using intracellular calcium (Ca2+ recordings in neural progenitors and cardiac myocytes, but could be applied to a wide variety of biosensors employed in diverse cell types and organisms. In this method, network structures are analyzed by applying cross-correlation signal processing and graph theory to single-cell recordings. The goal of the analysis is to determine if the single cell activity constitutes a network of interconnected cells and to decipher the properties of this network. The method can be applied in many fields of biology in which biosensors are used to monitor signaling events in living cells. Analyzing intercellular communication in cell ensembles can reveal essential network structures that provide important biological insights.

  17. Deconvolution of the tree ring based delta13C record

    International Nuclear Information System (INIS)

    Peng, T.; Broecker, W.S.; Freyer, H.D.; Trumbore, S.

    1983-01-01

    We assumed that the tree-ring based 13 C/ 12 C record constructed by Freyer and Belacy (1983) to be representative of the fossil fuel and forest-soil induced 13 C/ 12 C change for atmospheric CO 2 . Through the use of a modification of the Oeschger et al. ocean model, we have computed the contribution of the combustion of coal, oil, and natural gas to this observed 13 C/ 12 C change. A large residual remains when the tree-ring-based record is corrected for the contribution of fossil fuel CO 2 . A deconvolution was performed on this residual to determine the time history and magnitude of the forest-soil reservoir changes over the past 150 years. Several important conclusions were reached. (1) The magnitude of the integrated CO 2 input from these sources was about 1.6 times that from fossil fuels. (2) The forest-soil contribution reached a broad maximum centered at about 1900. (3) Over the 2 decade period covered by the Mauna Loa atmospheric CO 2 content record, the input from forests and soils was about 30% that from fossil fuels. (4) The 13 C/ 12 C trend over the last 20 years was dominated by the input of fossil fuel CO 2 . (5) The forest-soil release did not contribute significantly to the secular increase in atmospheric CO 2 observed over the last 20 years. (6) The pre-1850 atmospheric p2 values must have been in the range 245 to 270 x 10 -6 atmospheres

  18. Dose Record Analysis of External Exposure of Workers in Madagascar

    International Nuclear Information System (INIS)

    Andriambololona, R.; Ratovonjanahary, J. F.; Randriantsizafy, R. D.

    2004-01-01

    percent of the Hp(10) are less than 2.5 mSv. In the previous survey, it was 85 percent. Only 1 percent of the dose exceeded 5 mSv. In the previous survey it was around 2 percent and reached 13 percent in 1994 . These results show that appropriate dose record keeping and analysis can help in assessing the implementation of safety and radiation protection in a country. Furthermore, this shows that training and education can have an important role in upgrading radiation protection and safety. (Author)

  19. Localizing wushu players on a platform based on a video recording

    Science.gov (United States)

    Peczek, Piotr M.; Zabołotny, Wojciech M.

    2017-08-01

    This article describes the development of a method to localize an athlete during sports performance on a platform, based on a static video recording. Considered sport for this method is wushu - martial art. However, any other discipline can be applied. There are specified requirements, and 2 algorithms of image processing are described. The next part presents an experiment that was held based on recordings from the Pan American Wushu Championship. Based on those recordings the steps of the algorithm are shown. Results are evaluated manually. The last part of the article concludes if the algorithm is applicable and what improvements have to be implemented to use it during sports competitions as well as for offline analysis.

  20. The Sensetivity of Flood Frequency Analysis on Record Length in Continuous United States

    Science.gov (United States)

    Hu, L.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    In flood frequency analysis (FFA), sufficiently long data series are important to get more reliable results. Compared to return periods of interest, at-site FFA usually needs large data sets. Generally, the precision of at site estimators and time-sampling errors are associated with the length of a gauged record. In this work, we quantify the difference with various record lengths. we use generalized extreme value (GEV) and Log Pearson type III (LP3), two traditional methods on annual maximum stream flows to undertake FFA, and propose quantitative ways, relative difference in median and interquartile range (IQR) to compare the flood frequency performances on different record length from selected 350 USGS gauges, which have more than 70 years record length in Continuous United States. Also, we group those gauges into different regions separately based on hydrological unit map and discuss the geometry impacts. The results indicate that long record length can avoid imposing an upper limit on the degree of sophistication. Working with relatively longer record length may lead accurate results than working with shorter record length. Furthermore, the influence of hydrologic unites for the watershed boundary dataset on those gauges also be presented. The California region is the most sensitive to record length, while gauges in the east perform steady.

  1. Re-analysis of the Krakatoa Tsunami Records along the European Atlantic Coast

    Science.gov (United States)

    Karpytchev, M.; Daubord, C.; Hebert, H.; Woppelmann, G.

    2012-04-01

    The explosion of the Krakatoa volcano on August, 27, 1883, generated one of the highest tsunami ever recorded by tide gauges. The sea level measurements available at that time were collected and published by the Krakatoa Committee (Symons, 1888) but the original records seem to be lost. Pelinovsky et al (2005) digitized the Krakatoa Committee reproductions and pointed at the difficulties of using Symons' (1888) figures for a quantitave analysis. In this study, we attempted to identify the Krakatoa tsunami signature in the Symons' records along the British and French Atlantic coasts by comparing them to the sea level variations measured at the tidal station of Saint Servan. The original Saint Servan sea level record has been recently discovered in the French Navy (SHOM) data archive. The wavelet-based technqiues of cross-correlation and coherence analysis revealed a coherence between the Saint Servan observations and some of the Krakatoa Comittee records. The wavelet-based methods helped to identify the Krakatau tsunami signature in the English Channel and to estimate its parameters. Additional signal detection techniques were required, however, to extract the Krakatoa tsunami from the sea level oscillations recorded in the Bay of Biscay, at Rochefort and Soccoa tidal stations.

  2. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Science.gov (United States)

    2010-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  3. Anonymization of Electronic Medical Records to Support Clinical Analysis

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2013-01-01

    Anonymization of Electronic Medical Records to Support Clinical Analysis closely examines the privacy threats that may arise from medical data sharing, and surveys the state-of-the-art methods developed to safeguard data against these threats. To motivate the need for computational methods, the book first explores the main challenges facing the privacy-protection of medical data using the existing policies, practices and regulations. Then, it takes an in-depth look at the popular computational privacy-preserving methods that have been developed for demographic, clinical and genomic data sharing, and closely analyzes the privacy principles behind these methods, as well as the optimization and algorithmic strategies that they employ. Finally, through a series of in-depth case studies that highlight data from the US Census as well as the Vanderbilt University Medical Center, the book outlines a new, innovative class of privacy-preserving methods designed to ensure the integrity of transferred medical data for su...

  4. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Joe, Yang Hee; Cho, Sung Gook

    2003-01-01

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  5. Flexible polyimide microelectrode array for in vivo recordings and current source density analysis.

    Science.gov (United States)

    Cheung, Karen C; Renaud, Philippe; Tanila, Heikki; Djupsund, Kaj

    2007-03-15

    This work presents implantable, flexible polymer-based probes with embedded microelectrodes for acute and chronic neural recordings in vivo, as tested on rodents. Acute recordings using this array were done in mice under urethane anesthesia and compared to those made using silicon-based probes manufactured at the Center for Neural Communication Technology, University of Michigan. The two electrode arrays yielded similar results. Recordings with chronically implanted polymer-based electrodes were performed for 60 days post-surgically in awake, behaving rats. The microelectrodes were used to monitor local field potentials and capture laminar differences in function of cortex and hippocampus, and produced response waveforms of undiminished amplitude and signal-to-noise ratios 8 weeks after chronic implantation. The polymer-based electrodes could also be connected to a lesion current to mark specific locations in the tissue. Current source density (CSD) analysis from the recordings depicted a source - sink-composition. Tissue response was assessed 8 weeks after insertion by immunochemical labeling with glial fibrillary acidic protein (GFAP) to identify astrocytes, and histological analysis showed minimal tissue reaction to the implanted structures.

  6. A computerised out-patient medical records programme based on the Summary Time-Oriented Record (STOR) System.

    Science.gov (United States)

    Cheong, P Y; Goh, L G; Ong, R; Wong, P K

    1992-12-01

    Advances in microcomputer hardware and software technology have made computerised outpatient medical records practical. We have developed a programme based on the Summary Time-Oriented Record (STOR) system which complements existing paper-based record keeping. The elements of the Problem Oriented Medical Record (POMR) System are displayed in two windows within one screen, namely, the SOAP (Subjective information, Objective information, Assessments and Plans) elements in the Reason For Encounter (RFE) window and the problem list with outcomes in the Problem List (PL) window. Context sensitive child windows display details of plans of management in the RFE window and clinical notes in the PL window. The benefits of such innovations to clinical decision making and practice based research and its medico-legal implications are discussed.

  7. Opto-mechatronics issues in solid immersion lens based near-field recording

    Science.gov (United States)

    Park, No-Cheol; Yoon, Yong-Joong; Lee, Yong-Hyun; Kim, Joong-Gon; Kim, Wan-Chin; Choi, Hyun; Lim, Seungho; Yang, Tae-Man; Choi, Moon-Ho; Yang, Hyunseok; Rhim, Yoon-Chul; Park, Young-Pil

    2007-06-01

    We analyzed the effects of an external shock on a collision problem in a solid immersion lens (SIL) based near-field recording (NFR) through a shock response analysis and proposed a possible solution to this problem with adopting a protector and safety mode. With this proposed method the collision between SIL and media can be avoided. We showed possible solution for contamination problem in SIL based NFR through a numerical air flow analysis. We also introduced possible solid immersion lens designs to increase the fabrication and assembly tolerances of an optical head with replicated lens. Potentially, these research results could advance NFR technology for commercial product.

  8. Electronic Health Record for Intensive Care based on Usual Windows Based Software.

    Science.gov (United States)

    Reper, Arnaud; Reper, Pascal

    2015-08-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  9. Analysis of severe atmospheric disturbances from airline flight records

    Science.gov (United States)

    Wingrove, R. C.; Bach, R. E., Jr.; Schultz, T. A.

    1989-01-01

    Advanced methods were developed to determine time varying winds and turbulence from digital flight data recorders carried aboard modern airliners. Analysis of several cases involving severe clear air turbulence encounters at cruise altitudes has shown that the aircraft encountered vortex arrays generated by destabilized wind shear layers above mountains or thunderstorms. A model was developed to identify the strength, size, and spacing of vortex arrays. This model is used to study the effects of severe wind hazards on operational safety for different types of aircraft. It is demonstrated that small remotely piloted vehicles and executive aircraft exhibit more violent behavior than do large airliners during encounters with high-altitude vortices. Analysis of digital flight data from the accident at Dallas/Ft. Worth in 1985 indicates that the aircraft encountered a microburst with rapidly changing winds embedded in a strong outflow near the ground. A multiple-vortex-ring model was developed to represent the microburst wind pattern. This model can be used in flight simulators to better understand the control problems in severe microburst encounters.

  10. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  11. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  12. 75 FR 79312 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking...

    Science.gov (United States)

    2010-12-20

    ...-2008-0619] RIN 3150-AI25 Requirements for Fingerprint-Based Criminal History Records Checks for... a fingerprint- based criminal history records check before granting any individual unescorted access...

  13. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko

    2015-01-01

    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  14. Automatic detection of periods of slow wave sleep based on intracranial depth electrode recordings.

    Science.gov (United States)

    Reed, Chrystal M; Birch, Kurtis G; Kamiński, Jan; Sullivan, Shannon; Chung, Jeffrey M; Mamelak, Adam N; Rutishauser, Ueli

    2017-04-15

    An automated process for sleep staging based on intracranial EEG data alone is needed to facilitate research into the neural processes occurring during slow wave sleep (SWS). Current manual methods for sleep scoring require a full polysomnography (PSG) set-up, including electrooculography (EOG), electromyography (EMG), and scalp electroencephalography (EEG). This set-up can be technically difficult to place in the presence of intracranial EEG electrodes. There is thus a need for a method for sleep staging based on intracranial recordings alone. Here we show a reliable automated method for the detection of periods of SWS solely based on intracranial EEG recordings. The method utilizes the ratio of spectral power in delta, theta, and spindle frequencies relative to alpha and beta frequencies to classify 30-s segments as SWS or not. We evaluated this new method by comparing its performance against visually scored patients (n=9), in which we also recorded EOG and EMG simultaneously. Our method had a mean positive predictive value of 64% across all nights. Also, an ROC analysis of the performance of our algorithm compared to manually labeled nights revealed a mean average area under the curve of 0.91 across all nights. Our method had an average kappa score of 0.72 when compared to visual sleep scoring by an independent blinded sleep scorer. This shows that this simple method is capable of differentiating between SWS and non-SWS epochs reliably based solely on intracranial EEG recordings. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Simultaneous surface and depth neural activity recording with graphene transistor-based dual-modality probes.

    Science.gov (United States)

    Du, Mingde; Xu, Xianchen; Yang, Long; Guo, Yichuan; Guan, Shouliang; Shi, Jidong; Wang, Jinfen; Fang, Ying

    2018-05-15

    Subdural surface and penetrating depth probes are widely applied to record neural activities from the cortical surface and intracortical locations of the brain, respectively. Simultaneous surface and depth neural activity recording is essential to understand the linkage between the two modalities. Here, we develop flexible dual-modality neural probes based on graphene transistors. The neural probes exhibit stable electrical performance even under 90° bending because of the excellent mechanical properties of graphene, and thus allow multi-site recording from the subdural surface of rat cortex. In addition, finite element analysis was carried out to investigate the mechanical interactions between probe and cortex tissue during intracortical implantation. Based on the simulation results, a sharp tip angle of π/6 was chosen to facilitate tissue penetration of the neural probes. Accordingly, the graphene transistor-based dual-modality neural probes have been successfully applied for simultaneous surface and depth recording of epileptiform activity of rat brain in vivo. Our results show that graphene transistor-based dual-modality neural probes can serve as a facile and versatile tool to study tempo-spatial patterns of neural activities. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Coral-based climate records from tropical South Atlantic

    DEFF Research Database (Denmark)

    Pereira, Natan S.; Sial, Alcides N.; Kikuchi, Ruy K.P.

    2015-01-01

    Coral skeletons contain records of past environmental conditions due to their long life span and well calibrated geochemical signatures. C and O isotope records of corals are especially interesting, because they can highlight multidecadal variability of local climate conditions beyond the instrum...

  17. Virtual records : standardized safety training and web-based record keeping reduces training duplication

    Energy Technology Data Exchange (ETDEWEB)

    Stastny, R.P.

    2010-02-15

    Duplication in safety training is having a significant impact on the oil and gas industry in Canada. This article discussed a method of standardizing safety training and ensuring that accredited courses are completed by individual workers. Internet servers provided by the Alberta government are being used by the Health and Safety Association Network (HSAN) to provide access to safety training records. The program was initiated in collaboration with local governments, unions, labour providers, and other safety associations. The industry training and tracking system (ITTS) database was made fully operational in 2009. The ITTS aims to have the safety records of over a million people stored in its database within a 5-year period. The list of safety courses accepted by the HSAN administrator will also continue to expand.

  18. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... portions of the Risk Analysis and Management (RAM) Records, State-78, system of records contain criminal... for Department of State contracts, grants, cooperative agreements, or other funding. The information.... national security interests. The records may contain criminal investigation records, investigatory material...

  19. A definition for influenza pandemics based on historical records.

    Science.gov (United States)

    Potter, Chris W; Jennings, Roy

    2011-10-01

    To analyse the records of past influenza outbreaks to determine a definition for pandemics. Analysis of publications of large outbreaks of influenza which have occurred since 1889/90, and to match the results against the current definitions of an influenza pandemic. According to the general understanding of a pandemic, nine outbreaks of influenza since 1889/90 satisfy the definition; however, for two of these, occurring in 1900 and 1933, the data are limited. The special condition for an influenza pandemic requires, in one definition, that the virus strain responsible could not have arisen from the previous circulating strain by mutation; and in the second, that the new strain be a different subtype to the previously circulating strain. Both these restrictions deny pandemic status to two, and possibly three, influenza outbreaks which were pandemics according to the more general understanding of the term. These observations suggest that a re-evaluation of the criteria which define influenza pandemics should be carried out. The contradiction outlined above brings the previous definitions of an influenza pandemic into question; however, this can be resolved by defining an influenza pandemic by the following criteria. Thus, an influenza pandemic arises at a single, specific place and spreads rapidly to involve numerous countries. The haemagglutinin (HA) of the emergent virus does not cross-react serologically with the previously dominant virus strain(s), and there is a significant lack of immunity in the population against the emergent virus. These three criteria are interlinked and can be determined early to alert authorities who could respond appropriately. Other criteria associated with pandemics are necessarily retrospective, although important and valid. The implications of this definition are discussed. Copyright © 2011 The British Infection Association. Published by Elsevier Ltd. All rights reserved.

  20. Quality Assurance in a Computer-Based Outpatient Record

    Science.gov (United States)

    Colloff, Edwin; Morgan, Mary; Beaman, Peter; Justice, Norma; Kunstaetter, Robert; Barnett, G. Octo

    1980-01-01

    COSTAR, a COmputer-STored Ambulatory Record system, was developed at the Massachusetts General Hospital Laboratory of Computer Science. It can supplement or entirely replace the paper medical record with a highly encoded record. Although a computer-stored medical record provides a unique opportunity for quality assurance activities, it requires programming skills to examine the data. We have taken the dual approach of writing pre-specified quality assurance packages and developing a high level Medical Query Language (MQL) that can be used by non-programmers. While each approach has pros and cons, we are encouraged by our results which show that a quality assurance program can be written reasonably well in MQL by individuals who have little programming experience.

  1. Symptom diagnostics based on clinical records: a tool for scientific research in child psychiatry?

    Science.gov (United States)

    de Jong, Marianne; Punt, Marja; de Groot, Erik; Hielkema, Tjitske; Struik, Marianne; Minderaa, Ruud B; Hadders-Algra, Mijna

    2009-05-01

    Child psychiatric diagnoses are generally based on a clinical examination and not on standardized questionnaires. The present study assessed whether symptom diagnostics based on clinical records facilitates the use of non-standardized clinical material for research. Six hundred and eighty-five children, referred to a third level child psychiatric centre in the Netherlands, were, after extensive multidisciplinary examination, classified according to the multi-axial classification scheme for psychiatric disorders in childhood and adolescence (MAC-ICD-9). By two raters 44 behavioural symptoms were scored based on the clinical records of these children. Interrater agreement on symptoms in 50 records was performed. Principal components analysis on symptom scores of all children was performed; factor scores were related with MAC-ICD-9 classifications. Interrater reliability for behavioural symptoms was excellent (kappa = 0.88). Many children with psychiatric problems suffer from a large number of behavioural symptoms. Factor scores of the symptoms revealed recognizable and well interpretable entities and indicated overlap in symptomatology and comorbidity. A symptom-based diagnostic approach based on extensive clinical patient files may provide a special dimension to improve the reliability of psychiatric classification.

  2. Julius – a template based supplementary electronic health record system

    Directory of Open Access Journals (Sweden)

    Klein Gunnar O

    2007-05-01

    Full Text Available Abstract Background EHR systems are widely used in hospitals and primary care centres but it is usually difficult to share information and to collect patient data for clinical research. This is partly due to the different proprietary information models and inconsistent data quality. Our objective was to provide a more flexible solution enabling the clinicians to define which data to be recorded and shared for both routine documentation and clinical studies. The data should be possible to reuse through a common set of variable definitions providing a consistent nomenclature and validation of data. Another objective was that the templates used for the data entry and presentation should be possible to use in combination with the existing EHR systems. Methods We have designed and developed a template based system (called Julius that was integrated with existing EHR systems. The system is driven by the medical domain knowledge defined by clinicians in the form of templates and variable definitions stored in a common data repository. The system architecture consists of three layers. The presentation layer is purely web-based, which facilitates integration with existing EHR products. The domain layer consists of the template design system, a variable/clinical concept definition system, the transformation and validation logic all implemented in Java. The data source layer utilizes an object relational mapping tool and a relational database. Results The Julius system has been implemented, tested and deployed to three health care units in Stockholm, Sweden. The initial responses from the pilot users were positive. The template system facilitates patient data collection in many ways. The experience of using the template system suggests that enabling the clinicians to be in control of the system, is a good way to add supplementary functionality to the present EHR systems. Conclusion The approach of the template system in combination with various local EHR

  3. Analysis of infant cortical synchrony is constrained by the number of recording electrodes and the recording montage.

    Science.gov (United States)

    Tokariev, Anton; Vanhatalo, Sampsa; Palva, J Matias

    2016-01-01

    To assess how the recording montage in the neonatal EEG influences the detection of cortical source signals and their phase interactions. Scalp EEG was simulated by forward modeling 20-200 simultaneously active sources covering the cortical surface of a realistic neonatal head model. We assessed systematically how the number of scalp electrodes (11-85), analysis montage, or the size of cortical sources affect the detection of cortical phase synchrony. Statistical metrics were developed for quantifying the resolution and reliability of the montages. The findings converge to show that an increase in the number of recording electrodes leads to a systematic improvement in the detection of true cortical phase synchrony. While there is always a ceiling effect with respect to discernible cortical details, we show that the average and Laplacian montages exhibit superior specificity and sensitivity as compared to other conventional montages. Reliability in assessing true neonatal cortical synchrony is directly related to the choice of EEG recording and analysis configurations. Because of the high conductivity of the neonatal skull, the conventional neonatal EEG recordings are spatially far too sparse for pertinent studies, and this loss of information cannot be recovered by re-montaging during analysis. Future neonatal EEG studies will need prospective planning of recording configuration to allow analysis of spatial details required by each study question. Our findings also advice about the level of details in brain synchrony that can be studied with existing datasets or by using conventional EEG recordings. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Classifying Normal and Abnormal Status Based on Video Recordings of Epileptic Patients

    Directory of Open Access Journals (Sweden)

    Jing Li

    2014-01-01

    Full Text Available Based on video recordings of the movement of the patients with epilepsy, this paper proposed a human action recognition scheme to detect distinct motion patterns and to distinguish the normal status from the abnormal status of epileptic patients. The scheme first extracts local features and holistic features, which are complementary to each other. Afterwards, a support vector machine is applied to classification. Based on the experimental results, this scheme obtains a satisfactory classification result and provides a fundamental analysis towards the human-robot interaction with socially assistive robots in caring the patients with epilepsy (or other patients with brain disorders in order to protect them from injury.

  5. Validity of electronic diet recording nutrient estimates compared to dietitian analysis of diet records: A randomized controlled trial

    Science.gov (United States)

    Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...

  6. Reconstructing sacred landscapes from soils-based records

    Science.gov (United States)

    Simpson, Ian; Gilliland, Krista; Coningham, Robin; Manuel, Mark; Davis, Christopher; Strickland, Keir; Acharya, Kosh; Hyland, Katherine; Bull, Ian; Kinnaird, Timothy; Sanderson, David

    2015-04-01

    From soils- and sediments- based records we reconstruct development of the sacred landscape at Lumbini UNESCO World Heritage Site in the central Nepalese Terai, the birthplace of Buddha, a world religion and now a major place of pilgrimage to its temple site. The Terai is a plain less than 100 m above sea level with incising rivers that originate in the Churia Hills and flow to the Ganges. Alluvial sediments on the Terai plain, originating as laterite soils within the hills, are characterised by a range of textural classes rich in iron oxides and manganese, with sandier sediments near water sources and finer textures near the distal ends of alluvial reaches. Our objectives are to establish a chronological framework for occupation, identify influences of alluvial environments on site occupation and determine the process of secular and sacred site formation within the World Heritage Site. A set of key stratigraphies are the basis for our analyses and are located in a palaeo-channel adjacent the temple site, within the temple site itself, and within the mound of the original Lumbini village. Optically stimulated luminescence (OSL) measurements of soils and sediments together with supporting single entity radiocarbon measurements provide robust chronological frameworks. Assessment of field properties, thin section micromorphology and organic biomarkers offer new insight into the intimate and complex relationships between natural, cultural and culturally mediated processes in landscape development. Integration of our findings allows a detailed narrative of cultural landscape development at Lumbini. The area was occupied from ca. 1,500 BC first of all by a transient community who used the area for product storage and who were subject to persistent flooding with periodic major flood events. Subsequent occupation deliberately raised a permanent village settlement above the level of flood events flooding and which had associated managed field cultivation. Village life was

  7. Detecting seismic activity with a covariance matrix analysis of data recorded on seismic arrays

    Science.gov (United States)

    Seydoux, L.; Shapiro, N. M.; de Rosny, J.; Brenguier, F.; Landès, M.

    2016-03-01

    Modern seismic networks are recording the ground motion continuously at the Earth's surface, providing dense spatial samples of the seismic wavefield. The aim of our study is to analyse these records with statistical array-based approaches to identify coherent time-series as a function of time and frequency. Using ideas mainly brought from the random matrix theory, we analyse the spatial coherence of the seismic wavefield from the width of the covariance matrix eigenvalue distribution. We propose a robust detection method that could be used for the analysis of weak and emergent signals embedded in background noise, such as the volcanic or tectonic tremors and local microseismicity, without any prior knowledge about the studied wavefields. We apply our algorithm to the records of the seismic monitoring network of the Piton de la Fournaise volcano located at La Réunion Island and composed of 21 receivers with an aperture of ˜15 km. This array recorded many teleseismic earthquakes as well as seismovolcanic events during the year 2010. We show that the analysis of the wavefield at frequencies smaller than ˜0.1 Hz results in detection of the majority of teleseismic events from the Global Centroid Moment Tensor database. The seismic activity related to the Piton de la Fournaise volcano is well detected at frequencies above 1 Hz.

  8. Analysis of recorded earthquake response data at the Hualien large-scale seismic test site

    International Nuclear Information System (INIS)

    Hyun, C.H.; Tang, H.T.; Dermitzakis, S.; Esfandiari, S.

    1997-01-01

    A soil-structure interaction (SSI) experiment is being conducted in a seismically active region in Hualien, Taiwan. To obtain earthquake data for quantifying SSI effects and providing a basis to benchmark analysis methods, a 1/4-th scale cylindrical concrete containment model similar in shape to that of a nuclear power plant containment was constructed in the field where both the containment model and its surrounding soil, surface and sub-surface, are extensively instrumented to record earthquake data. In between September 1993 and May 1995, eight earthquakes with Richter magnitudes ranging from 4.2 to 6.2 were recorded. The author focuses on studying and analyzing the recorded data to provide information on the response characteristics of the Hualien soil-structure system, the SSI effects and the ground motion characteristics. An effort was also made to directly determine the site soil physical properties based on correlation analysis of the recorded data. No modeling simulations were attempted to try to analytically predict the SSI response of the soil and the structure. These will be the scope of a subsequent study

  9. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    Science.gov (United States)

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  10. Electronic medical records: a developing and developed country analysis

    CSIR Research Space (South Africa)

    Sikhondze, NC

    2016-05-01

    Full Text Available on the accuracy and availability of the data and since most of the data is on paper format; this limits access to the data by healthcare providers and acts as a hindrance to healthcare delivery. The implementation of Electronic Medical Records (EMR), which...

  11. A security analysis of the Dutch electronic patient record system

    NARCIS (Netherlands)

    van 't Noordende, G.

    2010-01-01

    In this article, we analyze the security architecture of the Dutch Electronic Patient Dossier (EPD) system. Intended as a national infrastructure for exchanging medical patient records among authorized parties (particularly, physicians), the EPD has to address a number of requirements, ranging from

  12. Simulation of artificial earthquake records compatible with site specific response spectra using time series analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Fadavi Amiri

    2017-11-01

    Full Text Available Time history analysis of infrastructures like dams, bridges and nuclear power plants is one of the fundamental parts of their design process. But there are not sufficient and suitable site specific earthquake records to do such time history analysis; therefore, generation of artificial accelerograms is required for conducting research works in this area.  Using time series analysis, wavelet transforms, artificial neural networks and genetic algorithm, a new method is introduced to produce artificial accelerograms compatible with response spectra for the specified site condition. In the proposed method, first, some recorded accelerograms are selected based on the soil condition at the recording station. The soils in these stations are divided into two groups of soil and rock according to their measured shear wave velocity. These accelerograms are then analyzed using wavelet transform. Next, artificial neural networks ability to produce reverse signal from response spectra is used to produce wavelet coefficients. Furthermore, a genetic algorithm is employed to optimize the network weight and bias matrices by searching in a wide range of values and prevent neural network convergence on local optima. At the end site specific accelerograms are produced. In this paper a number of recorded accelerograms in Iran are employed to test the neural network performances and to demonstrate the effectiveness of the method. It is shown that using synthetic time series analysis, genetic algorithm, neural network and wavelet transform will increase the capabilities of the algorithm and improve its speed and accuracy in generating accelerograms compatible with site specific response spectra for different site conditions.

  13. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment

    Directory of Open Access Journals (Sweden)

    Hongyi Mao

    2017-01-01

    Full Text Available In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients’ medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed.

  14. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment

    Science.gov (United States)

    2017-01-01

    In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR) system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients' medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients) are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed. PMID:28280506

  15. An Interrupted Time Series Analysis to Determine the Effect of an Electronic Health Record-Based Intervention on Appropriate Screening for Type 2 Diabetes in Urban Primary Care Clinics in New York City.

    Science.gov (United States)

    Albu, Jeanine B; Sohler, Nancy; Li, Rui; Li, Xuan; Young, Edwin; Gregg, Edward W; Ross-Degnan, Dennis

    2017-08-01

    To determine the impact of a health system-wide primary care diabetes management system, which included targeted guidelines for type 2 diabetes (T2DM) and prediabetes (dysglycemia) screening, on detection of previously undiagnosed dysglycemia cases. Intervention included electronic health record (EHR)-based decision support and standardized providers and staff training for using the American Diabetes Association guidelines for dysglycemia screening. Using EHR data, we identified 40,456 adults without T2DM or recent screening with a face-to-face visit (March 2011-December 2013) in five urban clinics. Interrupted time series analyses examined the impact of the intervention on trends in three outcomes: 1 ) monthly proportion of eligible patients receiving dysglycemia testing, 2 ) two negative comparison conditions (dysglycemia testing among ineligible patients and cholesterol screening), and 3 ) yield of undiagnosed dysglycemia among those tested. Baseline monthly proportion of eligible patients receiving testing was 7.4-10.4%. After the intervention, screening doubled (mean increase + 11.0% [95% CI 9.0, 13.0], proportion range 18.6-25.3%). The proportion of ineligible patients tested also increased (+5.0% [95% CI 3.0, 8.0]) with no concurrent change in cholesterol testing (+0% [95% CI -0.02, 0.05]). About 59% of test results in eligible patients showed dysglycemia both before and after the intervention. Implementation of a policy for systematic dysglycemia screening including formal training and EHR templates in urban academic primary care clinics resulted in a doubling of appropriate testing and the number of patients who could be targeted for treatment to prevent or delay T2DM. © 2017 by the American Diabetes Association.

  16. Crossing trend analysis methodology and application for Turkish rainfall records

    Science.gov (United States)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  17. Query Log Analysis of an Electronic Health Record Search Engine

    Science.gov (United States)

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A.

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users’ information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR. PMID:22195150

  18. 77 FR 27561 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking...

    Science.gov (United States)

    2012-05-11

    ... Fingerprinting and Criminal History Records Check Requirements for Unescorted Access to Research and Test... Criminal History Records Check Requirements for Unescorted Access to the General Atomics' Research and Test...-Based Criminal History Records Checks for Individuals Seeking Unescorted Access to Non-Power Reactors...

  19. Analysis and feasibility of chemical recording using thermosensitive liposomes

    Science.gov (United States)

    Tanner, Maria E.; Vasievich, Elizabeth A.; Protz, Jonathan M.

    2007-12-01

    A new generation of inertial measurement technology is being developed enabling a 10-micron particle to be "aware" of its geospatial location and respond to this information. The proposed approach combines an inertially-sensitive nanostructure or nano fluid/structure system with a nano-sized chemical reactor that functions as an analog computer. Originally, a cantilever-controlled valve used to control a first order chemical reaction was proposed. The feasibility of this concept was evaluated, resulting in a device with significant size reductions, comparable gain, and lower bandwidth than current accelerometers. New concepts with additional refinements have been investigated. Buoyancy-driven convection coupled with a chemical recording technique is explored as a possible alternative. Using a micro-track containing regions of different temperatures and thermosensitive liposomes (TSL), a range of accelerations can be recorded and the position determined. Through careful design, TSL can be developed that have unique transition temperatures and each class of TSL will contain a unique DNA sequence that serves as an identifier. Acceleration can be detected through buoyancy-driven convection. As the liposomes travel to regions of warmer temperature, they will release their contents at the recording site, thus documenting the acceleration. This paper will outline the concept and present the feasibility.

  20. 49 CFR 1542.209 - Fingerprint-based criminal history records checks (CHRC).

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Fingerprint-based criminal history records checks... Operations § 1542.209 Fingerprint-based criminal history records checks (CHRC). (a) Scope. The following... fingerprint-based CHRC that does not disclose that he or she has a disqualifying criminal offense, as...

  1. 49 CFR 1544.230 - Fingerprint-based criminal history records checks (CHRC): Flightcrew members.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Fingerprint-based criminal history records checks... Fingerprint-based criminal history records checks (CHRC): Flightcrew members. (a) Scope. This section applies... each flightcrew member has undergone a fingerprint-based CHRC that does not disclose that he or she has...

  2. Analysis of Direct Recordings from the Surface of the Human Brain

    Science.gov (United States)

    Towle, Vernon L.

    2006-03-01

    , suggestive of a transiently active language network. Our findings suggest that analysis of coherence patterns can supplement visual inspection of conventional records to help identify pathological regions of cortex. With further study, it is hoped that analysis of single channel dynamics, along with analysis of multichannel lateral coherence patterns, and the functional holographic technique may allow determination of the boundaries of epileptic foci based on brief interictal recordings, possibly obviating the current need for extended monitoring of seizures.

  3. Flexible polyimide microelectrode array for in vivo recordings and current source density analysis

    OpenAIRE

    Cheung, Karen; Renaud, Philippe; Tanila, Heikki; Djupsund, Kaj

    2007-01-01

    This work presents implantable, flexible polymer-based probes with embedded microelectrodes for acute and chronic neural recordings in vivo, as tested on rodents. Acute recordings using this array were done in mice under urethane anesthesia and compared to those made using silicon-based probes manufactured at the Center for Neural Communication Technology, University of Michigan. The two electrode arrays yielded similar results. Recordings with chronically implanted polymer-based electr...

  4. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas

    2012-12-01

    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  5. Global surveillance of trends in cancer survival 2000-14 (CONCORD-3): analysis of individual records for 37 513 025 patients diagnosed with one of 18 cancers from 322 population-based registries in 71 countries.

    Science.gov (United States)

    Allemani, Claudia; Matsuda, Tomohiro; Di Carlo, Veronica; Harewood, Rhea; Matz, Melissa; Nikšić, Maja; Bonaventure, Audrey; Valkov, Mikhail; Johnson, Christopher J; Estève, Jacques; Ogunbiyi, Olufemi J; Azevedo E Silva, Gulnar; Chen, Wan-Qing; Eser, Sultan; Engholm, Gerda; Stiller, Charles A; Monnereau, Alain; Woods, Ryan R; Visser, Otto; Lim, Gek Hsiang; Aitken, Joanne; Weir, Hannah K; Coleman, Michel P

    2018-03-17

    In 2015, the second cycle of the CONCORD programme established global surveillance of cancer survival as a metric of the effectiveness of health systems and to inform global policy on cancer control. CONCORD-3 updates the worldwide surveillance of cancer survival to 2014. CONCORD-3 includes individual records for 37·5 million patients diagnosed with cancer during the 15-year period 2000-14. Data were provided by 322 population-based cancer registries in 71 countries and territories, 47 of which provided data with 100% population coverage. The study includes 18 cancers or groups of cancers: oesophagus, stomach, colon, rectum, liver, pancreas, lung, breast (women), cervix, ovary, prostate, and melanoma of the skin in adults, and brain tumours, leukaemias, and lymphomas in both adults and children. Standardised quality control procedures were applied; errors were rectified by the registry concerned. We estimated 5-year net survival. Estimates were age-standardised with the International Cancer Survival Standard weights. For most cancers, 5-year net survival remains among the highest in the world in the USA and Canada, in Australia and New Zealand, and in Finland, Iceland, Norway, and Sweden. For many cancers, Denmark is closing the survival gap with the other Nordic countries. Survival trends are generally increasing, even for some of the more lethal cancers: in some countries, survival has increased by up to 5% for cancers of the liver, pancreas, and lung. For women diagnosed during 2010-14, 5-year survival for breast cancer is now 89·5% in Australia and 90·2% in the USA, but international differences remain very wide, with levels as low as 66·1% in India. For gastrointestinal cancers, the highest levels of 5-year survival are seen in southeast Asia: in South Korea for cancers of the stomach (68·9%), colon (71·8%), and rectum (71·1%); in Japan for oesophageal cancer (36·0%); and in Taiwan for liver cancer (27·9%). By contrast, in the same world region

  6. Practical analysis of tide gauges records from Antarctica

    Science.gov (United States)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted

  7. Obesity research based on the Copenhagen School Health Records Register

    DEFF Research Database (Denmark)

    Baker, Jennifer L; Sørensen, Thorkild I A

    2011-01-01

    of the obesity epidemic, and the long-term health consequences of birth weight as well as body size and growth in childhood. CONCLUSION: Research using this unique register is ongoing, and its contributions to the study of obesity as well as other topics will continue for years to come.......INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from...... the capital city of Denmark. Additional information on the cohort members has been obtained via linkages with population studies and national registers. RESEARCH TOPICS: Studies using data from the register have made important contributions in the areas of the aetiology of obesity, the development...

  8. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  9. Electronic Health Record Portal Adoption: a cross country analysis.

    Science.gov (United States)

    Tavares, Jorge; Oliveira, Tiago

    2017-07-05

    This study's goal is to understand the factors that drive individuals to adopt Electronic Health Record (EHR) portals and to estimate if there are differences between countries with different healthcare models. We applied a new adoption model using as a starting point the extended Unified Theory of Acceptance and Use of Technology (UTAUT2) by incorporating the Concern for Information Privacy (CFIP) framework. To evaluate the research model we used the partial least squares (PLS) - structural equation modelling (SEM) approach. An online questionnaire was administrated in the United States (US) and Europe (Portugal). We collected 597 valid responses. The statistically significant factors of behavioural intention are performance expectancy ([Formula: see text] total  = 0.285; P adoption of EHR portals and significant differences between the countries. Confidentiality issues do not seem to influence acceptance. The EHR portals usage patterns are significantly higher in US compared to Portugal.

  10. Spectroscopy for amateur astronomers recording, processing, analysis and interpretation

    CERN Document Server

    Trypsteen , Marc F M

    2017-01-01

    This accessible guide presents the astrophysical concepts behind astronomical spectroscopy, covering both the theory and the practical elements of recording, processing, analysing and interpreting your spectra. It covers astronomical objects, such as stars, planets, nebulae, novae, supernovae, and events such as eclipses and comet passages. Suitable for anyone with only a little background knowledge and access to amateur-level equipment, the guide's many illustrations, sketches and figures will help you understand and practise this scientifically important and growing field of amateur astronomy, up to the level of Pro-Am collaborations. Accessible to non-academics, it benefits many groups from novices and learners in astronomy clubs, to advanced students and teachers of astrophysics. This volume is the perfect companion to the Spectral Atlas for Amateur Astronomers, which provides detailed commented spectral profiles of more than 100 astronomical objects.

  11. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems.

    Science.gov (United States)

    Seo, Hwa Jeong; Kim, Hye Hyeon; Kim, Ju Han

    2011-09-01

    Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system.

  12. Instrumentation for low noise nanopore-based ionic current recording under laser illumination

    Science.gov (United States)

    Roelen, Zachary; Bustamante, José A.; Carlsen, Autumn; Baker-Murray, Aidan; Tabard-Cossa, Vincent

    2018-01-01

    We describe a nanopore-based optofluidic instrument capable of performing low-noise ionic current recordings of individual biomolecules under laser illumination. In such systems, simultaneous optical measurements generally introduce significant parasitic noise in the electrical signal, which can severely reduce the instrument sensitivity, critically hindering the monitoring of single-molecule events in the ionic current traces. Here, we present design rules and describe simple adjustments to the experimental setup to mitigate the different noise sources encountered when integrating optical components to an electrical nanopore system. In particular, we address the contributions to the electrical noise spectra from illuminating the nanopore during ionic current recording and mitigate those effects through control of the illumination source and the use of a PDMS layer on the SiNx membrane. We demonstrate the effectiveness of our noise minimization strategies by showing the detection of DNA translocation events during membrane illumination with a signal-to-noise ratio of ˜10 at 10 kHz bandwidth. The instrumental guidelines for noise minimization that we report are applicable to a wide range of nanopore-based optofluidic systems and offer the possibility of enhancing the quality of synchronous optical and electrical signals obtained during single-molecule nanopore-based analysis.

  13. Humor During Clinical Practice: Analysis of Recorded Clinical Encounters.

    Science.gov (United States)

    Phillips, Kari A; Singh Ospina, Naykky; Rodriguez-Gutierrez, Rene; Castaneda-Guarderas, Ana; Gionfriddo, Michael R; Branda, Megan; Montori, Victor

    2018-01-01

    Little is known about humor's use in clinical encounters, despite its many potential benefits. We aimed to describe humor during clinical encounters. We analyzed 112 recorded clinical encounters. Two reviewers working independently identified instances of humor, as well as information surrounding the logistics of its use. Of the 112 encounters, 66 (59%) contained 131 instances of humor. Humor was similarly frequent in primary care (36/61, 59%) and in specialty care (30/51, 59%), was more common in gender-concordant interactions (43/63, 68%), and was most common during counseling (81/112, 62%). Patients and clinicians introduced humor similarly (63 vs 66 instances). Typically, humor was about the patient's medical condition (40/131, 31%). Humor is used commonly during counseling to discuss the patient's medical condition and to relate to general life events bringing warmth to the medical encounter. The timing and topic of humor and its use by all parties suggests humor plays a role in the social connection between patients and physicians and allows easier discussion of difficult topics. Further research is necessary to establish its impact on clinicians, patients, and outcomes. © Copyright 2018 by the American Board of Family Medicine.

  14. Excimer laser beam profile recording based on electrochemical etched polycarbonate

    International Nuclear Information System (INIS)

    Parvin, P.; Jaleh, B.; Zangeneh, H.R.; Zamanipour, Z.; Davoud-Abadi, Gh.R.

    2008-01-01

    There is no polymeric detector used to register the beam profile of UV lasers. Here, a method is proposed for the measurement of intensive UV beam pattern of the excimer lasers based on the photoablated polycarbonate detector after coherent UV exposure and the subsequent electrochemical etching. UV laser induced defects in the form of self-microstructuring on polycarbonate are developed to replicate the spatial intensity distribution as a beam profiler

  15. Healthcare professionals' acceptance of BelRAI, a web-based system enabling person-centred recording and data sharing across care settings with interRAI instruments: a UTAUT analysis.

    Science.gov (United States)

    Vanneste, Dirk; Vermeulen, Bram; Declercq, Anja

    2013-11-27

    Healthcare and social care environments are increasingly confronted with older persons with long-term care needs. Consequently, the need for integrated and coordinated assessment systems increases. In Belgium, feasibility studies have been conducted on the implementation and use of interRAI instruments offering opportunities to improve continuity and quality of care. However, the development and implementation of information technology to support a shared dataset is a difficult and gradual process. We explore the applicability of the UTAUT theoretical model in the BelRAI healthcare project to analyse the acceptance of the BelRAI web application by healthcare professionals in home care, nursing home care and acute hospital care for older people with disabilities. A structured questionnaire containing items based on constructs validated in the original UTAUT study was distributed to 661 Flemish caregivers. We performed a complete case analysis using data from 282 questionnaires to obtain information regarding the effects of performance expectancy (PE), effort expectancy (EE), social influence (SI), facilitating conditions (FC), anxiety (ANX), self-efficacy (SE) and attitude towards using technology (ATUT) on behavioural intention (BI) to use the BelRAI web application. The values of the internal consistency evaluation of each construct demonstrated adequate reliability of the survey instrument. Convergent and discriminant validity were established. However, the items of the ATUT construct cross-loaded on PE. FC proved to have the most significant influence on BI to use BelRAI, followed by SE. Other constructs (PE, EE, SI, ANX, ATUT) had no significant influence on BI. The 'direct effects only' model explained 30.8% of the variance in BI to use BelRAI. Critical factors in stimulating the behavioural intention to use new technology are good-quality software, interoperability and compatibility with other information systems, easy access to computers, training facilities

  16. Training-free compressed sensing for wireless neural recording using analysis model and group weighted [Formula: see text]-minimization.

    Science.gov (United States)

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis [Formula: see text]-minimization (GWALM), is proposed for wireless neural recording. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis [Formula: see text]-minimization. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  17. Training-free compressed sensing for wireless neural recording using analysis model and group weighted {{\\ell}_{1}} -minimization

    Science.gov (United States)

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Objective. Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis {{\\ell}1} -minimization (GWALM), is proposed for wireless neural recording. Approach. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis {{\\ell}1} -minimization. Main results. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Significance. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  18. Understanding advocacy practice in mental health: a multidimensional scalogram analysis of case records.

    Science.gov (United States)

    Morrison, Paul; Stomski, Norman J; Whitely, Martin; Brennan, Pip

    2018-04-01

    Few studies have examined mental health consumers' motives for seeking advocacy assistance. This study aimed to identify factors that influenced mental health consumers' use of advocacy services. The analysis was based on 60 case records that were sourced from a community advocacy service. Each record was dichotomously coded across 11 variables to generate a series of categorical data profiles. The data set was then analysed using multidimensional scalogram analysis to reveal key relationships between subsets of variables. The results indicated that mental health consumers commonly reported a sense of fear, which motivated them to contact the advocacy service in the hope that advocates could intervene on their behalf through effective communication with health professionals. Advocates often undertook such intervention either through attending meetings between the consumer and health professionals or contacting health professionals outside of meetings, which was typically successful in terms of achieving mental health consumers' desired outcome. The resolution of most concerns suggests that they were often legitimate and not the result of a lack of insight or illness symptoms. Health professionals might consider exploring how they respond when consumers or carers raise concerns about the delivery of mental health care.

  19. Factors influencing consumer adoption of USB-based Personal Health Records in Taiwan

    Directory of Open Access Journals (Sweden)

    Jian Wen-Shan

    2012-08-01

    Full Text Available Abstract Background Usually patients receive healthcare services from multiple hospitals, and consequently their healthcare data are dispersed over many facilities’ paper and electronic-based record systems. Therefore, many countries have encouraged the research on data interoperability, access, and patient authorization. This study is an important part of a national project to build an information exchange environment for cross-hospital digital medical records carried out by the Department of Health (DOH of Taiwan in May 2008. The key objective of the core project is to set up a portable data exchange environment in order to enable people to maintain and own their essential health information. This study is aimed at exploring the factors influencing behavior and adoption of USB-based Personal Health Records (PHR in Taiwan. Methods Quota sampling was used, and structured questionnaires were distributed to the outpatient department at ten medical centers which participated in the DOH project to establish the information exchange environment across hospitals. A total of 3000 questionnaires were distributed and 1549 responses were collected, out of those 1465 were valid, accumulating the response rate to 48.83%. Results 1025 out of 1465 respondents had expressed their willingness to apply for the USB-PHR. Detailed analysis of the data reflected that there was a remarkable difference in the “usage intention” between the PHR adopters and non-adopters (χ2 =182.4, p  Conclusions Higher Usage Intentions, Perceived Usefulness and Subjective Norm of patients were found to be the key factors influencing PHR adoption. Thus, we suggest that government and hospitals should promote the potential usefulness of PHR, and physicians should encourage patients' to adopt the PHR.

  20. Factors influencing consumer adoption of USB-based Personal Health Records in Taiwan.

    Science.gov (United States)

    Jian, Wen-Shan; Syed-Abdul, Shabbir; Sood, Sanjay P; Lee, Peisan; Hsu, Min-Huei; Ho, Cheng-Hsun; Li, Yu-Chuan; Wen, Hsyien-Chia

    2012-08-27

    Usually patients receive healthcare services from multiple hospitals, and consequently their healthcare data are dispersed over many facilities' paper and electronic-based record systems. Therefore, many countries have encouraged the research on data interoperability, access, and patient authorization. This study is an important part of a national project to build an information exchange environment for cross-hospital digital medical records carried out by the Department of Health (DOH) of Taiwan in May 2008. The key objective of the core project is to set up a portable data exchange environment in order to enable people to maintain and own their essential health information.This study is aimed at exploring the factors influencing behavior and adoption of USB-based Personal Health Records (PHR) in Taiwan. Quota sampling was used, and structured questionnaires were distributed to the outpatient department at ten medical centers which participated in the DOH project to establish the information exchange environment across hospitals. A total of 3000 questionnaires were distributed and 1549 responses were collected, out of those 1465 were valid, accumulating the response rate to 48.83%. 1025 out of 1465 respondents had expressed their willingness to apply for the USB-PHR. Detailed analysis of the data reflected that there was a remarkable difference in the "usage intention" between the PHR adopters and non-adopters (χ2 =182.4, p factors affecting patients' adoption pattern were Usage Intention (OR, 9.43, 95%C.I., 5.87-15.16), Perceived Usefulness (OR, 1.60; 95%C.I., 1.11-2.29) and Subjective Norm (OR, 1.47; 95%C.I., 1.21-1.78). Higher Usage Intentions, Perceived Usefulness and Subjective Norm of patients were found to be the key factors influencing PHR adoption. Thus, we suggest that government and hospitals should promote the potential usefulness of PHR, and physicians should encourage patients' to adopt the PHR.

  1. Analysis of raw AIS spectrum recordings from a LEO satellite

    DEFF Research Database (Denmark)

    Larsen, Jesper Abildgaard; Mortensen, Hans Peter

    2014-01-01

    The AAUSAT3 satellite is a 1U cubesat, which has been developed by students at Aalborg University, Denmark in collaboration with the Danish Maritime Authority. The satellite was launched in February 2013 on a mission to monitor ships from space using their AIS broadcast signals as an indication...... of position. The SDR receiver developed to listen for these AIS signals also allows for sampling and storing of the raw intermediate frequency spectrum, which has been used in order to map channel utilization over the areas of interest for the mission, which is mainly the arctic regions. The SDR based...... receiver used onboard the satellite is using a single chip front-end solution, which down converts the AIS signal located around 162 MHz into an intermediate frequency, with a up to 200 kHz bandwidth. This I/F signal is sampled with a 750 kSPS A/D converter and further processed by an Analog Devices DSP...

  2. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Science.gov (United States)

    2010-07-01

    ... records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests... and cellulose-acetate base film? (a) The nitrocellulose base, a substance akin to gun cotton, is... picture film and X-ray film—nitrocellulose base). (b) Agencies must inspect cellulose-acetate film...

  3. Privacy Impact Assessment for the Lead-based Paint System of Records

    Science.gov (United States)

    The Lead-based Paint System of Records collects personally identifiable information, test scores, and submitted fees. Learn how this data is collected, how it will be used, access to the data, the purpose of data collection, and record retention policies.

  4. Multifractal analysis of long term records of karst watershed discharges

    Science.gov (United States)

    Labat, David; Mangin, Alain; Schertzer, Daniel; Tchinquirinskaia, Ioulia

    2010-05-01

    Karstic aquifers constitute a freshwater resource still under exploited in the world. Despite the importance of karst aquifer as a freshwater source for most Mediterranean countries for example, their complex behavior makes their exploitation much less easier than classic porous or even fissured aquifers. The mechanisms that generate water production and circulation need to be further precised. In classical porous aquifers, water both flows and is stored in the pores or in the fissures. Because of the carbonates dissolution, karstic aquifers structure makes the water flows in large drains connected to annex systems that constitute large water reserves. The existence of both rapid infiltration via boreholes and infiltration via epikarstic soil combined to diphasic flow in the unsatured zone and complex hydraulic connections in the saturated zone lead to a nonlinear response reflecting the large diversity of pathways connecting surface with spring.Therefore, karstic aquifers appear naturally as unconventional aquifers with micro- and macro-hydraulic elements. This extreme variability over a wide range of scales naturally suggests applying multifractal concepts based on scale invariance. In this contribution, based on a 10-years high temporal resolution runoff database over two French karstic watersheds (Aliou and Baget) with around 80000 consecutive data, we identify and characterize this multifractal properties of these two karstic watersheds and compare them to multifractal parameters already determined in surface hydrology. Besides the apparent heterogeneity of karstic systems, the aquifer response exhibits scale invariance behaviour over one or two large range of scales from flood scales (up to 1 day) to annual behaviour. The existence of a scale break in Aliou runoff time series can be explained by the high degree of karstification of this system that lead to a drain-concentrated behaviour for processes inferior to 1 day. In order to quantify the degree of

  5. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, Gabriele; Smith, James A.; Serinaldi, Francesco; Bales, Jerad; Bates, Paul D.; Krajewski, Witold F.

    2009-08-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110km) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1mskm to a maximum of 5.1mskm. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2mskm). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2mskm ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades.

  6. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.

    2009-01-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  7. Utilizing IHE-based Electronic Health Record systems for secondary use.

    Science.gov (United States)

    Holzer, K; Gall, W

    2011-01-01

    Due to the increasing adoption of Electronic Health Records (EHRs) for primary use, the number of electronic documents stored in such systems will soar in the near future. In order to benefit from this development in secondary fields such as medical research, it is important to define requirements for the secondary use of EHR data. Furthermore, analyses of the extent to which an IHE (Integrating the Healthcare Enterprise)-based architecture would fulfill these requirements could provide further information on upcoming obstacles for the secondary use of EHRs. A catalog of eight core requirements for secondary use of EHR data was deduced from the published literature, the risk analysis of the IHE profile MPQ (Multi-Patient Queries) and the analysis of relevant questions. The IHE-based architecture for cross-domain, patient-centered document sharing was extended to a cross-patient architecture. We propose an IHE-based architecture for cross-patient and cross-domain secondary use of EHR data. Evaluation of this architecture concerning the eight core requirements revealed positive fulfillment of six and the partial fulfillment of two requirements. Although not regarded as a primary goal in modern electronic healthcare, the re-use of existing electronic medical documents in EHRs for research and other fields of secondary application holds enormous potential for the future. Further research in this respect is necessary.

  8. Computer vision-based diameter maps to study fluoroscopic recordings of small intestinal motility from conscious experimental animals.

    Science.gov (United States)

    Ramírez, I; Pantrigo, J J; Montemayor, A S; López-Pérez, A E; Martín-Fontelles, M I; Brookes, S J H; Abalo, R

    2017-08-01

    When available, fluoroscopic recordings are a relatively cheap, non-invasive and technically straightforward way to study gastrointestinal motility. Spatiotemporal maps have been used to characterize motility of intestinal preparations in vitro, or in anesthetized animals in vivo. Here, a new automated computer-based method was used to construct spatiotemporal motility maps from fluoroscopic recordings obtained in conscious rats. Conscious, non-fasted, adult, male Wistar rats (n=8) received intragastric administration of barium contrast, and 1-2 hours later, when several loops of the small intestine were well-defined, a 2 minutes-fluoroscopic recording was obtained. Spatiotemporal diameter maps (Dmaps) were automatically calculated from the recordings. Three recordings were also manually analyzed for comparison. Frequency analysis was performed in order to calculate relevant motility parameters. In each conscious rat, a stable recording (17-20 seconds) was analyzed. The Dmaps manually and automatically obtained from the same recording were comparable, but the automated process was faster and provided higher resolution. Two frequencies of motor activity dominated; lower frequency contractions (15.2±0.9 cpm) had an amplitude approximately five times greater than higher frequency events (32.8±0.7 cpm). The automated method developed here needed little investigator input, provided high-resolution results with short computing times, and automatically compensated for breathing and other small movements, allowing recordings to be made without anesthesia. Although slow and/or infrequent events could not be detected in the short recording periods analyzed to date (17-20 seconds), this novel system enhances the analysis of in vivo motility in conscious animals. © 2017 John Wiley & Sons Ltd.

  9. Electronic Health Record Systems and Intent to Apply for Meaningful Use Incentives among Office-based Physician ...

    Science.gov (United States)

    ... Order from the National Technical Information Service NCHS Electronic Health Record Systems and Intent to Apply for ... In 2011, 57% of office-based physicians used electronic medical record/electronic health record (EMR/EHR) systems, ...

  10. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  11. Use and Characteristics of Electronic Health Record Systems among Office-Based Physician Practices: United States, ...

    Science.gov (United States)

    ... the National Technical Information Service NCHS Use and Characteristics of Electronic Health Record Systems Among Office-based ... physicians that collects information on physician and practice characteristics, including the adoption and use of EHR systems. ...

  12. An Enterprise Architecture Perspective to Electronic Health Record Based Care Governance.

    Science.gov (United States)

    Motoc, Bogdan

    2017-01-01

    This paper proposes an Enterprise Architecture viewpoint of Electronic Health Record (EHR) based care governance. The improvements expected are derived from the collaboration framework and the clinical health model proposed as foundation for the concept of EHR.

  13. Natural Language Processing Based Instrument for Classification of Free Text Medical Records

    OpenAIRE

    Khachidze, Manana; Tsintsadze, Magda; Archuadze, Maia

    2016-01-01

    According to the Ministry of Labor, Health and Social Affairs of Georgia a new health management system has to be introduced in the nearest future. In this context arises the problem of structuring and classifying documents containing all the history of medical services provided. The present work introduces the instrument for classification of medical records based on the Georgian language. It is the first attempt of such classification of the Georgian language based medical records. On the w...

  14. Evaluation of Reprocessed Suomi NPP VIIRS Sensor Data Records based on Sensitivity of Environmental Data Records Trending to Changes in Sensor Data Records

    Science.gov (United States)

    Huang, J.; Weng, F.; Sun, N.

    2017-12-01

    As the inputs to satellite Environmental Data Records (EDR) that provide continuous monitoring of Earth System changes from space, Sensor Data Records (SDR) need to meet very high standards of accuracy. SDR reprocessing, aiming for accurately accounting sensor degradation and calibration issues, is therefore very important in satellite remote sensing. Previous studies on heritage Terra MODIS in NASA Earth Observation System (EOS) indicated that SDR degradation over time, if not correctly calibrated and reprocessed, can result in false trending in several key satellite EDR observations, such as aerosol optical depth (AOD) and vegetation index (VI). Yet the sensitivity of these EDRs to the changes in the reprocessed SDRs is still not comprehensively understood or quantified. As part of the Suomi NPP SDR long term monitoring efforts, the current ongoing SDR reprocessing at NOAA NESDIS STAR provides a unique test bed for quantifying the changes of EDRs to the reprocessed SDRs, and thus improves our understanding of the potential impacts of the SDR reprocessing on our capability of critical Earth observations. For the sensitivity investigation, we selected the VIIRS aerosol algorithm, which EDR algorithm uses most of the visible to near infrared (VIS-NIR) SDR bands. Several aerosol hotspot regions over the globe are selected for conducting AOD trending analysis under several prescribed SDR reprocessing scenarios, and the changes in the spatial and temporal characterizations of AOD are linked to the changes in SDR for exploration of any potential systematic relations. Preliminary results indicated that although changes varies by regions and seasons, some EDRs can be sensitive to even slight SDR changes in certain VIS-NIR bands. The study sheds important lights on how we can use the SDR-EDR relation as an additional approach to facilitate the SDR reprocessing evaluation. Details of the finding will be reported at the presentation.

  15. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code

    Science.gov (United States)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi

    2014-12-01

    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  16. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems

    Science.gov (United States)

    Seo, Hwa Jeong; Kim, Hye Hyeon

    2011-01-01

    Objectives Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Methods Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Results Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. Conclusions By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system. PMID:22084811

  17. Selecting a summation base in diffraction transformation of seismic recordings (in an example of Northern Sakhalin)

    Energy Technology Data Exchange (ETDEWEB)

    Bulatov, M.G.; Telegin, A.N.

    1984-01-01

    The effect of the dimensions of a processing base on the results of diffraction transformation of seismic recordings is examined. A formula is cited for rating the optimal summation base on the basis of a proposed range of slant angles of the reflecting boundaries. The recommendations for selecting a processing base are confirmed by factual material.

  18. Ex post power economic analysis of record of decision operational restrictions at Glen Canyon Dam.

    Energy Technology Data Exchange (ETDEWEB)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.; Loftin, S.; Osiek, B; Decision and Information Sciences; Western Area Power Administration

    2010-07-31

    On October 9, 1996, Bruce Babbitt, then-Secretary of the U.S. Department of the Interior signed the Record of Decision (ROD) on operating criteria for the Glen Canyon Dam (GCD). Criteria selected were based on the Modified Low Fluctuating Flow (MLFF) Alternative as described in the Operation of Glen Canyon Dam, Colorado River Storage Project, Arizona, Final Environmental Impact Statement (EIS) (Reclamation 1995). These restrictions reduced the operating flexibility of the hydroelectric power plant and therefore its economic value. The EIS provided impact information to support the ROD, including an analysis of operating criteria alternatives on power system economics. This ex post study reevaluates ROD power economic impacts and compares these results to the economic analysis performed prior (ex ante) to the ROD for the MLFF Alternative. On the basis of the methodology used in the ex ante analysis, anticipated annual economic impacts of the ROD were estimated to range from approximately $15.1 million to $44.2 million in terms of 1991 dollars ($1991). This ex post analysis incorporates historical events that took place between 1997 and 2005, including the evolution of power markets in the Western Electricity Coordinating Council as reflected in market prices for capacity and energy. Prompted by ROD operational restrictions, this analysis also incorporates a decision made by the Western Area Power Administration to modify commitments that it made to its customers. Simulated operations of GCD were based on the premise that hourly production patterns would maximize the economic value of the hydropower resource. On the basis of this assumption, it was estimated that economic impacts were on average $26.3 million in $1991, or $39 million in $2009.

  19. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    Science.gov (United States)

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  20. Performance analysis of seismocardiography for heart sound signal recording in noisy scenarios.

    Science.gov (United States)

    Jain, Puneet Kumar; Tiwari, Anil Kumar; Chourasia, Vijay S

    2016-01-01

    This paper presents a system based on Seismocardiography (SCG) to monitor the heart sound signal for the long-term. It uses an accelerometer, which is of small size and low weight and, thus, convenient to wear. Such a system should also be robust to various noises which occur in real life scenarios. Therefore, a detailed analysis is provided of the proposed system and its performance is compared to the performance of the Phoncardiography (PCG) system. For this purpose, both signals of five subjects were simultaneously recorded in clinical and different real life noisy scenarios. For the quantitative analysis, the detection rate of fundamental heart sound components, S1 and S2, is obtained. Furthermore, a quality index based on the energy of fundamental components is also proposed and obtained for the same. Results show that both the techniques are able to acquire the S1 and S2, in clinical set-up. However, in real life scenarios, we observed many favourable features in the proposed system as compared to PCG, for its use for long-term monitoring.

  1. Community-based, interdisciplinary geriatric care team satisfaction with an electronic health record: a multimethod study.

    Science.gov (United States)

    Sockolow, Paulina S; Bowles, Kathryn H; Lehmann, Harold P; Abbott, Patricia A; Weiner, Jonathan P

    2012-06-01

    This multimethod study measured the impact of an electronic health record (EHR) on clinician satisfaction with clinical process. Subjects were 39 clinicians at a Program of All-inclusive Care for Elders (PACE) site in Philadelphia utilizing an EHR. Methods included the evidence-based evaluation framework, Health Information Technology Research-Based Evaluation Framework, which guided assessment of clinician satisfaction with surveys, observations, follow-up interviews, and actual EHR use at two points in time. Mixed-methods analysis of findings provided context for interpretation and improved validity. The study found that clinicians were satisfied with the EHR; however, satisfaction declined between time periods. Use of EHR was universal and wide and was differentiated by clinical role. Between time periods, EHR use increased in volume, with increased timeliness and decreased efficiency. As the first EHR evaluation at a PACE site from the perspective of clinicians who use the system, this study provides insights into EHR use in the care of older people in community-based healthcare settings.

  2. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  3. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis

    OpenAIRE

    Shenkin, S.D.; Zhang, M.G.; Der, G.; Mathur, S.; Mina, T.H.; Reynolds, R.M.

    2017-01-01

    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EM...

  4. AN ANALYSIS OF OFF RECORD STRATEGIES REFLECTING POLITENESS IMPLICATURE IN “OPRAH WINFREY SHOW”

    Directory of Open Access Journals (Sweden)

    Rahma Yanti

    2017-08-01

    Full Text Available This thesis discusses strategies off the record that describes implicatures modesty in a conversation. Off record strategy is one of the five strategies. This strategy is discussed for the use of the language used in the forms of direct. The object of research are strategies off the record that describes implicatures politeness in a famous talk show in America, namely, "Oprah Winfrey Show". The data were taken using methods refer to refer techniques involved free conversation, where the author was not involved in the dialogue that occurs because the data is taken from the TV show, recording technique with the aid of a recorded tape. Furthermore, the authors use the technique CAAT by way of transcribing talk show back in the form transcription ortrografis. This analysis uses methods equivalent pragmatic look at the role of external factors of language, especially the factor of interlocutors on selection strategies used off record. The results showed that the context of the situation and the violations of the maxim of conversation will influence the choice of strategies used off record. However there are some cases when this option do not follow the rules. This is because of other factors that come into play in a conbersation such as an intonation. mplicatures appear generally in the form of affirmation that is used in polite. In one sentence found two or more strategy off record selected speakers.

  5. Using the Java language to develop computer based patient records for use on the Internet.

    OpenAIRE

    Zuckerman, A. E.

    1996-01-01

    The development of the Java Programming Language by Sun Microsystems has provided a new tool for the development of Internet based applications. Our preliminary work has shown how Java can be used to program an Internet based CBPR. Java is well suited to the needs of patient records and can interface with clinical data repositories written in MUMPS or SQL.

  6. A Real Application of a Concept-based Electronic Medical Record

    Science.gov (United States)

    Purin, Barbara; Eccher, Claudio; Forti, Stefano

    2003-01-01

    We present a real implementation of a concept-based Electronic Medical Record for the management of heart failure disease. Our approach is based on GEHR archetypes represented in XML format for modelling clinical information. By using this technique it could be possible to build a interoperable future-proof clinical information system. PMID:14728481

  7. Using the Java language to develop computer based patient records for use on the Internet.

    Science.gov (United States)

    Zuckerman, A. E.

    1996-01-01

    The development of the Java Programming Language by Sun Microsystems has provided a new tool for the development of Internet based applications. Our preliminary work has shown how Java can be used to program an Internet based CBPR. Java is well suited to the needs of patient records and can interface with clinical data repositories written in MUMPS or SQL. PMID:8947770

  8. Base compaction specification feasibility analysis.

    Science.gov (United States)

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  9. The computer based patient record: a strategic issue in process innovation.

    Science.gov (United States)

    Sicotte, C; Denis, J L; Lehoux, P

    1998-12-01

    Reengineering of the workplace through Information Technology is an important strategic issue for today's hospitals. The computer-based patient record (CPR) is one technology that has the potential to profoundly modify the work routines of the care unit. This study investigates a CPR project aimed at allowing physicians and nurses to work in a completely electronic environment. The focus of our analysis was the patient nursing care process. The rationale behind the introduction of this technology was based on its alleged capability to both enhance quality of care and control costs. This is done by better managing the flow of information within the organization and by introducing mechanisms such as the timeless and spaceless organization of the work place, de-localization, and automation of work processes. The present case study analyzed the implementation of a large CPR project ($45 million U.S.) conducted in four hospitals in joint venture with two computer firms. The computerized system had to be withdrawn because of boycotts from both the medical and nursing personnel. User-resistance was not the problem. Despite its failure, this project was a good opportunity to understand better the intricate complexity of introducing technology in professional work where the usefulness of information is short lived and where it is difficult to predetermine the relevancy of information. Profound misconceptions in achieving a tighter fit (synchronization) between care processes and information processes were the main problems.

  10. THE EVOLUTION OF THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) FROM THE AMIDON-FLANDERS INTERACTION ANALYSIS. APPENDIX G.

    Science.gov (United States)

    BALDWIN, PATRICIA

    A DETAILED LISTING IS GIVEN OF THE REVISIONS THAT WERE MADE TO THE AMIDON-FLANDERS INTERACTION ANALYSIS SCALE WHILE THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) SCALE WAS BEING DEVELOPED. COMMENTS ARE GIVEN FOR GUIDANCE IN THE USE OF SOME OF THE RATINGS ALONG WITH SOME GROUND RULES AND GUIDELINES FOR MAKING A FILM RATING. RELATED REPORTS ARE AA…

  11. On the directionality of cortical interactions studied by structural analysis of electrophysiological recordings.

    Science.gov (United States)

    Bernasconi, C; König, P

    1999-09-01

    To investigate the directionality of neural interactions as assessed by electrophysiology, we adapted methods of structural analysis from the field of econometrics. In particular, within the framework of autoregressive modelling of the data, we considered quantitative measures of linear relationship between multiple time series adopting the Wiener-Granger concept of causality. The techniques were evaluated with local field potential measurements from the cat visual system. Here, several issues had to be addressed. First, out of several statistical tests of the stationarity of local field potentials considered, those based on the Kolmogorov-Smirnov and on the reverse arrangement statistics proved to be most powerful. The application of those tests to the experimental data showed that the large part of the local field potentials can be considered stationary on a time scale of 1 s. Second, out of the several investigated methods for the determination of an optimal order of the autoregressive model, the Akaike Information Criterion had the most suitable properties. The identified order of the model, across different repetitions of the trials, was consistently 5-8. Third, although the individual segments of field potentials used for the analysis were relatively short, the methods of structural analysis applied produced reliable results, confirming findings of simulations of data with similar properties. Furthermore the features of the estimated models were consistent among trials, so that the analysis of average measures of interaction appears to be a viable approach to investigate the relationship between the recording sites. In summary, the statistical methods considered have proved to be suitable for the study of the directionality of neuronal interactions.

  12. Social science and linguistic text analysis of nurses' records: a systematic review and critique.

    Science.gov (United States)

    Buus, Niels; Hamilton, Bridget Elizabeth

    2016-03-01

    The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations. © 2015 John Wiley & Sons Ltd.

  13. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  14. Continuous Record Laplace-based Inference about the Break Date in Structural Change Models

    OpenAIRE

    Casini, Alessandro; Perron, Pierre

    2018-01-01

    Building upon the continuous record asymptotic framework recently introduced by Casini and Perron (2017a) for inference in structural change models, we propose a Laplace-based (Quasi-Bayes) procedure for the construction of the estimate and confidence set for the date of a structural change. The procedure relies on a Laplace-type estimator defined by an integration-based rather than an optimization-based method. A transformation of the leastsquares criterion function is evaluated in order to ...

  15. A Brief Tool to Assess Image-Based Dietary Records and Guide Nutrition Counselling Among Pregnant Women: An Evaluation

    Science.gov (United States)

    Ashman, Amy M; Collins, Clare E; Brown, Leanne J; Rae, Kym M

    2016-01-01

    Background Dietitians ideally should provide personally tailored nutrition advice to pregnant women. Provision is hampered by a lack of appropriate tools for nutrition assessment and counselling in practice settings. Smartphone technology, through the use of image-based dietary records, can address limitations of traditional methods of recording dietary intake. Feedback on these records can then be provided by the dietitian via smartphone. Efficacy and validity of these methods requires examination. Objective The aims of the Australian Diet Bytes and Baby Bumps study, which used image-based dietary records and a purpose-built brief Selected Nutrient and Diet Quality (SNaQ) tool to provide tailored nutrition advice to pregnant women, were to assess relative validity of the SNaQ tool for analyzing dietary intake compared with nutrient analysis software, to describe the nutritional intake adequacy of pregnant participants, and to assess acceptability of dietary feedback via smartphone. Methods Eligible women used a smartphone app to record everything they consumed over 3 nonconsecutive days. Records consisted of an image of the food or drink item placed next to a fiducial marker, with a voice or text description, or both, providing additional detail. We used the SNaQ tool to analyze participants’ intake of daily food group servings and selected key micronutrients for pregnancy relative to Australian guideline recommendations. A visual reference guide consisting of images of foods and drinks in standard serving sizes assisted the dietitian with quantification. Feedback on participants’ diets was provided via 2 methods: (1) a short video summary sent to participants’ smartphones, and (2) a follow-up telephone consultation with a dietitian. Agreement between dietary intake assessment using the SNaQ tool and nutrient analysis software was evaluated using Spearman rank correlation and Cohen kappa. Results We enrolled 27 women (median age 28.8 years, 8 Indigenous

  16. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis.

    Science.gov (United States)

    Shenkin, S D; Zhang, M G; Der, G; Mathur, S; Mina, T H; Reynolds, R M

    2017-04-01

    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EMBASE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) to May 2015. We included studies that reported recalled birth weight and recorded birth weight. We excluded studies investigating a clinical population. Two reviewers independently reviewed citations, extracted data, assessed risk of bias. Data were pooled in a random effects meta-analysis for correlation and mean difference. In total, 40 studies were eligible for qualitative synthesis (n=78,997 births from 78,196 parents). Agreement between recalled and recorded birth weight was high: pooled estimate of correlation in 23 samples from 19 studies (n=7406) was 0.90 [95% confidence interval (CI) 0.87-0.93]. The difference between recalled and recorded birth weight in 29 samples from 26 studies (n=29,293) was small [range -86-129 g; random effects estimate 1.4 g (95% CI -4.0-6.9 g)]. Studies were heterogeneous, with no evidence for an effect of time since birth, person reporting, recall bias, or birth order. In post-hoc subgroup analysis, recall was higher than recorded birth weight by 80 g (95% CI 57-103 g) in low and middle income countries. In conclusion, there is high agreement between recalled and recorded birth weight. If birth weight is recalled, it is suitable for use in epidemiological studies, at least in high income countries.

  17. Results from a survey of national immunization programmes on home-based vaccination record practices in 2013.

    Science.gov (United States)

    Young, Stacy L; Gacic-Dobo, Marta; Brown, David W

    2015-07-01

    Data on home-based records (HBRs) practices within national immunization programmes are non-existent, making it difficult to determine whether current efforts of immunization programmes related to basic recording of immunization services are appropriately focused. During January 2014, WHO and the United Nations Children's Fund sent a one-page questionnaire to 195 countries to obtain information on HBRs including type of record used, number of records printed, whether records were provided free-of-charge or required by schools, whether there was a stock-out and the duration of any stock-outs that occurred, as well as the total expenditure for printing HBRs during 2013. A total of 140 countries returned a completed HBR questionnaire. Two countries were excluded from analysis because they did not use a HBR during 2013. HBR types varied across countries (vaccination only cards, 32/138 [23.1%]; vaccination plus growth monitoring records, 31/138 [22.4%]; child health books, 48/138 [34.7%]; combination of these, 27/138 [19.5%] countries). HBRs were provided free-of-charge in 124/138 (89.8%) respondent countries. HBRs were required for school entry in 62/138 (44.9%) countries. Nearly a quarter of countries reported HBR stock-outs during 2013. Computed printing cost per record was immunization programmes to develop, implement and monitor corrective activities to improve the availability and utilization of HBRs. Much work remains to improve forecasting where appropriate, to prevent HBR stock-outs, to identify and improve sustainable financing options and to explore viable market shaping opportunities. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  18. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    Science.gov (United States)

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  19. Validity and practicability of smartphone-based photographic food records for estimating energy and nutrient intake.

    Science.gov (United States)

    Kong, Kaimeng; Zhang, Lulu; Huang, Lisu; Tao, Yexuan

    2017-05-01

    Image-assisted dietary assessment methods are frequently used to record individual eating habits. This study tested the validity of a smartphone-based photographic food recording approach by comparing the results obtained with those of a weighed food record. We also assessed the practicality of the method by using it to measure the energy and nutrient intake of college students. The experiment was implemented in two phases, each lasting 2 weeks. In the first phase, a labelled menu and a photograph database were constructed. The energy and nutrient content of 31 randomly selected dishes in three different portion sizes were then estimated by the photograph-based method and compared with a weighed food record. In the second phase, we combined the smartphone-based photographic method with the WeChat smartphone application and applied this to 120 randomly selected participants to record their energy and nutrient intake. The Pearson correlation coefficients for energy, protein, fat, and carbohydrate content between the weighed and the photographic food record were 0.997, 0.936, 0.996, and 0.999, respectively. Bland-Altman plots showed good agreement between the two methods. The estimated protein, fat, and carbohydrate intake by participants was in accordance with values in the Chinese Residents' Nutrition and Chronic Disease report (2015). Participants expressed satisfaction with the new approach and the compliance rate was 97.5%. The smartphone-based photographic dietary assessment method combined with the WeChat instant messaging application was effective and practical for use by young people.

  20. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1999-05-01

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  1. Assessing the spatial representability of charcoal and PAH-based paleofire records with integrated GIS, modelling, and empirical approaches

    Science.gov (United States)

    Vachula, R. S.; Huang, Y.; Russell, J. M.

    2017-12-01

    Lake sediment-based fire reconstructions offer paleoenvironmental context in which to assess modern fires and predict future burning. However, despite the ubiquity, many uncertainties remain regarding the taphonomy of paleofire proxies and the spatial scales for which they record variations in fire history. Here we present down-core proxy analyses of polycyclic aromatic hydrocarbons (PAHs) and three size-fractions of charcoal (63-150, >150 and >250 μm) from Swamp Lake, California, an annually laminated lacustrine archive. Using a statewide historical GIS dataset of area burned, we assess the spatial scales for which these proxies are reliable recorders of fire history. We find that the coherence of observed and proxy-recorded fire history inherently depends upon spatial scale. Contrary to conventional thinking that charcoal mainly records local fires, our results indicate that macroscopic charcoal (>150 μm) may record spatially broader (250 μm) may be a more conservative proxy for local burning. We find that sub-macroscopic charcoal particles (63-150 μm) reliably record regional (up to 150 km) changes in fire history. These results indicate that charcoal-based fire reconstructions may represent spatially broader fire history than previously thought, which has major implications for our understanding of spatiotemporal paleofire variations. Our analyses of PAHs show that dispersal mobility is heterogeneous between compounds, but that PAH fluxes are reliable proxies of fire history within 25-50 km, which suggests PAHs may be a better spatially constrained paleofire proxy than sedimentary charcoal. Further, using a linear discriminant analysis model informed by modern emissions analyses, we show that PAH assemblages preserved in lake sediments can differentiate vegetation type burned, and are thus promising paleoecological biomarkers warranting further research and implementation. In sum, our analyses offer new insight into the spatial dimensions of paleofire

  2. Miniature, Single Channel, Memory-Based, High-G Acceleration Recorder (Millipen)

    International Nuclear Information System (INIS)

    Rohwer, Tedd A.

    1999-01-01

    The Instrumentation and Telemetry Departments at Sandia National Laboratories have been instrumenting earth penetrators for over thirty years. Recorded acceleration data is used to quantify penetrator performance. Penetrator testing has become more difficult as desired impact velocities have increased. This results in the need for small-scale test vehicles and miniature instrumentation. A miniature recorder will allow penetrator diameters to significantly decrease, opening the window of testable parameters. Full-scale test vehicles will also benefit from miniature recorders by using a less intrusive system to instrument internal arming, fusing, and firing components. This single channel concept is the latest design in an ongoing effort to miniaturize the size and reduce the power requirement of acceleration instrumentation. A micro-controller/memory based system provides the data acquisition, signal conditioning, power regulation, and data storage. This architecture allows the recorder, including both sensor and electronics, to occupy a volume of less than 1.5 cubic inches, draw less than 200mW of power, and record 15kHz data up to 40,000 gs. This paper will describe the development and operation of this miniature acceleration recorder

  3. A high capacity data recording device based on a digital audio processor and a video cassette recorder.

    Science.gov (United States)

    Bezanilla, F

    1985-01-01

    A modified digital audio processor, a video cassette recorder, and some simple added circuitry are assembled into a recording device of high capacity. The unit converts two analog channels into digital form at 44-kHz sampling rate and stores the information in digital form in a common video cassette. Bandwidth of each channel is from direct current to approximately 20 kHz and the dynamic range is close to 90 dB. The total storage capacity in a 3-h video cassette is 2 Gbytes. The information can be retrieved in analog or digital form. PMID:3978213

  4. Influence of weather factors on population dynamics of two lagomorph species based on hunting bag records

    NARCIS (Netherlands)

    Rödel, H.; Dekker, J.J.A.

    2012-01-01

    Weather conditions can have a significant influence on short-term fluctuations of animal populations. In our study, which is based on time series of hunting bag records of up to 28 years from 26 counties of The Netherlands and Germany, we investigated the impact of different weather variables on

  5. Automatic lameness detection based on consecutive 3D-video recordings

    NARCIS (Netherlands)

    Hertem, van T.; Viazzi, S.; Steensels, M.; Maltz, E.; Antler, A.; Alchanatis, V.; Schlageter-Tello, A.; Lokhorst, C.; Romanini, C.E.B.; Bahr, C.; Berckmans, D.; Halachmi, I.

    2014-01-01

    Manual locomotion scoring for lameness detection is a time-consuming and subjective procedure. Therefore, the objective of this study is to optimise the classification output of a computer vision based algorithm for automated lameness scoring. Cow gait recordings were made during four consecutive

  6. Analysis of the steel braced frames equipped with ADAS devices under the far field records

    Directory of Open Access Journals (Sweden)

    Mahmoud Bayat

    Full Text Available The usefulness of supplementary energy dissipation devices is now quite well-known in earthquake structural engineering for reducing the earthquake-induced response of structural systems. The seismic behavior of structures with supplemental ADAS devices is concerned in this study. In this paper, the ratio of the hysteretic energy to input energy is compared in different structural systems. The main purpose of this paper is to evaluate the behavior of structures equipped with yielding dampers (ADAS, located in far fields based on energy concepts. In order to optimize their seismic behavior, the codes and solutions are also presented. Three cases including five, ten and fifteen-story three-bay Concentric Braced Frames (CBF with and without ADAS were selected. The PERFORM 3D.V4 software along with three earthquake records (Northridge, Imperial Valley and Tabas is used for nonlinear time history analysis and the conclusions are drawn upon energy criterion. The effect of PGA variation and height of the frames are also considered in the study. Finally, to increase the energy damping ability and reduce the destructive effects in structures on an earthquake event, so that a great amount of induced energy is damped and destruction of the structure is prevented as much as possible by using ADAS dampers.

  7. Rescaled range analysis of streamflow records in the São Francisco River Basin, Brazil

    Science.gov (United States)

    Araujo, Marcelo Vitor Oliveira; Celeste, Alcigeimes B.

    2018-01-01

    Hydrological time series are sometimes found to have a distinctive behavior known as long-term persistence, in which subsequent values depend on each other even under very large time scales. This implies multiyear consecutive droughts or floods. Typical models used to generate synthetic hydrological scenarios, widely used in the planning and management of water resources, fail to preserve this kind of persistence in the generated data and therefore may have a major impact on projects whose design lives span for long periods of time. This study deals with the evaluation of long-term persistence in streamflow records by means of the rescaled range analysis proposed by British engineer Harold E. Hurst, who first observed the phenomenon in the mid-twentieth century. In this paper, Hurst's procedure is enhanced by a strategy based on statistical hypothesis testing. The case study comprises the six main hydroelectric power plants located in the São Francisco River Basin, part of the Brazilian National Grid. Historical time series of inflows to the major reservoirs of the system are investigated and 5/6 sites show significant persistence, with values for the so-called Hurst exponent near or greater than 0.7, i.e., around 40% above the value 0.5 that represents a white noise process, suggesting that decision makers should take long-term persistence into consideration when conducting water resources planning and management studies in the region.

  8. Analysis of the hologram recording on the novel chloride photo-thermo-refractive glass

    Science.gov (United States)

    Ivanov, S. A.; Nikonorov, N. V.; Dubrovin, V. D.; Krykova, V. A.

    2017-05-01

    In this research, we present new holographic material based on fluoride photo-thermo-refractive glass(PTR) - chloride PTR glass. One of the benefit of this type of PTR glass is positive refractive index change. During this work, for the first-time volume Bragg gratings were recorded in this kind of material. The first experiments revealed that such gratings are mixed i.e. possess both absorption and phase components. Complex analysis shows that both refractive index and absorption coefficient are modulated inside the grating structure. We found out that at first there is no strict dependence of the refractive index change from dosage, but as we continue the process of thermal treatment - dependence is appear. Exposure influence on the refractive index change for this glass differs from fluoride one and shows some sort of saturation after the exposure of 4-6 J/cm2 . We distinguished refractive index change and absorption coefficient change and observed both behavior with increasing thermal treatment time. We found out that the increase of thermal treatment time results in the significant refractive index change. At the same time the absorption does `not practically change. It was found that maximum modulation of refractive index is comparable with fluoride PTR glass and achieves value of 1600 ppm. The modulation of absorption is equal to induced absorption caused by silver nanoparticles and depends from reading wavelength. Our study shows that almost all absorption is modulated inside the grating.

  9. Non-linear transient behavior during soil liquefaction based on re-evaluation of seismic records

    OpenAIRE

    Kamagata, S.; Takewaki, Izuru

    2015-01-01

    Focusing on soil liquefaction, the seismic records during the Niigata-ken earthquake in 1964, the southern Hyogo prefecture earthquake in 1995 and the 2011 off the Pacific coast of Tohoku earthquake are analyzed by the non-stationary Fourier spectra. The shift of dominant frequency in the seismic record of Kawagishi-cho during the Niigata-ken earthquake is evaluated based on the time-variant property of dominant frequencies. The reduction ratio of the soil stiffness is evaluated from the shif...

  10. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    NARCIS (Netherlands)

    de Haan, S.; de Haan, S.; Lodder, J.C.; Popma, T.J.A.

    1991-01-01

    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having

  11. How powerful is the dwell-time analysis of multichannel records?

    Science.gov (United States)

    Blunck, R; Kirst, U; Riessner, T; Hansen, U

    1998-09-01

    Exact algorithms for the kinetic analysis of multichannel patch-clamp records require hours to days for a single record. Thus, it may be reasonable to use a fast but less accurate method for the analysis of all data sets and to use the results for a reanalysis of some selected records with more sophisticated approaches. For the first run, the tools of single-channel analysis were used for the evaluation of the single-channel rate constants from multichannel dwell-time histograms. This could be achieved by presenting an ensemble of single channels by a "macrochannel" comprising all possible states of the ensemble of channels. Equations for the calculations of the elements of the macrochannel transition matrix and for the steady-state concentrations for individual states are given. Simulations of multichannel records with 1 to 8 channels with two closed and one open states and with 2 channels with two open and two closed states were done in order to investigate under which conditions the one-dimensional dwell-time analysis itself already provides reliable results. Distributions of the evaluated single-channel rate constants show that a bias of the estimations of the single-channel rate constants of 10 to 20% has to be accepted. The comparison of simulations with signal-to-noise ratios of SNR = 1 or SNR = 25 demonstrates that the major problem is not the convergence of the fitting routine, but failures of the level detector algorithm which creates the dwell-times distributions from noisy time series. The macrochannel presentation allows the incorporation of constraints like channel interaction. The evaluation of simulated 4-channel records in which the rate-constant of opening increased by 20% per already open channel could reveal the interaction factor.

  12. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  13. Source properties of the 1998 July 17 Papua New Guinea tsunami based on tide gauge records

    Science.gov (United States)

    Heidarzadeh, Mohammad; Satake, Kenji

    2015-07-01

    We analysed four newly retrieved tide gauge records of the 1998 July 17 Papua New Guinea (PNG) tsunami to study statistical and spectral properties of this tsunami. The four tide gauge records were from Lombrum (PNG), Rabaul (PNG), Malakal Island (Palau) and Yap Island (State of Yap) stations located 600-1450 km from the source. The tsunami registered a maximum trough-to-crest wave height of 3-9 cm at these gauges. Spectral analysis showed two dominant peaks at period bands of 2-4 and 6-20 min with a clear separation at the period of ˜5 min. We interpreted these peak periods as belonging to the landslide and earthquake sources of the PNG tsunami, respectively. Analysis of the tsunami waveforms revealed 12-17 min delay in landslide generation compared to the origin time of the main shock. Numerical simulations including this delay fairly reproduced the observed tide gauge records. This is the first direct evidence of the delayed landslide source of the 1998 PNG tsunami which was previously indirectly estimated from acoustic T-phase records.

  14. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  15. Analysis of the nature and cause of turbulence upset using airline flight records

    Science.gov (United States)

    Parks, E. K.; Bach, R. E., Jr.; Wingrove, R. C.

    1982-01-01

    The development and application of methods for determining aircraft motions and related winds, using data normally recorded during airline flight operations, are described. The methods are being developed, in cooperation with the National Transportation Safety Board, to aid in the analysis and understanding of circumstances associated with aircraft accidents or incidents. Data from a recent DC-10 encounter with severe, high-altitude turbulence are used to illustrate the methods. The analysis of this encounter shows the turbulence to be a series of equally spaced horizontal swirls known as 'cat's eyes' vortices. The use of flight-data analysis methods to identify this type of turbulence phenomenon is presented for the first time.

  16. Cost-benefit analysis of electronic medical record system at a tertiary care hospital.

    Science.gov (United States)

    Choi, Jong Soo; Lee, Woo Baik; Rhee, Poong-Lyul

    2013-09-01

    Although Electronic Medical Record (EMR) systems provide various benefits, there are both advantages and disadvantages regarding its cost-effectiveness. This study analyzed the economic effects of EMR systems using a cost-benefit analysis based on the differential costs of managerial accounting. Samsung Medical Center (SMC) is a general hospital in Korea that developed an EMR system for outpatients from 2006 to 2008. This study measured the total costs and benefits during an 8-year period after EMR adoption. The costs include the system costs of building the EMR and the costs incurred in smoothing its adoption. The benefits included cost reductions after its adoption and additional revenues from both remodeling of paper-chart storage areas and medical transcriptionists' contribution. The measured amounts were discounted by SMC's expected interest rate to calculate the net present value (NPV), benefit-cost ratio (BCR), and discounted payback period (DPP). During the analysis period, the cumulative NPV and the BCR were US$3,617 thousand and 1.23, respectively. The DPP was about 6.18 years. Although the adoption of an EMR resulted in overall growth in administrative costs, it is cost-effective since the cumulative NPV was positive. The positive NPV was attributed to both cost reductions and additional revenues. EMR adoption is not so attractive to management in that the DPP is longer than 5 years at 6.18 and the BCR is near 1 at 1.23. However, an EMR is a worthwhile investment, seeing that this study did not include any qualitative benefits and that the paper-chart system was cost-centric.

  17. Thermal-Signature-Based Sleep Analysis Sensor

    Directory of Open Access Journals (Sweden)

    Ali Seba

    2017-10-01

    Full Text Available This paper addresses the development of a new technique in the sleep analysis domain. Sleep is defined as a periodic physiological state during which vigilance is suspended and reactivity to external stimulations diminished. We sleep on average between six and nine hours per night and our sleep is composed of four to six cycles of about 90 min each. Each of these cycles is composed of a succession of several stages of sleep that vary in depth. Analysis of sleep is usually done via polysomnography. This examination consists of recording, among other things, electrical cerebral activity by electroencephalography (EEG, ocular movements by electrooculography (EOG, and chin muscle tone by electromyography (EMG. Recordings are made mostly in a hospital, more specifically in a service for monitoring the pathologies related to sleep. The readings are then interpreted manually by an expert to generate a hypnogram, a curve showing the succession of sleep stages during the night in 30s epochs. The proposed method is based on the follow-up of the thermal signature that makes it possible to classify the activity into three classes: “awakening,” “calm sleep,” and “restless sleep”. The contribution of this non-invasive method is part of the screening of sleep disorders, to be validated by a more complete analysis of the sleep. The measure provided by this new system, based on temperature monitoring (patient and ambient, aims to be integrated into the tele-medicine platform developed within the framework of the Smart-EEG project by the SYEL–SYstèmes ELectroniques team. Analysis of the data collected during the first surveys carried out with this method showed a correlation between thermal signature and activity during sleep. The advantage of this method lies in its simplicity and the possibility of carrying out measurements of activity during sleep and without direct contact with the patient at home or hospitals.

  18. An algorithm to manage variable-length records for highly portable clinical data base systems.

    Science.gov (United States)

    Okada, M; Okada, M

    1986-06-01

    An algorithm to archive patient data at free size in disk storage is presented. A record, assumed to be a character string such as an ASCII-coded text, is compressed and divided into fixed-length blocks. One block consists of a data field and a pointer field, and the blocks comprising a record are chained with pointers forwardly. A head pointer of each record is sequentially saved on a separate file. The data compression is performed as follows: if the same character code appears more than twice in succession, we count the number of the repetitions and save it with initial two characters. The algorithms for fetching, re-saving, and purging a record are also presented. These were implemented in FORTRAN77 and tested for performance using a practical patient data file. As the algorithm allows highly flexible record manipulation and can easily be implemented in conventional programming languages, it will make a useful tool for constructing a portable data base management system.

  19. Analysis of Accelerographic Records Obtained in Jassy During the 1986 and 1990 Vrancea Earthquakes

    Directory of Open Access Journals (Sweden)

    Ioan-Sorin Borcia

    2007-01-01

    Full Text Available The interpretation, using methodological elements developed in INCERC, of data of instrumental nature obtained in Jassy during the 1986 and 1990 Vrancea earthquakes, is the main goal of the paper. The response spectra (for 12 azimuthally equidistant horizontal directions of strong motion records, the global parameters which characterize an individual (horizontal component of a record (effective peak values and corner (control periods, and the instrumental intensity (global and averaged upon a frequency interval, based on destructiveness spectrum and on response spectrum are the main numerical results obtained for the seismic records at hand. Finally, an evaluation of results, as well as conclusions useful for the aseismic design of buildings in Jassy, is presented.

  20. The Private Communications of Magnetic Recording under Socialism (Retrospective Disco Analysis

    Directory of Open Access Journals (Sweden)

    Oleg Vladimir Sineokij

    2013-07-01

    Full Text Available The article analyzes the formation and development of a general model of rare sound records in the structure of institutions of a social communication. The author considers psychocomminicative features of the filophone communication as a special type of interaction in the field of entertainment. The author studied the causes and conditions of a tape subculture in the USSR. It is observed the dynamics of the disco-communication in limited information conditions from socialism till modern high-tech conditions.At the end of the article the author argues based achievements in the field of advanced technology systems, innovation revival in the industry of music-record. Hence, using innovative approaches in the study, the author sets out the basic concept of recording popular music as a special information and legal institution, in retrospect, the theory and practice of the future needs in the information society.

  1. Online Recorded Data-Based Composite Neural Control of Strict-Feedback Systems With Application to Hypersonic Flight Dynamics.

    Science.gov (United States)

    Xu, Bin; Yang, Daipeng; Shi, Zhongke; Pan, Yongping; Chen, Badong; Sun, Fuchun

    2017-09-25

    This paper investigates the online recorded data-based composite neural control of uncertain strict-feedback systems using the backstepping framework. In each step of the virtual control design, neural network (NN) is employed for uncertainty approximation. In previous works, most designs are directly toward system stability ignoring the fact how the NN is working as an approximator. In this paper, to enhance the learning ability, a novel prediction error signal is constructed to provide additional correction information for NN weight update using online recorded data. In this way, the neural approximation precision is highly improved, and the convergence speed can be faster. Furthermore, the sliding mode differentiator is employed to approximate the derivative of the virtual control signal, and thus, the complex analysis of the backstepping design can be avoided. The closed-loop stability is rigorously established, and the boundedness of the tracking error can be guaranteed. Through simulation of hypersonic flight dynamics, the proposed approach exhibits better tracking performance.

  2. Web application for recording learners’ mouse trajectories and retrieving their study logs for data analysis

    Directory of Open Access Journals (Sweden)

    Yoshinori Miyazaki

    2012-03-01

    Full Text Available With the accelerated implementation of e-learning systems in educational institutions, it has become possible to record learners’ study logs in recent years. It must be admitted that little research has been conducted upon the analysis of the study logs that are obtained. In addition, there is no software that traces the mouse movements of learners during their learning processes, which the authors believe would enable teachers to better understand their students’ behaviors. The objective of this study is to develop a Web application that records students’ study logs, including their mouse trajectories, and to devise an IR tool that can summarize such diversified data. The results of an experiment are also scrutinized to provide an analysis of the relationship between learners’ activities and their study logs.

  3. Ambient vibration recording on the Maddalena Bridge in Borgo a Mozzano (Italy: data analysis.

    Directory of Open Access Journals (Sweden)

    Riccardo Mario Azzara

    2017-07-01

    A monitoring system has been fitted on the external surface of the bridge in order to evaluate its dynamic response to vibrations originating in the adjacent railway and two nearby roads. The natural frequencies and mode shapes of the structure and the corresponding damping ratios have been obtained by analyzing the recorded data using different techniques of Operational Modal Analysis. Lastly, a finite-element model of the bridge has been calibrated to fit the experimental data.

  4. Spectral analysis in overmodulated holographic reflection gratings recorded with BB640 ultrafine grain emulsion

    Science.gov (United States)

    Mas-Abellán, P.; Madrigal, R.; Fimia, A.

    2015-05-01

    Silver halide emulsions have been considered one of the most energetic sensitive materials for holographic applications. Nonlinear recording effects on holographic reflection gratings recorded on silver halide emulsions have been studied by different authors obtaining excellent experimental results. In this communication specifically we focused our investigation on the effects of refractive index modulation, trying to get high levels of overmodulation. We studied the influence of the grating thickness on the overmodulation and its effects on the transmission spectra for a wide exposure range by use of two different thickness ultrafine grain emulsion BB640, thin films (6 μm) and thick films (9 μm), exposed to single collimated beams using a red He-Ne laser (wavelength 632.8 nm) with Denisyuk configuration obtaining a spatial frequency of 4990 l/mm recorded on the emulsion. The experimental results show that high overmodulation levels of refractive index could offer some benefits such as high diffraction efficiency (reaching 90 %), increase of grating bandwidth (close to 80 nm), making lighter holograms, or diffraction spectra deformation, transforming the spectrum from sinusoidal to approximation of square shape. Based on these results, we demonstrate that holographic reflection gratings spectra recorded with overmodulation of refractive index is formed by the combination of several non-linear components due to very high overmodulation. This study is the first step to develop a new easy multiplexing technique based on the use of high index modulation reflection gratings.

  5. AGGLOMERATIVE CLUSTERING OF SOUND RECORD SPEECH SEGMENTS BASED ON BAYESIAN INFORMATION CRITERION

    Directory of Open Access Journals (Sweden)

    O. Yu. Kydashev

    2013-01-01

    Full Text Available This paper presents the detailed description of agglomerative clustering system implementation for speech segments based on Bayesian information criterion. Numerical experiment results with different acoustic features, as well as the full and diagonal covariance matrices application are given. The error rate DER equal to 6.4% for audio records of radio «Svoboda» was achieved by means of designed system.

  6. Smart Card Based Integrated Electronic Health Record System For Clinical Practice

    OpenAIRE

    N. Anju Latha; B. Rama Murthy; U. Sunitha

    2012-01-01

    Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Smart cards are used as a Electronic Health Record (EHR) Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in hospitals. In this paper, a smart card based Integrated Electronic health Reco...

  7. Why georeferencing matters: Introducing a practical protocol to prepare species occurrence records for spatial analysis.

    Science.gov (United States)

    Bloom, Trevor D S; Flower, Aquila; DeChaine, Eric G

    2018-01-01

    Species Distribution Models (SDMs) are widely used to understand environmental controls on species' ranges and to forecast species range shifts in response to climatic changes. The quality of input data is crucial determinant of the model's accuracy. While museum records can be useful sources of presence data for many species, they do not always include accurate geographic coordinates. Therefore, actual locations must be verified through the process of georeferencing. We present a practical, standardized manual georeferencing method (the Spatial Analysis Georeferencing Accuracy (SAGA) protocol) to classify the spatial resolution of museum records specifically for building improved SDMs. We used the high-elevation plant Saxifraga austromontana Wiegand (Saxifragaceae) as a case study to test the effect of using this protocol when developing an SDM. In MAXENT, we generated and compared SDMs using a comprehensive occurrence dataset that had undergone three different levels of georeferencing: (1) trained using all publicly available herbarium records of the species, minus outliers (2) trained using herbarium records claimed to be previously georeferenced, and (3) trained using herbarium records that we have manually georeferenced to a ≤ 1-km resolution using the SAGA protocol. Model predictions of suitable habitat for S. austromontana differed greatly depending on georeferencing level. The SDMs fitted with presence locations georeferenced using SAGA outperformed all others. Differences among models were exacerbated for future distribution predictions. Under rapid climate change, accurately forecasting the response of species becomes increasingly important. Failure to georeference location data and cull inaccurate samples leads to erroneous model output, limiting the utility of spatial analyses. We present a simple, standardized georeferencing method to be adopted by curators, ecologists, and modelers to improve the geographic accuracy of museum records and SDM

  8. Understanding Factors Contributing to Inappropriate Critical Care: A Mixed-Methods Analysis of Medical Record Documentation.

    Science.gov (United States)

    Neville, Thanh H; Tarn, Derjung M; Yamamoto, Myrtle; Garber, Bryan J; Wenger, Neil S

    2017-11-01

    Factors leading to inappropriate critical care, that is treatment that should not be provided because it does not offer the patient meaningful benefit, have not been rigorously characterized. We explored medical record documentation about patients who received inappropriate critical care and those who received appropriate critical care to examine factors associated with the provision of inappropriate treatment. Medical records were abstracted from 123 patients who were assessed as receiving inappropriate treatment and 66 patients who were assessed as receiving appropriate treatment but died within six months of intensive care unit (ICU) admission. We used mixed methods combining qualitative analysis of medical record documentation with multivariable analysis to examine the relationship between patient and communication factors and the receipt of inappropriate treatment, and present these within a conceptual model. One academic health system. Medical records revealed 21 themes pertaining to prognosis and factors influencing treatment aggressiveness. Four themes were independently associated with patients receiving inappropriate treatment according to physicians. When decision making was not guided by physicians (odds ratio [OR] 3.76, confidence interval [95% CI] 1.21-11.70) or was delayed by patient/family (OR 4.52, 95% CI 1.69-12.04), patients were more likely to receive inappropriate treatment. Documented communication about goals of care (OR 0.29, 95% CI 0.10-0.84) and patient's preferences driving decision making (OR 0.02, 95% CI 0.00-0.27) were associated with lower odds of receiving inappropriate treatment. Medical record documentation suggests that inappropriate treatment occurs in the setting of communication and decision-making patterns that may be amenable to intervention.

  9. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  10. Natural Language Processing Based Instrument for Classification of Free Text Medical Records

    Directory of Open Access Journals (Sweden)

    Manana Khachidze

    2016-01-01

    Full Text Available According to the Ministry of Labor, Health and Social Affairs of Georgia a new health management system has to be introduced in the nearest future. In this context arises the problem of structuring and classifying documents containing all the history of medical services provided. The present work introduces the instrument for classification of medical records based on the Georgian language. It is the first attempt of such classification of the Georgian language based medical records. On the whole 24.855 examination records have been studied. The documents were classified into three main groups (ultrasonography, endoscopy, and X-ray and 13 subgroups using two well-known methods: Support Vector Machine (SVM and K-Nearest Neighbor (KNN. The results obtained demonstrated that both machine learning methods performed successfully, with a little supremacy of SVM. In the process of classification a “shrink” method, based on features selection, was introduced and applied. At the first stage of classification the results of the “shrink” case were better; however, on the second stage of classification into subclasses 23% of all documents could not be linked to only one definite individual subclass (liver or binary system due to common features characterizing these subclasses. The overall results of the study were successful.

  11. Analysis of Driver Evasive Maneuvering Prior to Intersection Crashes Using Event Data Recorders.

    Science.gov (United States)

    Scanlon, John M; Kusano, Kristofer D; Gabler, Hampton C

    2015-01-01

    Intersection crashes account for over 4,500 fatalities in the United States each year. Intersection Advanced Driver Assistance Systems (I-ADAS) are emerging vehicle-based active safety systems that have the potential to help drivers safely navigate across intersections and prevent intersection crashes and injuries. The performance of an I-ADAS is expected to be highly dependent upon driver evasive maneuvering prior to an intersection crash. Little has been published, however, on the detailed evasive kinematics followed by drivers prior to real-world intersection crashes. The objective of this study was to characterize the frequency, timing, and kinematics of driver evasive maneuvers prior to intersection crashes. Event data recorders (EDRs) downloaded from vehicles involved in intersection crashes were investigated as part of NASS-CDS years 2001 to 2013. A total of 135 EDRs with precrash vehicle speed and braking application were downloaded to investigate evasive braking. A smaller subset of 59 EDRs that collected vehicle yaw rate was additionally analyzed to investigate evasive steering. Each vehicle was assigned to one of 3 precrash movement classifiers (traveling through the intersection, completely stopped, or rolling stop) based on the vehicle's calculated acceleration and observed velocity profile. To ensure that any significant steering input observed was an attempted evasive maneuver, the analysis excluded vehicles at intersections that were turning, driving on a curved road, or performing a lane change. Braking application at the last EDR-recorded time point was assumed to indicate evasive braking. A vehicle yaw rate greater than 4° per second was assumed to indicate an evasive steering maneuver. Drivers executed crash avoidance maneuvers in four-fifths of intersection crashes. A more detailed analysis of evasive braking frequency by precrash maneuver revealed that drivers performing complete or rolling stops (61.3%) braked less often than drivers

  12. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Directory of Open Access Journals (Sweden)

    Marco Crescentini

    2016-05-01

    Full Text Available High-throughput screening (HTS using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i design of scalable microfluidic devices; (ii design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  13. Archetype-based data warehouse environment to enable the reuse of electronic health record data.

    Science.gov (United States)

    Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G

    2015-09-01

    The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world

  14. A data repository and analysis framework for spontaneous neural activity recordings in developing retina.

    Science.gov (United States)

    Eglen, Stephen John; Weeks, Michael; Jessop, Mark; Simonotto, Jennifer; Jackson, Tom; Sernagor, Evelyne

    2014-03-26

    During early development, neural circuits fire spontaneously, generating activity episodes with complex spatiotemporal patterns. Recordings of spontaneous activity have been made in many parts of the nervous system over the last 25 years, reporting developmental changes in activity patterns and the effects of various genetic perturbations. We present a curated repository of multielectrode array recordings of spontaneous activity in developing mouse and ferret retina. The data have been annotated with minimal metadata and converted into HDF5. This paper describes the structure of the data, along with examples of reproducible research using these data files. We also demonstrate how these data can be analysed in the CARMEN workflow system. This article is written as a literate programming document; all programs and data described here are freely available. 1. We hope this repository will lead to novel analysis of spontaneous activity recorded in different laboratories. 2. We encourage published data to be added to the repository. 3. This repository serves as an example of how multielectrode array recordings can be stored for long-term reuse.

  15. Calibration of Clinical Audio Recording and Analysis Systems for Sound Intensity Measurement.

    Science.gov (United States)

    Maryn, Youri; Zarowski, Andrzej

    2015-11-01

    Sound intensity is an important acoustic feature of voice/speech signals. Yet recordings are performed with different microphone, amplifier, and computer configurations, and it is therefore crucial to calibrate sound intensity measures of clinical audio recording and analysis systems on the basis of output of a sound-level meter. This study was designed to evaluate feasibility, validity, and accuracy of calibration methods, including audiometric speech noise signals and human voice signals under typical speech conditions. Calibration consisted of 3 comparisons between data from 29 measurement microphone-and-computer systems and data from the sound-level meter: signal-specific comparison with audiometric speech noise at 5 levels, signal-specific comparison with natural voice at 3 levels, and cross-signal comparison with natural voice at 3 levels. Intensity measures from recording systems were then linearly converted into calibrated data on the basis of these comparisons, and validity and accuracy of calibrated sound intensity were investigated. Very strong correlations and quasisimilarity were found between calibrated data and sound-level meter data across calibration methods and recording systems. Calibration of clinical sound intensity measures according to this method is feasible, valid, accurate, and representative for a heterogeneous set of microphones and data acquisition systems in real-life circumstances with distinct noise contexts.

  16. A 6,700 years sea-level record based on French Polynesian coral reefs

    Science.gov (United States)

    Hallmann, Nadine; Camoin, Gilbert; Eisenhauer, Anton; Vella, Claude; Samankassou, Elias; Botella, Albéric; Milne, Glenn; Fietzke, Jan; Dussouillez, Philippe

    2015-04-01

    Sea-level change during the Mid- to Late Holocene has a similar amplitude to the sea-level rise that is likely to occur before the end of the 21st century providing a unique opportunity to study the coastal response to sea-level change and to reveal an important baseline of natural climate variability prior to the industrial revolution. Mid- to Late Holocene relative sea-level change in French Polynesia was reconstructed using coral reef records from ten islands, which represent ideal settings for accurate sea-level studies because: 1) they can be regarded as tectonically stable during the relevant period (slow subsidence), 2) they are located far from former ice sheets (far-field), 3) they are characterized by a low tidal amplitude, and 4) they cover a wide range of latitudes which produces significantly improved constraints on GIA (Glacial Isostatic Adjustment) model parameters. Absolute U/Th dating of in situ coral colonies and their accurate positioning via GPS RTK (Real Time Kinematic) measurements is crucial for an accurate reconstruction of sea-level change. We focus mainly on the analysis of coral microatolls, which are sensitive low-tide recorders, as their vertical accretion is limited by the mean low water springs level. Growth pattern analysis allows the reconstruction of low-amplitude, high-frequency sea-level changes on centennial to sub-decadal time scales. A sea-level rise of less than 1 m is recorded between 6 and 3-3.5 ka, and is followed by a gradual fall in sea level that started around 2.5 ka and persisted until the past few centuries. The reconstructed sea-level curve therefore extends the Tahiti sea-level curve [Deschamps et al., 2012, Nature, 483, 559-564], and is in good agreement with a geophysical model tuned to fit far-field deglacial records [Bassett et al., 2005, Science, 309, 925-928].

  17. High-efficient and high-content cytotoxic recording via dynamic and continuous cell-based impedance biosensor technology.

    Science.gov (United States)

    Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping

    2016-10-01

    Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.

  18. Paper-Based Medical Records: the Challenges and Lessons Learned from Studying Obstetrics and Gynaecological Post-Operation Records in a Nigerian Hospital

    Directory of Open Access Journals (Sweden)

    Adekunle Yisau Abdulkadir

    2010-10-01

    Full Text Available AIM: With the background knowledge that auditing of Medical Records (MR for adequacy and completeness is necessary if it is to be useful and reliable in continuing patient care; protection of the legal interest of the patient, physicians, and the Hospital; and meeting requirements for researches, we scrutinized theatre records of our hospital to identify routine omissions or deficiencies, and correctable errors in our MR system. METHOD: Obstetrics and Gynaecological post operation theatre records between January 2006 and December 2008 were quantitatively and qualitatively analyzed for details that included: hospital number; Patients age; diagnosis; surgery performed; types and modes of anesthesia; date of surgery; patients’ ward; Anesthetists names; surgeons and attending nurses names, and abbreviations used with SPSS 15.0 for Windows. RESULTS: Hardly were any of the 1270 surgeries during the study period documented without an omission or an abbreviation. Hospital numbers and patients’ age were not documented in 21.8% (n=277 and 59.1% (n=750 respectively. Diagnoses and surgeries were recorded with varying abbreviations in about 96% of instances. Surgical team names were mostly abbreviated or initials only given. CONCLUSION: To improve the quality of Paper-based Medical Record, regular auditing, training and good orientation of medical personnel for good record practices, and discouraging large volume record book to reduce paper damages and sheet loss from handling are necessary else what we record toady may neither be useful nor available tomorrow. [TAF Prev Med Bull 2010; 9(5.000: 427-432

  19. Impact of the recorded variable on recurrence quantification analysis of flows

    International Nuclear Information System (INIS)

    Portes, Leonardo L.; Benda, Rodolfo N.; Ugrinowitsch, Herbert; Aguirre, Luis A.

    2014-01-01

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA

  20. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  1. Study on key techniques for camera-based hydrological record image digitization

    Science.gov (United States)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping

    2015-10-01

    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  2. Feasibility and performance evaluation of generating and recording visual evoked potentials using ambulatory Bluetooth based system.

    Science.gov (United States)

    Ellingson, Roger M; Oken, Barry

    2010-01-01

    Report contains the design overview and key performance measurements demonstrating the feasibility of generating and recording ambulatory visual stimulus evoked potentials using the previously reported custom Complementary and Alternative Medicine physiologic data collection and monitoring system, CAMAS. The methods used to generate visual stimuli on a PDA device and the design of an optical coupling device to convert the display to an electrical waveform which is recorded by the CAMAS base unit are presented. The optical sensor signal, synchronized to the visual stimulus emulates the brain's synchronized EEG signal input to CAMAS normally reviewed for the evoked potential response. Most importantly, the PDA also sends a marker message over the wireless Bluetooth connection to the CAMAS base unit synchronized to the visual stimulus which is the critical averaging reference component to obtain VEP results. Results show the variance in the latency of the wireless marker messaging link is consistent enough to support the generation and recording of visual evoked potentials. The averaged sensor waveforms at multiple CPU speeds are presented and demonstrate suitability of the Bluetooth interface for portable ambulatory visual evoked potential implementation on our CAMAS platform.

  3. Single-chip microcomputer based protection, diagnostic and recording system for longwall shearers

    Energy Technology Data Exchange (ETDEWEB)

    Heyduk, A.; Krasucki, F. (Politechnika Slaska, Gliwice (Poland). Katedra Elektryfikacji i Automatyzacji Gornictwa)

    1993-05-01

    Presents a concept of microcomputer-aided operation, protection, diagnostics and recording for shearer loaders. A two-stage mathematical model is suggested and explained. The model represents the thermal processes that determine the overcurrent protection of drive motors. Circuits for monitoring fuses, supply voltages, contacts, relays, contactors and electro-hydraulic distributors with the use of transoptors are shown. Recording characteristic operation parameters of a shearer loader during the 5 minutes before a failure is proposed. Protection, diagnosis and control functions are suggested as additional functions to the microcomputer-aided system of shearer loader control being developed at the Silesian Technical University. The system is based on the NECmicroPD 78310 microprocessor. 10 refs.

  4. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  5. A Semantic-Based K-Anonymity Scheme for Health Record Linkage.

    Science.gov (United States)

    Lu, Yang; Sinnott, Richard O; Verspoor, Karin

    2017-01-01

    Record linkage is a technique for integrating data from sources or providers where direct access to the data is not possible due to security and privacy considerations. This is a very common scenario for medical data, as patient privacy is a significant concern. To avoid privacy leakage, researchers have adopted k-anonymity to protect raw data from re-identification however they cannot avoid associated information loss, e.g. due to generalisation. Given that individual-level data is often not disclosed in the linkage cases, but yet remains potentially re-discoverable, we propose semantic-based linkage k-anonymity to de-identify record linkage with fewer generalisations and eliminate inference disclosure through semantic reasoning.

  6. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record].

    Science.gov (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen

    2015-06-01

    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital.

  7. [Web-based for preanesthesia evaluation record: a structured, evidence-based patient interview to assess the anesthesiological risk profile].

    Science.gov (United States)

    Kramer, Sylvia; Lau, Alexandra; Krämer, Michael; Wendler, Olafur Gunnarsson; Müller-Lobeck, Lutz; Scheding, Christoph; Klarhöfer, Manja; Schaffartzik, Walter; Neumann, Tim; Krampe, Henning; Spies, Claudia

    2011-10-01

    At present, providers at an Anesthesia Preoperative Evaluation Clinic (APEC) may have difficulties in gaining access to relevant clinical information, including external medical records, surgical dictations etc. This common occurence makes obtaining an informed consent by the patient after a complete pre-anesthetic assessment difficult. This form of patient information is subject to wide interindividual variations and, thus, represents a challenge for quality assurance. Insufficient or not completed pre-anesthetic assessments can lead to an untimely termination of an elective procedure.A web-based pre-anesthetic evaluation record moves the time point of the first contact to well before the day of admission. The current pre-anesthesia evaluation record is replaced by a structured interview in the form of a complex of questions in a specific hierarchy taking guidelines, standard operating procedures (SOP) and evidence-based medicine (EBM) into consideration. The answers to the complex of questions are then classified according to agreed criteria and possible scoring systems of relevant classifications. The endpoints result in procedural recommendations not only for the informing anesthesiologist but also for the patient. The standardized risk criteria can be used as core process indicators to check the process quality of the anesthesiological risk evaluation. Short-notice cancellations of elective operations due to incomplete premedication procedures will then be avoided with the help of such structured and evidence-based patient interviews with detailed assessment of the anesthesiological risk profile.The web-based anesthesia evaluation record (WAR) corresponds with the recommendations of the DGAI to carry out the staged information in analogy to the staged information of Weissauer. The basic practice is not changed by WACH. By means of WACH, the time point of the first contact with anesthesia is moved forward and occurs within a different framework. WACH has

  8. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  9. Exploring Type-and-Identity-Based Proxy Re-Encryption Scheme to Securely Manage Personal Health Records

    NARCIS (Netherlands)

    Ibraimi, L.; Gangopadhyay, Aryya; Tang, Qiang; Hartel, Pieter H.; Jonker, Willem

    2010-01-01

    Commercial Web-based Personal-Health Record (PHR) systems can help patients to share their personal health records (PHRs) anytime from anywhere. PHRs are very sensitive data and an inappropriate disclosure may cause serious problems to an individual. Therefore commercial Web-based PHR systems have

  10. Revised estimates of Greenland ice sheet thinning histories based on ice-core records

    DEFF Research Database (Denmark)

    Lecavalier, B.S.; Milne, G.A.; Fisher, D.A.

    2013-01-01

    -based reconstructions and, to some extent, the estimated elevation histories. A key component of the ice core analysis involved removing the influence of vertical surface motion on the dO signal measured from the Agassiz and Renland ice caps. We re-visit the original analysis with the intent to determine if the use...

  11. Periodontal progression based on radiographic records: An observational study in chronic and aggressive periodontitis.

    Science.gov (United States)

    Onabolu, Olanrewaju; Donos, Nikos; Tu, Yu-Kang; Darbar, Ulpee; Nibali, Luigi

    2015-06-01

    The current classification assumes that aggressive periodontitis (AgP) has a faster rate of progression than chronic periodontitis (CP). However, this has not been clearly proven and difficulties exist in establishing progression. This study aimed to assess the feasibility of retrospectively utilising previous records for clinical diagnosis of periodontal diseases and to assess if two different patterns of disease progression exist between AgP and CP. Previous radiographic records of a cohort of 235 patients clinically diagnosed with AgP or CP were requested from the referring general dental practitioners (GDPs). Comparable radiographic records were analysed in order to assess progression patterns and associate these with clinical diagnosis, by multilevel analysis. 43 patients out of the initial 235 had comparable radiographs retrieved from the GDPs. 858 sites were followed for an average 6.6 years. Radiographically, AgP showed a faster linear pattern of progression than CP (0.31mm/year vs. 0.20mm/year, pperiodontal disease progression and may have an impact on the clinical management of aggressive periodontitis, since our findings show that there is continuous destruction in patients with aggressive periodontitis if left untreated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Analysis of diagnoses extracted from electronic health records in a large mental health case register.

    Directory of Open Access Journals (Sweden)

    Yevgeniya Kovalchuk

    Full Text Available The UK government has recently recognised the need to improve mental health services in the country. Electronic health records provide a rich source of patient data which could help policymakers to better understand needs of the service users. The main objective of this study is to unveil statistics of diagnoses recorded in the Case Register of the South London and Maudsley NHS Foundation Trust, one of the largest mental health providers in the UK and Europe serving a source population of over 1.2 million people residing in south London. Based on over 500,000 diagnoses recorded in ICD10 codes for a cohort of approximately 200,000 mental health patients, we established frequency rate of each diagnosis (the ratio of the number of patients for whom a diagnosis has ever been recorded to the number of patients in the entire population who have made contact with mental disorders. We also investigated differences in diagnoses prevalence between subgroups of patients stratified by gender and ethnicity. The most common diagnoses in the considered population were (recurrent depression (ICD10 codes F32-33; 16.4% of patients, reaction to severe stress and adjustment disorders (F43; 7.1%, mental/behavioural disorders due to use of alcohol (F10; 6.9%, and schizophrenia (F20; 5.6%. We also found many diagnoses which were more likely to be recorded in patients of a certain gender or ethnicity. For example, mood (affective disorders (F31-F39; neurotic, stress-related and somatoform disorders (F40-F48, except F42; and eating disorders (F50 were more likely to be found in records of female patients, while males were more likely to be diagnosed with mental/behavioural disorders due to psychoactive substance use (F10-F19. Furthermore, mental/behavioural disorders due to use of alcohol and opioids were more likely to be recorded in patients of white ethnicity, and disorders due to use of cannabinoids in those of black ethnicity.

  13. Relation chain based clustering analysis

    Science.gov (United States)

    Zhang, Cheng-ning; Zhao, Ming-yang; Luo, Hai-bo

    2011-08-01

    Clustering analysis is currently one of well-developed branches in data mining technology which is supposed to find the hidden structures in the multidimensional space called feature or pattern space. A datum in the space usually possesses a vector form and the elements in the vector represent several specifically selected features. These features are often of efficiency to the problem oriented. Generally, clustering analysis goes into two divisions: one is based on the agglomerative clustering method, and the other one is based on divisive clustering method. The former refers to a bottom-up process which regards each datum as a singleton cluster while the latter refers to a top-down process which regards entire data as a cluster. As the collected literatures, it is noted that the divisive clustering is currently overwhelming both in application and research. Although some famous divisive clustering methods are designed and well developed, clustering problems are still far from being solved. The k - means algorithm is the original divisive clustering method which initially assigns some important index values, such as the clustering number and the initial clustering prototype positions, and that could not be reasonable in some certain occasions. More than the initial problem, the k - means algorithm may also falls into local optimum, clusters in a rigid way and is not available for non-Gaussian distribution. One can see that seeking for a good or natural clustering result, in fact, originates from the one's understanding of the concept of clustering. Thus, the confusion or misunderstanding of the definition of clustering always derives some unsatisfied clustering results. One should consider the definition deeply and seriously. This paper demonstrates the nature of clustering, gives the way of understanding clustering, discusses the methodology of designing a clustering algorithm, and proposes a new clustering method based on relation chains among 2D patterns. In

  14. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  15. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data.

    Science.gov (United States)

    Jang, Min Jee; Nam, Yoonkey

    2015-07-01

    Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of [Formula: see text] neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements.

  16. Fetal movement detection based on QRS amplitude variations in abdominal ECG recordings.

    Science.gov (United States)

    Rooijakkers, M J; de Lau, H; Rabotti, C; Oei, S G; Bergmans, J W M; Mischi, M

    2014-01-01

    Evaluation of fetal motility can give insight in fetal health, as a strong decrease can be seen as a precursor to fetal death. Typically, the assessment of fetal health by fetal movement detection relies on the maternal perception of fetal activity. The percentage of detected movements is strongly subject dependent and with undivided attention of the mother varies between 37% to 88%. Various methods to assist in fetal movement detection exist based on a wide spectrum of measurement techniques. However, these are typically unsuitable for ambulatory or long-term observation. In this paper, a novel method for fetal motion detection is presented based on amplitude and shape changes in the abdominally recorded fetal ECG. The proposed method has a sensitivity and specificity of 0.67 and 0.90, respectively, outperforming alternative fetal ECG-based methods from the literature.

  17. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System.

    Science.gov (United States)

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul

    2015-01-01

    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  18. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    Science.gov (United States)

    Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.

  19. Task and error analysis balancing benefits over business of electronic medical records.

    Science.gov (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B

    2014-01-01

    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR.

  20. Extracting physician group intelligence from electronic health records to support evidence based medicine.

    Directory of Open Access Journals (Sweden)

    Griffin M Weber

    Full Text Available Evidence-based medicine employs expert opinion and clinical data to inform clinical decision making. The objective of this study is to determine whether it is possible to complement these sources of evidence with information about physician "group intelligence" that exists in electronic health records. Specifically, we measured laboratory test "repeat intervals", defined as the amount of time it takes for a physician to repeat a test that was previously ordered for the same patient. Our assumption is that while the result of a test is a direct measure of one marker of a patient's health, the physician's decision to order the test is based on multiple factors including past experience, available treatment options, and information about the patient that might not be coded in the electronic health record. By examining repeat intervals in aggregate over large numbers of patients, we show that it is possible to 1 determine what laboratory test results physicians consider "normal", 2 identify subpopulations of patients that deviate from the norm, and 3 identify situations where laboratory tests are over-ordered. We used laboratory tests as just one example of how physician group intelligence can be used to support evidence based medicine in a way that is automated and continually updated.

  1. A critical ear: Analysis of value judgements in reviews of Beethoven’s piano sonata recordings

    Directory of Open Access Journals (Sweden)

    Elena eAlessandri

    2016-03-01

    Full Text Available What sets a great music performance apart? In this study we addressed this question through an examination of value judgements in written criticism of recorded performance. One hundred reviews of recordings of Beethoven’s piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative expressed by critics’ statements and the evaluation criteria that underpinned their judgements. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  2. A Critical Ear: Analysis of Value Judgments in Reviews of Beethoven's Piano Sonata Recordings.

    Science.gov (United States)

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron

    2016-01-01

    What sets a great music performance apart? In this study, we addressed this question through an examination of value judgments in written criticism of recorded performance. One hundred reviews of recordings of Beethoven's piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative) expressed by critics' statements and the evaluation criteria that underpinned their judgments. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  3. Simultaneous Recording and Analysis of Uterine and Abdominal Muscle Electromyographic Activity in Nulliparous Women During Labor.

    Science.gov (United States)

    Qian, Xueya; Li, Pin; Shi, Shao-Qing; Garfield, Robert E; Liu, Huishu

    2017-03-01

    To record and characterize electromyography (EMG) from the uterus and abdominal muscles during the nonlabor to first and second stages of labor and to define relationships to contractions. Nulliparous patients without any treatments were used (n = 12 nonlabor stage, 48 during first stage and 33 during second stage). Electromyography of both uterine and abdominal muscles was simultaneously recorded from electrodes placed on patients' abdominal surface using filters to separate uterine and abdominal EMG. Contractions of muscles were also recorded using tocodynamometry. Electromyography was characterized by analysis of various parameters. During the first stage of labor, when abdominal EMG is absent, uterine EMG bursts temporally correspond to contractions. In the second stage, uterine EMG bursts usually occur at same frequency as groups of abdominal bursts and precede abdominal bursts, whereas abdominal EMG bursts correspond to contractions and are accompanied by feelings of "urge to push." Uterine EMG increases progressively from nonlabor to second stage of labor. (1) Uterine EMG activity can be separated from abdominal EMG events by filtering. (2) Uterine EMG gradually evolves from the antepartum stage to the first and second stages of labor. (3) Uterine and abdominal EMG reflect electrical activity of the muscles during labor and are valuable to assess uterine and abdominal muscle events that control labor. (4) During the first stage of labor uterine, EMG is responsible for contractions, and during the second stage, both uterine and abdominal muscle participate in labor.

  4. Personal dose analysis of TLD glow curve data from individual monitoring records

    International Nuclear Information System (INIS)

    Adjei, D.; Darko, E. O.; Schandorf, C.; Owusu-Manteaw, P.; Akrobortu, E.

    2012-01-01

    Radiation exposure of workers in Ghana have been estimated on the basis of personal dose records of the occupationally exposed in medical, industrial and research/teaching practices for the period 2008-09. The estimated effective doses for 2008 are 0.400, 0.495 and 0.426 mSv for medical, industrial and research/teaching practices, respectively. The corresponding collective effective doses are 0.128, 0.044 and 0.017 person-Sv, respectively. Similarly, the effective doses recorded in 2009 are 0.448, 0.545 and 0.388 mSv, respectively with corresponding collective effective doses of 0.108, 0.032 and 0.012 person-Sv, respectively. The study shows that occupational exposure in Ghana is skewed to the lower doses (between 0.001 and 0.500 mSv). A statistical analysis of the effective doses showed no significant difference at p < 0.05 among the means of the effective doses recorded in various practices. (authors)

  5. Quantifying motion in video recordings of neonatal seizures by feature trackers based on predictive block matching.

    Science.gov (United States)

    Karayiannis, N B; Sami, A; Frost, J D; Wise, M S; Mizrahi, E M

    2004-01-01

    This work introduces predictive block matching, a method developed to track motion in video by exploiting the advantages of block motion estimation and adaptive block matching. The proposed method relies on a pure translation motion model to estimate the displacement of a block between two successive video frames before initiating the search for the best match of the block tracked throughout the frame sequence. The search for the best match relies on adaptive block matching, which employs an update strategy based on Kalman filtering to account for the changing appearance of the block. Predictive block matching was used to extract motor activity signals from video recordings of neonatal seizures.

  6. Polymer SU-8 Based Microprobes for Neural Recording and Drug Delivery

    Science.gov (United States)

    Altuna, Ane; Fernandez, Luis; Berganzo, Javier

    2015-06-01

    This manuscript makes a reflection about SU-8 based microprobes for neural activity recording and drug delivery. By taking advantage of improvements in microfabrication technologies and using polymer SU-8 as the only structural material, we developed several microprobe prototypes aimed to: a) minimize injury in neural tissue, b) obtain high-quality electrical signals and c) deliver drugs at a micrometer precision scale. Dedicated packaging tools have been developed in parallel to fulfill requirements concerning electric and fluidic connections, size and handling. After these advances have been experimentally proven in brain using in vivo preparation, the technological concepts developed during consecutive prototypes are discussed in depth now.

  7. Designing ETL Tools to Feed a Data Warehouse Based on Electronic Healthcare Record Infrastructure.

    Science.gov (United States)

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2015-01-01

    Aim of this paper is to propose a methodology to design Extract, Transform and Load (ETL) tools in a clinical data warehouse architecture based on the Electronic Healthcare Record (EHR). This approach takes advantages on the use of this infrastructure as one of the main source of information to feed the data warehouse, taking also into account that clinical documents produced by heterogeneous legacy systems are structured using the HL7 CDA standard. This paper describes the main activities to be performed to map the information collected in the different types of document with the dimensional model primitives.

  8. POLYMER SU-8 BASED MICROPROBES FOR NEURAL RECORDING AND DRUG DELIVERY

    Directory of Open Access Journals (Sweden)

    Ane eAltuna

    2015-06-01

    Full Text Available This manuscript makes a reflection about SU-8 based microprobes for neural activity recording and drug delivery. By taking advantage of improvements in microfabrication technologies and using polymer SU-8 as the only structural material, we developed several microprobe prototypes aimed to: a minimize injury in neural tissue, b obtain high-quality electrical signals and c deliver drugs at a micrometer precision scale. Dedicated packaging tools have been developed in parallel to fulfill requirements concerning electric and fluidic connections, size and handling. After these advances have been experimentally proven in brain using in vivo preparation, the technological concepts developed during consecutive prototypes are discussed in depth now.

  9. A Coral-based Climate Record from the Western Pacific Warm Pool

    Science.gov (United States)

    Quinn, T. M.; Taylor, F. W.; Crowley, T. J.; Stephans, C.

    2002-12-01

    The Western Pacific Warm Pool (WPWP) serves as a heat engine for Earth's climate and as a major moisture source for its hydrological cycle. Thermal and hydrologic variations in the WPWP are intimately involved with ENSO variations on the interannual timescale, but the role of these variations on decadal to century timescales remains poorly understood because of the paucity of subannually resolved climate and paleoclimate time series from the WPWP. Coral-based proxy records of thermal and hydrologic variations in the WPWP offer a great opportunity to extend the instrumental record and address the modes and mechanisms of tropical climate variability on decadal to century timescales. Coral-based climate records have been exploited in other regions of the tropical oceans, yet such records are rare from the WPWP. Herein we report the initial results of a stable isotopic and elemental ratio study of a ~1.8 m Porites coral head recovered in ~ 8 m of water from offshore of Rabaul, East New Britain, Papua New Guinea (4°S, 152°E) in September, 1998. Rabaul is a site of active volcanism and has had major eruptive episodes in 1998, 1994, 1943-1937, 1878, 1791 and 1767. Rabaul is located within the 29°C contour of mean annual SST field of the WPWP and seawaters surrounding it experience <1°C seasonal range in SST. In contrast, there is a 1 psu seasonal range in SSS. Average annual rainfall exceeds 2 m per year. X-radiography reveals readily discernable growth bands and we estimate an average extension rate of 10 mm/yr. The coral slab was sampled every 0.625 mm yielding an average sample resolution of 16 samples per year. Coral powder was divided into two samples: one for oxygen and carbon isotopic determinations and one for Sr/Ca ratio determinations. Our initial stable isotope results indicate the existence of a robust annual cycle in addition to large isotopic excursions in 1994, likely the result of the large volcanic event of that year. Stable isotope data acquisition

  10. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  11. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens

    2013-09-01

    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  12. Acoustic analysis of snoring sounds recorded with a smartphone according to obstruction site in OSAS patients.

    Science.gov (United States)

    Koo, Soo Kweon; Kwon, Soon Bok; Kim, Yang Jae; Moon, J I Seung; Kim, Young Jun; Jung, Sung Hoon

    2017-03-01

    Snoring is a sign of increased upper airway resistance and is the most common symptom suggestive of obstructive sleep apnea. Acoustic analysis of snoring sounds is a non-invasive diagnostic technique and may provide a screening test that can determine the location of obstruction sites. We recorded snoring sounds according to obstruction level, measured by DISE, using a smartphone and focused on the analysis of formant frequencies. The study group comprised 32 male patients (mean age 42.9 years). The spectrogram pattern, intensity (dB), fundamental frequencies (F 0 ), and formant frequencies (F 1 , F 2 , and F 3 ) of the snoring sounds were analyzed for each subject. On spectrographic analysis, retropalatal level obstruction tended to produce sharp and regular peaks, while retrolingual level obstruction tended to show peaks with a gradual onset and decay. On formant frequency analysis, F 1 (retropalatal level vs. retrolingual level: 488.1 ± 125.8 vs. 634.7 ± 196.6 Hz) and F 2 (retropalatal level vs. retrolingual level: 1267.3 ± 306.6 vs. 1723.7 ± 550.0 Hz) of retrolingual level obstructions showed significantly higher values than retropalatal level obstruction (p snoring is a non-invasive diagnostic technique that can be easily applied at a relatively low cost. The analysis of formant frequencies will be a useful screening test for the prediction of occlusion sites. Moreover, smartphone can be effective for recording snoring sounds.

  13. Comparison of experimental approaches to study selective properties of thick phase-amplitude holograms recorded in materials with diffusion-based formation mechanisms

    Science.gov (United States)

    Borisov, Vladimir; Klepinina, Mariia; Veniaminov, Andrey; Angervaks, Aleksandr; Shcheulin, Aleksandr; Ryskin, Aleksandr

    2016-04-01

    Volume holographic gratings, both transmission and reflection-type, may be employed as one-dimensional pho- tonic crystals. More complex two- and three-dimensional holographic photonic-crystalline structures can be recorded using several properly organized beams. As compared to colloidal photonic crystals, their holographic counterparts let minimize distortions caused by multiple inner boundaries of the media. Unfortunately, it's still hard to analyze spectral response of holographic structures. This work presents the results of thick holographic gratings analysis based on spectral-angular selectivity contours approximation. The gratings were recorded in an additively colored fluorite crystal and a glassy polymer doped with phenanthrenequinone (PQ-PMMA). The two materials known as promising candidates for 3D diffraction optics including photonic crystals, employ diffusion-based mechanisms of grating formation. The surfaces of spectral-angular selectivity were obtained in a single scan using a white-light LED, rotable table and a matrix spectrometer. The data expressed as 3D plots make apparent visual estimation of the grating phase/amplitude nature, noninearity of recording, etc., and provide sufficient information for numerical analysis. The grating recorded in the crystal was found to be a mixed phase-amplitude one, with different contributions of refractive index and absorbance modulation at different wavelengths, and demonstrated three diffraction orders corresponding to its three spatial harmonics originating from intrinsically nonlinear diffusion-drift recording mechanism. Contrastingly, the grating in the polymeric medium appeared purely phase and linearly recorded.

  14. Analysis of physiotherapy documentation of patients′ records and discharge plans in a tertiary hospital

    Directory of Open Access Journals (Sweden)

    Olajide A Olawale

    2015-01-01

    Full Text Available Background and Objective: Accurate documentation promotes continuity of care and facilitates dissemination of information concerning the patient to all members of the health care team. This study was designed to analyze the pattern of physiotherapy documentation of the patients' records and discharge plans in a tertiary hospital in Lagos, Nigeria. Materials and Methods: A total of 503 case files from the four units of the Physiotherapy Department of the hospital were examined for accuracy of records. The D-Catch instrument was used to quantify the accuracy of record structure, admission data, physiotherapy examination, physiotherapy diagnosis, patients' prognoses based on the plan of care, physiotherapy intervention, progress and outcome evaluation, legibility, and discharge/discontinuation plan. Results: “Accuracy of legibility” domain had the highest accuracy score: 401 (79.72% case files had an accuracy score of 4. The domain “accuracy of the discharge/discontinuation summary” had the lowest accuracy score: 502 (99.80% case files had an accuracy score of 1. Conclusion: Documentation of the plan of care made in the hospital for the period of this study did not fully conform to the guidelines of the World Confederation for Physical Therapy (WCPT. The accuracy of physiotherapy documentation needs to be improved in order to promote optimal continuity of care, improve efficiency and quality of care, and recognize patients' needs. Implementation and use of electronically produced documentation might help physiotherapists to organize their notes more accurately.

  15. An Internet-Based Real-Time Audiovisual Link for Dual MEG Recordings.

    Directory of Open Access Journals (Sweden)

    Andrey Zhdanov

    Full Text Available Most neuroimaging studies of human social cognition have focused on brain activity of single subjects. More recently, "two-person neuroimaging" has been introduced, with simultaneous recordings of brain signals from two subjects involved in social interaction. These simultaneous "hyperscanning" recordings have already been carried out with a spectrum of neuroimaging modalities, such as functional magnetic resonance imaging (fMRI, electroencephalography (EEG, and functional near-infrared spectroscopy (fNIRS.We have recently developed a setup for simultaneous magnetoencephalographic (MEG recordings of two subjects that communicate in real time over an audio link between two geographically separated MEG laboratories. Here we present an extended version of the setup, where we have added a video connection and replaced the telephone-landline-based link with an Internet connection. Our setup enabled transmission of video and audio streams between the sites with a one-way communication latency of about 130 ms. Our software that allows reproducing the setup is publicly available.We demonstrate that the audiovisual Internet-based link can mediate real-time interaction between two subjects who try to mirror each others' hand movements that they can see via the video link. All the nine pairs were able to synchronize their behavior. In addition to the video, we captured the subjects' movements with accelerometers attached to their index fingers; we determined from these signals that the average synchronization accuracy was 215 ms. In one subject pair we demonstrate inter-subject coherence patterns of the MEG signals that peak over the sensorimotor areas contralateral to the hand used in the task.

  16. Constraint-Based Time-Scale Modification of Music Recordings for Noise Beautification

    Directory of Open Access Journals (Sweden)

    Meinard Müller

    2018-03-01

    Full Text Available In magnetic resonance imaging (MRI, a patient is exposed to beat-like knocking sounds, often interrupted by periods of silence, which are caused by pulsing currents of the MRI scanner. In order to increase the patient’s comfort, one strategy is to play back ambient music to induce positive emotions and to reduce stress during the MRI scanning process. To create an overall acceptable acoustic environment, one idea is to adapt the music to the locally periodic acoustic MRI noise. Motivated by this scenario, we consider in this paper the general problem of adapting a given music recording to fulfill certain temporal constraints. More concretely, the constraints are given by a reference time axis with specified time points (e.g., the time positions of the MRI scanner’s knocking sounds. Then, the goal is to temporally modify a suitable music recording such that its beat positions align with the specified time points. As one technical contribution, we model this alignment task as an optimization problem with the objective to fulfill the constraints while avoiding strong local distortions in the music. Furthermore, we introduce an efficient algorithm based on dynamic programming for solving this task. Based on the computed alignment, we use existing time-scale modification procedures for locally adapting the music recording. To illustrate the outcome of our procedure, we discuss representative synthetic and real-world examples, which can be accessed via an interactive website. In particular, these examples indicate the potential of automated methods for noise beautification within the MRI application scenario.

  17. [Desmoid fibromatosis in absorption infrared spectroscopy, emission spectral analysis and roentgen diffraction recording].

    Science.gov (United States)

    Zejkan, A; Bejcek, Z; Horejs, J; Vrbová, H; Bakosová, M; Macholda, F; Rykl, D

    1989-10-01

    The authors present results of serial quality and quantity microanalyses of bone patterns and dental tissue patterns in patient with desmoid fibromatosis. Methods of absorption spectroscopy, emission spectral analysis and X-ray diffraction analysis with follow-up to x-ray examination are tested. The above mentioned methods function in a on-line system by means of specially adjusted monitor unit which is controlled centrally by the computer processor system. The whole process of measurement is fully automated and the data obtained are recorded processed in the unit data structure classified into index sequence blocks of data. Serial microanalyses offer exact data for the study of structural changes of dental and bone tissues which manifest themselves in order of crystal grid shifts. They prove the fact that microanalyses give new possibilities in detection and interpretation of chemical and structural changes of apatite cell.

  18. An analysis of concert saxophone vibrato through the examination of recordings by eight prominent soloists

    Science.gov (United States)

    Zinninger, Thomas

    This study examines concert saxophone vibrato through the analysis of several recordings of standard repertoire by prominent soloists. The vibrato of Vincent Abato, Arno Bornkamp, Claude Delangle, Jean-Marie Londeix, Marcel Mule, Otis Murphy, Sigurd Rascher, and Eugene Rousseau is analyzed with regards to rate, extent, shape, and discretionary use. Examination of these parameters was conducted through both general observation and precise measurements with the aid of a spectrogram. Statistical analyses of the results provide tendencies for overall vibrato use, as well as the effects of certain musical attributes (note length, tempo, dynamic, range) on vibrato. The results of this analysis are also compared among each soloist and against pre-existing theories or findings in vibrato research.

  19. Combined Analysis of GRIDICE and BOSS Information Recorded During CMS-LCG0 Production

    CERN Document Server

    Coviello, Tommaso; Donvito, Giacinto; Maggi, Giorgio; Maggi, Marcello; Pierro, A

    2004-01-01

    Input parameters needed to a CMS data analysis computing model simulation were determined by a combined analysis of the data stored by the GridICE and BOSS monitoring systems from July to October 2003 when simulating two million events for CMS data challenge 2004, in a Grid distributed environment on a dedicated CMS testbed (CMS-LCG0). During the production, the two monitoring systems were taking records of complementary information for each submitted job. In particular, by integrating data from both monitoring system databases, the measurements of the job execution performance on different processors used in the CMS-LCG0 testbed, were obtained. First results about the simulation of the CMS-LCG0 testbed using the Ptolemy program were also reported.

  20. An iPad and Android-based Application for Digitally Recording Geologic Field Data

    Science.gov (United States)

    Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.

    2011-12-01

    Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file

  1. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  2. Time-Dependent Probabilistic Seismic Hazard Analysis Using the Simulated Records, the Case of Tehran

    Directory of Open Access Journals (Sweden)

    Babak Hajimohammadi

    2015-03-01

    Full Text Available Common attenuation equations are developed by seismic records which belong to earthquakes that have happened so far. Although there are many recorded data during last 50 years, it is not possible to consider all possible wave propagation paths, site types and fault rupture mechanisms inclassical attenuation relations. This fact becomes more serious in near field cases and a common shortcoming in most attenuation equations is their low accuracyin estimation of near field parameters.Many important cities of the world such as Tehran are located nearby some active faults. For example, the North Tehran Fault is such a closeseismic source to Tehran Metropolitanarea andcould be considered asa near field source. Therefore, it is necessary to evaluate near field effects in most of hazard analyses, risk management programs, structural designs, etc.In past, it was routine to use attenuation equations in hazard analyses. In this project for avoiding from insufficient performance of attenuation equations in near field, proposed simulation datum byZafarani, et al., (2012, were used directly in the hazard analysis without converting them into attenuation equations.Besides, time dependent hazard analysis (Non-PoissonianModel was used to taking into account the probable seismic activity of the North Tehran Fault.

  3. Frequency Dependent Polarization Analysis of Ambient Seismic Noise Recorded at Broadband Seismometers

    Science.gov (United States)

    Koper, K.; Hawley, V.

    2010-12-01

    Analysis of ambient seismic noise is becoming increasingly relevant to modern seismology. Advances in computational speed and storage have made it feasible to analyze years and even decades of continuous seismic data in short amounts of time. Therefore, it is now possible to perform longitudinal studies of station performance in order to identify degradation or mis-installation of seismic equipment. Long-term noise analysis also provides insight into the evolution of the ocean wave climate, specifically whether the frequency and intensity of storms have changed as global temperatures have changed. Here we present a new approach to polarization analysis of seismic noise recorded by three-component seismometers. Essentially, eigen-decomposition of the 3-by-3 Hermitian spectral matrix associated with a sliding window of data is applied to yield various polarization attributes as a function of time and frequency. This in turn yields fundamental information about the composition of seismic noise, such as the extent to which it is polarized, its mode of propagation, and the direction from which it arrives at the seismometer. The polarization attributes can be viewed as function of time or binned over 2D frequency-time space to deduce regularities in the ambient noise that are unbiased by transient signals from earthquakes and explosions. We applied the algorithm to continuous data recorded in 2009 by the seismic station SLM, located in central North America. A rich variety of noise sources was observed. At low frequencies (3 Hz), Rayleigh-type energy was again dominant, in the form of Rg waves created by nearby cultural activities. Analysis of the time dependence of noise power shows that a frequency range of at least 0.02-1.0 Hz (much larger than the microseism band) is sensitive to annual, meteorologically induced sources of noise. We are currently applying our technique to selected seismometers from USArray and the University of Utah Seismic Network.

  4. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  5. A technique to stabilize record bases for Gothic arch tracings in patients with implant-retained complete dentures.

    Science.gov (United States)

    Raigrodski, A J; Sadan, A; Carruth, P L

    1998-12-01

    Clinicians have long expressed concern about the accuracy of the Gothic arch tracing for recording centric relation in edentulous patients. With the use of dental implants to assist in retaining complete dentures, the problem of inaccurate recordings, made for patients without natural teeth, can be significantly reduced. This article presents a technique that uses healing abutments to stabilize the record bases so that an accurate Gothic arch tracing can be made.

  6. Eielson Air Force Base operable unit 2 and other areas record of decision

    International Nuclear Information System (INIS)

    Lewis, R.E.; Smith, R.M.

    1994-10-01

    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater

  7. An evaluation of a teaching package constructed using a Web-based lecture recorder

    Directory of Open Access Journals (Sweden)

    Judith Segal

    1997-12-01

    Full Text Available This paper reports on an evaluation of a teaching package constructed using Audiograph, a Web-based lecture recorder developed at the University of Surrey. Audiograph is described in detail in Jesshope and Shafarenko (1997. Its developer aims to provide a medium by which multimedia teaching packages, based on traditional university lectures, may be developed rapidly by the lecturer(s concerned (as opposed to professional CAL developers at low cost. Audiograph is designed so that development time should only be in the order of two hours for every hour of presentation. Packages developed using Audiograph make much use of audio, which is somewhat unusual (apart from in video clips in a package not dedicated to Computer-Assisted Language Learning or to addressing learning difficulties associated with vision. They also use text and (some animation.

  8. Removal of BCG artefact from concurrent fMRI-EEG recordings based on EMD and PCA.

    Science.gov (United States)

    Javed, Ehtasham; Faye, Ibrahima; Malik, Aamir Saeed; Abdullah, Jafri Malin

    2017-11-01

    Simultaneous electroencephalography (EEG) and functional magnetic resonance image (fMRI) acquisitions provide better insight into brain dynamics. Some artefacts due to simultaneous acquisition pose a threat to the quality of the data. One such problematic artefact is the ballistocardiogram (BCG) artefact. We developed a hybrid algorithm that combines features of empirical mode decomposition (EMD) with principal component analysis (PCA) to reduce the BCG artefact. The algorithm does not require extra electrocardiogram (ECG) or electrooculogram (EOG) recordings to extract the BCG artefact. The method was tested with both simulated and real EEG data of 11 participants. From the simulated data, the similarity index between the extracted BCG and the simulated BCG showed the effectiveness of the proposed method in BCG removal. On the other hand, real data were recorded with two conditions, i.e. resting state (eyes closed dataset) and task influenced (event-related potentials (ERPs) dataset). Using qualitative (visual inspection) and quantitative (similarity index, improved normalized power spectrum (INPS) ratio, power spectrum, sample entropy (SE)) evaluation parameters, the assessment results showed that the proposed method can efficiently reduce the BCG artefact while preserving the neuronal signals. Compared with conventional methods, namely, average artefact subtraction (AAS), optimal basis set (OBS) and combined independent component analysis and principal component analysis (ICA-PCA), the statistical analyses of the results showed that the proposed method has better performance, and the differences were significant for all quantitative parameters except for the power and sample entropy. The proposed method does not require any reference signal, prior information or assumption to extract the BCG artefact. It will be very useful in circumstances where the reference signal is not available. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Investigations of anomalous gravity signals prior to 71 large earthquakes based on a 4-years long superconducting gravimeter records

    Directory of Open Access Journals (Sweden)

    Dijin Wang

    2017-09-01

    Full Text Available Using continuous 1-Hz sampling time-series recorded by a SG (superconducting gravimeter at Hsinchu station, Taiwan of China, we investigate the anomalous gravity signals prior to 71 large earthquakes with moment magnitude larger than 7.0 (Mw7.0 occurred between 1 Jan 2008 and 31 Dec 2011. We firstly evaluate the noise level of the SG records at Hsinchu (HS station in microseismic bands from 0.05 Hz to 0.1 Hz by computing the PSD (power spectral density of seismically quiet days selected based on the RMS of records. Based on the analysis of the noise level and the spectral features of the seismically quiet SG records at HS station, we detect AGSs (anomalous gravity signals prior to large earthquakes. We apply HHT (Hilbert-Huang transformation to establish the TFEP (time-frequency-energy paradigms and MS (marginal spectra of the SG data before the large earthquakes, and the characteristics of TFEP and MS of the SGs data during the typhoon event are also analyzed. By comparing the spectral characteristics of the SGs data during seismically quiet period, three types of AGSs are found; and the occurrence rate of AGSs before 71 earthquakes is given in terms of the cases with different epicenter distance and different focal depth. The statistical results show that 56.3% of all the examined large earthquakes were preceded by AGSs; and if we constrain the epicenter distance to be smaller than 3500 km and focal depth less than 300 km, 75.3% of the examined large earthquakes can be associated with the AGSs. Especially, we note that for all the large earthquakes occurred in the Eurasian plate in recent four years, the precursory AGSs can always be found in the SG data recorded at HS station. Our investigations suggest that the AGSs prior to large earthquakes may be related to focal depth, epicentre distance and location.

  10. Gap analysis between provisional diagnosis and final diagnosis in government and private teaching hospitals: A record-linked comparative study

    Directory of Open Access Journals (Sweden)

    Sudeshna Chatterjee

    2016-01-01

    Full Text Available Aims: 1. To identify the extent of clinical gaps at the context of knowledge, practice and systems. 2. To formulate necessary intervention measures towards bridging the gaps. Settings and Design: Comparative, cross-sectional and non-interventional study. Methods and Material: It is retrospective, record-based study conducted upon inpatients (n = 200 of major disciplines of two teaching hospitals. Major outcome variables were to observe the matching and un-matching of final and provisional diagnosis by using ICD-10 criteria. Statistical Analysis Used: Comparative analysis of specific and selective gaps were estimated in terms of percentage (%. Results: Pilot observation showed the existence of gaps between provisional and final diagnosis in both private and government institution. Both knowledge and skill gaps were evident in caregivers and gap in documentation was existent in medical records. Conclusions: The pilot data is may be an eye-opener to public and private governance systems for understanding and revising the process service planning and service delivery. Necessary intervention measures may be contemplated towards enhancing diagnostic skill of doctors for quality hospital care.

  11. Flexible graphene electrode-based organic photovoltaics with record-high efficiency.

    Science.gov (United States)

    Park, Hyesung; Chang, Sehoon; Zhou, Xiang; Kong, Jing; Palacios, Tomás; Gradečak, Silvija

    2014-09-10

    Advancements in the field of flexible high-efficiency solar cells and other optoelectronic devices will strongly depend on the development of electrode materials with good conductivity and flexibility. To address chemical and mechanical instability of currently used indium tin oxide (ITO), graphene has been suggested as a promising flexible transparent electrode but challenges remain in achieving high efficiency of graphene-based polymer solar cells (PSCs) compared to their ITO-based counterparts. Here we demonstrate graphene anode- and cathode-based flexible PSCs with record-high power conversion efficiencies of 6.1 and 7.1%, respectively. The high efficiencies were achieved via thermal treatment of MoO3 electron blocking layer and direct deposition of ZnO electron transporting layer on graphene. We also demonstrate graphene-based flexible PSCs on polyethylene naphthalate substrates and show the device stability under different bending conditions. Our work paves a way to fully graphene electrode-based flexible solar cells using a simple and reproducible process.

  12. Development of Software for dose Records Data Base Access; Programacion para la consulta del Banco de Datos Dosimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M.

    1990-07-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs.

  13. Early to mid-Miocene palaeoclimate of Antarctica based on terrestrial records

    Science.gov (United States)

    Ashworth, Allan; Lewis, Adam

    2017-04-01

    Paleontological and stratigraphic studies of sites in the Transantarctic Mountains (TAM) are advancing knowledge of the landscape, vegetation and climate that existed immediately before the growth of the modern East Antarctic Ice Sheet. The sites are located in the Friis Hills and the western Olympus Range in the McMurdo Dry Valleys. In both localities, parts of ancient landscapes are preserved on upland surfaces high above modern valley floors. The early to mid-Miocene interval is bracketed by 40Ar/39Ar ages on volcanic ashes of 19.76 ± 0.11 Ma to 13.85 ± 0.03 Ma. Like all glacial records it is discontinuous but even so several trends can be detected. The record is one of an evolving glacial system during which ice caps coalesced to form an ice sheet. Initially, small alpine glaciers flowed southwestward toward the continental interior eroding shallow troughs into granitic bedrock. By the close of the interval, large glaciers flowed eastward from the continental interior to the Ross Sea. The interval was marked by numerous glacial advances and retreats. Tills are matrix-rich, and outwash sands and gravels ripple-laminated and cross-bedded, typical of those associated with wet-based glaciation. The vegetation during the interval was in a dynamic flux retreating downslope during glacial advances and recolonizing valleys after retreats. Fossils accumulated in peat beds and organic silts representing lacustrine, fluvial and paludal environments. Fossils include diatoms, fungal ascomycetes, pollen and spores, lycopod megaspores, mosses, wood and leaves of Nothofagus (southern beech), fruits of vascular plants, and insect skeletal parts of Diptera (flies) and Coleoptera (beetles). The vegetation was a tundra, initially shrub- and later moss-dominated. During the interval there was a marked decline in biodiversity. Initially, there were 4 species of Nothofagus represented by different leaf types and at least 9 species of vascular plants by their seeds. At the close of

  14. Analysis of records of external occupational dose records in Brazil; Analise dos registros de dose ocupacional externa no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Mauricio, Claudia L.P.; Silva, Herica L.R. da, E-mail: claudia@ird.gov.br, E-mail: herica@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ),Rio de Janeiro, RJ (Brazil); Silva, Claudio Ribeiro da, E-mail: claudio@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    Brazil, a continental country, with actually more than 150,000 workers under individual monitoring for ionizing radiation, has implemented in 1987 a centralized system for storage of external occupational dose. This database has been improved over the years and is now a web-based information system called Brazilian External Occupational Dose Management Database System - GDOSE. This paper presents an overview of the Brazilian external occupational dose over the years. The estimated annual average effective dose shows a decrease from 2.4 mSv in 1987 to about 0.6 mSv, having been a marked reduction from 1987 to 1990. Analyzing by type of controlled practice, one sees that the medical and dental radiology is the area with the largest number of users of individual monitors (70%); followed by education practices (8%) and the industrial radiography (7%). Additionally to photon whole body monitoring; neutron monitors are used in maintenance (36%), reactor (30%) and education (27%); and extremity monitors, in education (27%), nuclear medicine (22%) and radiology (19%). In terms of collective dose, the highest values are also found in conventional radiology, but the highest average dose values are those of interventional radiology. Nuclear medicine, R and D and radiotherapy also have average annual effective dose higher than 1 mSv. However, there is some very high dose values registered in GDOSE that give false information. This should be better analyzed in the future. Annual doses above 500 are certainly not realistic. (author)

  15. 76 FR 59681 - Record Hill Wind, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Science.gov (United States)

    2011-09-27

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-4527-000] Record Hill Wind, LLC; Supplemental Notice That Initial Market- Based Rate Filing Includes Request for Blanket Section 204 Authorization This is a supplemental notice in the above-referenced proceeding of Record Hill...

  16. Experimentally derived detection distances from audio recordings and human observers enable integrated analysis of point count data

    Directory of Open Access Journals (Sweden)

    Daniel A. Yip

    2017-06-01

    Full Text Available Point counts are one of the most commonly used methods for assessing bird abundance. Autonomous recording units (ARUs are increasingly being used as a replacement for human-based point counts. Previous studies have compared the relative benefits of human versus ARU-based point count methods, primarily with the goal of understanding differences in species richness and the abundance of individuals over an unlimited distance. What has not been done is an evaluation of how to standardize these two types of data so that they can be compared in the same analysis, especially when there are differences in the area sampled. We compared detection distances between human observers in the field and four commercially available recording devices (Wildlife Acoustics SM2, SM3, RiverForks, and Zoom H1 by simulating vocalizations of various avian species at different distances and amplitudes. We also investigated the relationship between sound amplitude and detection to simplify ARU calibration. We used these data to calculate correction factors that can be used to standardize detection distances of ARUs relative to each other and human observers. In general, humans in the field could detect sounds at greater distances than an ARU although detectability varied depending on species song characteristics. We provide correction factors for four commonly used ARUs and propose methods for calibrating ARUs relative to each other and human observers.

  17. Record Russian river discharge in 2007 and the limits of analysis

    International Nuclear Information System (INIS)

    Shiklomanov, A I; Lammers, R B

    2009-01-01

    The Arctic water cycle has experienced an unprecedented degree of change which may have planetary-scale impacts. The year 2007 in particular not only was unique in terms of minimum sea ice extent in the Arctic Ocean but also was a record breaking year for Eurasian river inflow to the Arctic Ocean. Over the observational period from 1936 to 2006, the mean annual river discharge for the six largest Russian rivers was 1796 km 3 y -1 , with the previous record high being 2080 km 3 y -1 , in 2002. The year 2007 showed a massive flux of fresh water from these six drainage basins of 2254 km 3 y -1 . We investigated the hydroclimatological conditions for such extreme river discharge and found that while that year's flow was unusually high, the overall spatial patterns were consistent with the hydroclimatic trends since 1980, indicating that 2007 was not an aberration but a part of the general trend. We wanted to extend our hydroclimatological analysis of river discharge anomalies to seasonal and monthly time steps; however, there were limits to such analyses due to the direct human impact on the river systems. Using reconstructions of the naturalized hydrographs over the Yenisey basin we defined the limits to analysis due to the effect of reservoirs on river discharge. For annual time steps the trends are less impacted by dam construction, whereas for seasonal and monthly time steps these data are confounded by the two sources of change, and the climate change signals were overwhelmed by the human-induced river impoundments. We offer two solutions to this problem; first, we recommend wider use of algorithms to 'naturalize' the river discharge data and, second, we suggest the identification of a network of existing and stable river monitoring sites to be used for climate change analysis.

  18. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson

    2010-06-01

    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  19. Titanium-based multi-channel, micro-electrode array for recording neural signals.

    Science.gov (United States)

    McCarthy, Patrick T; Madangopal, Rajtarun; Otto, Kevin J; Rao, Masaru P

    2009-01-01

    Micro-scale brain-machine interface (BMI) devices have provided an opportunity for direct probing of neural function and have also shown significant promise for restoring neurological functions lost to stroke, injury, or disease. However, the eventual clinical translation of such devices may be hampered by limitations associated with the materials commonly used for their fabrication, e.g. brittleness of silicon, insufficient rigidity of polymeric devices, and unproven chronic biocompatibility of both. Herein, we report, for the first time, the development of titanium-based "Michigan" type multi-channel, microelectrode arrays that seek to address these limitations. Titanium provides unique properties of immediate relevance to microelectrode arrays, such as high toughness, moderate modulus, and excellent biocompatibility, which may enhance structural reliability, safety, and chronic recording reliability. Realization of these devices is enabled by recently developed techniques which provide the opportunity for fabrication of high aspect ratio micromechanical structures in bulk titanium substrates. Details regarding the design, fabrication, and characterization of these devices for eventual use in rat auditory cortex and thalamus recordings are presented, as are preliminary results.

  20. Conflict Detection Performance Analysis for Function Allocation Using Time-Shifted Recorded Traffic Data

    Science.gov (United States)

    Guerreiro, Nelson M.; Butler, Ricky W.; Maddalon, Jeffrey M.; Hagen, George E.; Lewis, Timothy A.

    2015-01-01

    The performance of the conflict detection function in a separation assurance system is dependent on the content and quality of the data available to perform that function. Specifically, data quality and data content available to the conflict detection function have a direct impact on the accuracy of the prediction of an aircraft's future state or trajectory, which, in turn, impacts the ability to successfully anticipate potential losses of separation (detect future conflicts). Consequently, other separation assurance functions that rely on the conflict detection function - namely, conflict resolution - are prone to negative performance impacts. The many possible allocations and implementations of the conflict detection function between centralized and distributed systems drive the need to understand the key relationships that impact conflict detection performance, with respect to differences in data available. This paper presents the preliminary results of an analysis technique developed to investigate the impacts of data quality and data content on conflict detection performance. Flight track data recorded from a day of the National Airspace System is time-shifted to create conflicts not present in the un-shifted data. A methodology is used to smooth and filter the recorded data to eliminate sensor fusion noise, data drop-outs and other anomalies in the data. The metrics used to characterize conflict detection performance are presented and a set of preliminary results is discussed.

  1. The SAGE II/OSIRIS/OMPS-LP USask 2D Deseasonalized Anomaly Ozone Data Record for Use in Trend Analysis

    Science.gov (United States)

    Degenstein, D. A.; Bourassa, A. E.; Zawada, D.; Roth, C.; McLinden, C. A.

    2017-12-01

    The SAGE II/OSIRIS/OMPS-LP USask 2D ozone deseasonalized anomaly data record spans over three decades, from 1984 to the present, and has been used extensively for the determination of stratospheric ozone trends in the post Montreal Protocol era. Radiance measurements made by the three instruments have all been used to produce ozone data profiles whose native units are number density as a function of altitude. Therefore, during the merging process required to produce the extended data record it is not necessary to use meteorological data for unit conversion and all trends contained within the data record come directly from the data products themselves. Although the SAGE II occultation data record ended in 2005, both the OMPS-LP and OSIRIS limb scattered sunlight data records continue. OSIRIS has been in operation since 2001 and is well beyond its lifetime but OMPS-LP is scheduled for launch on future spacecraft so the data record should continue for many years, or even decades. It is also anticipated that SAGE III ISS data will be added to the existing record to further enhance its utility for trend analysis. This paper will outline the process used to produce the deseasonalized ozone anomaly data record detailing the issues associated with merging data records that have different biases and sampling characteristics. Issues associated with SAGE II and OSIRIS measurements that are made at different local times will also be discussed. Finally, this paper will present trend results produced using variations of the official LOTUS analysis code. These results cover an altitude range from the tropopause to 55 km and from 60 South to 60 North in ten-degree bins. It will be shown that the new OMPS-LP USask 2D data record is of excellent quality and can be used to extend the ozone data records for the purpose of trend analysis.

  2. Analysis of the definition and utility of personal health records using Q methodology.

    Science.gov (United States)

    Kim, Jeongeun; Bates, David W

    2011-11-29

    Personal health records (PHRs) remain a relatively new technology and concept in practice even though they have been discussed in the literature for more than 50 years. There is no consensus on the definition of a PHR or PHR system even within the professional societies of health information technology. Our objective was to analyze and classify the opinions of health information professionals regarding the definitions of the PHR. Q methodology was used to explore the concept of the PHR. A total of 50 Q-statements were selected and rated by 45 P-samples consisting of health information professionals. We analyzed the resulting data by using Q methodology-specific software and SPSS. We selected five types of health information professionals' opinions: type I, public interest centered; type II, health information standardization centered; type III, health consumer centered; type IV, health information security centered; and type V, health consumer convenience centered. The Q-statements with the highest levels of agreement were as follows: (1) the PHR is the lifetime record of personal health information, (2) the PHR is the representation of health 2.0, and (3) security is the most important requirement of the PHR. The most disagreed-with Q-statements were (1) the PHR is a paper-based system, and (2) it is most effective to carry the PHR information in USB storage. Health information professionals agree that PHRs should be lifetime records, that they will be useful as more information is stored electronically, and that data security is paramount. To maximize the benefits of PHR, activation strategies should be developed and extended across disciplines and professionals so that patients begin to receive the benefits associate with using PHRs.

  3. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Height (Top and Base) Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of cloud height (top and base) from the Visible Infrared Imaging Radiometer Suite...

  4. A compact self-recording pressure based sea level gauge suitable for deployments at harbour and offshore environments

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.; Peshwe, V.B.; Joseph, A.; Mehra, P.; Naik, G.P.; Kumar, V.; Desa, E.S.; Desai, R.G.P.; Nagvekar, S.; Desai, S.P.

    A compact and lightweight self-recording pressure based sea level gauge has been designed to suit deployments from harbour and offshore environments. A novel hydraulic coupling device designed in-house was used to transfer the seawater pressure...

  5. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Base Height (CBH) Environmental Data Record (EDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of Cloud Base Heights (CBH) from the Visible Infrared Imaging Radiometer Suite...

  6. Integrated interpretation of helicopter and ground-based geophysical data recorded within the Okavango Delta, Botswana

    DEFF Research Database (Denmark)

    Podgorski, Joel E.; Green, Alan G.; Kalscheuer, Thomas

    2015-01-01

    ) data recorded across most of the delta, (ii) 2D models and images derived from ground-based electrical resistance tomographic, transient electromagnetic, and high resolution seismic reflection/refraction tomographic data acquired at four selected sites in western and north-central regions of the delta...... on the arcuate nature of its front and the semi-conical shape of its upper surface in the HTEM resistivity model. Moderate to high resistivity subhorizontal layers are consistent with this interpretation. The deepest unit is the basement with very high resistivity, high P-wave velocity, and low or complex......Integration of information from the following sources has been used to produce a much better constrained and more complete four-unit geological/hydrological model of the Okavango Delta than previously available: (i) a 3D resistivity model determined from helicopter time-domain electromagnetic (HTEM...

  7. Reconstruction of walleye exploitation based on angler diary records and a model of predicted catches.

    Science.gov (United States)

    Willms, Allan R; Green, David M

    2007-11-01

    The walleye population in Canadarago Lake, New York, was 81-95% exploited in the 1988 fishing season, the year in which a previous restriction on the length and number of legally harvestable fish was liberalized. Using diary records from a subset of fishermen, growth estimates, and an estimate of the walleye population in the following year, a method is developed to reconstruct the fish population back to the spring of 1988 and thus determine the exploitation rate. The method is based on a model of diary catches that partitions time and fish length into a set of cells and relates predicted catches and population sizes in these cells. The method's sensitivity to the partitioning scheme, the growth estimates, and the diary data is analyzed. The method could be employed in other fish exploitation analyses and demonstrates the use of inexpensive angler-collected data in fisheries management.

  8. A cloud-based approach for interoperable electronic health records (EHRs).

    Science.gov (United States)

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  9. Delivering a lifelong integrated electronic health record based on a service oriented architecture.

    Science.gov (United States)

    Katehakis, Dimitrios G; Sfakianakis, Stelios G; Kavlentakis, Georgios; Anthoulakis, Dimitrios N; Tsiknakis, Manolis

    2007-11-01

    Efficient access to a citizen's Integrated Electronic Health Record (I-EHR) is considered to be the cornerstone for the support of continuity of care, the reduction of avoidable mistakes, and the provision of tools and methods to support evidence-based medicine. For the past several years, a number of applications and services (including a lifelong I-EHR) have been installed, and enterprise and regional infrastructure has been developed, in HYGEIAnet, the Regional Health Information Network (RHIN) of the island of Crete, Greece. Through this paper, the technological effort toward the delivery of a lifelong I-EHR by means of World Wide Web Consortium (W3C) technologies, on top of a service-oriented architecture that reuses already existing middleware components is presented and critical issues are discussed. Certain design and development decisions are exposed and explained, laying this way the ground for coordinated, dynamic navigation to personalized healthcare delivery.

  10. A Bayesian network model for predicting type 2 diabetes risk based on electronic health records

    Science.gov (United States)

    Xie, Jiang; Liu, Yan; Zeng, Xu; Zhang, Wu; Mei, Zhen

    2017-07-01

    An extensive, in-depth study of diabetes risk factors (DBRF) is of crucial importance to prevent (or reduce) the chance of suffering from type 2 diabetes (T2D). Accumulation of electronic health records (EHRs) makes it possible to build nonlinear relationships between risk factors and diabetes. However, the current DBRF researches mainly focus on qualitative analyses, and the inconformity of physical examination items makes the risk factors likely to be lost, which drives us to study the novel machine learning approach for risk model development. In this paper, we use Bayesian networks (BNs) to analyze the relationship between physical examination information and T2D, and to quantify the link between risk factors and T2D. Furthermore, with the quantitative analyses of DBRF, we adopt EHR and propose a machine learning approach based on BNs to predict the risk of T2D. The experiments demonstrate that our approach can lead to better predictive performance than the classical risk model.

  11. Analysis of record-low ozone values during the 1997 winter over Lauder, New Zealand

    Science.gov (United States)

    Brinksma, E. J.; Meijer, Y. J.; Connor, B. J.; Manney, G. L.; Bergwerff, J. B.; Bodeker, G. E.; Boyd, I. S.; Liley, J. B.; Hogervorst, W.; Hovenier, J. W.; Livesey, N. J.; Swart, D. P. J.

    Record-low ozone (O3) column densities (with a minimum of 222 DU) were observed over the Lauder NDSC (Network for the Detection of Stratospheric Change) station (45°S, 170°E) in August 1997. Possible causes are examined using height-resolved O3 measurements over Lauder, and high-resolution reverse trajectory maps of O3 (initialised with Microwave Limb Sounder measurements) and of potential vorticity. The analysis shows that O3 poor air originated from two regions: Below the 550 K isentrope (˜22 km) subtropical air was observed, while between 600 and 1000 K (˜25-33.5 km) the polar vortex tilted over Lauder for several days. A rapid recovery of the O3 column density was observed later, due to an O3 rich polar vortex filament moving over Lauder between 18 and 24 km, while simultaneously the O3 poor higher vortex moved away.

  12. A Socio-Technical Analysis of Patient Accessible Electronic Health Records.

    Science.gov (United States)

    Hägglund, Maria; Scandurra, Isabella

    2017-01-01

    In Sweden, and internationally, there is a movement towards increased transparency in healthcare including giving patients online access to their electronic health records (EHR). The purpose of this paper is to analyze the Swedish patient accessible EHR (PAEHR) service using a socio-technical framework, to increase the understanding of factors that influence the design, implementation, adoption and use of the service. Using the Sitting and Singh socio-technical framework as a basis for analyzing the Swedish PAEHR system and its context indicated that there are many stakeholders engaged in these types of services, with different driving forces and incentives that may influence the adoption and usefulness of PAEHR services. The analysis was useful in highlighting important areas that need to be further explored in evaluations of PAEHR services, and can act as a guide when planning evaluations of any PAEHR service.

  13. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    Kamenopoulou, V.; Dimitriou, P.; Proukakis, Ch.

    1995-01-01

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  14. Mercury Determination in Fish Samples by Chronopotentiometric Stripping Analysis Using Gold Electrodes Prepared from Recordable CDs

    Directory of Open Access Journals (Sweden)

    Andrei Florin Danet

    2008-11-01

    Full Text Available A simple method for manufacturing gold working electrodes for chronopotentiometric stripping measurements from recordable CD-R’s is described. These gold electrodes are much cheaper than commercially available ones. The electrochemical behavior of such an electrode and the working parameters for mercury determination by chronopotentiometric stripping analysis were studied. Detection limit was 0.30 μg Hg/L and determination limit was 1.0 μg Hg/L for a deposition time of 600 s. Using the developed working electrodes it was possible to determine the total mercury in fish samples. A method for fish sample digestion was developed by using a mixture of fuming nitric acid and both concentrated sulfuric and hydrochloric acids. The recovery degree for a known amount of mercury introduced in the sample before digestion was 95.3% (n=4.

  15. Automating occupational protection records systems

    International Nuclear Information System (INIS)

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs

  16. Prefrontal cortex and somatosensory cortex in tactile crossmodal association: an independent component analysis of ERP recordings.

    Directory of Open Access Journals (Sweden)

    Yixuan Ku

    2007-08-01

    Full Text Available Our previous studies on scalp-recorded event-related potentials (ERPs showed that somatosensory N140 evoked by a tactile vibration in working memory tasks was enhanced when human subjects expected a coming visual stimulus that had been paired with the tactile stimulus. The results suggested that such enhancement represented the cortical activities involved in tactile-visual crossmodal association. In the present study, we further hypothesized that the enhancement represented the neural activities in somatosensory and frontal cortices in the crossmodal association. By applying independent component analysis (ICA to the ERP data, we found independent components (ICs located in the medial prefrontal cortex (around the anterior cingulate cortex, ACC and the primary somatosensory cortex (SI. The activity represented by the IC in SI cortex showed enhancement in expectation of the visual stimulus. Such differential activity thus suggested the participation of SI cortex in the task-related crossmodal association. Further, the coherence analysis and the Granger causality spectral analysis of the ICs showed that SI cortex appeared to cooperate with ACC in attention and perception of the tactile stimulus in crossmodal association. The results of our study support with new evidence an important idea in cortical neurophysiology: higher cognitive operations develop from the modality-specific sensory cortices (in the present study, SI cortex that are involved in sensation and perception of various stimuli.

  17. Prefrontal cortex and somatosensory cortex in tactile crossmodal association: an independent component analysis of ERP recordings.

    Science.gov (United States)

    Ku, Yixuan; Ohara, Shinji; Wang, Liping; Lenz, Fred A; Hsiao, Steven S; Bodner, Mark; Hong, Bo; Zhou, Yong-Di

    2007-08-22

    Our previous studies on scalp-recorded event-related potentials (ERPs) showed that somatosensory N140 evoked by a tactile vibration in working memory tasks was enhanced when human subjects expected a coming visual stimulus that had been paired with the tactile stimulus. The results suggested that such enhancement represented the cortical activities involved in tactile-visual crossmodal association. In the present study, we further hypothesized that the enhancement represented the neural activities in somatosensory and frontal cortices in the crossmodal association. By applying independent component analysis (ICA) to the ERP data, we found independent components (ICs) located in the medial prefrontal cortex (around the anterior cingulate cortex, ACC) and the primary somatosensory cortex (SI). The activity represented by the IC in SI cortex showed enhancement in expectation of the visual stimulus. Such differential activity thus suggested the participation of SI cortex in the task-related crossmodal association. Further, the coherence analysis and the Granger causality spectral analysis of the ICs showed that SI cortex appeared to cooperate with ACC in attention and perception of the tactile stimulus in crossmodal association. The results of our study support with new evidence an important idea in cortical neurophysiology: higher cognitive operations develop from the modality-specific sensory cortices (in the present study, SI cortex) that are involved in sensation and perception of various stimuli.

  18. Fabrication of chitosan/Au-TiO2nanotube-based dry electrodes for electroencephalography recording.

    Science.gov (United States)

    Song, Yanjuan; Li, Penghai; Li, Mingji; Li, Hongji; Li, Cuiping; Sun, Dazhi; Yang, Baohe

    2017-10-01

    In this paper, we describe a method for fabricating dry electrodes for use in recording electroencephalograms (EEGs), which are based on the use of chitosan (Ch), gold (Au) particles, and titanium dioxide (TiO 2 ) nanotube arrays deposited on titanium (Ti) thin sheets. The samples were characterized by scanning electron microscopy, X-ray diffraction, electrochemical impedance spectroscopy, and EEG signal collection. The TiO 2 nanotube arrays were grown on the Ti thin sheet by an electrochemical anodic oxidation method. The Au particles were deposited on the bottom and surface layers of the TiO 2 nanotube array using an electrochemistry-based multi-potential step technology. The fabricated dry Ch/Au-TiO 2 electrodes have an efficient conversion interface for ion current/electron current, a high biocompatible contact surface, and a fast electron transfer channel. To confirm that the Ch/Au-TiO 2 layer can be used in dry EEG electrodes, the impedance spectra of the electrodes in solution and skin were analyzed. The mean impedance values for skin were found to be approximately 169±33.0kΩ at 2.15Hz and 67.4±8.9kΩ at 100Hz. In addition, EEG signals from the forehead and sites with hair were collected using both the dry Ch/Au-TiO 2 electrode and a wet Ag/AgCl electrode for comparison purposes. It was found that high quality EEG signal recordings could be obtained using the dry electrodes. The fact that electrolytes are not required means that the electrodes are suitable for use in long-term bio-potential testing. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Communication Base Station Log Analysis Based on Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Zhang Shao-Hua

    2017-01-01

    Full Text Available Communication base stations generate massive data every day, these base station logs play an important value in mining of the business circles. This paper use data mining technology and hierarchical clustering algorithm to group the scope of business circle for the base station by recording the data of these base stations.Through analyzing the data of different business circle based on feature extraction and comparing different business circle category characteristics, which can choose a suitable area for operators of commercial marketing.

  20. Analysis of the Impact of Wildfire on Surface Ozone Record in the Colorado Front Range

    Science.gov (United States)

    McClure-Begley, A.; Petropavlovskikh, I. V.; Oltmans, S. J.; Pierce, R. B.; Sullivan, J. T.; Reddy, P. J.

    2015-12-01

    Ozone plays an important role on the oxidation capacity of the atmosphere, and at ground-level has negative impacts on human health and ecosystem processes. In order to understand the dynamics and variability of surface ozone, it is imperative to analyze individual sources, interactions between sources, transport, and chemical processes of ozone production and accumulation. Biomass burning and wildfires have been known to emit a suite of particulate matter and gaseous compounds into the atmosphere. These compounds, such as, volatile organic compounds, carbon monoxide, and nitrogen oxides are precursor species which aid in the photochemical production and destruction of ozone. The Colorado Front Range (CFR) is a region of complex interactions between pollutant sources and meteorological conditions which result in the accumulation of ozone. High ozone events in the CFR associated with fires are analyzed for 2003-2014 to develop understanding of the large scale influence and variability of ozone and wildfire relationships. This study provides analysis of the frequency of enhanced ozone episodes that can be confirmed to be transported within and affected by the fires and smoke plumes. Long-term records of surface ozone data from the CFR provide information on the impact of wildfire pollutants on seasonal and diurnal ozone behavior. Years with increased local fire activity, as well as years with increased long-range transport of smoke plumes, are evaluated for the effect on the long-term record and high ozone frequency of each location. Meteorological data, MODIS Fire detection images, NOAA HYSPLIT Back Trajectory analysis, NOAA Smoke verification model, Fire Tracer Data (K+), RAQMS Model, Carbon Monoxide data, and Aerosol optical depth retrievals are used with NOAA Global Monitoring Division surface ozone data from three sites in Colorado. This allows for investigation of the interactions between pollutants and meteorology which result in high surface ozone levels.

  1. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  2. Preservation and analysis of footprint evidence within the archaeological record: examples from Valsequillo and Cuatrocienegas, Mexico.

    Science.gov (United States)

    Bennett, M.; Huddart, D.; Gonzalez, S.

    2008-05-01

    Human footprints provide a direct record of human occupation and can be used to make a range of biometric inferences about the individuals which left them. In this paper we describe the application of three-dimensional optical laser scanning in the preservation and analysis both human and animal footprints. Optical laser scanning provides a digital elevation model of a print or surface with a vertical accuracy typically less than + 0.01 mm. Not only does this provide a procedure for recording fragile footprint evidence but allows digital measurements to be made. It is also possible to use the techniques developed for rapid proto-typing to recreate the print as solid models for visualisation. The role of optical laser scanning in the preservation of footprint evidence is explored with specific reference to the controversial footprints of the Valsequillo Basin in Central Mexico which may provide some of the earliest evidence of human colonization of the Americas. More importantly, digital footprint scans provide a basis for the numerical analysis of footprints allowing the tools of geometric morphometrics to be applied. These tools have been widely developed in the fields of biology and physical anthropology and used to explore the anatomical significance of shape. One key question that can be addressed using this approach is to develop a statistical approach to the objective recognition of a human footprint thereby helping to verify their interpretation and archaeological significance. Using footprint data from sites across the World a statistical model for the recognition of human footprints is presented and used to evaluate the controversial footprint site of Valsequillo, (Puebla State) preserved in volcanic ash and those in the Cuatrocienegas Basin, (Coahuila State) preserved in travertine.

  3. Analysis of Handling Processes of Record Versions in NoSQL Databases

    OpenAIRE

    Yu. A. Grigorev

    2015-01-01

    This article investigates the handling processes versions of a record in NoSQL databases. The goal of this work is to develop a model, which enables users both to handle record versions and work with a record simultaneously. This model allows us to estimate both a time distribution for users to handle record versions and a distribution of the count of record versions. With eventual consistency (W=R=1) there is a possibility for several users to update any record simultaneously. In this case, ...

  4. Semantic validation of standard-based electronic health record documents with W3C XML schema.

    Science.gov (United States)

    Rinner, C; Janzek-Hawlat, S; Sibinovic, S; Duftschmid, G

    2010-01-01

    The goal of this article is to examine whether W3C XML Schema provides a practicable solution for the semantic validation of standard-based electronic health record (EHR) documents. With semantic validation we mean that the EHR documents are checked for conformance with the underlying archetypes and reference model. We describe an approach that allows XML Schemas to be derived from archetypes based on a specific naming convention. The archetype constraints are augmented with additional components of the reference model within the XML Schema representation. A copy of the EHR document that is transformed according to the before-mentioned naming convention is used for the actual validation against the XML Schema. We tested our approach by semantically validating EHR documents conformant to three different ISO/EN 13606 archetypes respective to three sections of the CDA implementation guide "Continuity of Care Document (CCD)" and an implementation guide for diabetes therapy data. We further developed a tool to automate the different steps of our semantic validation approach. For two particular kinds of archetype prescriptions, individual transformations are required for the corresponding EHR documents. Otherwise, a fully generic validation is possible. In general, we consider W3C XML Schema as a practicable solution for the semantic validation of standard-based EHR documents.

  5. Descriptive and temporal analysis of post-mortem lesions recorded in slaughtered pigs in New Zealand from 2000 to 2010.

    Science.gov (United States)

    Neumann, Ej; Hall, Wf; Stevenson, Ma; Morris, Rs; Ling Min Than, J

    2014-05-01

    To complete a retrospective analysis of data from a national abattoir-based lesion recording system (PigCheck) in the New Zealand pig industry, in order to establish the prevalence of 20 post-mortem disease lesions, describe long-term trends in the prevalence of these lesions, and identify the proportion of the monthly variation in lesion prevalence that could be attributed to individual farms or abattoirs. Slaughter lesion data were collected and reported at the lot level (a cohort of pigs delivered from one farm, at one time). Data on the prevalence of lesions between January 2000 and December 2010 was aggregated by month, and time-series analysis of the data for each lesion was conducted. The time series pattern for each lesion was described with an auto-regressive integrated moving average (ARIMA) model; seasonality of lesion occurrence was assessed separately. To determine the proportion of variance in lesion prevalence that could be attributed to farms relative to that attributed to abattoirs, a hierarchical binomial generalised linear mixed model was created incorporating two random effect levels, at the farm (within abattoir) and abattoir levels. A dataset comprised of 124,407 lots (6,220,664 pigs, 279 farms, five abattoirs) was compiled for analysis. The most prevalent conditions across the 11-year time series were antero-ventral pneumonia (7.6%), pleuropneumonia (11.4%), and milk spots (9.2%). Of the 15 lesions shown to have a significant annual change in prevalence, 10 decreased over time and five increased. The variance in prevalence that was observed for pyogenic lesion (92%), mange (73%), and ileitis (62%) was attributed primarily to variation between abattoirs. By contrast, the farm of origin explained the greatest percentage of variance in prevalence for rectal prolapse (98%), pneumonia (97%), and antero-ventral pneumonia (96%). The overall prevalence of most lesions recorded in PigCheck for the period was low relative to published data from other

  6. Untangling inconsistent magnetic polarity records through an integrated rock magnetic analysis: A case study on Neogene sections in East Timor

    Science.gov (United States)

    Aben, F. M.; Dekkers, M. J.; Bakker, R. R.; van Hinsbergen, D. J. J.; Zachariasse, W. J.; Tate, G. W.; McQuarrie, N.; Harris, R.; Duffy, B.

    2014-06-01

    Inconsistent polarity patterns in sediments are a common problem in magnetostratigraphic and paleomagnetic research. Multiple magnetic mineral generations result in such remanence "haystacks." Here we test whether end-member modeling of isothermal remanent magnetization acquisition curves as a basis for an integrated rock magnetic and microscopic analysis is capable of isolating original magnetic polarity patterns. Uppermost Miocene-Pliocene deep-marine siliciclastics and limestones in East Timor, originally sampled to constrain the uplift history of the young Timor orogeny, serve as case study. An apparently straightforward polarity record was obtained that, however, proved impossible to reconcile with the associated biostratigraphy. Our analysis distinguished two magnetic end-members for each section, which result from various greigite suites and a detrital magnetite suite. The latter yields largely viscous remanence signals and is deemed unsuited. The greigite suites are late diagenetic in the Cailaco River section and early diagenetic, thus reliable, in the Viqueque Type section. By selecting reliable sample levels based on a quality index, a revised polarity pattern of the latter section is obtained: consistent with the biostratigraphy and unequivocally correlatable to the Geomagnetic Polarity Time Scale. Although the Cailaco River section lacks a reliable magnetostratigraphy, it does record a significant postremagnetization tectonic rotation. Our results shows that the application of well-designed rock magnetic research, based on the end-member model and integrated with microscopy and paleomagnetic data, can unravel complex and seemingly inconsistent polarity patterns. We recommend this approach to assess the veracity of the polarity of strata with complex magnetic mineralogy.

  7. Development of the electronic patient record system based on problem oriented system.

    Science.gov (United States)

    Uto, Yumiko; Iwaanakuchi, Takashi; Muranaga, Fuminori; Kumamoto, Ichiro

    2013-01-01

    In Japan, POS (problem oriented system) is recommended in the clinical guideline. Therefore, the records are mainly made by SOAP. We developed a system mainly with a function which enabled our staff members of all kinds of professions including doctors to enter the patients' clinical information as an identical record, regardless if they were outpatients or inpatients, and to observe the contents chronologically. This electric patient record system is called "e-kanja recording system". On this system, all staff members in the medical team can now share the same information. Moreover, the contents can be reviewed by colleagues; the quality of records has been improved as it is evaluated by the others.

  8. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  9. Evaluation of an algorithm based on single-condition decision rules for binary classification of 12-lead ambulatory ECG recording quality

    International Nuclear Information System (INIS)

    Di Marco, Luigi Yuri; Duan, Wenfeng; Bojarnejad, Marjan; Zheng, Dingchang; Murray, Alan; Langley, Philip; King, Susan

    2012-01-01

    A new algorithm for classifying ECG recording quality based on the detection of commonly observed ECG contaminants which often render the ECG unusable for diagnostic purposes was evaluated. Contaminants (baseline drift, flat line, QRS-artefact, spurious spikes, amplitude stepwise changes, noise) were detected on individual leads from joint time-frequency analysis and QRS amplitude. Classification was based on cascaded single-condition decision rules (SCDR) that tested levels of contaminants against classification thresholds. A supervised learning classifier (SLC) was implemented for comparison. The SCDR and SLC algorithms were trained on an annotated database (Set A, PhysioNet Challenge 2011) of ‘acceptable’ versus ‘unacceptable’ quality recordings using the ‘leave M out’ approach with repeated random partitioning and cross-validation. Two training approaches were considered: (i) balanced, in which training records had equal numbers of ‘acceptable’ and ‘unacceptable’ recordings, (ii) unbalanced, in which the ratio of ‘acceptable’ to ‘unacceptable’ recordings from Set A was preserved. For each training approach, thresholds were calculated, and classification accuracy of the algorithm compared to other rule based algorithms and the SLC using a database for which classifications were unknown (Set B PhysioNet Challenge 2011). The SCDR algorithm achieved the highest accuracy (91.40%) compared to the SLC (90.40%) in spite of its simple logic. It also offers the advantage that it facilitates reporting of meaningful causes of poor signal quality to users. (paper)

  10. Evaluation of an algorithm based on single-condition decision rules for binary classification of 12-lead ambulatory ECG recording quality.

    Science.gov (United States)

    Di Marco, Luigi Yuri; Duan, Wenfeng; Bojarnejad, Marjan; Zheng, Dingchang; King, Susan; Murray, Alan; Langley, Philip

    2012-09-01

    A new algorithm for classifying ECG recording quality based on the detection of commonly observed ECG contaminants which often render the ECG unusable for diagnostic purposes was evaluated. Contaminants (baseline drift, flat line, QRS-artefact, spurious spikes, amplitude stepwise changes, noise) were detected on individual leads from joint time-frequency analysis and QRS amplitude. Classification was based on cascaded single-condition decision rules (SCDR) that tested levels of contaminants against classification thresholds. A supervised learning classifier (SLC) was implemented for comparison. The SCDR and SLC algorithms were trained on an annotated database (Set A, PhysioNet Challenge 2011) of 'acceptable' versus 'unacceptable' quality recordings using the 'leave M out' approach with repeated random partitioning and cross-validation. Two training approaches were considered: (i) balanced, in which training records had equal numbers of 'acceptable' and 'unacceptable' recordings, (ii) unbalanced, in which the ratio of 'acceptable' to 'unacceptable' recordings from Set A was preserved. For each training approach, thresholds were calculated, and classification accuracy of the algorithm compared to other rule based algorithms and the SLC using a database for which classifications were unknown (Set B PhysioNet Challenge 2011). The SCDR algorithm achieved the highest accuracy (91.40%) compared to the SLC (90.40%) in spite of its simple logic. It also offers the advantage that it facilitates reporting of meaningful causes of poor signal quality to users.

  11. Gap analysis between provisional diagnosis and final diagnosis in government and private teaching hospitals: A record-linked comparative study.

    Science.gov (United States)

    Chatterjee, Sudeshna; Ray, Krishnangshu; Das, Anup Kumar

    2016-01-01

    1. To identify the extent of clinical gaps at the context of knowledge, practice and systems. 2. To formulate necessary intervention measures towards bridging the gaps. Comparative, cross-sectional and non-interventional study. It is retrospective, record-based study conducted upon inpatients ( n = 200) of major disciplines of two teaching hospitals. Major outcome variables were to observe the matching and un-matching of final and provisional diagnosis by using ICD-10 criteria. Comparative analysis of specific and selective gaps were estimated in terms of percentage (%). Pilot observation showed the existence of gaps between provisional and final diagnosis in both private and government institution. Both knowledge and skill gaps were evident in caregivers and gap in documentation was existent in medical records. The pilot data is may be an eye-opener to public and private governance systems for understanding and revising the process service planning and service delivery. Necessary intervention measures may be contemplated towards enhancing diagnostic skill of doctors for quality hospital care.

  12. Statistical Metadata Analysis of the Variability of Latency, Device Transfer Time, and Coordinate Position from Smartphone-Recorded Infrasound Data

    Science.gov (United States)

    Garces, E. L.; Garces, M. A.; Christe, A.

    2017-12-01

    The RedVox infrasound recorder app uses microphones and barometers in smartphones to record infrasound, low-frequency sound below the threshold of human hearing. We study a device's metadata, which includes position, latency time, the differences between the device's internal times and the server times, and the machine time, searching for patterns and possible errors or discontinuities in these scaled parameters. We highlight metadata variability through scaled multivariate displays (histograms, distribution curves, scatter plots), all created and organized through software development in Python. This project is helpful in ascertaining variability and honing the accuracy of smartphones, aiding the emergence of portable devices as viable geophysical data collection instruments. It can also improve the app and cloud service by increasing efficiency and accuracy, allowing to better document and foresee drastic natural movements like tsunamis, earthquakes, volcanic eruptions, storms, rocket launches, and meteor impacts; recorded data can later be used for studies and analysis by a variety of professions. We expect our final results to produce insight on how to counteract problematic issues in data mining and improve accuracy in smartphone data-collection. By eliminating lurking variables and minimizing the effect of confounding variables, we hope to discover efficient processes to reduce superfluous precision, unnecessary errors, and data artifacts. These methods should conceivably be transferable to other areas of software development, data analytics, and statistics-based experiments, contributing a precedent of smartphone metadata studies from geophysical rather than societal data. The results should facilitate the rise of civilian-accessible, hand-held, data-gathering mobile sensor networks and yield more straightforward data mining techniques.

  13. Fragmented implementation of maternal and child health home-based records in Vietnam: need for integration

    Directory of Open Access Journals (Sweden)

    Hirotsugu Aiga

    2016-02-01

    Full Text Available Background: Home-based records (HBRs are globally implemented as the effective tools that encourage pregnant women and mothers to timely and adequately utilise maternal and child health (MCH services. While availability and utilisation of nationally representative HBRs have been assessed in several earlier studies, the reality of a number of HBRs subnationally implemented in a less coordinated manner has been neither reported nor analysed. Objectives: This study is aimed at estimating the prevalence of HBRs for MCH and the level of fragmentation of and overlapping between different HBRs for MCH in Vietnam. The study further attempts to identify health workers’ and mothers’ perceptions towards HBR operations and utilisations. Design: A self-administered questionnaire was sent to the provincial health departments of 28 selected provinces. A copy of each HBR available was collected from them. A total of 20 semi-structured interviews with health workers and mothers were conducted at rural communities in four of 28 selected provinces. Results: Whereas HBRs developed exclusively for maternal health and exclusively for child health were available in four provinces (14% and in 28 provinces (100%, respectively, those for both maternal health and child health were available in nine provinces (32%. The mean number of HBRs in 28 provinces (=5.75 indicates over-availability of HBRs. All 119 minimum required items for recording found in three different HBRs under nationwide scale-up were also included in the Maternal and Child Health Handbook being piloted for nationwide scaling-up. Implementation of multiple HBRs is likely to confuse not only health workers by requiring them to record the same data on several HBRs but also mothers about which HBR they should refer to and rely on at home. Conclusions: To enable both health workers and pregnant women to focus on only one type of HBR, province-specific HBRs for maternal and/or child health need to be

  14. Validity of type 2 diabetes diagnosis in a population-based electronic health record database.

    Science.gov (United States)

    Moreno-Iribas, Conchi; Sayon-Orea, Carmen; Delfrade, Josu; Ardanaz, Eva; Gorricho, Javier; Burgui, Rosana; Nuin, Marian; Guevara, Marcela

    2017-04-08

    The increasing burden of type 2 diabetes mellitus makes the continuous surveillance of its prevalence and incidence advisable. Electronic health records (EHRs) have great potential for research and surveillance purposes; however the quality of their data must first be evaluated for fitness for use. The aim of this study was to assess the validity of type 2 diabetes diagnosis in a primary care EHR database covering more than half a million inhabitants, 97% of the population in Navarra, Spain. In the Navarra EPIC-InterAct study, the validity of the T90 code from the International Classification of Primary Care, Second Edition was studied in a primary care EHR database to identify incident cases of type 2 diabetes, using a multi-source approach as the gold standard. The sensitivity, specificity, positive predictive value, negative predictive value and the kappa index were calculated. Additionally, type 2 diabetes prevalence from the EHR database was compared with estimations from a health survey. The sensitivity, specificity, positive predictive value and negative predictive value of incident type 2 diabetes recorded in the EHRs were 98.2, 99.3, 92.2 and 99.8%, respectively, and the kappa index was 0.946. Overall prevalence of type 2 diabetes diagnosed in the EHRs among adults (35-84 years of age) was 7.2% (95% confidence interval [CI] 7.2-7.3) in men and 5.9% (95% CI 5.8-5.9) in women, which was similar to the prevalence estimated from the health survey: 8.5% (95% CI 7.1-9.8) and 5.5% (95% CI 4.4-6.6) in men and women, respectively. The high sensitivity and specificity of type 2 diabetes diagnosis found in the primary care EHRs make this database a good source for population-based surveillance of incident and prevalent type 2 diabetes, as well as for monitoring quality of care and health outcomes in diabetic patients.

  15. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  16. A 600-ka Arctic sea-ice record from Mendeleev Ridge based on ostracodes

    Science.gov (United States)

    Cronin, Thomas M.; Polyak, L.V.; Reed, D.; Kandiano, E. S.; Marzen, R. E.; Council, E. A.

    2013-01-01

    Arctic paleoceanography and sea-ice history were reconstructed from epipelagic and benthic ostracodes from a sediment core (HLY0503-06JPC, 800 m water depth) located on the Mendeleev Ridge, Western Arctic Ocean. The calcareous microfaunal record (ostracodes and foraminifers) covers several glacial/interglacial cycles back to estimated Marine Isotope Stage 13 (MIS 13, ∼500 ka) with an average sedimentation rate of ∼0.5 cm/ka for most of the stratigraphy (MIS 5–13). Results based on ostracode assemblages and an unusual planktic foraminiferal assemblage in MIS 11 dominated by a temperate-water species Turborotalita egelida show that extreme interglacial warmth, high surface ocean productivity, and possibly open ocean convection characterized MIS 11 and MIS 13 (∼400 and 500 ka, respectively). A major shift in western Arctic Ocean environments toward perennial sea ice occurred after MIS 11 based on the distribution of an ice-dwelling ostracode Acetabulastoma arcticum. Spectral analyses of the ostracode assemblages indicate sea ice and mid-depth ocean circulation in western Arctic Ocean varied primarily at precessional (∼22 ka) and obliquity (∼40 ka) frequencies.

  17. Spatiotemporal dynamics of feature-based attention spread: evidence from combined electroencephalographic and magnetoencephalographic recordings.

    Science.gov (United States)

    Stoppel, Christian Michael; Boehler, Carsten Nicolas; Strumpf, Hendrik; Krebs, Ruth Marie; Heinze, Hans-Jochen; Hopf, Jens-Max; Schoenfeld, Mircea Ariel

    2012-07-11

    Attentional selection on the basis of nonspatial stimulus features induces a sensory gain enhancement by increasing the firing-rate of individual neurons tuned to the attended feature, while responses of neurons tuned to opposite feature-values are suppressed. Here we recorded event-related potentials (ERPs) and magnetic fields (ERMFs) in human observers to investigate the underlying neural correlates of feature-based attention at the population level. During the task subjects attended to a moving transparent surface presented in the left visual field, while task-irrelevant probe stimuli executing brief movements into varying directions were presented in the opposite visual field. ERP and ERMF amplitudes elicited by the unattended task-irrelevant probes were modulated as a function of the similarity between their movement direction and the task-relevant movement direction in the attended visual field. These activity modulations reflecting globally enhanced processing of the attended feature were observed to start not before 200 ms poststimulus and were localized to the motion-sensitive area hMT. The current results indicate that feature-based attention operates in a global manner but needs time to spread and provide strong support for the feature-similarity gain model.

  18. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  19. Maturity Matrices for Quality of Model- and Observation-Based Climate Data Records

    Science.gov (United States)

    Höck, Heinke; Kaiser-Weiss, Andrea; Kaspar, Frank; Stockhause, Martina; Toussaint, Frank; Lautenschlager, Michael

    2015-04-01

    In the field of Software Engineering the Capability Maturity Model is used to evaluate and improve software development processes. The application of a Maturity Matrix is a method to assess the degree of software maturity. This method was adapted to the maturity of Earth System data in scientific archives. The application of such an approach to Climate Data Records was first proposed in the context of satellite-based climate products and applied by NOAA and NASA. The European FP7 project CORE-CLIMAX suggested and tested extensions of the approach in order to allow the applicability to additional climate datasets, e.g. based on in-situ observations as well as model-based reanalysis. Within that project the concept was applied to products of satellite- and in-situ based datasets. Examples are national ground-based data from Germany as an example for typical products of a national meteorological service, the EUMETSAT Satellite Application Facility Network, the ESA Climate Change Initiative, European Reanalysis activities (ERA-CLIM) and international in situ-based climatologies such as GPCC, ECA&D, BSRN, HadSST. Climate models and their related output have some additional characteristics that need specific consideration in such an approach. Here we use examples from the World Data Centre for Climate (WDCC) to discuss the applicability. The WDCC focuses on climate data products, specifically those resulting from climate simulations. Based on these already existing Maturity Matrix models, WDCC developed a generic Quality Assessment System for Earth System data. A self-assessment is performed using a maturity matrix evaluating the data quality for five maturity levels with respect to the criteria data and metadata consistency, completeness, accessibility and accuracy. The classical goals of a quality assessment system in a data processing workflow are: (1) to encourage data creators to improve quality to reach the next quality level, (2) enable data consumers to decide

  20. Analysis of Kuwait Temperature Records: Test of Heat Island Existence in Kuwait City Arid Environment.

    Science.gov (United States)

    Nasrallah, Hasan Ali

    Very few arid land cities have been studied to determine local climate effects developing from rapid urban growth in the twentieth century. Kuwait City in the State of Kuwait is examined to determine the significance of urban growth on heating in the region. The study examines recent changes in temperature for the State of Kuwait for the period 1958-1980. During this time period, Kuwait has experienced explosive urban growth from 0.2 million to 1.7 million population. Simple parametric inferential statistics are employed to monthly temperature records from seven locations in and adjacent to Kuwait City. These tests are conducted to determine the connection between urbanization and the development of urban heating effects. The statistical tests employ a national "benchmark" desert site; a rural, agricultural benchmark site in the State of Kuwait; and stations in Bahrain, Eilat, Riyadh, Abadan, and Baghdad. The analysis illustrates that there is only a modest level of urban heating detectable in temperature records from the region of Kuwait. This finding runs counter to prevailing literature on urban climatology, which generally states that urban heating depends strongly on urban extent and population growth. Upon inspection of geographic location and surficial characteristics of Kuwait City, two hypotheses are suggested for the low order urban heating detected: (1) cooling effects of advected Arabian Gulf air across the city, and, (2) the lack of substantive spatial differences of surface albedo, thermal inertia, surface moisture, and aerosol heating. However, Kuwait's morphological (i.e., building geometry) characteristics, according to urban canyon-heat island theory, should have promoted a 7 ^circC heat island in Kuwait City. A test of this theory revealed no such heat island of that magnitude. One major reason relates to station network inadequacy to portray the extent of Kuwait City's heat island development through time. More research, including modeling and

  1. Content analysis of physical examination templates in electronic health records using SNOMED CT.

    Science.gov (United States)

    Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia

    2014-10-01

    Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology

  2. A new methodology for estimating rainfall aggressiveness risk based on daily rainfall records for multi-decennial periods.

    Science.gov (United States)

    García-Barrón, Leoncio; Morales, Julia; Sousa, Arturo

    2018-02-15

    The temporal irregularity of rainfall, characteristic of a Mediterranean climate, corresponds to the irregularity of the environmental effects on soil. We used aggressiveness as an indicator to quantify the potential environmental impact of rainfall. However, quantifying rainfall aggressiveness is conditioned by the lack of sub-hourly frequency records on which intensity models are based. On the other hand, volume models are characterized by a lack of precision in the treatment of heavy rainfall events because they are based on monthly series. Therefore, in this study, we propose a new methodology for estimating rainfall aggressiveness risk. A new synthesis parameter based on reformulation using daily data of the Modified Fournier and Oliver's Precipitation Concentration indices is defined. The weighting of both indices for calculating the aggressiveness risk is established by multiple regression with respect to the local erosion R factor estimated in the last decades. We concluded that the proposed methodology overcomes the previously mentioned limitations of the traditional intensity and volume models and provides accurate information; therefore, it is appropriate for determining potential rainfall impact over long time periods. Specifically, we applied this methodology to the daily rainfall time series from the San Fernando Observatory (1870-2010) in southwest Europe. An interannual aggressiveness risk series was generated, which allowed analysis of its evolution and determination of the temporal variability. The results imply that environmental management can use data from long-term historical series as a reference for decision making. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Automatic BSS-based filtering of metallic interference in MEG recordings: definition and validation using simulated signals

    Science.gov (United States)

    Migliorelli, Carolina; Alonso, Joan F.; Romero, Sergio; Mañanas, Miguel A.; Nowak, Rafał; Russi, Antonio

    2015-08-01

    Objective. One of the principal drawbacks of magnetoencephalography (MEG) is its high sensitivity to metallic artifacts, which come from implanted intracranial electrodes and dental ferromagnetic prosthesis and produce a high distortion that masks cerebral activity. The aim of this study was to develop an automatic algorithm based on blind source separation (BSS) techniques to remove metallic artifacts from MEG signals. Approach. Three methods were evaluated: AMUSE, a second-order technique; and INFOMAX and FastICA, both based on high-order statistics. Simulated signals consisting of real artifact-free data mixed with real metallic artifacts were generated to objectively evaluate the effectiveness of BSS and the subsequent interference reduction. A completely automatic detection of metallic-related components was proposed, exploiting the known characteristics of the metallic interference: regularity and low frequency content. Main results. The automatic procedure was applied to the simulated datasets and the three methods exhibited different performances. Results indicated that AMUSE preserved and consequently recovered more brain activity than INFOMAX and FastICA. Normalized mean squared error for AMUSE decomposition remained below 2%, allowing an effective removal of artifactual components. Significance. To date, the performance of automatic artifact reduction has not been evaluated in MEG recordings. The proposed methodology is based on an automatic algorithm that provides an effective interference removal. This approach can be applied to any MEG dataset affected by metallic artifacts as a processing step, allowing further analysis of unusable or poor quality data.

  4. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    Science.gov (United States)

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  5. Movement based artifacts may contaminate extracellular electrical recordings from GI muscles.

    Science.gov (United States)

    Bayguinov, O; Hennig, G W; Sanders, K M

    2011-11-01

    Electrical slow waves drive peristaltic contractions in the stomach and facilitate gastric emptying. In gastroparesis and other disorders associated with altered gastric emptying, motility defects have been related to altered slow wave frequency and disordered propagation. Experimental and clinical measurements of slow waves are made with extracellular or abdominal surface recording. We tested the consequences of muscle contractions and movement on biopotentials recorded from murine gastric muscles with array electrodes and pairs of silver electrodes. Propagating biopotentials were readily recorded from gastric sheets composed of the entire murine stomach. The biopotentials were completely blocked by nifedipine (2 μmol L(-1) ) that blocked contractile movements and peristaltic contractions. Wortmannin, an inhibitor of myosin light chain kinase, also blocked contractions and biopotentials. Stimulation of muscles with carbachol increased the frequency of biopotentials in control conditions but failed to elicit biopotentials with nifedipine or wortmannin present. Intracellular recording with microelectrodes showed that authentic gastric slow waves occur at a faster frequency typically than biopotentials recorded with extracellular electrodes, and electrical slow waves recorded with intracellular electrodes were unaffected by suppression of movement. Electrical transients, equal in amplitude to biopotentials recorded with extracellular electrodes, were induced by movements produced by small transient stretches (artifacts in extracellular recordings of biopotentials from murine gastric muscles and suggest that movement suppression should be an obligatory control when monitoring electrical activity and characterizing propagation and coordination of electrical events with extracellular recording techniques. © 2011 Blackwell Publishing Ltd.

  6. Provider and Patient Determinants of Generic Levothyroxine Prescribing: An Electronic Health Records-Based Study.

    Science.gov (United States)

    Romanelli, Robert J; Nimbal, Vani; Dutcher, Sarah K; Pu, Xia; Segal, Jodi B

    2017-08-01

    Despite the availability of generic levothyroxine products for more than a decade, uptake of these products is poor. We sought to evaluate determinants of generic prescribing of levothyroxine. In a cross-sectional analysis of electronic health records data between 2010 and 2013, we identified adult patients with a levothyroxine prescription from a primary-care physician (PCP) or endocrinologist. We used mixed-effect logistic regression models with random intercepts for prescribing provider to examine predictors of generic levothyroxine prescribing. Models include patient, prescription, and provider fixed-effect covariates. Odds ratios (ORs) and 95% CIs were generated. Between-provider random variation was quantified by the intraclass correlation coefficient (ICC). Study patients (n = 63 838) were clustered among 941 prescribing providers within 25 ambulatory care clinics. The overall prevalence of generic prescribing of levothyroxine was 73%. In the multivariable mixed-effect model, patients were significantly less likely to receive generic levothyroxine from an endocrinologist than a PCP (OR = 0.43; 95% CI = 0.33-0.55; P levothyroxine than men from endocrinologists (OR = 0.68; 95% CI = 0.59-0.78; P levothyroxine prescribing differed by PCPs and endocrinologists. Residual variation in generic prescribing, after accounting for measurable factors, indicates the need for provider interventions or patient education aimed at improving levothyroxine generic uptake.

  7. The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine.

    Directory of Open Access Journals (Sweden)

    Yasaira Rodriguez Torres

    Full Text Available This study aimed to determine the role of electronic health record software in resident education by evaluating documentation of 30 elements extracted from the American Academy of Ophthalmology Dry Eye Syndrome Preferred Practice Pattern. The Kresge Eye Institute transitioned to using electronic health record software in June 2013. We evaluated the charts of 331 patients examined in the resident ophthalmology clinic between September 1, 2011, and March 31, 2014, for an initial evaluation for dry eye syndrome. We compared documentation rates for the 30 evidence-based elements between electronic health record chart note templates among the ophthalmology residents. Overall, significant changes in documentation occurred when transitioning to a new version of the electronic health record software with average compliance ranging from 67.4% to 73.6% (p 90% in 13 elements while Electronic Health Record B had high compliance (>90% in 11 elements. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin examination, contact lens wear, and smoking exposure. Significant differences in documentation were correlated with electronic health record template design rather than individual resident or residents' year in training. Our results show that electronic health record template design influences documentation across all resident years. Decreased documentation likely results from "mouse click fatigue" as residents had to access multiple dialog boxes to complete documentation. These findings highlight the importance of EHR template design to improve resident documentation and integration of evidence-based medicine into their clinical notes.

  8. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis.

    Science.gov (United States)

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2011-01-01

    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.

  9. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review.

    Science.gov (United States)

    Castillo, Víctor H; Martínez-García, Ana I; Pulido, J R G

    2010-10-15

    The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. The review identifies two main issues: 1) a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2) the definition of a base for the design of a conceptual framework for supporting the design of knowledge-based

  10. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review

    Directory of Open Access Journals (Sweden)

    Martínez-García Ana I

    2010-10-01

    Full Text Available Abstract Background The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. Methods This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. Results The review identifies two main issues: 1 a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2 the definition of a base for the design of a conceptual

  11. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records

    DEFF Research Database (Denmark)

    Walker, Mike; Johnsen, Sigfus Johann; Rasmussen, Sune Olander

    2009-01-01

    The Greenland ice core from NorthGRIP (NGRIP) contains a proxy climate record across the Pleistocene-Holocene boundary of unprecedented clarity and resolution. Analysis of an array of physical and chemical parameters within the ice enables the base of the Holocene, as reflected in the first signs...... species, and annual layer thickness. A timescale based on multi-parameter annual layer counting provides an age of 11,700 calendar yr b2 k (before AD 2000) for the base of the Holocene, with a maximum counting error of 99 yr. A proposal that an archived core from this unique sequence should constitute...

  12. An Analysis of Off Record Strategies Reflecting Politeness Implicature in “Oprah Winfrey Show”

    OpenAIRE

    Yanti, Rahma

    2017-01-01

    This thesis discusses strategies off the record that describes implicatures modesty in a conversation. Off record strategy is one of the five strategies. This strategy is discussed for the use of the language used in the forms of direct.The object of research is strategies off the record that describes implicatures politeness in a famous talk show in America, namely, "Oprah Winfrey Show". The data were taken using methods refer to refer techniques involved free conversation, where the author ...

  13. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    Directory of Open Access Journals (Sweden)

    Ivo Martiník

    2015-01-01

    Full Text Available Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  14. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    Science.gov (United States)

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  15. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  16. Global rainfall erosivity assessment based on high-temporal resolution rainfall records.

    Science.gov (United States)

    Panagos, Panos; Borrelli, Pasquale; Meusburger, Katrin; Yu, Bofu; Klik, Andreas; Jae Lim, Kyoung; Yang, Jae E; Ni, Jinren; Miao, Chiyuan; Chattopadhyay, Nabansu; Sadeghi, Seyed Hamidreza; Hazbavi, Zeinab; Zabihi, Mohsen; Larionov, Gennady A; Krasnov, Sergey F; Gorobets, Andrey V; Levi, Yoav; Erpul, Gunay; Birkel, Christian; Hoyos, Natalia; Naipal, Victoria; Oliveira, Paulo Tarso S; Bonilla, Carlos A; Meddi, Mohamed; Nel, Werner; Al Dashti, Hassan; Boni, Martino; Diodato, Nazzareno; Van Oost, Kristof; Nearing, Mark; Ballabio, Cristiano

    2017-06-23

    The exposure of the Earth's surface to the energetic input of rainfall is one of the key factors controlling water erosion. While water erosion is identified as the most serious cause of soil degradation globally, global patterns of rainfall erosivity remain poorly quantified and estimates have large uncertainties. This hampers the implementation of effective soil degradation mitigation and restoration strategies. Quantifying rainfall erosivity is challenging as it requires high temporal resolution(rainfall recordings. We present the results of an extensive global data collection effort whereby we estimated rainfall erosivity for 3,625 stations covering 63 countries. This first ever Global Rainfall Erosivity Database was used to develop a global erosivity map at 30 arc-seconds(~1 km) based on a Gaussian Process Regression(GPR). Globally, the mean rainfall erosivity was estimated to be 2,190 MJ mm ha -1 h -1 yr -1 , with the highest values in South America and the Caribbean countries, Central east Africa and South east Asia. The lowest values are mainly found in Canada, the Russian Federation, Northern Europe, Northern Africa and the Middle East. The tropical climate zone has the highest mean rainfall erosivity followed by the temperate whereas the lowest mean was estimated in the cold climate zone.

  17. Late Eocene siliceous sponge fauna of southern Australia: reconstruction based on loose spicules record.

    Science.gov (United States)

    Łukowiak, Magdalena

    2015-02-09

    An abundant and diversified assemblage of siliceous loose sponge spicules has been identified in the Late Eocene deposits cropping out along the southern coasts of Australia. Based on the comparison of the obtained spicules with those of living sponges, representatives of at least 43 species within 33 genera, 26 families, and 9 orders of "soft" Demospongiae and Homoscleromorpha have been identified in the assemblage. Within the studied sediments, the spicules representing demosponge orders Poecilosclerida, Hadromerida, and Astrophorida were the most diverse. The rest of the five demosponge orders (Halichondrida, Agelasida, Haplosclerida, Spirophorida, and Chondrosida) are represented by single families. Also, a single family Plakinidae within the class Homoscleromorpha that includes two genera was present. The diversity of spicules is similar in all studied samples and areas, even distant geographically, and there are only minor differences between the sections. That indicates a homogenous character of this rich siliceous sponge assemblage. Most of the studied sponge spicules have Recent equivalents among present-day siliceous spicules. However, the fossil ones are bigger which is most likely due to different environmental conditions. Among the recognized sponge species, at least eleven (Agelas cf. axifera, Agelas cf. wiedenmayeri, Penares sclerobesa, Histodermella australis, Trikentrion flabelliforme, Cliona cf. mucronata, Tethya cf. omanensis, Terpios sp., Placinolopha cf. sarai, Dotona pulchella, and Sigmosceptrella quadrilobata) are noted for the first time in the fossil record.

  18. Real-time 3D shape recording by DLP-based all-digital surface encoding

    Science.gov (United States)

    Höfling, Roland; Aswendt, Petra

    2009-02-01

    The use of computer generated sinusoidal fringe patterns has found wide acceptance in optical metrology. There are corresponding software solutions that reconstruct the phase field encoded in the fringe pattern in order to get 3D-shape data via triangulation and deflection measuring setups, respectively. Short recording time is a common issue of high importance for all tasks on the factory shop floor as well as in medical applications and for security. Recent high-speed implementations take advantage of MEMS based spatial light modulators and the digital micro mirror chipset DLP DiscoveryTM* is the fastest proven component currently available for this aim. Being a bi-stable on-off-state system, the sinusoidal gray level pictures are generated by controlling the mirrors ON-time period during which an analogue detector is exposed. This digital generation of light intensity distributions provides outstanding precision and long-term stability. It is used in leading edge technology solutions that produce video type streams of 3D surface data with a sustained repetition rate of 40 Hz. A new proposal is discussed in this paper that goes beyond this state of the art by considering the optical encoding of the surface as an all-digital communication link. After a brief classification of state-of- the-art systems, the authors describe how future all-digital encoding leads to extremely high speed and precision in 3D shape acquisition.

  19. Enhancing electronic health record usability in pediatric patient care: a scenario-based approach.

    Science.gov (United States)

    Patterson, Emily S; Zhang, Jiajie; Abbott, Patricia; Gibbons, Michael C; Lowry, Svetlana Z; Quinn, Matthew T; Ramaiah, Mala; Brick, David

    2013-03-01

    Usability of electronic health records (EHRs) is an important factor affecting patient safety and the EHR adoption rate for both adult and pediatric care providers. A panel of interdisciplinary experts (the authors) was convened by the National Institute of Standards and Technology to generate consensus recommendations to improve EHR usefulness, usability, and patient safety when supporting pediatric care, with a focus on critical user interactions. The panel members represented expertise in the disciplines of human factors engineering (HFE), usability, informatics, and pediatrics in ambulatory care and pediatric intensive care. An iterative, scenario-based approach was used to identify unique considerations in pediatric care and relevant human factors concepts. A draft of the recommendations were reviewed by invited experts in pediatric informatics, emergency medicine, neonatology, pediatrics, HFE, nursing, usability engineering, and software development and implementation. Recommendations for EHR developers, small-group pediatric medical practices, and children's hospitals were identified out of the original 54 recommendations, in terms of nine critical user interaction categories: patient identification, medications, alerts, growth chart, vaccinations, labs, newborn care, privacy, and radiology. Pediatric patient care has unique dimensions, with great complexity and high stakes for adverse events. The recommendations are anticipated to increase the rate of EHR adoption by pediatric care providers and improve patient safety for pediatric patients. The described methodology might be useful for accelerating adoption and increasing safety in a variety of clinical areas where the adoption of EHRs is lagging or usability issues are believed to reduce potential patient safety, efficiency, and quality benefits.

  20. A tutorial on activity-based costing of electronic health records.

    Science.gov (United States)

    Federowicz, Marie H; Grossman, Mila N; Hayes, Bryant J; Riggs, Joseph

    2010-01-01

    As the American Recovery and Restoration Act of 2009 allocates $19 billion to health information technology, it will be useful for health care managers to project the true cost of implementing an electronic health record (EHR). This study presents a step-by-step guide for using activity-based costing (ABC) to estimate the cost of an EHR. ABC is a cost accounting method with a "top-down" approach for estimating the cost of a project or service within an organization. The total cost to implement an EHR includes obvious costs, such as licensing fees, and hidden costs, such as impact on productivity. Unlike other methods, ABC includes all of the organization's expenditures and is less likely to miss hidden costs. Although ABC is used considerably in manufacturing and other industries, it is a relatively new phenomenon in health care. ABC is a comprehensive approach that the health care field can use to analyze the cost-effectiveness of implementing EHRs. In this article, ABC is applied to a health clinic that recently implemented an EHR, and the clinic is found to be more productive after EHR implementation. This methodology can help health care administrators assess the impact of a stimulus investment on organizational performance.

  1. Evidence-Based Guidelines for Interface Design for Data Entry in Electronic Health Records.

    Science.gov (United States)

    Wilbanks, Bryan A; Moss, Jacqueline

    2018-01-01

    Electronic health records use a variety of data entry methods that are often customized to clinician needs. Data entry interfaces must be appropriately designed to maximize benefits and minimize unintended consequences. There was relatively little evidence in the literature to guide the selection of specific data entry methods according to the type of data documented. This literature review summarizes existing data entry design recommendations to guide data entry interface design. Structured data entry uses predefined charting elements to limit acceptable data entry to standard coded data and improve completeness and data reuse at the expense of correctness. Unstructured data entry methods use natural language and improve correctness, at the expense of completeness and data reusability. Semistructured data entry uses a combination of these data entry methods to complement the strengths and minimize the weaknesses of each method. Documentation quality is influenced by the method of data entry. It is important to choose data entry methods based on the type of data to be documented. This literature review summarizes data entry design guidelines to inform clinical practice and future research.

  2. A Recording-Based Method for Auralization of Rotorcraft Flyover Noise

    Science.gov (United States)

    Pera, Nicholas M.; Rizzi, Stephen A.; Krishnamurthy, Siddhartha; Fuller, Christopher R.; Christian, Andrew

    2018-01-01

    Rotorcraft noise is an active field of study as the sound produced by these vehicles is often found to be annoying. A means to auralize rotorcraft flyover noise is sought to help understand the factors leading to annoyance. Previous work by the authors focused on auralization of rotorcraft fly-in noise, in which a simplification was made that enabled the source noise synthesis to be based on a single emission angle. Here, the goal is to auralize a complete flyover event, so the source noise synthesis must be capable of traversing a range of emission angles. The synthesis uses a source noise definition process that yields periodic and aperiodic (modulation) components at a set of discrete emission angles. In this work, only the periodic components are used for the source noise synthesis for the flyover; the inclusion of modulation components is the subject of ongoing research. Propagation of the synthesized source noise to a ground observer is performed using the NASA Auralization Framework. The method is demonstrated using ground recordings from a flight test of the AS350 helicopter for the source noise definition.

  3. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

    Low-frequency modulation of sound carry important information for speech and music. The modulation spectrum i commonly obtained by spectral analysis of the sole temporal envelopes of the sub-bands out of a time-frequency analysis. Processing in this domain usually creates undesirable distortions...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  4. 77 FR 5781 - Record of Decision for the Air Space Training Initiative Shaw Air Force Base, South Carolina...

    Science.gov (United States)

    2012-02-06

    ... DEPARTMENT OF DEFENSE Department of the Air Force Record of Decision for the Air Space Training Initiative Shaw Air Force Base, South Carolina Final Environmental Impact Statement ACTION: Notice of... signed the ROD for the Airspace Training Initiative Shaw Air Force Base, South Carolina Final...

  5. Global Trend Analysis of Multi-decade Soil Temperature Records Show Soils Resistant to Warming

    Science.gov (United States)

    Frey, S. D.; Jennings, K.

    2017-12-01

    Soil temperature is an important determinant of many subterranean ecological processes including plant growth, nutrient cycling, and carbon sequestration. Soils are expected to warm in response to increasing global surface temperatures; however, despite the importance of soil temperature to ecosystem processes, less attention has been given to examining changes in soil temperature over time. We collected long-term (> 20 years) soil temperature records from approximately 50 sites globally, many with multiple depths (5 - 100 cm), and examined temperature trends over the last few decades. For each site and depth we calculated annual summer means and conducted non-parametric Mann Kendall trend and Sen slope analysis to assess changes in summer soil temperature over the length of each time series. The mean summer soil temperature trend across all sites and depths was not significantly different than zero (mean = 0.004 °C year-1 ± 0.033 SD), suggesting that soils have not warmed over the observation period. Of the subset of sites that exhibit significant increases in temperature over time, site location, depth of measurement, time series length, and neither start nor end date seem to be related to trend strength. These results provide evidence that the thermal regime of soils may have a stronger buffering capacity than expected, having important implications for the global carbon cycle and feedbacks to climate change.

  6. Iteratively Reweighted Least Squares Algorithm for Sparse Principal Component Analysis with Application to Voting Records

    Directory of Open Access Journals (Sweden)

    Tomáš Masák

    2017-09-01

    Full Text Available Principal component analysis (PCA is a popular dimensionality reduction and data visualization method. Sparse PCA (SPCA is its extensively studied and NP-hard-to-solve modifcation. In the past decade, many diferent algorithms were proposed to perform SPCA. We build upon the work of Zou et al. (2006 who recast the SPCA problem into the regression framework and proposed to induce sparsity with the l1 penalty. Instead, we propose to drop the l1 penalty and promote sparsity by re-weighting the l2-norm. Our algorithm thus consists mainly of solving weighted ridge regression problems. We show that the algorithm basically attempts to fnd a solution to a penalized least squares problem with a non-convex penalty that resembles the l0-norm more closely. We also apply the algorithm to analyze the voting records of the Chamber of Deputies of the Parliament of the Czech Republic. We show not only why the SPCA is more appropriate to analyze this type of data, but we also discuss whether the variable selection property can be utilized as an additional piece of information, for example to create voting calculators automatically.

  7. An ecometric analysis of the fossil mammal record of the Turkana Basin

    Science.gov (United States)

    Žliobaitė, Indrė; Kaya, Ferhat; Bibi, Faysal; Bobe, René; Leakey, Louise; Leakey, Meave; Patterson, David; Rannikko, Janina; Werdelin, Lars

    2016-01-01

    Although ecometric methods have been used to analyse fossil mammal faunas and environments of Eurasia and North America, such methods have not yet been applied to the rich fossil mammal record of eastern Africa. Here we report results from analysis of a combined dataset spanning east and west Turkana from Kenya between 7 and 1 million years ago (Ma). We provide temporally and spatially resolved estimates of temperature and precipitation and discuss their relationship to patterns of faunal change, and propose a new hypothesis to explain the lack of a temperature trend. We suggest that the regionally arid Turkana Basin may between 4 and 2 Ma have acted as a ‘species factory’, generating ecological adaptations in advance of the global trend. We show a persistent difference between the eastern and western sides of the Turkana Basin and suggest that the wetlands of the shallow eastern side could have provided additional humidity to the terrestrial ecosystems. Pending further research, a transient episode of faunal change centred at the time of the KBS Member (1.87–1.53 Ma), may be equally plausibly attributed to climate change or to a top-down ecological cascade initiated by the entry of technologically sophisticated humans. This article is part of the themed issue ‘Major transitions in human evolution’. PMID:27298463

  8. Effect of a weight-based prescribing method within an electronic health record on prescribing errors.

    Science.gov (United States)

    Ginzburg, Regina; Barr, Wendy B; Harris, Marissa; Munshi, Shibani

    2009-11-15

    The effect of a weight-based prescribing method within the electronic health record (EHR) on the rate of prescribing errors was studied. A report was generated listing all patients who received a prescription by a clinic provider for either infants' or children's acetaminophen or ibuprofen from January 1 to July 28, 2005 (preintervention group) and from July 29 to December 30, 2005 (postintervention group). Patients were included if they were 12 years old or younger, had a prescription ordered for infants' or children's acetaminophen or ibuprofen within the EHR, and had a weight documented in the chart on the visit day. The dosing range for acetaminophen was 10-15 mg/kg every four to six hours as needed, and the regimen for ibuprofen was 5-10 mg/kg every six to eight hours as needed. Dosing errors were defined as overdosage of strength, overdosage of regimen, underdosage of strength, under-dosage of regimen, and incomprehensible dosing directions. Totals of 316 and 224 patient visits were analyzed from the preintervention and postintervention groups, respectively. Significantly more medication errors were found in the preintervention group than in the postintervention group (103 versus 46, p = 0.002). Significantly fewer strength overdosing errors occurred in the postintervention group (8.9% versus 4.0%, p = 0.028). An automated weight-based dosing calculator integrated into an EHR system in the outpatient setting significantly reduced medication prescribing errors for antipyretics prescribed to pediatric patients. This effect appeared to be strongest for reducing overdose errors.

  9. Iterative Covariance-Based Removal of Time-Synchronous Artifacts: Application to Gastrointestinal Electrical Recordings.

    Science.gov (United States)

    Erickson, Jonathan C; Putney, Joy; Hilbert, Douglas; Paskaranandavadivel, Niranchan; Cheng, Leo K; O'Grady, Greg; Angeli, Timothy R

    2016-11-01

    The aim of this study was to develop, validate, and apply a fully automated method for reducing large temporally synchronous artifacts present in electrical recordings made from the gastrointestinal (GI) serosa, which are problematic for properly assessing slow wave dynamics. Such artifacts routinely arise in experimental and clinical settings from motion, switching behavior of medical instruments, or electrode array manipulation. A novel iterative Covariance-Based Reduction of Artifacts (COBRA) algorithm sequentially reduced artifact waveforms using an updating across-channel median as a noise template, scaled and subtracted from each channel based on their covariance. Application of COBRA substantially increased the signal-to-artifact ratio (12.8 ± 2.5 dB), while minimally attenuating the energy of the underlying source signal by 7.9% on average ( -11.1 ± 3.9 dB). COBRA was shown to be highly effective for aiding recovery and accurate marking of slow wave events (sensitivity = 0.90 ± 0.04; positive-predictive value = 0.74 ± 0.08) from large segments of in vivo porcine GI electrical mapping data that would otherwise be lost due to a broad range of contaminating artifact waveforms. Strongly reducing artifacts with COBRA ultimately allowed for rapid production of accurate isochronal activation maps detailing the dynamics of slow wave propagation in the porcine intestine. Such mapping studies can help characterize differences between normal and dysrhythmic events, which have been associated with GI abnormalities, such as intestinal ischemia and gastroparesis. The COBRA method may be generally applicable for removing temporally synchronous artifacts in other biosignal processing domains.

  10. Unsupervised ensemble ranking of terms in electronic health record notes based on their importance to patients.

    Science.gov (United States)

    Chen, Jinying; Yu, Hong

    2017-04-01

    Allowing patients to access their own electronic health record (EHR) notes through online patient portals has the potential to improve patient-centered care. However, EHR notes contain abundant medical jargon that can be difficult for patients to comprehend. One way to help patients is to reduce information overload and help them focus on medical terms that matter most to them. Targeted education can then be developed to improve patient EHR comprehension and the quality of care. The aim of this work was to develop FIT (Finding Important Terms for patients), an unsupervised natural language processing (NLP) system that ranks medical terms in EHR notes based on their importance to patients. We built FIT on a new unsupervised ensemble ranking model derived from the biased random walk algorithm to combine heterogeneous information resources for ranking candidate terms from each EHR note. Specifically, FIT integrates four single views (rankers) for term importance: patient use of medical concepts, document-level term salience, word co-occurrence based term relatedness, and topic coherence. It also incorporates partial information of term importance as conveyed by terms' unfamiliarity levels and semantic types. We evaluated FIT on 90 expert-annotated EHR notes and used the four single-view rankers as baselines. In addition, we implemented three benchmark unsupervised ensemble ranking methods as strong baselines. FIT achieved 0.885 AUC-ROC for ranking candidate terms from EHR notes to identify important terms. When including term identification, the performance of FIT for identifying important terms from EHR notes was 0.813 AUC-ROC. Both performance scores significantly exceeded the corresponding scores from the four single rankers (Ppatients. It may help develop future interventions to improve quality of care. By using unsupervised learning as well as a robust and flexible framework for information fusion, FIT can be readily applied to other domains and applications

  11. An analysis of electronic health record-related patient safety concerns

    Science.gov (United States)

    Meeks, Derek W; Smith, Michael W; Taylor, Lesley; Sittig, Dean F; Scott, Jean M; Singh, Hardeep

    2014-01-01

    Objective A recent Institute of Medicine report called for attention to safety issues related to electronic health records (EHRs). We analyzed EHR-related safety concerns reported within a large, integrated healthcare system. Methods The Informatics Patient Safety Office of the Veterans Health Administration (VA) maintains a non-punitive, voluntary reporting system to collect and investigate EHR-related safety concerns (ie, adverse events, potential events, and near misses). We analyzed completed investigations using an eight-dimension sociotechnical conceptual model that accounted for both technical and non-technical dimensions of safety. Using the framework analysis approach to qualitative data, we identified emergent and recurring safety concerns common to multiple reports. Results We extracted 100 consecutive, unique, closed investigations between August 2009 and May 2013 from 344 reported incidents. Seventy-four involved unsafe technology and 25 involved unsafe use of technology. A majority (70%) involved two or more model dimensions. Most often, non-technical dimensions such as workflow, policies, and personnel interacted in a complex fashion with technical dimensions such as software/hardware, content, and user interface to produce safety concerns. Most (94%) safety concerns related to either unmet data-display needs in the EHR (ie, displayed information available to the end user failed to reduce uncertainty or led to increased potential for patient harm), software upgrades or modifications, data transmission between components of the EHR, or ‘hidden dependencies’ within the EHR. Discussion EHR-related safety concerns involving both unsafe technology and unsafe use of technology persist long after ‘go-live’ and despite the sophisticated EHR infrastructure represented in our data source. Currently, few healthcare institutions have reporting and analysis capabilities similar to the VA. Conclusions Because EHR-related safety concerns have complex

  12. Citation Analysis of Dissertations Completed at Istanbul University Information and Records Management Department

    Directory of Open Access Journals (Sweden)

    Fatih Canata

    2017-03-01

    Full Text Available The aim of this study is to perform a citation analysis of dissertations completed in Department of Information and Records Management of Istanbul University (DIRMIU between 1967-2015 and to compare the results with those already done on this topic. For this purpose, completed dissertations in DIRMIU have been determined and data have been gathered from the general characteristics (number of pages, year, advisor and jury members and references of dissertations. Citation analysis was carried out on the collected data. Additionally, the findings were compared with previous studies on dissertations of Department of Information Management of Hacettepe University (DIMHU using comparative method. According to the key findings, a total of 120 dissertations were completed in DIRMIU, 91 of which are master’s thesis and 29 of which are doctoral thesis. The 23 advisors and 42 jury members were assigned in these dissertations. Hasan Sacit Keseroğlu, Meral Alpay and Aysel Yontar are among the leading members of the faculty who manage the dissertations most frequently. The most frequently used source types in the dissertations are books (50% and electronic resources (31%. 69% of these sources are in Turkish and 31% are in other languages. A statistically significant relationship was identified between the types of the thesis and the types of sources and languages used in dissertations. Turkish Librarianship journal and Official Gazette of the Republic of Turkey are the most frequently cited periodicals. The most frequently cited people are Meral Alpay, Jale Baysal, Yasar Tonta, Hasan S. Keseroglu and Aysel Yontar. The half-life of the sources used is 8 years. As a result of the comparison made between DIRMIU and DIMHU, it is seen that the ratio of source type used is similar, the most frequently used periodicals are Turkish Librarianship journal and Official Gazette of the Republic of Turkey, and the half-life is between 8 and 9 years.

  13. Experimental investigation on spontaneously active hippocampal cultures recorded by means of high-density MEAs: analysis of the spatial resolution effects

    Directory of Open Access Journals (Sweden)

    Alessandro Maccione

    2010-05-01

    Full Text Available Based on experiments performed with high-resolution Active Pixel Sensor microelectrode arrays (APS-MEAs coupled with spontaneously active hippocampal cultures, this work investigates the spatial resolution effects of the neuroelectronic interface on the analysis of the recorded electrophysiological signals. The adopted methodology consists, first, in recording the spontaneous activity at the highest spatial resolution (inter-electrode separation of 21 µm from the whole array of 4096 microelectrodes. Then, the full resolution dataset is spatially down sampled in order to evaluate the effects on raster plot representation, array-wide spike rate (AWSR, mean firing rate (MFR and mean bursting rate (MBR. Furthermore, the effects of the array-to-network relative position are evaluated by shifting a subset of equally spaced electrodes on the entire recorded area. Results highlight that MFR and MBR are particularly influenced by the spatial resolution provided by the neuroelectronic interface. On high-resolution large MEAs, such analysis better represent the time-based parameterization of the network dynamics. Finally, this work suggest interesting capabilities of high-resolution MEAs for spatial-based analysis in dense and low-dense neuronal preparation for investigating signalling at both local and global neuronal circuitries.

  14. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Directory of Open Access Journals (Sweden)

    Catalina Pimiento

    Full Text Available Carcharocles megalodon ("Megalodon" is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma. Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  15. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Science.gov (United States)

    Pimiento, Catalina; Clements, Christopher F

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE) model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  16. Analysis of Readex's Serial Set MARC Records: Improving the Data for the Library Catalog

    Science.gov (United States)

    Draper, Daniel; Lederer, Naomi

    2013-01-01

    Colorado State University Libraries (CSUL) purchased the digitized "United States Congressional Serial Set," 1817-1994 and "American State Papers" (1789-1838) from the Readex Division of NewsBank, Inc. and, once funds and records were available, the accompanying MARC records. The breadth of information found in the "Serial…

  17. Quantitative analysis of single muscle fibre action potentials recorded at known distances

    NARCIS (Netherlands)

    Albers, B.A.; Put, J.H.M.; Wallinga, W.; Wirtz, P.

    1989-01-01

    In vivo records of single fibre action potentials (SFAPs) have always been obtained at unknown distance from the active muscle fibre. A new experimental method has been developed enabling the derivation of the recording distance in animal experiments. A single fibre is stimulated with an

  18. Reconstructing Late Pleistocene air temperature variability based on branched GDGTs in the sedimentary record of Llangorse Lake (Wales)

    Science.gov (United States)

    Maas, David; Hoek, Wim; Peterse, Francien; Akkerman, Keechy; Macleod, Alison; Palmer, Adrian; Lowe, John

    2015-04-01

    This study aims to provide a temperature reconstruction of the Lateglacial sediments of Llangorse Lake. A new temperature proxy is used, based on the occurrence of different membrane lipids of soil bacteria (de Jonge et al., 2014). Application of this proxy on lacustrine environments is difficult because of in situ (water column) production and co-elution of isomers. Pollen analysis provides a palynological record that can be used for biostratigraphical correlation to other records. Llangorse Lake lies in a glacial basin just northeast of the Brecon Beacons in Powys, South Wales. The lake is located upstream in the Afon Llynfi valley, at the edge of the watershed of the River Wye. The lake consists of two semi-separated basins with a maximum water depth of 7.5 m, arranged in an L-shape with a surface area of roughly 1.5 km2. Previous studies have focused on the Holocene development of the lake and its surrounding environment (Jones et al., 1985). This study focuses on the deglacial record that appeared to be present in the basal part of the sequence. The lake was cored in the September, 2014 with a manual operated 3 m piston corer from a small coring platform. Overlapping cores were taken to form a continuous 12 m core, spanning the Holocene and the Lateglacial sediments. Six adjacent Lateglacial core segments from the southern basin of Llangorse lake were scanned for their major element composition using XRF scanning at 5 mm resolution to discern changes in sediment origin. Furthermore, loss on ignition (LOI) analysis was used to determine the changes in organic content of the sediments. Subsamples of the Lateglacial sedimentary record were analyzed for the occurrence of different bacterial membrane lipids (brGDGTs: branched glycerol dialkyl glycerol tetraethers) by means of HPLC-MS (high performance liquid chromatography and mass spectrometry) using two silica columns to achieve proper separation of isomers (de Jonge et al., 2013). Air temperatures are

  19. Network analysis of mesoscale optical recordings to assess regional, functional connectivity.

    Science.gov (United States)

    Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H

    2015-10-01

    With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.

  20. Characterizing and Managing Missing Structured Data in Electronic Health Records: Data Analysis.

    Science.gov (United States)

    Beaulieu-Jones, Brett K; Lavage, Daniel R; Snyder, John W; Moore, Jason H; Pendergrass, Sarah A; Bauer, Christopher R

    2018-02-23

    Missing data is a challenge for all studies; however, this is especially true for electronic health record (EHR)-based analyses. Failure to appropriately consider missing data can lead to biased results. While there has been extensive theoretical work on imputation, and many sophisticated methods are now available, it remains quite challenging for researchers to implement these methods appropriately. Here, we provide detailed procedures for when and how to conduct imputation of EHR laboratory results. The objective of this study was to demonstrate how the mechanism of missingness can be assessed, evaluate the performance of a variety of imputation methods, and describe some of the most frequent problems that can be encountered. We analyzed clinical laboratory measures from 602,366 patients in the EHR of Geisinger Health System in Pennsylvania, USA. Using these data, we constructed a representative set of complete cases and assessed the performance of 12 different imputation methods for missing data that was simulated based on 4 mechanisms of missingness (missing completely at random, missing not at random, missing at random, and real data modelling). Our results showed that several methods, including variations of Multivariate Imputation by Chained Equations (MICE) and softImpute, consistently imputed missing values with low error; however, only a subset of the MICE methods was suitable for multiple imputation. The analyses we describe provide an outline of considerations for dealing with missing EHR data, steps that researchers can perform to characterize missingness within their own data, and an evaluation of methods that can be applied to impute clinical data. While the performance of methods may vary between datasets, the process we describe can be generalized to the majority of structured data types that exist in EHRs, and all of our methods and code are publicly available.

  1. A machine learning-based framework to identify type 2 diabetes through electronic health records.

    Science.gov (United States)

    Zheng, Tao; Xie, Wei; Xu, Liling; He, Xiaoying; Zhang, Ya; You, Mingrong; Yang, Gong; Chen, You

    2017-01-01

    To discover diverse genotype-phenotype associations affiliated with Type 2 Diabetes Mellitus (T2DM) via genome-wide association study (GWAS) and phenome-wide association study (PheWAS), more cases (T2DM subjects) and controls (subjects without T2DM) are required to be identified (e.g., via Electronic Health Records (EHR)). However, existing expert based identification algorithms often suffer in a low recall rate and could miss a large number of valuable samples under conservative filtering standards. The goal of this work is to develop a semi-automated framework based on machine learning as a pilot study to liberalize filtering criteria to improve recall rate with a keeping of low false positive rate. We propose a data informed framework for identifying subjects with and without T2DM from EHR via feature engineering and machine learning. We evaluate and contrast the identification performance of widely-used machine learning models within our framework, including k-Nearest-Neighbors, Naïve Bayes, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression. Our framework was conducted on 300 patient samples (161 cases, 60 controls and 79 unconfirmed subjects), randomly selected from 23,281 diabetes related cohort retrieved from a regional distributed EHR repository ranging from 2012 to 2014. We apply top-performing machine learning algorithms on the engineered features. We benchmark and contrast the accuracy, precision, AUC, sensitivity and specificity of classification models against the state-of-the-art expert algorithm for identification of T2DM subjects. Our results indicate that the framework achieved high identification performances (∼0.98 in average AUC), which are much higher than the state-of-the-art algorithm (0.71 in AUC). Expert algorithm-based identification of T2DM subjects from EHR is often hampered by the high missing rates due to their conservative selection criteria. Our framework leverages machine learning and feature

  2. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    Since the acoustic characteristics of ejective sounds differ from the corresponding voiced and voiceless pulmonic sound conjugates, mainly in the source of excitation, epoch-based analysis is useful, in addition to the spectral or spectrographic analysis. The most frequently used features for analysis of ejective sounds are: ...

  3. Redundancy in electronic health record corpora: analysis, impact on text mining performance and mitigation strategies.

    Science.gov (United States)

    Cohen, Raphael; Elhadad, Michael; Elhadad, Noémie

    2013-01-16

    The increasing availability of Electronic Health Record (EHR) data and specifically free-text patient notes presents opportunities for phenotype extraction. Text-mining methods in particular can help disease modeling by mapping named-entities mentions to terminologies and clustering semantically related terms. EHR corpora, however, exhibit specific statistical and linguistic characteristics when compared with corpora in the biomedical literature domain. We focus on copy-and-paste redundancy: clinicians typically copy and paste information from previous notes when documenting a current patient encounter. Thus, within a longitudinal patient record, one expects to observe heavy redundancy. In this paper, we ask three research questions: (i) How can redundancy be quantified in large-scale text corpora? (ii) Conventional wisdom is that larger corpora yield better results in text mining. But how does the observed EHR redundancy affect text mining? Does such redundancy introduce a bias that distorts learned models? Or does the redundancy introduce benefits by highlighting stable and important subsets of the corpus? (iii) How can one mitigate the impact of redundancy on text mining? We analyze a large-scale EHR corpus and quantify redundancy both in terms of word and semantic concept repetition. We observe redundancy levels of about 30% and non-standard distribution of both words and concepts. We measure the impact of redundancy on two standard text-mining applications: collocation identification and topic modeling. We compare the results of these methods on synthetic data with controlled levels of redundancy and observe significant performance variation. Finally, we compare two mitigation strategies to avoid redundancy-induced bias: (i) a baseline strategy, keeping only the last note for each patient in the corpus; (ii) removing redundant notes with an efficient fingerprinting-based algorithm. (a)For text mining, preprocessing the EHR corpus with fingerprinting yields

  4. Reconstruction of Oceanographic Changes Based on the Diatom Records of the Central Okhotsk Sea over the last 500000 Years

    Directory of Open Access Journals (Sweden)

    Wei-Lung Wang and Liang-Chi Wang

    2008-01-01

    Full Text Available This study provides insight into changes in sea ice conditions and the oceanographic environment over the past 500 kyr through analysis of the diatom record. Based on the relative abundance of 13 diatoms species in piston core MD012414, four types of environmental conditions in the central Okhotsk Sea over the last 330 ka BP have been distinguished: (1 open-ocean alternating with seasonal sea-ice cover in Stages 9, 5, and 1; (2 almost open-ocean free of sea-ice cover in Stages 7 and 3; (3 perennial sea-ice cover in Stages 6, 4, and 2; and (4 a warm ice-age dominated by open ocean assemblages in Stage 8. The littoral diatom species, Paralia sulcata, showed a sudden increase from the glacial period to the nterglacial period over the last 330 ka BP, except during Stage 8. Such a result implies that melting sea-ice transported terrigenous materials from the north Okhotsk Sea continental shelves to the central ocean during eglaciation. From Stage 13 to Stage 10, however, cold and warm marine conditions unexpectedly occurred in the late interglacial periods and the glacial periods, respectively. One possible reason for this is a lack of age control points from Stage 13 to Stage 10, and the different sediment accumulation rates between glacial and interglacial periods. This study suggests not only the process by which oceanographic variation of sea ice occurred, but also new significance for Paralia sulcata as an indicator in the diatom record of the Okhotsk Sea.

  5. SPECTRAL ANALYSIS OF TOARCIAN SEDIMENTS FROM THE VALDORBIA SECTION (UMBRIA-MARCHE APENNINES: THE ASTRONOMICAL INPUT IN THE FORAMINIFERAL RECORD

    Directory of Open Access Journals (Sweden)

    FRANCISCO J. RODRÍGUEZ-TOVAR

    2016-05-01

    Full Text Available Toarcian sections studied mainly in Europe have revealed the incidence of Milankovitch forcing with a well-developed, highly stable, 405 ky component of eccentricity, a short-term eccentricity of ~100 kyr, the cycle of obliquity ~36 kyr, and the precession signal at ~21 kyr. Cyclostratigraphic analysis of the Toarcian succession at the Valdorbia section (Umbria-Marche Apennines was conducted based on time-series of foraminiferal assemblages. Well-developed cyclic patterns were obtained, with several significant cycles corresponding to thicknesses of 3.8-4.1 m / 5.8-6.3 m / 8.2 m / 10.4 m. Comparison with previous studies at the Valdorbia section led us to interpret the cycle of ~4 m as directly related with the short-term eccentricity (95-105 kyr. The rest of the cycles could be assigned to a periodicity of ~140-160 kyr, ~200 kyr and ~250 kyr, and interpreted as indirect signals of the long-term eccentricity, obliquity and precession, whose record would be impeded by the incompleteness of the studied succession and the sampling interval. Studied components in the foraminiferal assemblage show variable cyclostratigraphic patterns, allowing for a differentiation of groups based on similar registered cycles. These groups reveal different responses by the foraminiferal assemblage, associated with particular requirements, to the palaeoenvironmental changes of Milankovitch origin.

  6. Point-process analysis of neural spiking activity of muscle spindles recorded from thin-film longitudinal intrafascicular electrodes.

    Science.gov (United States)

    Citi, Luca; Djilas, Milan; Azevedo-Coste, Christine; Yoshida, Ken; Brown, Emery N; Barbieri, Riccardo

    2011-01-01

    Recordings from thin-film Longitudinal Intra-Fascicular Electrodes (tfLIFE) together with a wavelet-based de-noising and a correlation-based spike sorting algorithm, give access to firing patterns of muscle spindle afferents. In this study we use a point process probability structure to assess mechanical stimulus-response characteristics of muscle spindle spike trains. We assume that the stimulus intensity is primarily a linear combination of the spontaneous firing rate, the muscle extension, and the stretch velocity. By using the ability of the point process framework to provide an objective goodness of fit analysis, we were able to distinguish two classes of spike clusters with different statistical structure. We found that spike clusters with higher SNR have a temporal structure that can be fitted by an inverse Gaussian distribution while lower SNR clusters follow a Poisson-like distribution. The point process algorithm is further able to provide the instantaneous intensity function associated with the stimulus-response model with the best goodness of fit. This important result is a first step towards a point process decoding algorithm to estimate the muscle length and possibly provide closed loop Functional Electrical Stimulation (FES) systems with natural sensory feedback information.

  7. LabTrove: a lightweight, web based, laboratory "blog" as a route towards a marked up record of work in a bioscience research laboratory.

    Directory of Open Access Journals (Sweden)

    Andrew J Milsted

    Full Text Available The electronic laboratory notebook (ELN has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research.We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system.LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse, repurposing, and redeployment.

  8. Target Tracking Based Scene Analysis

    Science.gov (United States)

    1984-08-01

    NATO Advanced Study PG Institute, Braunlage/ Harz , FRG, June 21 July 2, 1I82 Springer, Berlin, 1983, pp. 493-501. 141 B. Bhanu."Recognition of...Braunlage/ Harz . FRG, June 21 - July 2, 1082 Springer, Berlin, 1083. pp 10.1-124. [81 R.B. Cate, T.*1B. Dennis, J.T. Mallin, K.S. Nedelman, NEIL Trenchard, and...34Image, Sequence Processing and Dynamic Scene Analysis", Proceedings of NATO,. Advanced Study Institute, Braunlage/ Harz , FRG, June 21 - July 2, 1982

  9. Based on records of Three Gorge Telemetric Seismic Network to analyze Vibration process of micro fracture of rock landslide

    Science.gov (United States)

    WANG, Q.

    2017-12-01

    Used the finite element analysis software GeoStudio to establish vibration analysis model of Qianjiangping landslide, which locates at the Three Gorges Reservoir area. In QUAKE/W module, we chosen proper Dynamic elasticity modulus and Poisson's ratio of soil layer and rock stratum. When loading, we selected the waveform data record of Three Gorge Telemetric Seismic Network as input ground motion, which includes five rupture events recorded of Lujiashan seismic station. In dynamic simulating, we mainly focused on sliding process when the earthquake date record was applied. The simulation result shows that Qianjiangping landslide wasn't not only affected by its own static force, but also experienced the dynamic process of micro fracture-creep-slip rupture-creep-slip.it provides a new approach for the early warning feasibility of rock landslide in future research.

  10. Independent component analysis and decision trees for ECG holter recording de-noising.

    Directory of Open Access Journals (Sweden)

    Jakub Kuzilek

    Full Text Available We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA. This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE between original ECG and filtered data contaminated with artificial noise. Proposed algorithm achieved comparable result in terms of standard noises (power line interference, base line wander, EMG, but noticeably significantly better results were achieved when uncommon noise (electrode cable movement artefact were compared.

  11. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  12. Residential segregation, dividing walls and mental health: a population-based record linkage study

    Science.gov (United States)

    Maguire, Aideen; French, Declan; O'Reilly, Dermot

    2016-01-01

    Background Neighbourhood segregation has been described as a fundamental determinant of physical health, but literature on its effect on mental health is less clear. While most previous research has relied on conceptualised measures of segregation, Northern Ireland is unique as it contains physical manifestations of segregation in the form of segregation barriers (or ‘peacelines’) which can be used to accurately identify residential segregation. Methods We used population-wide health record data on over 1.3 million individuals, to analyse the effect of residential segregation, measured by both the formal Dissimilarity Index and by proximity to a segregation barrier, on the likelihood of poor mental health. Results Using multilevel logistic regression models, we found residential segregation measured by the Dissimilarity Index poses no additional risk to the likelihood of poor mental health after adjustment for area-level deprivation. However, residence in an area segregated by a ‘peaceline’ increases the likelihood of antidepressant medication by 19% (OR=1.19, 95% CI 1.14 to 1.23) and anxiolytic medication by 39% (OR=1.39, 95% CI 1.32 to 1.48), even after adjustment for gender, age, conurbation, deprivation and crime. Conclusions Living in an area segregated by a ‘peaceline’ is detrimental to mental health suggesting segregated areas characterised by a heightened sense of ‘other’ pose a greater risk to mental health. The difference in results based on segregation measure highlights the importance of choice of measure when studying segregation. PMID:26858342

  13. Installation Restoration Program, Phase 1. Records Search, Wheeler Air Force Base, Oahu, Hawaii

    Science.gov (United States)

    1983-07-01

    the 3vegetation was already exotic, consisting of trees such as guava , koa haole, eucalyptus and silver oak, and shrubs and 3 grasses including lantana...material in use during day-to-day activities. Degreasers and solvents were simply "engine wash " or "solvent" and 3 records do not exist that allow the...available records do not 3 indicate whether certain shops were tied into the system during earlier periods. Exterior wash racks currently 3 discharge to

  14. Impact of OSHA final rule--recording hearing loss: an analysis of an industrial audiometric dataset.

    Science.gov (United States)

    Rabinowitz, Peter M; Slade, Martin; Dixon-Ernst, Christine; Sircar, Kanta; Cullen, Mark

    2003-12-01

    The 2003 Occupational Safety and Health Administration (OSHA) Occupational Injury and Illness Recording and Reporting Final Rule changed the definition of recordable work-related hearing loss. We performed a study of the Alcoa Inc. audiometric database to evaluate the impact of this new rule. The 2003 rule increased the rate of potentially recordable hearing loss events from 0.2% to 1.6% per year. A total of 68.6% of potentially recordable cases had American Academy of Audiology/American Medical Association (AAO/AMA) hearing impairment at the time of recordability. On average, recordable loss occurred after onset of impairment, whereas the non-age-corrected 10-dB standard threshold shift (STS) usually preceded impairment. The OSHA Final Rule will significantly increase recordable cases of occupational hearing loss. The new case definition is usually accompanied by AAO/AMA hearing impairment. Other, more sensitive metrics should therefore be used for early detection and prevention of hearing loss.

  15. Impact of OSHA Final Rule—Recording Hearing Loss: An Analysis of an Industrial Audiometric Dataset

    Science.gov (United States)

    Rabinowitz, Peter M.; Slade, Martin; Dixon-Ernst, Christine; Sircar, Kanta; Cullen, Mark

    2013-01-01

    The 2003 Occupational Safety and Health Administration (OSHA) Occupational Injury and Illness Recording and Reporting Final Rule changed the definition of recordable work-related hearing loss. We performed a study of the Alcoa Inc. audiometric database to evaluate the impact of this new rule. The 2003 rule increased the rate of potentially recordable hearing loss events from 0.2% to 1.6% per year. A total of 68.6% of potentially recordable cases had American Academy of Audiology/American Medical Association (AAO/AMA) hearing impairment at the time of recordability. On average, recordable loss occurred after onset of impairment, whereas the non-age-corrected 10-dB standard threshold shift (STS) usually preceded impairment. The OSHA Final Rule will significantly increase recordable cases of occupational hearing loss. The new case definition is usually accompanied by AAO/AMA hearing impairment. Other, more sensitive metrics should therefore be used for early detection and prevention of hearing loss. PMID:14665813

  16. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L

    2014-01-01

    BACKGROUND: Elevated resting heart rate (RHR) is associated with cardiovascular mortality and morbidity. Assessment of heart rate (HR) from Holter recording may afford a more precise estimate of the effect of RHR on cardiovascular risk, as compared to casual RHR. Comparative analysis was carried...... and mean HR by Multivariate Cox regression was performed. RESULTS: A total of 57 composite endpoints occurred during 17.1 years of follow-up. Regression analysis suggests correlation between Casual RHR and Holter RHR. Multivariate Cox regression analysis adjusted for gender and age demonstrated hazard...... rates of 1.02 (p = 0.079) for casual RHR, 1.04 (p = 0.036*) for Holter RHR, and 1.03 (p = 0.093) for mean HR for each 10 beat increment in HR. CONCLUSIONS: In a comparative analysis on the correlation and significance of differing RHR measurement modalities RHR measured by 24-hour Holter recording...

  17. Assessment of nursing records on cardiopulmonary resuscitation based on the utstein model

    Directory of Open Access Journals (Sweden)

    Daiane Lopes Grisante

    2014-01-01

    Full Text Available Cross-sectional study that assessed the quality of nursing records on cardiopulmonary resuscitation. Forty-two patients’ charts were reviewed in an intensive care unit, using the Utstein protocol. There was a predominance of men (54.8%, aged from 21-70 years old (38.1%, correction of acquired heart diseases (42.7%, with more than one pre-existing device (147. As immediate cause of cardiac arrest, hypotension predominated (48.3% and as the initial rhythm, bradycardia (37.5%. Only the time of death and time of arrest were recorded in 100% of the sample. Professional training in Advanced Life Support was not recorded. The causes of arrest and initial rhythm were recorded in 69% and 76.2% of the sample. Chest compressions, patent airway obtainment and defibrillation were recorded in less than 16%. Records were considered of low quality and may cause legal sanctions to professionals and do not allow the comparison of the effectiveness of the maneuvers with other centers.

  18. Establishment of data base of regional seismic recordings from earthquakes, chemical explosions and nuclear explosions in the Former Soviet Union

    Energy Technology Data Exchange (ETDEWEB)

    Ermolenko, N.A.; Kopnichev, Yu.F.; Kunakov, V.G.; Kunakova, O.K.; Rakhmatullin, M.Kh.; Sokolova, I.N.; Vybornyy, Zh.I. [AN SSSR, Moscow (Russian Federation). Inst. Fiziki Zemli

    1995-06-01

    In this report results of work on establishment of a data base of regional seismic recordings from earthquakes, chemical explosions and nuclear explosions in the former Soviet Union are described. This work was carried out in the Complex Seismological Expedition (CSE) of the Joint Institute of Physics of the Earth of the Russian Academy of Sciences and Lawrence Livermore National Laboratory. The recording system, methods of investigations and primary data processing are described in detail. The largest number of digital records was received by the permanent seismic station Talgar, situated in the northern Tien Shan, 20 km to the east of Almaty city. More than half of the records are seismograms of underground nuclear explosions and chemical explosions. The nuclear explosions were recorded mainly from the Semipalatinsk test site. In addition, records of the explosions from the Chinese test site Lop Nor and industrial nuclear explosions from the West Siberia region were obtained. Four records of strong chemical explosions were picked out (two of them have been produced at the Semipalatinsk test site and two -- in Uzbekistan). We also obtained 16 records of crustal earthquakes, mainly from the Altai region, close to the Semipalatinsk test site, and also from the West China region, close to the Lop Nor test site. In addition, a small number of records of earthquakes and underground nuclear explosions, received by arrays of temporary stations, that have been working in the southern Kazakhstan region are included in this report. Parameters of the digital seismograms and file structure are described. Possible directions of future work on the digitizing of unique data archive are discussed.

  19. Interpretation of Coronary Angiograms Recorded Using Google Glass: A Comparative Analysis.

    Science.gov (United States)

    Duong, Thao; Wosik, Jedrek; Christakopoulos, Georgios E; Martínez Parachini, José Roberto; Karatasakis, Aris; Tarar, Muhammad Nauman Javed; Resendes, Erica; Rangan, Bavana V; Roesle, Michele; Grodin, Jerrold; Abdullah, Shuaib M; Banerjee, Subhash; Brilakis, Emmanouil S

    2015-10-01

    Google Glass (Google, Inc) is a voice-activated, hands-free, optical head-mounted display device capable of taking pictures, recording videos, and transmitting data via wi-fi. In the present study, we examined the accuracy of coronary angiogram interpretation, recorded using Google Glass. Google Glass was used to record 15 angiograms with 17 major findings and the participants were asked to interpret those recordings on: (1) an iPad (Apple, Inc); or (2) a desktop computer. Interpretation was compared with the original angiograms viewed on a desktop. Ten physicians (2 interventional cardiologists and 8 cardiology fellows) participated. One point was assigned for each correct finding, for a maximum of 17 points. The mean angiogram interpretation score for Google Glass angiogram recordings viewed on an iPad or a desktop vs the original angiograms viewed on a desktop was 14.9 ± 1.1, 15.2 ± 1.8, and 15.9 ± 1.1, respectively (P=.06 between the iPad and the original angiograms, P=.51 between the iPad and recordings viewed on a desktop, and P=.43 between the recordings viewed on a desktop and the original angiograms). In a post-study survey, one of the 10 physicians (10%) was "neutral" with the quality of the recordings using Google Glass, 6 physicians (60%) were "somewhat satisfied," and 3 physicians (30%) were "very satisfied." This small pilot study suggests that the quality of coronary angiogram video recordings obtained using Google Glass may be adequate for recognition of major findings, supporting its expanding use in telemedicine.

  20. Prevalence of spontaneous Brugada ECG pattern recorded at standard intercostal leads: A meta-analysis.

    Science.gov (United States)

    Shi, Shaobo; Barajas-Martinez, Hector; Liu, Tao; Sun, Yaxun; Yang, Bo; Huang, Congxin; Hu, Dan

    2018-03-01

    Typical Brugada ECG pattern is the keystone in the diagnosis of Brugada syndrome. However, the exact prevalence remains unclear, especially in Asia. The present study was designed to systematically evaluate the prevalence of spontaneous Brugada ECG pattern recorded at standard leads. We searched the Medline, Embase and Chinese National Knowledge Infrastructure (CNKI) for studies of the prevalence of Brugada ECG pattern, published between Jan 1, 2003, and September 1, 2016. Pooled prevalence of type 1 and type 2-3 Brugada ECG pattern were estimated in a random-effects model, and group prevalence data by the characteristic of studies. Meta-regression analyses were performed to explore the potential sources of heterogeneity, and sensitivity analyses were conducted to assess the effect of each study on the overall prevalence. Thirty-nine eligible studies involving 558,689 subjects were identified. Pooled prevalence of type 1 and 2-3 Brugada ECG pattern was 0.03% (95%CI, 0.01%-0.06%), and 0.42% (95%CI, 0.28%-0.59%), respectively. Regions, sample size, year of publication were the main source of heterogeneity. The prevalence of type 1 Brugada ECG pattern was higher in male, Asia, adult, patient, and fever subjects; but the relation between fever and type 2-3 Brugada ECG pattern was not significant. Sensitivity analysis showed that each study did not lonely affect the prevalence of type 1 and type 2-3 Brugada ECG pattern. Brugada ECG pattern is not rare, especially preponderant in adult Asian males, and fever subjects. Clinical screening and further examination of Brugada syndrome in potential population need to be highlighted. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  1. Integrated interpretation of helicopter and ground-based geophysical data recorded within the Okavango Delta, Botswana

    Science.gov (United States)

    Podgorski, Joel E.; Green, Alan G.; Kalscheuer, Thomas; Kinzelbach, Wolfgang K. H.; Horstmeyer, Heinrich; Maurer, Hansruedi; Rabenstein, Lasse; Doetsch, Joseph; Auken, Esben; Ngwisanyi, Tiyapo; Tshoso, Gomotsang; Jaba, Bashali Charles; Ntibinyane, Onkgopotse; Laletsang, Kebabonye

    2015-03-01

    Integration of information from the following sources has been used to produce a much better constrained and more complete four-unit geological/hydrological model of the Okavango Delta than previously available: (i) a 3D resistivity model determined from helicopter time-domain electromagnetic (HTEM) data recorded across most of the delta, (ii) 2D models and images derived from ground-based electrical resistance tomographic, transient electromagnetic, and high resolution seismic reflection/refraction tomographic data acquired at four selected sites in western and north-central regions of the delta, and (iii) geological details extracted from boreholes in northeastern and southeastern parts of the delta. The upper heterogeneous unit is the modern delta, which comprises extensive dry and freshwater-saturated sand and lesser amounts of clay and salt. It is characterized by moderate to high electrical resistivities and very low to low P-wave velocities. Except for images of several buried abandoned river channels, it is non-reflective. The laterally extensive underlying unit of low resistivities, low P-wave velocity, and subhorizontal reflectors very likely contains saline-water-saturated sands and clays deposited in the huge Paleo Lake Makgadikgadi (PLM), which once covered a 90,000 km2 area that encompassed the delta, Lake Ngami, the Mababe Depression, and the Makgadikgadi Basin. Examples of PLM sediments are intersected in many boreholes. Low permeability clay within the PLM unit seems to be a barrier to the downward flow of the saline water. Below the PLM unit, freshwater-saturated sand of the Paleo Okavango Megafan (POM) unit is distinguished by moderate to high resistivities, low P-wave velocity, and numerous subhorizontal reflectors. The POM unit is interpreted to be the remnants of a megafan based on the arcuate nature of its front and the semi-conical shape of its upper surface in the HTEM resistivity model. Moderate to high resistivity subhorizontal layers are

  2. An analysis of electronic health record-related patient safety concerns.

    Science.gov (United States)

    Meeks, Derek W; Smith, Michael W; Taylor, Lesley; Sittig, Dean F; Scott, Jean M; Singh, Hardeep

    2014-01-01

    A recent Institute of Medicine report called for attention to safety issues related to electronic health records (EHRs). We analyzed EHR-related safety concerns reported within a large, integrated healthcare system. The Informatics Patient Safety Office of the Veterans Health Administration (VA) maintains a non-punitive, voluntary reporting system to collect and investigate EHR-related safety concerns (ie, adverse events, potential events, and near misses). We analyzed completed investigations using an eight-dimension sociotechnical conceptual model that accounted for both technical and non-technical dimensions of safety. Using the framework analysis approach to qualitative data, we identified emergent and recurring safety concerns common to multiple reports. We extracted 100 consecutive, unique, closed investigations between August 2009 and May 2013 from 344 reported incidents. Seventy-four involved unsafe technology and 25 involved unsafe use of technology. A majority (70%) involved two or more model dimensions. Most often, non-technical dimensions such as workflow, policies, and personnel interacted in a complex fashion with technical dimensions such as software/hardware, content, and user interface to produce safety concerns. Most (94%) safety concerns related to either unmet data-display needs in the EHR (ie, displayed information available to the end user failed to reduce uncertainty or led to increased potential for patient harm), software upgrades or modifications, data transmission between components of the EHR, or 'hidden dependencies' within the EHR. EHR-related safety concerns involving both unsafe technology and unsafe use of technology persist long after 'go-live' and despite the sophisticated EHR infrastructure represented in our data source. Currently, few healthcare institutions have reporting and analysis capabilities similar to the VA. Because EHR-related safety concerns have complex sociotechnical origins, institutions with long-standing as well

  3. Multi-level analysis of electronic health record adoption by health care professionals: A study protocol

    Directory of Open Access Journals (Sweden)

    Labrecque Michel

    2010-04-01

    Full Text Available Abstract Background The electronic health record (EHR is an important application of information and communication technologies to the healthcare sector. EHR implementation is expected to produce benefits for patients, professionals, organisations, and the population as a whole. These benefits cannot be achieved without the adoption of EHR by healthcare professionals. Nevertheless, the influence of individual and organisational factors in determining EHR adoption is still unclear. This study aims to assess the unique contribution of individual and organisational factors on EHR adoption in healthcare settings, as well as possible interrelations between these factors. Methods A prospective study will be conducted. A stratified random sampling method will be used to select 50 healthcare organisations in the Quebec City Health Region (Canada. At the individual level, a sample of 15 to 30 health professionals will be chosen within each organisation depending on its size. A semi-structured questionnaire will be administered to two key informants in each organisation to collect organisational data. A composite adoption score of EHR adoption will be developed based on a Delphi process and will be used as the outcome variable. Twelve to eighteen months after the first contact, depending on the pace of EHR implementation, key informants and clinicians will be contacted once again to monitor the evolution of EHR adoption. A multilevel regression model will be applied to identify the organisational and individual determinants of EHR adoption in clinical settings. Alternative analytical models would be applied if necessary. Results The study will assess the contribution of organisational and individual factors, as well as their interactions, to the implementation of EHR in clinical settings. Conclusions These results will be very relevant for decision makers and managers who are facing the challenge of implementing EHR in the healthcare system. In addition

  4. Implications of electronic health record downtime: an analysis of patient safety event reports.

    Science.gov (United States)

    Larsen, Ethan; Fong, Allan; Wernz, Christian; Ratwani, Raj M

    2018-02-01

    We sought to understand the types of clinical processes, such as image and medication ordering, that are disrupted during electronic health record (EHR) downtime periods by analyzing the narratives of patient safety event report data. From a database of 80 381 event reports, 76 reports were identified as explicitly describing a safety event associated with an EHR downtime period. These reports were analyzed and categorized based on a developed code book to identify the clinical processes that were impacted by downtime. We also examined whether downtime procedures were in place and followed. The reports were coded into categories related to their reported clinical process: Laboratory, Medication, Imaging, Registration, Patient Handoff, Documentation, History Viewing, Delay of Procedure, and General. A majority of reports (48.7%, n = 37) were associated with lab orders and results, followed by medication ordering and administration (14.5%, n = 11). Incidents commonly involved patient identification and communication of clinical information. A majority of reports (46%, n = 35) indicated that downtime procedures either were not followed or were not in place. Only 27.6% of incidents (n = 21) indicated that downtime procedures were successfully executed. Patient safety report data offer a lens into EHR downtime-related safety hazards. Important areas of risk during EHR downtime periods were patient identification and communication of clinical information; these should be a focus of downtime procedure planning to reduce safety hazards. EHR downtime events pose patient safety hazards, and we highlight critical areas for downtime procedure improvement. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. Transparency in Transcribing: Making Visible Theoretical Bases Impacting Knowledge Construction from Open-Ended Interview Records

    Directory of Open Access Journals (Sweden)

    Audra Skukauskaite

    2012-01-01

    Full Text Available This article presents a reflexive analysis of two transcripts of an open-ended interview and argues for transparency in transcribing processes and outcomes. By analyzing ways in which a researcher's theories become consequential in producing and using transcripts of an open-ended interview, this paper makes visible the importance of examining and presenting theoretical bases of transcribing decisions. While scholars across disciplines have argued that transcribing is a theoretically laden process (GREEN, FRANQUIZ & DIXON, 1997; KVALE & BRINKMAN, 2009, few have engaged in reflexive analyses of the data history to demonstrate the consequences particular theoretical and methodological approaches pose in producing knowledge claims and inciting dialogues across traditions. The article demonstrates how theory-method-claim relationships in transcribing influence research transparency and warrantability. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1201146

  6. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information:

  7. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005)

    Science.gov (United States)

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  8. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. Temporal phenome analysis of a large electronic health record cohort enables identification of hospital-acquired complications

    Science.gov (United States)

    Warner, Jeremy L; Zollanvari, Amin; Ding, Quan; Zhang, Peijin; Snyder, Graham M; Alterovitz, Gil

    2013-01-01

    Objective To develop methods for visual analysis of temporal phenotype data available through electronic health records (EHR). Materials and methods 24 580 adults from the multiparameter intelligent monitoring in intensive care V.6 (MIMIC II) EHR database of critically ill patients were analyzed, with significant temporal associations visualized as a map of associations between hospital length of stay (LOS) and ICD-9-CM codes. An expanded phenotype, using ICD-9-CM, microbiology, and computerized physician order entry data, was defined for hospital-acquired Clostridium difficile (HA-CDI). LOS, estimated costs, 30-day post-discharge mortality, and antecedent medication provider order entry were evaluated for HA-CDI cases compared to randomly selected controls. Results Temporal phenome analysis revealed 191 significant codes (p value, adjusted for false discovery rate, ≤0.05). HA-CDI was identified in 414 cases, and was associated with longer median LOS, 20 versus 9 days, and adjusted HR 0.33 (95% CI 0.28 to 0.39). This prolongation carries an estimated annual incremental cost increase of US$1.2–2.0 billion in the USA alone. Discussion Comprehensive EHR data have made large-scale phenome-based analysis feasible. Time-dependent pathological disease states have dynamic phenomic evolution, which may be captured through visual analytical approaches. Although MIMIC II is a single institutional retrospective database, our approach should be portable to other EHR data sources, including prospective ‘learning healthcare systems’. For example, interventions to prevent HA-CDI could be dynamically evaluated using the same techniques. Conclusions The new visual analytical method described in this paper led directly to the identification of numerous hospital-acquired conditions, which could be further explored through an expanded phenotype definition. PMID:23907284

  10. A novel assessment of odor sources using instrumental analysis combined with resident monitoring records for an industrial area in Korea

    Science.gov (United States)

    Lee, Hyung-Don; Jeon, Soo-Bin; Choi, Won-Joon; Lee, Sang-Sup; Lee, Min-Ho; Oh, Kwang-Joong

    2013-08-01

    The residents living nearby the Sa-sang industrial area (SSIA) continuously were damaged by odorous pollution since 1990s. We determined the concentrations of reduced sulfur compounds (RSCs) [hydrogen sulfide (H2S), methyl mercaptan (CH3SH), dimethyl sulfide (DMS), and dimethyl disulfide (DMDS)], nitrogenous compounds (NCs) [ammonia (NH3) and trimethylamine (TMA)], and carbonyl compounds (CCs) [acetaldehyde and butyraldehyde] by instrumental analysis in the SSIA in Busan, Korea from Jun to Nov, 2011. We determined odor intensity (OI) based on the concentrations of the odorants and resident monitoring records (RMR). The mean concentration of H2S was 10-times higher than NCs, CCs and the other RSC. The contribution from RSCs to the OI was over 50% at all sites excluding the A-5 (chemical production) site. In particular, A-4 (food production) site showed more than 8-times higher the sum of odor activity value (SOAV) than the other sites. This suggested that the A-4 site was the most malodorous area in the SSIA. From the RMR analysis, the annoyance degree (OI ≥ 2) was 51.9% in the industrial area. The 'Rotten' smell arising from the RSCs showed the highest frequency (25.3%) while 'Burned' and 'Other' were more frequent than 'Rotten' in the residential area. The correlation between odor index calculated by instrumental analysis and OI from the RMR was analyzed. The Pearson correlation coefficient (r) of the SOAV was the highest at 0.720 (P food production causes significant annoyance in the SSIA. We also confirm RMR data can be used effectively to evaluate the characteristic of odorants emitted from the SSIA.

  11. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  12. The Analysis and Suppression of the spike noise in vibrator record

    Science.gov (United States)

    Jia, H.; Jiang, T.; Xu, X.; Ge, L.; Lin, J.; Yang, Z.

    2013-12-01

    During the seismic exploration with vibrator, seismic recording systems have often been affected by random spike noise in the background, which leads to strong data distortions as a result of the cross-correlation processing of the vibrator method. Partial or total loss of the desired seismic information is possible if no automatic spike reduction is available in the field prior to correlation of the field record. Generally speaking, original record of vibrator is uncorrelated data, in which the signal is non-wavelet form. In order to obtain the seismic record similar to explosive source, the signal of uncorrelated data needs to use the correlation algorithm to compress into wavelet form. The correlation process results in that the interference of spike in correlated data is not only being suppressed, but also being expanded. So the spike noise suppression of vibrator is indispensable. According to numerical simulation results, the effect of spike in the vibrator record is mainly affected by the amplitude and proportional points in the uncorrelated record. When the spike noise ratio in uncorrelated record reaches 1.5% and the average amplitude exceeds 200, it will make the SNR(signal-to-noise ratio) of the correlated record lower than 0dB, so that it is difficult to separate the signal. While the amplitude and ratio is determined by the intensity of background noise. Therefore, when the noise level is strong, in order to improve SNR of the seismic data, the uncorrelated record of vibrator need to take necessary steps to suppress spike noise. For the sake of reducing the influence of the spike noise, we need to make the detection and suppression of spike noise process for the uncorrelated record. Because vibrator works by inputting sweep signal into the underground long time, ideally, the peak and valley values of each trace have little change. On the basis of the peak and valley values, we can get a reference amplitude value. Then the spike can be detected and

  13. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  14. A real-time spike classification method based on dynamic time warping for extracellular enteric neural recording with large waveform variability.

    Science.gov (United States)

    Cao, Yingqiu; Rakhilin, Nikolai; Gordon, Philip H; Shen, Xiling; Kan, Edwin C

    2016-03-01

    Computationally efficient spike recognition methods are required for real-time analysis of extracellular neural recordings. The enteric nervous system (ENS) is important to human health but less well-understood with few appropriate spike recognition algorithms due to large waveform variability. Here we present a method based on dynamic time warping (DTW) with high tolerance to variability in time and magnitude. Adaptive temporal gridding for "fastDTW" in similarity calculation significantly reduces the computational cost. The automated threshold selection allows for real-time classification for extracellular recordings. Our method is first evaluated on synthesized data at different noise levels, improving both classification accuracy and computational complexity over the conventional cross-correlation based template-matching method (CCTM) and PCA+k-means clustering without time warping. Our method is then applied to analyze the mouse enteric neural recording with mechanical and chemical stimuli. Successful classification of biphasic and monophasic spikes is achieved even when the spike variability is larger than millisecond in width and millivolt in magnitude. In comparison with conventional template matching and clustering methods, the fastDTW method is computationally efficient with high tolerance to waveform variability. We have developed an adaptive fastDTW algorithm for real-time spike classification of ENS recording with large waveform variability against colony motility, ambient changes and cellular heterogeneity. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Analysis of simultaneous MEG and intracranial LFP recordings during Deep Brain Stimulation: a protocol and experimental validation.

    Science.gov (United States)

    Oswal, Ashwini; Jha, Ashwani; Neal, Spencer; Reid, Alphonso; Bradbury, David; Aston, Peter; Limousin, Patricia; Foltynie, Tom; Zrinzo, Ludvic; Brown, Peter; Litvak, Vladimir

    2016-03-01

    Deep Brain Stimulation (DBS) is an effective treatment for several neurological and psychiatric disorders. In order to gain insights into the therapeutic mechanisms of DBS and to advance future therapies a better understanding of the effects of DBS on large-scale brain networks is required. In this paper, we describe an experimental protocol and analysis pipeline for simultaneously performing DBS and intracranial local field potential (LFP) recordings at a target brain region during concurrent magnetoencephalography (MEG) measurement. Firstly we describe a phantom setup that allowed us to precisely characterise the MEG artefacts that occurred during DBS at clinical settings. Using the phantom recordings we demonstrate that with MEG beamforming it is possible to recover oscillatory activity synchronised to a reference channel, despite the presence of high amplitude artefacts evoked by DBS. Finally, we highlight the applicability of these methods by illustrating in a single patient with Parkinson's disease (PD), that changes in cortical-subthalamic nucleus coupling can be induced by DBS. To our knowledge this paper provides the first technical description of a recording and analysis pipeline for combining simultaneous cortical recordings using MEG, with intracranial LFP recordings of a target brain nucleus during DBS. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. A model of heat transfer in STM-based magnetic recording on CoNi/Pt multilayers

    NARCIS (Netherlands)

    Zhang, Li; Bain, James A.; Zhu, Jian-Gang; Abelmann, Leon; Onoue, T.

    2006-01-01

    A method of heat-assisted magnetic recording (HAMR) potentially suitable for probe-based storage systems is characterized. In this work, field emission current from a scanning tunneling microscope (STM) tip is used as the heating source. Pulse voltages of 3-7 V with a duration of 500 ns were applied

  17. The realization of the storage of XML and middleware-based data of electronic medical records

    International Nuclear Information System (INIS)

    Liu Shuzhen; Gu Peidi; Luo Yanlin

    2007-01-01

    In this paper, using the technology of XML and middleware to design and implement a unified electronic medical records storage archive management system and giving a common storage management model. Using XML to describe the structure of electronic medical records, transform the medical data from traditional 'business-centered' medical information into a unified 'patient-centered' XML document and using middleware technology to shield the types of the databases at different departments of the hospital and to complete the information integration of the medical data which scattered in different databases, conducive to information sharing between different hospitals. (authors)

  18. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  19. Development and validation of a simple tape-based measurement tool for recording cervical rotation in patients with ankylosing spondylitis: comparison with a goniometer-based approach.

    Science.gov (United States)

    Maksymowych, Walter P; Mallon, Catherine; Richardson, Rhonda; Conner-Spady, Barbara; Jauregui, Edwin; Chung, Cecilia; Zappala, Lisa; Pile, Kevin; Russell, Anthony S

    2006-11-01

    To compare a tape-based tool for measuring cervical mobility in patients with ankylosing spondylitis (AS) with the widely practiced goniometer-based approach. We developed a novel tape-based approach to measurement of lateral cervical rotation of the neck that is minimally affected by flexion/extension movements of the neck. This requires measurement of the difference between a mark at the suprasternal notch and the tragus of the ear. Rotation score is measured in centimeters and constitutes the difference in length between the 2 extremes of cervical rotation (http://www.arthritisdoctors.org/researcher.html). We assessed the tape-based and goniometer-based methods in a total of 263 patients from 3 countries, Canada (n = 205), Australia (n = 29), and Colombia (n = 29), that included patients from community and tertiary-based practice. Intra- and interobserver reliability was assessed in a subset of 44 patients by ANOVA and a 2-way mixed effects model. The Bath AS Disease Activity (BASDAI) and Function (BASFI) Indices, and the modified Stoke AS Spinal Score (mSASSS), were also recorded to assess construct validity by correlation coefficient and regression analysis. Responsiveness was assessed in a subset of 33 patients that were either randomized to anti-tumor necrosis factor-a therapy:placebo (n = 22) or received open label infliximab (n = 4) or pamidronate (n = 7) over a period of 24 weeks. Scores obtained with the tape-based method were normally distributed, while those obtained using the goniometer were skewed towards normal values. Reliability for the goniometer-based approach was excellent [intraclass correlation coefficient (ICC) > 0.90] and very good for the tape-based approach (ICC > 0.80). Significant correlations were noted between age, disease duration, function and structural damage scores, and scores obtained with both methods. Responsiveness was high using raw scores obtained with the goniometer (standardized response mean > 0.80) but was not evident

  20. Standard-based comprehensive detection of adverse drug reaction signals from nursing statements and laboratory results in electronic health records.

    Science.gov (United States)

    Lee, Suehyun; Choi, Jiyeob; Kim, Hun-Sung; Kim, Grace Juyun; Lee, Kye Hwa; Park, Chan Hee; Han, Jongsoo; Yoon, Dukyong; Park, Man Young; Park, Rae Woong; Kang, Hye-Ryun; Kim, Ju Han

    2017-07-01

    We propose 2 Medical Dictionary for Regulatory Activities-enabled pharmacovigilance algorithms, MetaLAB and MetaNurse, powered by a per-year meta-analysis technique and improved subject sampling strategy. This study developed 2 novel algorithms, MetaLAB for laboratory abnormalities and MetaNurse for standard nursing statements, as significantly improved versions of our previous electronic health record (EHR)-based pharmacovigilance method, called CLEAR. Adverse drug reaction (ADR) signals from 117 laboratory abnormalities and 1357 standard nursing statements for all precautionary drugs ( n   = 101) were comprehensively detected and validated against SIDER (Side Effect Resource) by MetaLAB and MetaNurse against 11 817 and 76 457 drug-ADR pairs, respectively. We demonstrate that MetaLAB (area under the curve, AUC = 0.61 ± 0.18) outperformed CLEAR (AUC = 0.55 ± 0.06) when we applied the same 470 drug-event pairs as the gold standard, as in our previous research. Receiver operating characteristic curves for 101 precautionary terms in the Medical Dictionary for Regulatory Activities Preferred Terms were obtained for MetaLAB and MetaNurse (0.69 ± 0.11; 0.62 ± 0.07), which complemented each other in terms of ADR signal coverage. Novel ADR signals discovered by MetaLAB and MetaNurse were successfully validated against spontaneous reports in the US Food and Drug Administration Adverse Event Reporting System database. The present study demonstrates the symbiosis of laboratory test results and nursing statements for ADR signal detection in terms of their system organ class coverage and performance profiles. Systematic discovery and evaluation of the wide spectrum of ADR signals using standard-based observational electronic health record data across many institutions will affect drug development and use, as well as postmarketing surveillance and regulation. © The Author 2017. Published by Oxford University Press on behalf of the American

  1. VBORNET Gap Analysis: Sand Fly Vector Distribution Models Utilised to Identify Areas of Potential Species Distribution in Areas Lacking Records

    Directory of Open Access Journals (Sweden)

    Bulent Alten

    2016-12-01

    Full Text Available This is the first of a number of planned data papers presenting modelled vector distributions, the models in this paper were produced during the ECDC funded VBORNET project. This work continues under the VectorNet project now jointly funded by ECDC and EFSA. This data paper contains the sand fly model outputs produced as part of the VBORNET project. Further data papers will be published after sampling seasons when more field data will become available allowing further species to be modelled or validation and updates to existing models. The data package described here includes those sand fly species first modelled in 2013 and 2014 as part of the VBORNET gap analysis work which aimed to identify areas of potential species distribution in areas lacking records. It comprises four species models together with suitability masks based on land class and environmental limits. The species included within this paper are 'Phlebotomus ariasi', 'Phlebotomus papatasi', 'Phlebotomus perniciosus' and 'Phlebotomus tobbi'. The known distributions of these species within the project area (Europe, the Mediterranean Basin, North Africa, and Eurasia are currently incomplete to a greater or lesser degree. The models are designed to fill the gaps with predicted distributions, to provide a assistance in targeting surveys to collect ­distribution data for those areas with no field validated information, and b a first indication of project wide distributions.

  2. VBORNET gap analysis: Mosquito vector distribution models utilised to identify areas of potential species distribution in areas lacking records.

    Directory of Open Access Journals (Sweden)

    Francis Schaffner

    2016-12-01

    Full Text Available This is the second of a number of planned data papers presenting modelled vector distributions produced originally during the ECDC funded VBORNET project. This work continues under the VectorNet project now jointly funded by ECDC and EFSA. Further data papers will be published after sampling seasons when more field data will become available allowing further species to be modelled or validation and updates to existing models.  The data package described here includes those mosquito species first modelled in 2013 & 2014 as part of the VBORNET gap analysis work which aimed to identify areas of potential species distribution in areas lacking records. It comprises three species models together with suitability masks based on land class and environmental limits. The species included as part of this phase are the mosquitoes 'Aedes vexans', 'Anopheles plumbeus' and 'Culex modestus'. The known distributions of these species within the area covered by the project (Europe, the ­Mediterranean Basin, North Africa, and Eurasia are currently incomplete to a greater or lesser degree. The models are designed to fill the gaps with predicted distributions, to provide a assistance in ­targeting surveys to collect distribution data for those areas with no field validated information, and b a first indication of the species distributions within the project areas.

  3. Phylogenetic analysis shows that Neolithic slate plaques from the southwestern Iberian Peninsula are not genealogical recording systems.

    Directory of Open Access Journals (Sweden)

    Daniel García Rivero

    Full Text Available Prehistoric material culture proposed to be symbolic in nature has been the object of considerable archaeological work from diverse theoretical perspectives, yet rarely are methodological tools used to test the interpretations. The lack of testing is often justified by invoking the opinion that the slippery nature of past human symbolism cannot easily be tackled by the scientific method. One such case, from the southwestern Iberian Peninsula, involves engraved stone plaques from megalithic funerary monuments dating ca. 3,500-2,750 B.C. (calibrated age. One widely accepted proposal is that the plaques are ancient mnemonic devices that record genealogies. The analysis reported here demonstrates that this is not the case, even when the most supportive data and techniques are used. Rather, we suspect there was a common ideological background to the use of plaques that overlay the southwestern Iberian Peninsula, with little or no geographic patterning. This would entail a cultural system in which plaque design was based on a fundamental core idea, with a number of mutable and variable elements surrounding it.

  4. iSpectra: An Open Source Toolbox For The Analysis of Spectral Images Recorded on Scanning Electron Microscopes.

    Science.gov (United States)

    Liebske, Christian

    2015-08-01

    iSpectra is an open source and system-independent toolbox for the analysis of spectral images (SIs) recorded on energy-dispersive spectroscopy (EDS) systems attached to scanning electron microscopes (SEMs). The aim of iSpectra is to assign pixels with similar spectral content to phases, accompanied by cumulative phase spectra with superior counting statistics for quantification. Pixel-to-phase assignment starts with a threshold-based pre-sorting of spectra to create groups of pixels with identical elemental budgets, similar to a method described by van Hoek (2014). Subsequent merging of groups and re-assignments of pixels using elemental or principle component histogram plots enables the user to generate chemically and texturally plausible phase maps. A variety of standard image processing algorithms can be applied to groups of pixels to optimize pixel-to-phase assignments, such as morphology operations to account for overlapping excitation volumes over pixels located at phase boundaries. iSpectra supports batch processing and allows pixel-to-phase assignments to be applied to an unlimited amount of SIs, thus enabling phase mapping of large area samples like petrographic thin sections.

  5. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L

    2014-01-01

    BACKGROUND: Elevated resting heart rate (RHR) is associated with cardiovascular mortality and morbidity. Assessment of heart rate (HR) from Holter recording may afford a more precise estimate of the effect of RHR on cardiovascular risk, as compared to casual RHR. Comparative analysis was carried...... HRs recorded and mean HR calculated from all daytime HRs. Follow-up was recorded from public registers. Outcome measure was hazard rate for the combined endpoint of cardiovascular mortality, non-fatal heart failure and non-fatal acute myocardial infarction. Comparison of casual RHR, Holter RHR...... was found to be marginally superior as a predictor of cardiovascular morbidity and mortality. The results presented here do not however warrant the abandonment of a tested epidemiological variable....

  6. Implementation of a Cloud-Based Electronic Medical Record to Reduce Gaps in the HIV Treatment Continuum in Rural Kenya.

    Directory of Open Access Journals (Sweden)

    John Haskew

    Full Text Available Electronic medical record (EMR systems are increasingly being adopted to support the delivery of health care in developing countries and their implementation can help to strengthen pathways of care and close gaps in the HIV treatment cascade by improving access to and use of data to inform clinical and public health decision-making.This study implemented a novel cloud-based electronic medical record system in an HIV outpatient setting in Western Kenya and evaluated its impact on reducing gaps in the HIV treatment continuum including missing data and patient eligibility for ART. The impact of the system was assessed using a two-sample test of proportions pre- and post-implementation of EMR-based data verification and clinical decision support.Significant improvements in data quality and provision of clinical care were recorded through implementation of the EMR system, helping to ensure patients who are eligible for HIV treatment receive it early. A total of 2,169 and 764 patient records had missing data pre-implementation and post-implementation of EMR-based data verification and clinical decision support respectively. A total of 1,346 patients were eligible for ART, but not yet started on ART, pre-implementation compared to 270 patients pre-implementation.EMR-based data verification and clinical decision support can reduce gaps in HIV care, including missing data and eligibility for ART. A cloud-based model of EMR implementation removes the need for local clinic infrastructure and has the potential to enhance data sharing at different levels of health care to inform clinical and public health decision-making. A number of issues, including data management and patient confidentiality, must be considered but significant improvements in data quality and provision of clinical care are recorded through implementation of this EMR model.

  7. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  8. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  9. Recording the dynamic endocytosis of single gold nanoparticles by AFM-based force tracing.

    Science.gov (United States)

    Ding, Bohua; Tian, Yongmei; Pan, Yangang; Shan, Yuping; Cai, Mingjun; Xu, Haijiao; Sun, Yingchun; Wang, Hongda

    2015-05-07

    We utilized force tracing to directly record the endocytosis of single gold nanoparticles (Au NPs) with different sizes, revealing the size-dependent endocytosis dynamics and the crucial role of membrane cholesterol. The force, duration and velocity of Au NP invagination are accurately determined at the single-particle and microsecond level unprecedentedly.

  10. Information security risk measures for cloud-based personal health records

    CSIR Research Space (South Africa)

    Mxoli, A

    2014-11-01

    Full Text Available Personal Health Records (PHRs) provide a convenient way for individuals to better manage their health. With the advancement in technology, they can be stored via Cloud Computing. These are pay-per-use applications offered as a service over...

  11. Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data

    Science.gov (United States)

    Butterfield, Ansel J.

    2001-01-01

    Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.

  12. A record/replay system for the harmonic analysis of nuclear reactor flux noise

    International Nuclear Information System (INIS)

    Lawrence, L.A.J.; Corran, E.R.

    1960-02-01

    The design of a record/replay system for the determination of neutron flux spectra is discussed and circuit details and performance figures are given. Frequency modulation is used with a carrier frequency of 1,000 cycles per second, the complete system having an overall bandwidth of 100 cycles per second. The noise is recorded on magnetic tape and when replayed is analysed into a spectrum by means of a selective amplifier used in conjunction with an infinitely variable speed control on the tape. It is shown that if the lowest spectral frequency is .01 cycles per second a recording time of many hours is necessary. The DIDO noise spectrum is analysed and shown to be contained in a bandwidth of a few cycles per second. (author)

  13. Analysis of the foetal heart rate in cardiotocographic recordings through a progressive characterization of decelerations

    Directory of Open Access Journals (Sweden)

    Fuentealba Patricio

    2017-09-01

    Full Text Available The main purpose of this work is to propose a new method for characterization and visualization of FHR deceleration episodes in terms of their depth, length and location. This is performed through the estimation of a progressive baseline computed using a median filter allowing to identify and track the evolution of decelerations in cardiotocographic CTG recordings. The proposed method has been analysed using three representative cases of normal and pathological CTG recordings extracted from the CTU-UHB database freely available on the PhysioNet Website. Results show that both the progressive baseline and the parameterized deceleration episodes can describe different time-variant behaviour, whose characteristics and progression can help the observer to discriminate between normal and pathological FHR signal patterns. This opens perspectives for classification of non-reassuring CTG recordings as a sign of foetal acidemia.

  14. An analysis of electronic health record-related patient safety incidents.

    Science.gov (United States)

    Palojoki, Sari; Mäkelä, Matti; Lehtonen, Lasse; Saranto, Kaija

    2017-06-01

    The aim of this study was to analyse electronic health record-related patient safety incidents in the patient safety incident reporting database in fully digital hospitals in Finland. We compare Finnish data to similar international data and discuss their content with regard to the literature. We analysed the types of electronic health record-related patient safety incidents that occurred at 23 hospitals during a 2-year period. A procedure of taxonomy mapping served to allow comparisons. This study represents a rare examination of patient safety risks in a fully digital environment. The proportion of electronic health record-related incidents was markedly higher in our study than in previous studies with similar data. Human-computer interaction problems were the most frequently reported. The results show the possibility of error arising from the complex interaction between clinicians and computers.

  15. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  16. Comparison of radon-daughter exposures calculated for US- underground uranium miners based on MSHA and company records

    International Nuclear Information System (INIS)

    Cooper, W.E.

    1981-01-01

    How accurate are past and present employee radon-daughter exposure records of underground uranium miners employed in the United States. This often-debated question is essential for future substantiation of safe exposure limits. An apparent discrepancy between company-reported exposures and Mining Enforcement and Safety Administration (MESA) projected exposures was detected in 1977. For these reasons a need for an updated comparison of these exposure data was indicated. This paper gives some of the conclusions of the earlier study and compares more recent exposure records compiled by the Atomic Industrial Forum, Inc., with projected exposures based on sampling by Federal mine inspectors

  17. Eye of the storm: analysis of shelter treatment records of evacuees to Acadiana from Hurricanes Katrina and Rita.

    Science.gov (United States)

    Caillouet, L Philip; Paul, P Joseph; Sabatier, Steven M; Caillouet, Kevin A

    2012-01-01

    specific evaluation and care, no population-based experimental hypothesis was framed nor was the effectiveness of any specific intervention researched at the time. This study reports experiential data collected without a particular preconceived hypothesis, because no specific outcome measures had been designed in advance. Data analysis revealed much about the origins and demographics of the evacuees, their hurricane-related risks and injuries, and the loss of continuity in their prior and ongoing healthcare. The authors believe that much can be learned from studying data collected in evacuee triage clinics, and that such insights may influence personal and official preparedness for future events. In the Katrina-Rita evacuations, only paper-based data collection mechanisms were used-and those with great inconsistency-and there was no predeployed mechanism for close-to-real-time collation of evacuee data. Deployment of simple electronic health record systems might well have allowed for a better real-time understanding of the unfolding of events, upon arrival of evacuees in shelters. Information and communication technologies have advanced since 2005, but predisaster staging and training on such technologies is still lacking.

  18. Joint analysis of longitudinal feed intake and single recorded production traits in pigs using a novel horizontal model

    DEFF Research Database (Denmark)

    Shirali, M.; Strathe, A. B.; Mark, T.

    2017-01-01

    - and first-order Legendre polynomials of age on test, respectively. The fixed effect and random residual variance were estimated for each weekly FI trait. Residual feed intake (RFI) was derived from the conditional distribution of FI given the breeding values of ADG100 and LMP. The heritability of FI varied......A novel Horizontal model is presented for multitrait analysis of longitudinal traits through random regression analysis combined with single recorded traits. Weekly ADFI on test for Danish Duroc, Landrace, and Yorkshire boars were available from the national test station and were collected from 30...... to 100 kg BW. Single recorded production traits of ADG from birth to 30 kg BW (ADG30), ADG from 30 to 100 kg BW (ADG100), and lean meat percentage (LMP) were available from breeding herds or the national test station. The Horizontal model combined random regression analysis of feed intake (FI...

  19. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  20. Source reconstruction based on subdural EEG recordings adds to the presurgical evaluation in refractory frontal lobe epilepsy.

    Science.gov (United States)

    Ramantani, Georgia; Cosandier-Rimélé, Delphine; Schulze-Bonhage, Andreas; Maillard, Louis; Zentner, Josef; Dümpelmann, Matthias

    2013-03-01

    In presurgical investigations of refractory frontal lobe epilepsy, subdural EEG recordings offer extensive cortical coverage, but may overlook deep sources. Electrical Source Localization (ESL) from subdural recordings could overcome this sampling limitation. This study aims to assess the clinical relevance of this new method in refractory frontal lobe epilepsy associated with focal cortical dysplasia. In 14 consecutive patients, we retrospectively compared: (i) the ESL of interictal spikes to the conventional irritative and seizure onset zones; (ii) the surgical outcome of cases with congruent ESL and resection volume to cases with incongruent ESL and resection volume. Each spike type was averaged to serve as a template for ESL by the MUSIC and sLORETA algorithms. Results were superimposed on the corresponding pre and post-surgical MRI. Both ESL methods were congruent and consistent with conventional electroclinical analysis in all patients. In 7 cases, ESL identified a common deep source for spikes of different 2D localizations. The inclusion of ESL in the resection volume correlated with seizure freedom. ESL from subdural recordings provided clinically relevant results in patients with refractory frontal lobe epilepsy. ESL complements the conventional analysis of subdural recordings. Its potential in improving tailored resections and surgical outcomes should be prospectively assessed. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  2. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  3. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  4. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study

    Directory of Open Access Journals (Sweden)

    Ann R R Robertson

    2015-03-01

    Full Text Available Background   Globally, diabetes mellitus presents a substantial burden to individuals and healthcare systems. Structuring and/or coding of medical records underpin attempts to improve information sharing and searching, potentially bringing clinical and secondary uses benefits.Aims and objectives   We investigated if, how and why records for adults with diabetes were structured and/or coded, and explored stakeholders’ perceptions of current practice.Methods   We carried out a qualitative, theoretically-informed case study of documenting healthcare information for diabetes patients in family practice and hospital settings, using semi-structured interviews, observations, systems demonstrations and documentary data.Results   We conducted 22 interviews and four on-site observations, and reviewed 25 documents. For secondary uses – research, audit, public health and service planning – the benefits of highly structured and coded diabetes data were clearly articulated. Reported clinical benefits in terms of managing and monitoring diabetes, and perhaps encouraging patient self-management, were modest. We observed marked differences in levels of record structuring and/or coding between settings, and found little evidence that these data were being exploited to improve information sharing between them.Conclusions   Using high levels of data structuring and coding in medical records for diabetes patients has potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK.

  5. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents

    Science.gov (United States)

    2009-06-01

    Control and Communications). 5.1.3.4. Establish and maintain a register of auto - mated records management products that have been certi- fied as...inspection Symbolic items Tract ownership data Insignias , guidons, medals, flags, seals, plaques, badges, ribbons, coats of arms, pennants, streamers...symbolic item, insignia , guidon, medal, flag, seal, ribbons, coat arm, pennant, streamer, aircraft marking, illustrations, designs, paintings, photo

  6. Recording and analysis of locomotion in dairy cows with 3D accelerometers

    NARCIS (Netherlands)

    Mol, de R.M.; Lammers, R.J.H.; Pompe, J.C.A.M.; Ipema, A.H.; Hogewerf, P.H.

    2009-01-01

    An automated method for lameness detection can be an alternative for detection by regular observations. Accelerometers attached to a leg of the dairy cow can be used to record the locomotion of a dairy cow. In an experiment the 3D acceleration of the right hind leg during walking of three dairy cows

  7. Recorded fatal and permanently disabling injuries in South African manufacturing industry - Overview, analysis and reflection

    DEFF Research Database (Denmark)

    Hedlund, Frank Huess

    2013-01-01

    Studies on occupational accident statistics in South Africa are few and far between, the most recent paper on the manufacturing sector was published in 1990. Accidents in South Africa are recorded in two systems: Exhaustive information is available from the insurance system under the Workmen’s Co...

  8. A Correlational Analysis: Electronic Health Records (EHR) and Quality of Care in Critical Access Hospitals

    Science.gov (United States)

    Khan, Arshia A.

    2012-01-01

    Driven by the compulsion to improve the evident paucity in quality of care, especially in critical access hospitals in the United States, policy makers, healthcare providers, and administrators have taken the advise of researchers suggesting the integration of technology in healthcare. The Electronic Health Record (EHR) System composed of multiple…

  9. The Accuracy Analysis of Five-planet Movements Recorded in China in the Han Dynasty

    Science.gov (United States)

    Zhang, J.

    2010-04-01

    The observations and researches of five-planet are one of the important part of ancient calendars and also one of the methods to evaluate their accuracies. So astronomers paid much attention to this field. In "Hanshu·Tian wen zhi" and "Xuhanshu· Tian wen zhi", there are 160 records with detailed dates and positions, which are calculated and studied by the modern astronomical method in this paper. The calculated results show that these positions are mostly correct, taking up 77.5% of the total records. While the rest 36 records are incorrect, taking up 22.5%. In addition, there are three typical or special forms of five-planet movements. The numbers of “shou”, “he”, “fan” movements are 14, 22 and 46, taking up 9%, 14% and 29%, respectively. In this paper, a detailed research on these three typical forms of five-planet movements is carried out. We think that the 36 incorrect records are caused by various reasons, but mainly in the data processes carried out by later generations.

  10. Running Records and First Grade English Learners: An Analysis of Language Related Errors

    Science.gov (United States)

    Briceño, Allison; Klein, Adria F.

    2018-01-01

    The purpose of this study was to determine if first-grade English Learners made patterns of language related errors when reading, and if so, to identify those patterns and how teachers coded language related errors when analyzing English Learners' running records. Using research from the fields of both literacy and Second Language Acquisition, we…

  11. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  12. Traceability of Biologics in The Netherlands: An Analysis of Information-Recording Systems in Clinical Practice and Spontaneous ADR Reports.

    Science.gov (United States)

    Klein, Kevin; Scholl, Joep H G; Vermeer, Niels S; Broekmans, André W; Van Puijenbroek, Eugène P; De Bruin, Marie L; Stolk, Pieter

    2016-02-01

    Pharmacovigilance requirements for biologics mandate that EU Member States shall ensure that any biologic that is the subject of a suspected adverse drug reaction (ADR) is identifiable by brand name and batch number. Recent studies showed that brand name identification is well established, whereas batch numbers are (still) poorly reported. We evaluated information-recording systems and practices in the Dutch hospital setting to identify determinants for brand name and batch number recording as well as success factors and bottlenecks for traceability. We surveyed Dutch hospital pharmacists with an online questionnaire on systems and practices in hospitals for recording brand names and batch numbers. Additionally, we performed an analysis of the traceability of recombinant biologics in spontaneous ADR reports (received between 2009 and 2014) from the Netherlands Pharmacovigilance Centre Lareb. The survey showed that brand names are not routinely recorded in the clinical practice of Dutch hospitals, whereas batch numbers are poorly recorded. Seventy-six percent of the 1523 ADR reports for recombinant biologics had a traceable brand name whereas 5% of these reports contained a batch number. The results suggest a possible relationship between the availability of brand and batch number information in clinical practice and the inclusion of this information in ADR reports for biologics. The limited traceability of brand names and batch numbers in ADR reports may be primarily caused by the shortcomings in the recording of information in clinical practice. We recommend efforts to improve information-recording systems as a first step to improve the traceability of biologics in ADR reporting.

  13. Diffractive Optical Elements with a Large Angle of Operation Recorded in Acrylamide Based Photopolymer on Flexible Substrates

    Directory of Open Access Journals (Sweden)

    Hoda Akbari

    2014-01-01

    Full Text Available A holographic device characterised by a large angular range of operation is under development. The aim of this study is to increase the angular working range of the diffractive lens by stacking three layers of high efficiency optical elements on top of each other so that light is collected (and focussed from a broader range of angles. The angular range of each individual lens element is important, and work has already been done in an acrylamide-based photosensitive polymer to broaden the angular range of individual elements using holographic recording at a low spatial frequency. This paper reports new results on the angular selectivity of stacked diffractive lenses. A working range of 12° is achieved. The diffractive focussing elements were recorded holographically with a central spatial frequency of 300 l/mm using exposure energy of 60 mJ/cm2 at a range of recording angles. At this spatial frequency with layers of thickness 50 ± 5 µm, a diffraction efficiency of 80% and 50% was achieved in the single lens element and combined device, respectively. The optical recording process and the properties of the multilayer structure are described and discussed. Holographic recording of a single lens element is also successfully demonstrated on a flexible glass substrate (Corning(R Willow(R Glass for the first time.

  14. High agreement between the new Mongolian electronic immunization register and written immunization records: a health centre based audit

    Directory of Open Access Journals (Sweden)

    Jocelyn Chan

    2017-09-01

    Full Text Available Introduction: Monitoring of vaccination coverage is vital for the prevention and control of vaccine-preventable diseases. Electronic immunization registers have been increasingly adopted to assist with the monitoring of vaccine coverage; however, there is limited literature about the use of electronic registers in low- and middle-income countries such as Mongolia. We aimed to determine the accuracy and completeness of the newly introduced electronic immunization register for calculating vaccination coverage and determining vaccine effectiveness within two districts in Mongolia in comparison to written health provider records. Methods: We conducted a cross-sectional record review among children 2–23 months of age vaccinated at immunization clinics within the two districts. We linked data from written records with the electronic immunization register using the national identification number to determine the completeness and accuracy of the electronic register. Results: Both completeness (90.9%; 95% CI: 88.4–93.4 and accuracy (93.3%; 95% CI: 84.1–97.4 of the electronic immunization register were high when compared to written records. The increase in completeness over time indicated a delay in data entry. Conclusion: Through this audit, we have demonstrated concordance between a newly introduced electronic register and health provider records in a middle-income country setting. Based on this experience, we recommend that electronic registers be accompanied by routine quality assurance procedures for the monitoring of vaccination programmes in such settings.

  15. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study.

    Science.gov (United States)

    Robertson, Ann R R; Fernando, Bernard; Morrison, Zoe; Kalra, Dipak; Sheikh, Aziz

    2015-03-27

    Globally, diabetes mellitus presents a substantial and increasing burden to individuals, health care systems and society. Structuring and coding of information in the electronic health record underpin attempts to improve sharing and searching for information. Digital records for those with long-term conditions are expected to bring direct and secondary uses benefits, and potentially to support patient self-management. We sought to investigate if how and why records for adults with diabetes were structured and coded and to explore a range of UK stakeholders' perceptions of current practice in the National Health Service. We carried out a qualitative, theoretically informed case study of documenting health care information for diabetes in family practice and hospital settings in England, using semi-structured interviews, observations, systems demonstrations and documentary data. We conducted 22 interviews and four on-site observations. With respect to secondary uses - research, audit, public health and service planning - interviewees clearly articulated the benefits of highly structured and coded diabetes data and it was believed that benefits would expand through linkage to other datasets. Direct, more marginal, clinical benefits in terms of managing and monitoring diabetes and perhaps encouraging patient self-management were also reported. We observed marked differences in levels of record structuring and/or coding between family practices, where it was high, and the hospital. We found little evidence that structured and coded data were being exploited to improve information sharing between care settings. Using high levels of data structuring and coding in records for diabetes patients has the potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK. A first step would be for hospitals to attain levels of health information technology infrastructure and systems use commensurate with family practices.

  16. An empirical approach to predicting long term behavior of metal particle based recording media

    Science.gov (United States)

    Hadad, Allan S.

    1992-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  17. Design of an Electronic Healthcare Record Server Based on Part 1 of ISO EN 13606

    Directory of Open Access Journals (Sweden)

    Tony Austin

    2011-01-01

    Full Text Available ISO EN 13606 is a newly approved standard at European and ISO levels for the meaningful exchange of clinical information between systems. Although conceived as an inter-operability standard to which existing electronic health record (EHR systems will transform legacy data, the requirements met and architectural approach reflected in this standard also make it a good candidate for the internal architecture of an EHR server. The authors have built such a server for the storage of healthcare records and demonstrated that it is possible to use ISO EN 13606 part 1 as the basis of an internal system architecture. The development of the system and some of the applications of the server are described in this paper. It is the first known operational implementation of the standard as an EHR system.

  18. Theory-based Support for Mobile Language Learning: Noticing and Recording

    Directory of Open Access Journals (Sweden)

    Agnes Kukulska-Hulme

    2009-04-01

    Full Text Available This paper considers the issue of 'noticing' in second language acquisition, and argues for the potential of handheld devices to: (i support language learners in noticing and recording noticed features 'on the spot', to help them develop their second language system; (ii help language teachers better understand the specific difficulties of individuals or those from a particular language background; and (iii facilitate data collection by applied linguistics researchers, which can be fed back into educational applications for language learning. We consider: theoretical perspectives drawn from the second language acquisition literature, relating these to the practice of writing language learning diaries; and the potential for learner modelling to facilitate recording and prompting noticing in mobile assisted language learning contexts. We then offer guidelines for developers of mobile language learning solutions to support the development of language awareness in learners.

  19. Implementation of computer-based patient records in primary care: the societal health economic effects.

    OpenAIRE

    Arias-Vimárlund, V.; Ljunggren, M.; Timpka, T.

    1996-01-01

    OBJECTIVE: Exploration of the societal health economic effects occurring during the first year after implementation of Computerised Patient Records (CPRs) at Primary Health Care (PHC) centres. DESIGN: Comparative case studies of practice processes and their consequences one year after CPR implementation, using the constant comparison method. Application of transaction-cost analyses at a societal level on the results. SETTING: Two urban PHC centres under a managed care contract in Ostergötland...

  20. Certificate-Based Encryption with Keyword Search: Enabling Secure Authorization in Electronic Health Record

    OpenAIRE

    Clémentine Gritti; Willy Susilo; Thomas Plantard

    2016-01-01

    In an e-Health scenario, we study how the practitioners are authorized when they are requesting access to medical documents containing sensitive information. Consider the following scenario. A clinician wants to access and retrieve a patient’s Electronic Health Record (EHR), and this means that the clinician must acquire sufficient access right to access this document. As the EHR is within a collection of many other patients, the clinician would need to specify some requirements (such as a ke...

  1. Two years of recorded data for a multisource heat pump system: A performance analysis

    International Nuclear Information System (INIS)

    Busato, F.; Lazzarin, R.M.; Noro, M.

    2013-01-01

    The concept of a low energy building in a temperate climate (according to the Koppen climate classification) is based upon the following principles: reduction of heat losses through enhanced insulation; the inclusion of heat recovery on mechanical ventilation; and the use of high efficiency heating/cooling systems integrated with renewable technologies. It is almost impossible to achieve optimum results in terms of global energy efficiency if one of these elements is omitted from the design. In 2009, a new school building, integrating these three key elements, was opened in Agordo town, located in northern Italy. The main design features of the building incorporate a well insulated envelope and a space heating and ventilation system driven by an innovative multisource heat pump system. Outdoor air is a common heat source, although it does have widely documented limitations. Heat pump systems can utilise more efficient sources than air, including those of ground heat, solar heat, and heat recovery. The installed system within the school building incorporates these three sources. A multisource system aims to enhance the performance of the heat pump, both in terms of heating capacity and overall efficiency. The present work includes evaluation and analysis of data obtained through real time monitoring of the working system in operation, for a period of approximately two heating seasons. During this time, the behaviour of the system was assessed and the incorrect settings of the plant were identified and subsequently adjusted as required. The energy balance indicates that the integration of different sources not only increases the thermal performance of the system as a whole, but also optimizes the use of each source. Further savings can be obtained through correct adjustment of the set point of the indoor temperature. During the final stage of the study, the total energy consumption of the new building is calculated and compared to that of the former building that

  2. Individualized music played for agitated patients with dementia: analysis of video-recorded sessions.

    Science.gov (United States)

    Ragneskog, H; Asplund, K; Kihlgren, M; Norberg, A

    2001-06-01

    Many nursing home patients with dementia suffer from symptoms of agitation (e.g. anxiety, shouting, irritability). This study investigated whether individualized music could be used as a nursing intervention to reduce such symptoms in four patients with severe dementia. The patients were video-recorded during four sessions in four periods, including a control period without music, two periods where individualized music was played, and one period where classical music was played. The recordings were analysed by systematic observations and the Facial Action Coding System. Two patients became calmer during some of the individualized music sessions; one patient remained sitting in her armchair longer, and the other patient stopped shouting. For the two patients who were most affected by dementia, the noticeable effect of music was minimal. If the nursing staff succeed in discovering the music preferences of an individual, individualized music may be an effective nursing intervention to mitigate anxiety and agitation for some patients.

  3. Why georeferencing matters: Introducing a practical protocol to prepare species occurrence records for spatial analysis

    OpenAIRE

    Bloom, Trevor D. S.; Flower, Aquila; DeChaine, Eric G.

    2017-01-01

    Abstract Species Distribution Models (SDMs) are widely used to understand environmental controls on species’ ranges and to forecast species range shifts in response to climatic changes. The quality of input data is crucial determinant of the model's accuracy. While museum records can be useful sources of presence data for many species, they do not always include accurate geographic coordinates. Therefore, actual locations must be verified through the process of georeferencing. We present a pr...

  4. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  5. An analysis of the recording of tobacco use among inpatients in Irish hospitals.

    LENUS (Irish Health Repository)

    Sheridan, A

    2014-10-01

    Smoking is the largest avoidable cause of premature mortality in the world. Hospital admission is an opportunity to identify and help smokers quit. This study aimed to determine the level of recording of tobacco use (current and past) in Irish hospitals. Information on inpatient discharges with a tobacco use diagnosis was extracted from HIPE. In 2011, a quarter (n=84, 679) of discharges had a recording of tobacco use, which were more common among males (29% (n=50,161) male v. 20% (n=30,162) female), among medical patients (29% (n=54,375) medical v. 20% (n=30,162) other) and was highest among those aged 55-59 years (30.6%; n=7,885). SLAN 2007 reported that 48% of adults had smoked at some point in their lives. This study would suggest an under- reporting of tobacco use among hospital inpatients. Efforts should be made to record smoking status at hospital admission, and to improve the quality of the HIPE coding of tobacco use.

  6. Analysis of a Near-field Earthquake Record at the Deep Underground Research Tunnel

    International Nuclear Information System (INIS)

    Yun, Kwan Hee; Park, Dong Hee; Shim, Taek Mo

    2009-01-01

    On October 29, 2008, a moderate earthquake (M=3.4, 36.35 N 127.25 E) occurred near the city of Daejon where an underground testing facilities called 'KURT (KAERI Underground Research Tunnel)' was located inside KAERI. Even though this earthquake did not trigger a seismic monitoring system of the mock-up Nuclear Power Plant of Hanaro, it was large enough not only to provide nation-wide earthquake data of good quality but also to be widely felt by the people uncomfortably around Daejon. In addition, this earthquake provides a good chance to obtain a nearfield broadband seismogram of frequency up to 200Hz recorded at the three-component geophones at the deep underground tunnel of the KURT (-90m). So we compared the seismic records from the KURT with other records from the nearby national seismic network to evaluate the earthquake ground-motion characteristics at the underground facilities for future engineering application. Three nearby seismic stations of the national seismic network jointly operated by Korea Meteorological Administration (KMA), Korea Institute of Geoscience And Mineral Resources (KIGAM), KEPRI, and KINS

  7. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    G. W. L. Gresik

    2013-07-01

    Full Text Available The goal of RECORDS (Reaching Recording Data Technologies is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  8. The MSG-SEVIRI-based cloud property data record CLAAS-2

    Directory of Open Access Journals (Sweden)

    N. Benas

    2017-07-01

    Full Text Available Clouds play a central role in the Earth's atmosphere, and satellite observations are crucial for monitoring clouds and understanding their impact on the energy budget and water cycle. Within the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF, a new cloud property data record was derived from geostationary Meteosat Spinning Enhanced Visible and Infrared Imager (SEVIRI measurements for the time frame 2004–2015. The resulting CLAAS-2 (CLoud property dAtAset using SEVIRI, Edition 2 data record is publicly available via the CM SAF website (https://doi.org/10.5676/EUM_SAF_CM/CLAAS/V002. In this paper we present an extensive evaluation of the CLAAS-2 cloud products, which include cloud fractional coverage, thermodynamic phase, cloud top properties, liquid/ice cloud water path and corresponding optical thickness and particle effective radius. Data validation and comparisons were performed on both level 2 (native SEVIRI grid and repeat cycle and level 3 (daily and monthly averages and histograms with reference datasets derived from lidar, microwave and passive imager measurements. The evaluation results show very good overall agreement with matching spatial distributions and temporal variability and small biases attributed mainly to differences in sensor characteristics, retrieval approaches, spatial and temporal samplings and viewing geometries. No major discrepancies were found. Underpinned by the good evaluation results, CLAAS-2 demonstrates that it is fit for the envisaged applications, such as process studies of the diurnal cycle of clouds and the evaluation of regional climate models. The data record is planned to be extended and updated in the future.

  9. Validation of fragility fractures in primary care electronic medical records: A population-based study.

    Science.gov (United States)

    Martinez-Laguna, Daniel; Soria-Castro, Alberto; Carbonell-Abella, Cristina; Orozco-López, Pilar; Estrada-Laza, Pilar; Nogues, Xavier; Díez-Perez, Adolfo; Prieto-Alhambra, Daniel

    2017-11-28

    Electronic medical records databases use pre-specified lists of diagnostic codes to identify fractures. These codes, however, are not specific enough to disentangle traumatic from fragility-related fractures. We report on the proportion of fragility fractures identified in a random sample of coded fractures in SIDIAP. Patients≥50 years old with any fracture recorded in 2012 (as per pre-specified ICD-10 codes) and alive at the time of recruitment were eligible for this retrospective observational study in 6 primary care centres contributing to the SIDIAP database (www.sidiap.org). Those with previous fracture/s, non-responders, and those with dementia or a serious psychiatric disease were excluded. Data on fracture type (traumatic vs fragility), skeletal site, and basic patient characteristics were collected. Of 491/616 (79.7%) patients with a registered fracture in 2012 who were contacted, 331 (349 fractures) were included. The most common fractures were forearm (82), ribs (38), and humerus (32), and 225/349 (64.5%) were fragility fractures, with higher proportions for classic osteoporotic sites: hip, 91.7%; spine, 87.7%; and major fractures, 80.5%. This proportion was higher in women, the elderly, and patients with a previously coded diagnosis of osteoporosis. More than 4 in 5 major fractures recorded in SIDIAP are due to fragility (non-traumatic), with higher proportions for hip (92%) and vertebral (88%) fracture, and a lower proportion for fractures other than major ones. Our data support the validity of SIDIAP for the study of the epidemiology of osteoporotic fractures. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  10. Textile Concentric Ring Electrodes for ECG Recording Based on Screen-Printing Technology.

    Science.gov (United States)

    Lidón-Roger, José Vicente; Prats-Boluda, Gema; Ye-Lin, Yiyao; Garcia-Casado, Javier; Garcia-Breijo, Eduardo

    2018-01-21

    Among many of the electrode designs used in electrocardiography (ECG), concentric ring electrodes (CREs) are one of the most promising due to their enhanced spatial resolution. Their development has undergone a great push due to their use in recent years; however, they are not yet widely used in clinical practice. CRE implementation in textiles will lead to a low cost, flexible, comfortable, and robust electrode capable of detecting high spatial resolution ECG signals. A textile CRE set has been designed and developed using screen-printing technology. This is a mature technology in the textile industry and, therefore, does not require heavy investments. Inks employed as conductive elements have been silver and a conducting polymer (poly (3,4-ethylenedioxythiophene) polystyrene sulfonate; PEDOT:PSS). Conducting polymers have biocompatibility advantages, they can be used with flexible substrates, and they are available for several printing technologies. CREs implemented with both inks have been compared by analyzing their electric features and their performance in detecting ECG signals. The results reveal that silver CREs present a higher average thickness and slightly lower skin-electrode impedance than PEDOT:PSS CREs. As for ECG recordings with subjects at rest, both CREs allowed the uptake of bipolar concentric ECG signals (BC-ECG) with signal-to-noise ratios similar to that of conventional ECG recordings. Regarding the saturation and alterations of ECGs captured with textile CREs caused by intentional subject movements, silver CREs presented a more stable response (fewer saturations and alterations) than those of PEDOT:PSS. Moreover, BC-ECG signals provided higher spatial resolution compared to conventional ECG. This improved spatial resolution was manifested in the identification of P1 and P2 waves of atrial activity in most of the BC-ECG signals. It can be concluded that textile silver CREs are more suitable than those of PEDOT:PSS for obtaining BC-ECG records

  11. Evaluation of the metabolic rate based on the recording of the heart rate

    OpenAIRE

    MALCHAIRE, Jacques; ALFANO, Francesca Romana d?AMBROSIO; PALELLA, Boris Igor

    2017-01-01

    The assessment of harsh working conditions requires a correct evaluation of the metabolic rate. This paper revises the basis described in the ISO 8996 standard for the evaluation of the metabolic rate at a work station from the recording of the heart rate of a worker during a representative period of time. From a review of the literature, formulas different from those given in the standard are proposed to estimate the maximum working capacity, the maximum heart rate, the heart rate and the me...

  12. HL7 document patient record architecture: an XML document architecture based on a shared information model.

    Science.gov (United States)

    Dolin, R H; Alschuler, L; Behlen, F; Biron, P V; Boyer, S; Essin, D; Harding, L; Lincoln, T; Mattison, J E; Rishel, W; Sokolowski, R; Spinosa, J; Williams, J P

    1999-01-01

    The HL7 SGML/XML Special Interest Group is developing the HL7 Document Patient Record Architecture. This draft proposal strives to create a common data architecture for the interoperability of healthcare documents. Key components are that it is under the umbrella of HL7 standards, it is specified in Extensible Markup Language, the semantics are drawn from the HL7 Reference Information Model, and the document specifications form an architecture that, in aggregate, define the semantics and structural constraints necessary for the exchange of clinical documents. The proposal is a work in progress and has not yet been submitted to HL7's formal balloting process.

  13. Comparison of clinical knowledge bases for summarization of electronic health records.

    Science.gov (United States)

    McCoy, Allison B; Sittig, Dean F; Wright, Adam

    2013-01-01

    Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.

  14. Using a cloud-based electronic health record during disaster response: a case study in Fukushima, March 2011.

    Science.gov (United States)

    Nagata, Takashi; Halamka, John; Himeno, Shinkichi; Himeno, Akihiro; Kennochi, Hajime; Hashizume, Makoto

    2013-08-01

    Following the Great East Japan Earthquake on March 11, 2011, the Japan Medical Association deployed medical disaster teams to Shinchi-town (population: approximately 8,000), which is located 50 km north of the Fukushima Daiichi nuclear power plant. The mission of the medical disaster teams sent from Fukuoka, 1,400 km south of Fukushima, was to provide medical services and staff a temporary clinic for six weeks. Fear of radiation exposure restricted the use of large medical teams and local infrastructure. Therefore, small volunteer groups and a cloud-hosted, web-based electronic health record were implemented. The mission was successfully completed by the end of May 2011. Cloud-based electronic health records deployed using a "software as a service" model worked well during the response to the large-scale disaster.

  15. Action Potential Recording and Pro-arrhythmia Risk Analysis in Human Ventricular Trabeculae.

    Science.gov (United States)

    Qu, Yusheng; Page, Guy; Abi-Gerges, Najah; Miller, Paul E; Ghetti, Andre; Vargas, Hugo M

    2017-01-01

    negative predictive values were 0.88, 0.8, 0.88 and 0.8, respectively. Thus, the hVT AP-based model combined with the integrated analysis of pro-arrhythmic score can differentiate between torsadogenic and non-torsadogenic drugs, and has a greater predictive performance when compared to human SC-CM models.

  16. Allen's big-eared bat (Idionycteris phyllotis) documented in colorado based on recordings of its distinctive echolocation call

    Science.gov (United States)

    Hayes, M.A.; Navo, K.W.; Bonewell, L.; Mosch, C.J.; Adams, Rick A.

    2009-01-01

    Allen's big-eared bat (Idionycteris phyllotis) inhabits much of the southwestern USA, but has not been documented in Colorado. We recorded echolocation calls consistent with I. phyllotis near La Sal Creek, Montrose County, Colorado. Based on characteristics of echolocation calls and flight behavior, we conclude that the echolocation calls described here were emitted by I. phyllotis and that they represent the first documentation of this species in Colorado.

  17. Preliminary analysis of strong-motion recordings from the 28 September 2004 Parkfield, California earthquake

    Science.gov (United States)

    Shakal, A.; Graizer, V.; Huang, M.; Borcherdt, R.; Haddadi, H.; Lin, K.-W.; Stephens, C.; Roffers, P.

    2005-01-01

    The Parkfield 2004 earthquake yielded the most extensive set of strong-motion data in the near-source region of a magnitude 6 earthquake yet obtained. The recordings of acceleration and volumetric strain provide an unprecedented document of the near-source seismic radiation for a moderate earthquake. The spatial density of the measurements alon g the fault zone and in the linear arrays perpendicular to the fault is expected to provide an exceptional opportunity to develop improved models of the rupture process. The closely spaced measurements should help infer the temporal and spatial distribution of the rupture process at much higher resolution than previously possible. Preliminary analyses of the peak a cceleration data presented herein shows that the motions vary significantly along the rupture zone, from 0.13 g to more than 2.5 g, with a map of the values showing that the larger values are concentrated in three areas. Particle motions at the near-fault stations are consistent with bilateral rupture. Fault-normal pulses similar to those observed in recent strike-slip earthquakes are apparent at several of the stations. The attenuation of peak ground acceleration with distance is more rapid than that indicated by some standard relationships but adequately fits others. Evidence for directivity in the peak acceleration data is not strong. Several stations very near, or over, the rupturing fault recorded relatively low accelerations. These recordings may provide a quantitative basis to understand observations of low near-fault shaking damage that has been reported in other large strike-slip earthquak.

  18. Appropriate threshold levels of cardiac beat-to-beat variation in semi-automatic analysis of equine ECG recordings

    DEFF Research Database (Denmark)

    Madsen, Mette Flethøj; Kanters, Jørgen K.; Pedersen, Philip Juul

    2016-01-01

    Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study was to de......Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study...... was to determine the appropriate threshold levels of maximum acceptable deviation of RR intervals in equine ECG analysis, and to evaluate a novel two-step timing algorithm by quantifying the frequency of arrhythmias in a cohort of healthy adult endurance horses. Results: Beat-to-beat variation differed......, range 1–24). Conclusions: Beat-to-beat variation of equine cardiac rhythm varies according to HR, and threshold levels in equine ECG analysis should be adjusted accordingly. Standardization of the analysis criteria will enable comparisons of studies and follow-up examinations of patients. A small number...

  19. Population and patient factors affecting emergency department attendance in London: retrospective cohort analysis of linked primary and secondary care records.

    Science.gov (United States)

    Hull, Sally A; Homer, Kate; Boomla, Kambiz; Robson, John; Ashworth, Mark

    2018-03-01

    Population factors, including social deprivation and morbidity, predict the use of emergency departments (EDs). To link patient-level primary and secondary care data to determine whether the association between deprivation and ED attendance is explained by multimorbidity and other clinical factors in the GP record. Retrospective cohort study based in East London. Primary care demographic, consultation, diagnostic, and clinical data were linked with ED attendance data. GP Patient Survey (GPPS) access questions were linked to practices. Adjusted multilevel analysis for adults showed a progressive rise in ED attendance with increasing numbers of long-term conditions (LTCs). Comparing two LTCs with no conditions, the odds ratio (OR) is 1.28 (95% confidence interval [CI] = 1.25 to 1.31); comparing four or more conditions with no conditions, the OR is 2.55 (95% CI = 2.44 to 2.66). Increasing annual GP consultations predicted ED attendance: comparing zero with more than two consultations, the OR is 2.44 (95% CI = 2.40 to 2.48). Smoking (OR 1.30, 95% CI = 1.28 to 1.32), being housebound (OR 2.01, 95% CI = 1.86 to 2.18), and age also predicted attendance. Patient-reported access scores from the GPPS were not a significant predictor. For children, younger age, male sex, white ethnicity, and higher GP consultation rates predicted attendance. Using patient-level data rather than practice-level data, the authors demonstrate that the burden of multimorbidity is the strongest clinical predictor of ED attendance, which is independently associated with social deprivation. Low use of the GP surgery is associated with low attendance at ED. Unlike other studies, the authors found that adult patient experience of GP access, reported at practice level, did not predict use. © British Journal of General Practice 2018.

  20. Evaluation of organizational maturity based on people capacity maturity model in medical record wards of Iranian hospitals.

    Science.gov (United States)

    Yarmohammadian, Mohammad H; Tavakoli, Nahid; Shams, Assadollah; Hatampour, Farzaneh

    2014-01-01

    People capacity maturity model (PCMM) is one of the models which focus on improving organizational human capabilities. The aim of this model's application is to increase people ability to attract, develop, motivate, organize and retain the talents needed to organizational continuous improvement. In this study, we used the PCMM for investigation of organizational maturity level in medical record departments of governmental hospitals and determination strengths and weaknesses of their staff capabilities. This is an applied research and cross sectional study in which data were collected by questionnaires to investigation of PCMM model needs in medical record staff of governmental hospitals at Isfahan, Iran. We used the questionnaire which has been extracted from PCMM model and approved its reliability with Cronbach's Alpha 0.96. Data collected by the questionnaire was analyzed based on the research objectives using SPSS software and in accordance with research questions descriptive statistics were used. Our findings showed that the mean score of medical record practitioners, skill and capability in governmental hospitals was 35 (62.5%) from maximum 56 (100%). There is no significant relevance between organizational maturity and medical record practitioners, attributes. Applying PCMM model is caused increasing staff and manager attention in identifying the weaknesses in the current activities and practices, so it will result in improvement and developing processes.

  1. Possible Cretaceous Arctic terrestrial ecosystem dynamics based on a rich dinosaur record from Alaska

    Science.gov (United States)

    Fiorillo, A. R.; McCarthy, P. J.; Flaig, P. P.

    2010-12-01

    The widespread occurrence of large-bodied herbivores, specifically hadrosaurian and ceratopsian dinosaurs, in the Cretaceous of Alaska presents a proxy for understanding polar terrestrial ecosystem biological productivity in a warm Arctic world. These dinosaurs lived in Alaska at time when this region was at or near current latitudes. Thus these dinosaurs present a paradox. The warmer Cretaceous high-latitude climate, likely related to higher levels of CO2, may have increased plant productivity but the polar light regime fluctuations must have limited the available food during the winter months. The most detailed sedimentological data available regarding the paleoenvironments supporting these dinosaurs are from the Prince Creek Formation of northern Alaska and to a lesser extent the Cantwell Formation of the Alaska Range. The sediments of the Late Cretaceous Prince Creek Formation represent a continental succession deposited on a high-latitude, low-gradient, alluvial/coastal plain. The Prince Creek Formation records numerous paleosols that are consistent with seasonality and successional vegetative cover. Drab colors in fine-grained sediments, abundant carbonaceous plant material, and common siderite nodules and jarosite suggest widespread reducing conditions on poorly-drained floodplains influenced in more distal areas by marine waters. In addition, these rocks contain high levels of organic carbon and charcoal. Carbonaceous root-traces found ubiquitously within all distributary channels and most floodplain facies, along with common Fe-oxide mottles, indicate that the alluvial system likely experienced flashy, seasonal, or ephemeral flow and a fluctuating water table. The flashy nature of the alluvial system may have been driven by recurring episodes of vigorous seasonal snowmelt in the Brooks Range orogenic belt as a consequence of the high paleolatitude of northern Alaska in the Late Cretaceous. The presence of dinosaurian megaherbivores suggests that water was

  2. Textile Concentric Ring Electrodes for ECG Recording Based on Screen-Printing Technology

    Directory of Open Access Journals (Sweden)

    José Vicente Lidón-Roger

    2018-01-01

    Full Text Available Among many of the electrode designs used in electrocardiography (ECG, concentric ring electrodes (CREs are one of the most promising due to their enhanced spatial resolution. Their development has undergone a great push due to their use in recent years; however, they are not yet widely used in clinical practice. CRE implementation in textiles will lead to a low cost, flexible, comfortable, and robust electrode capable of detecting high spatial resolution ECG signals. A textile CRE set has been designed and developed using screen-printing technology. This is a mature technology in the textile industry and, therefore, does not require heavy investments. Inks employed as conductive elements have been silver and a conducting polymer (poly (3,4-ethylenedioxythiophene polystyrene sulfonate; PEDOT:PSS. Conducting polymers have biocompatibility advantages, they can be used with flexible substrates, and they are available for several printing technologies. CREs implemented with both inks have been compared by analyzing their electric features and their performance in detecting ECG signals. The results reveal that silver CREs present a higher average thickness and slightly lower skin-electrode impedance than PEDOT:PSS CREs. As for ECG recordings with subjects at rest, both CREs allowed the uptake of bipolar concentric ECG signals (BC-ECG with signal-to-noise ratios similar to that of conventional ECG recordings. Regarding the saturation and alterations of ECGs captured with textile CREs caused by intentional subject movements, silver CREs presented a more stable response (fewer saturations and alterations than those of PEDOT:PSS. Moreover, BC-ECG signals provided higher spatial resolution compared to conventional ECG. This improved spatial resolution was manifested in the identification of P1 and P2 waves of atrial activity in most of the BC-ECG signals. It can be concluded that textile silver CREs are more suitable than those of PEDOT:PSS for obtaining

  3. Cluster randomized trials utilizing primary care electronic health records: methodological issues in design, conduct, and analysis (eCRT Study).

    Science.gov (United States)

    Gulliford, Martin C; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Charlton, Judith; Dregan, Alex

    2014-06-11

    There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for

  4. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    Science.gov (United States)

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The hospital must insure the CMMS project provides a means to implement an integrated on-line hospital information data base for use by departments in operating under a DRG-based Prospective Payment System. This paper presents guidelines for use in selecting a Case Mix Mangement System to meet the hospital's financial and operations planning, budgeting, marketing, and other management needs, while considering the data base implications of the implementation.

  5. Multi-modal causality analysis of eyes-open and eyes-closed data from simultaneously recorded EEG and MEG.

    Science.gov (United States)

    Anwar, Abdul Rauf; Mideska, Kidist Gebremariam; Hellriegel, Helge; Hoogenboom, Nienke; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2014-01-01

    Owing to the recent advances in multi-modal data analysis, the aim of the present study was to analyze the functional network of the brain which remained the same during the eyes-open (EO) and eyes-closed (EC) resting task. The simultaneously recorded electroencephalogram (EEG) and magnetoencephalogram (MEG) were used for this study, recorded from five distinct cortical regions of the brain. We focused on the 'alpha' functional network, corresponding to the individual peak frequency in the alpha band. The total data set of 120 seconds was divided into three segments of 18 seconds each, taken from start, middle, and end of the recording. This segmentation allowed us to analyze the evolution of the underlying functional network. The method of time-resolved partial directed coherence (tPDC) was used to assess the causality. This method allowed us to focus on the individual peak frequency in the 'alpha' band (7-13 Hz). Because of the significantly higher power in the recorded EEG in comparison to MEG, at the individual peak frequency of the alpha band, results rely only on EEG. The MEG was used only for comparison. Our results show that different regions of the brain start to `disconnect' from one another over the course of time. The driving signals, along with the feedback signals between different cortical regions start to recede over time. This shows that, with the course of rest, brain regions reduce communication with each another.

  6. A comparison between flexible electrogoniometers, inclinometers and three-dimensional video analysis system for recording neck movement.

    Science.gov (United States)

    Carnaz, Letícia; Moriguchi, Cristiane S; de Oliveira, Ana Beatriz; Santiago, Paulo R P; Caurin, Glauco A P; Hansson, Gert-Åke; Coury, Helenice J C Gil

    2013-11-01

    This study compared neck range of movement recording using three different methods goniometers (EGM), inclinometers (INC) and a three-dimensional video analysis system (IMG) in simultaneous and synchronized data collection. Twelve females performed neck flexion-extension, lateral flexion, rotation and circumduction. The differences between EGM, INC, and IMG were calculated sample by sample. For flexion-extension movement, IMG underestimated the amplitude by 13%; moreover, EGM showed a crosstalk of about 20% for lateral flexion and rotation axes. In lateral flexion movement, all systems showed similar amplitude and the inter-system differences were moderate (4-7%). For rotation movement, EGM showed a high crosstalk (13%) for flexion-extension axis. During the circumduction movement, IMG underestimated the amplitude of flexion-extension movements by about 11%, and the inter-system differences were high (about 17%) except for INC-IMG regarding lateral flexion (7%) and EGM-INC regarding flexion-extension (10%). For application in workplace, INC presents good results compared to IMG and EGM though INC cannot record rotation. EGM should be improved in order to reduce its crosstalk errors and allow recording of the full neck range of movement. Due to non-optimal positioning of the cameras for recording flexion-extension, IMG underestimated the amplitude of these movements. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Installation Restoration Program Records Search for Richards-Gebaur Air Force Base, Missouri.

    Science.gov (United States)

    1983-03-01

    City Aviation Department. These operations have been primarily involved in the routine maintenance of assigned aircraft and associated ground support...tarnsferred to base supply (Talley Services, Inc.) for disposal off-base through contract. C. Kansas City Aviation Department (KCAD) i. Vehicle...Fixed-Base Operation ( FBO ) Talley Services operates the FBO for light aircraft out of Building 821. Small quantities (less than 60 gallons a year

  8. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    OpenAIRE

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The h...

  9. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  10. Quality improvement in documentation of postoperative care nursing using computer-based medical records

    DEFF Research Database (Denmark)

    Olsen, Susanne Winther

    2013-01-01

    Postanesthesia nursing should be documented with high quality. The purpose of this retrospective case-based study on 49 patients was to analyze the quality of postoperative documentation in the two existing templates and, based on this audit, to suggest a new template for documentation. The audit...... be converted to explicit documentation. Furthermore, the quality of documentation was improved.......Postanesthesia nursing should be documented with high quality. The purpose of this retrospective case-based study on 49 patients was to analyze the quality of postoperative documentation in the two existing templates and, based on this audit, to suggest a new template for documentation. The audit...

  11. Evaluation of the metabolic rate based on the recording of the heart rate.

    Science.gov (United States)

    Malchaire, Jacques; d'AMBROSIO Alfano, Francesca Romana; Palella, Boris Igor

    2017-06-08

    The assessment of harsh working conditions requires a correct evaluation of the metabolic rate. This paper revises the basis described in the ISO 8996 standard for the evaluation of the metabolic rate at a work station from the recording of the heart rate of a worker during a representative period of time. From a review of the literature, formulas different from those given in the standard are proposed to estimate the maximum working capacity, the maximum heart rate, the heart rate and the metabolic rate at rest and the relation (HR vs. M) at the basis of the estimation of the equivalent metabolic rate, as a function of the age, height and weight of the person. A Monte Carlo simulation is used to determine, from the approximations of these parameters and formulas, the imprecision of the estimated equivalent metabolic rate. The results show that the standard deviation of this estimate varies from 10 to 15%.

  12. Architecture and implementation for a system enabling smartphones to access smart card based healthcare records.

    Science.gov (United States)

    Karampelas, Vasilios; Pallikarakis, Nicholas; Mantas, John

    2013-01-01

    The healthcare researchers', academics' and practitioners' interest concerning the development of Healthcare Information Systems has been on a steady rise for the last decades. Fueling this steady rise has been the healthcare professional need of quality information, in every healthcare provision incident, whenever and wherever this incident may take place. In order to address this need a truly mobile health care system is required, one that will be able to provide a healthcare provider with accurate patient-related information regardless of the time and place that healthcare is provided. In order to fulfill this role the present study proposes the architecture for a Healthcare Smartcard system, which provides authenticated healthcare professionals with remote mobile access to a Patient's Healthcare Record, through their Smartphone. Furthermore the research proceeds to develop a working prototype system.

  13. Three-Dimensional Visual Patient Based on Electronic Medical Diagnostic Records.

    Science.gov (United States)

    Shi, Liehang; Sun, Jianyong; Yang, Yuanyuan; Ling, Tonghui; Wang, Mingqing; Gu, Yiping; Yang, Zhiming; Hua, Yanqing; Zhang, Jianguo

    2018-01-01

    an innovative concept and method is introduced to use a 3-D anatomical graphic pattern called visual patient (VP) visually to index, represent, and render the medical diagnostic records (MDRs) of a patient, so that a doctor can quickly learn the current and historical medical status of the patient by manipulating VP. The MDRs can be imaging diagnostic reports and DICOM images, laboratory reports and clinical summaries which can have clinical information relating to medical status of human organs or body parts. the concept and method included three steps. First, a VP data model called visual index object (VIO) and a VP graphic model called visual anatomic object (VAO) were introduced. Second, a series of processing methods of parsing and extracting key information from MDRs were used to fill the attributes of the VIO model of a patient. Third, a VP system (VPS) was designed to map VIO to VAO, to create a VP instance for each patient. a prototype VPS has been implemented in a simulated hospital PACS/RIS integrated environment. Two evaluation results showed that more than 70% participating radiologists would like to use the VPS in their radiological imaging tasks, and the efficiency of using VPS to review the tested patients' MDRs was 2.24 times higher than that of using PACS/RIS, while the average accuracy by using PACS/RIS was better than that by using VPS; however, this difference was only about 4%. the developed VPS can show the medical status of patient organs/sub-organs with 3-D anatomical graphic pattern and will be welcomed by radiologists with better efficiency in reviewing the patients' MDRs and with acceptable accuracy. the VP introduces a new way for medical professionals to access and interact with a huge amount of patient records with better efficiency in the big data era.

  14. Constructing a population-based research database from routine maternal screening records: a resource for studying alloimmunization in pregnant women.

    Directory of Open Access Journals (Sweden)

    Brian K Lee

    Full Text Available BACKGROUND: Although screening for maternal red blood cell antibodies during pregnancy is a standard procedure, the prevalence and clinical consequences of non-anti-D immunization are poorly understood. The objective was to create a national database of maternal antibody screening results that can be linked with population health registers to create a research resource for investigating these issues. STUDY DESIGN AND METHODS: Each birth in the Swedish Medical Birth Register was uniquely identified and linked to the text stored in routine maternal antibody screening records in the time window from 9 months prior to 2 weeks after the delivery date. These text records were subjected to a computerized search for specific antibodies using regular expressions. To illustrate the research potential of the resulting database, selected antibody prevalence rates are presented as tables and figures, and the complete data (from more than 60 specific antibodies presented as online moving graphical displays. RESULTS: More than one million (1,191,761 births with valid screening information from 1982-2002 constitute the study population. Computerized coverage of screening increased steadily over time and varied by region as electronic records were adopted. To ensure data quality, we restricted analysis to birth records in areas and years with a sustained coverage of at least 80%, representing 920,903 births from 572,626 mothers in 17 of the 24 counties in Sweden. During the study period, non-anti-D and anti-D antibodies occurred in 76.8/10,000 and 14.1/10,000 pregnancies respectively, with marked differences between specific antibodies over time. CONCLUSION: This work demonstrates the feasibility of creating a nationally representative research database from the routine maternal antibody screening records from an extended calendar period. By linkage with population registers of maternal and child health, such data are a valuable resource for addressing important

  15. Analysis of a health team's records and nurses' perceptions concerning signs and symptoms of delirium.

    Science.gov (United States)

    Silva, Rosa Carla Gomes da; Silva, Abel Avelino de Paiva E; Marques, Paulo Alexandre Oliveira

    2011-01-01

    This study investigates the extent of under-diagnosis of acute confusion/delirium by analyzing the records of a health team and the perception of nurses concerning this phenomenon. This quantitative study was developed in a central university hospital in Portugal and used the documentary and interview techniques. The sample obtained through the application of the NeeCham's scale was composed of 111 patients with the diagnosis of acute confusion/delirium hospitalized in the medical and surgical acute care units. A rate of 12.6% of under-diagnosis was identified in the records and a rate of 30.6% was found taking into account the perception of nurses. No indicators of acute confusion/delirium were found in 8.1% of the 111 cases and only 4.5% of the patients were diagnosed with acute confusion/delirium. The results indicate there is difficulty in identifying acute confusion/delirium, with implications for the quality of care, suggesting the need to implement training measures directed to health teams.

  16. Vertical Microbial Community Variability of Carbonate-based Cones may Provide Insight into Formation in the Rock Record

    Science.gov (United States)

    Trivedi, C.; Bojanowski, C.; Daille, L. K.; Bradley, J.; Johnson, H.; Stamps, B. W.; Stevenson, B. S.; Berelson, W.; Corsetti, F. A.; Spear, J. R.

    2015-12-01

    Stromatolite morphogenesis is poorly understood, and the process by which microbial mats become mineralized is a primary question in microbialite formation. Ancient conical stromatolites are primarily carbonate-based whereas the few modern analogues in hot springs are either non-mineralized or mineralized by silica. A team from the 2015 International GeoBiology Course investigated carbonate-rich microbial cones from near Little Hot Creek (LHC), Long Valley Caldera, California, to investigate how conical stromatolites might form in a hot spring carbonate system. The cones are up to 3 cm tall and are found in a calm, ~45° C pool near LHC that is 4 times super-saturated with respect to CaCO3. The cones rise from a flat, layered microbial mat at the edge of the pool. Scanning electron microscopy revealed filamentous bacteria associated with calcite crystals within the cone tips. Preliminary 16S rRNA gene analysis indicated variability of community composition between different vertical levels of the cone. The cone tip had comparatively greater abundance of filamentous cyanobacteria (Leptolyngbya and Phormidium) and fewer heterotrophs (e.g. Chloroflexi) compared to the cone bottom. This supports the hypothesis that cone formation may depend on the differential abundance of the microbial community and their potential functional roles. Metagenomic analyses of the cones revealed potential genes related to chemotaxis and motility. Specifically, a genomic bin identified as a member of the genus Isosphaera contained an hmp chemotaxis operon implicated in gliding motility in the cyanobacterium Nostoc punctiforme [1]. Isosphaera is a Planctomycete shown to have phototactic capabilities [2], and may play a role in conjunction with cyanobacteria in the vertical formation of the cones. This analysis of actively growing cones indicates a complex interplay of geochemistry and microbiology that form structures which can serve as models for processes that occurred in the past and are

  17. Electronic health record-based patient identification and individualized mailed outreach for primary cardiovascular disease prevention: a cluster randomized trial.

    Science.gov (United States)

    Persell, Stephen D; Lloyd-Jones, Donald M; Friesema, Elisha M; Cooper, Andrew J; Baker, David W

    2013-04-01

    Many individuals at higher risk for cardiovascular disease (CVD) do not receive recommended treatments. Prior interventions using personalized risk information to promote prevention did not test clinic-wide effectiveness. To perform a 9-month cluster-randomized trial, comparing a strategy of electronic health record-based identification of patients with increased CVD risk and individualized mailed outreach to usual care. Patients of participating physicians with a Framingham Risk Score of at least 5 %, low-density lipoprotein (LDL)-cholesterol level above guideline threshold for drug treatment, and not prescribed a lipid-lowering medication were included in the intention-to-treat analysis. Patients of physicians randomized to the intervention group were mailed individualized CVD risk messages that described benefits of using a statin (and controlling hypertension or quitting smoking when relevant). The primary outcome was occurrence of a LDL-cholesterol level, repeated in routine practice, that was at least 30 mg/dl lower than prior. A secondary outcome was lipid-lowering drug prescribing. Clinicaltrials.gov identifier: NCT01286311. Fourteen physicians with 218 patients were randomized to intervention, and 15 physicians with 217 patients to control. The mean patient age was 60.7 years and 77% were male. There was no difference in the primary outcome (11.0 % vs. 11.1 %, OR 0.99, 95 % CI 0.56-1.74, P = 0.96), but intervention group patients were twice as likely to receive a prescription for lipid-lowering medication (11.9 %, vs. 6.0 %, OR 2.13, 95 % CI 1.05-4.32, p = 0.038). In post hoc analysis with extended follow-up to 18 months, the primary outcome occurred more often in the intervention group (22.5 % vs. 16.1 %, OR 1.59, 95 % CI 1.05-2.41, P = 0.029). In this effectiveness trial, individualized mailed CVD risk messages increased the frequency of new lipid-lowering drug prescriptions, but we observed no difference in proportions lowering LDL

  18. Records Management

    Science.gov (United States)

    Ray, Charles M.

    1977-01-01

    This discussion of evaluating a records management course includes comments on management orientation, creation of records, maintenance of records, selection and use of equipment, storage and destruction of records, micrographics, and a course outline. (TA)

  19. Ground-Based Assessment of the Bias and Long-Term Stability of Fourteen Limb and Occultation Ozone Profile Data Records

    Science.gov (United States)

    Hubert, D.; Lambert, J.-C.; Verhoelst, T.; Granville, J.; Keppens, A.; Baray, J.-L.; Cortesi, U.; Degenstein, D. A.; Froidevaux, L.; Godin-Beekmann, S.; hide

    2016-01-01

    The ozone profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of fourteen limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias, and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20-40 kilometers the satellite ozone measurement biases are smaller than plus or minus 5 percent, the short-term variabilities are less than 5-12 percent and the drifts are at most plus or minus 5 percent per decade (or even plus or minus 3 percent per decade for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10 percent and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY), and significant long-term drifts (SCIAMACHY, OSIRIS

  20. Analysis of 24-h oesophageal pH and pressure recordings

    NARCIS (Netherlands)

    Weusten, B. L.; Smout, A. J.

    1995-01-01

    In patients with non-cardiac chest pain, ambulatory oesophageal pressure and pH monitoring provide valuable information. In this paper, the global analysis describing severity and pattern of reflux or the mean amplitude or duration of oesophageal contractions, and the symptom analysis of 24-h

  1. The cause-consequence data base: a retrieval system for records pertaining to accident management

    International Nuclear Information System (INIS)

    Kumamoto, H.; Inoue, K.; Sawaragi, Y.

    1981-01-01

    This paper describes a proposal to store in a data base important paragraphs from reports of investigations into many types of accidents. The data base is to handle not only reports on TMI, but also reports on other events at nuclear reactors, chemical plant explosions, earthquakes, hurricanes, fires, and so forth. (author)

  2. A Blended Learning Study on Implementing Video Recorded Speaking Tasks in Task-Based Classroom Instruction

    Science.gov (United States)

    Kirkgoz, Yasemin

    2011-01-01

    This study investigates designing and implementing a speaking course in which face-to-face instruction informed by the principles of Task-Based Learning is blended with the use of technology, the video, for the first-year student teachers of English in Turkish higher education. The study consisted of three hours of task-based classroom…

  3. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis...

  4. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...

  5. Comparison of a Web-Based Dietary Assessment Tool with Software for the Evaluation of Dietary Records.

    Science.gov (United States)

    Benedik, Evgen; Koroušić Seljak, Barbara; Hribar, Maša; Rogelj, Irena; Bratanič, Borut; Orel, Rok; Fidler Mis, Nataša

    2015-06-01

    Dietary assessment in clinical practice is performed by means of computer support, either in the form of a web-based tool or software. The aim of the paper is to present the results of the comparison of a Slovenian web-based tool with German software for the evaluation of four-day weighted paper-and-pencil-based dietary records (paper-DRs) in pregnant women. A volunteer group of pregnant women (n=63) completed paper-DRs. These records were entered by an experienced research dietitian into a web-based application (Open Platform for Clinical Nutrition, OPEN, http://opkp.si/en, Ljubljana, Slovenia) and software application (Prodi 5.7 Expert plus, Nutri-Science, Stuttgart, Germany, 2011). The results for calculated energy intake, as well as 45 macro- and micronutrient intakes, were statistically compared by using the non-parametric Spearman's rank correlation coefficient. The cut-off for Spearman's rho was set at >0.600. 12 nutritional parameters (energy, carbohydrates, fat, protein, water, potassium, calcium, phosphorus, dietary fiber, vitamin C, folic acid, and stearic acid) were in high correlation (>0.800), 18 in moderate (0.600-0.799), 11 in weak correlation (0.400-0.599), while 5 (arachidonic acid, niacin, alpha-linolenic acid, fluoride, total sugars) did not show any statistical correlation. Comparison of the results of the evaluation of dietary records using a web-based dietary assessment tool with those using software shows that there is a high correlation for energy and macronutrient content.

  6. A retrospective analysis of medical record use in e-consultations.

    Science.gov (United States)

    Pecina, Jennifer L; North, Frederick

    2017-06-01

    Introduction Under certain circumstances, e-consultations can substitute for a face-to-face consultation. A basic requirement for a successful e-consultation is that the e-consultant has access to important medical history and exam findings along with laboratory and imaging results. Knowing just what information the specialist needs to complete an e-consultation is a major challenge. This paper examines differences between specialties in their need for past information from laboratory, imaging and clinical notes. Methods This is a retrospective study of patients who had an internal e-consultation performed at an academic medical centre. We reviewed a random sample of e-consultations that occurred in the first half of 2013 for the indication for the e-consultation and whether the e-consultant reviewed data in the medical record that was older than one year to perform the e-consultation. Results Out of 3008 total e-consultations we reviewed 360 (12%) randomly selected e-consultations from 12 specialties. Questions on management (35.8%), image results (27.2%) and laboratory results (25%) were the three most common indications for e-consultation. E-consultants reviewed medical records in existence more than one year prior to the e-consultation 146 (40.6%) of the time with e-consultants in the specialties of endocrinology, haematology and rheumatology, reviewing records older than one year more than half the time. Labs (20.3%), office notes (20%) and imaging (17.8%) were the types of medical data older than one year that were reviewed the most frequently overall. Discussion Management questions appear to be the most common reason for e-consultation. E-consultants frequently reviewed historical medical data that is older than one year at the time of the e-consultation, especially in endocrinology, haematology and rheumatology specialties. Practices engaging in e-consultations that require transfer of data may want to include longer time frames of historical information

  7. Linear Feature Projection-Based Sensory Event Detection from the Multiunit Activity of Dorsal Root Ganglion Recordings.

    Science.gov (United States)

    Han, Sungmin; Youn, Inchan

    2018-03-28

    Afferent signals recorded from the dorsal root ganglion can be used to extract sensory information to provide feedback signals in a functional electrical stimulation (FES) system. The goal of this study was to propose an efficient feature projection method for detecting sensory events from multiunit activity-based feature vectors of tactile afferent activity. Tactile afferent signals were recorded from the L4 dorsal root ganglion using a multichannel microelectrode for three types of sensory events generated by mechanical stimulation on the rat hind paw. The multiunit spikes (MUSs) were extracted as multiunit activity-based feature vectors and projected using a linear feature projection method which consisted of projection pursuit and negentropy maximization (PP/NEM). Finally, a multilayer perceptron classifier was used to detect sensory events. The proposed method showed a detection accuracy superior to those of other linear and nonlinear feature projection methods and all processes were completed within real-time constraints. Results suggest that the proposed method could be useful to detect sensory events in real time. We have demonstrated the methodology for an efficient feature projection method to detect real-time sensory events from the multiunit activity of dorsal root ganglion recordings. The proposed method could be applied to provide real-time sensory feedback signals in closed-loop FES systems.

  8. TELE-EXPERTISE SYSTEM BASED ON THE USE OF THE ELECTRONIC PATIENT RECORD TO SUPPORT REAL-TIME ANTIMICROBIAL USE.

    Science.gov (United States)

    Morquin, David; Ologeanu-Taddei, Roxana; Koumar, Yatrika; Reynes, Jacques

    2018-03-01

    The aims of this study are (i) to present the design of a tele-expertise system, based on the telephone and electronic patient record (EPR), which supports the counseling of the infectious diseases specialist (IDS) for appropriate antimicrobial use, in a French University hospital; and (ii) to assess the diffusion of the system, the users' adherence, and their perceived utility. A prospective observational study was conducted to measure (i) the number and patterns of telephone calls for tele-expertise council, the number of initial and secondary assessments from the IDS and multidisciplinary meetings; (ii) the clinicians' adherence rate to therapeutic proposals by the IDS and the number of clinical situations for which the IDS decided to move to bedside; and (iii) the perceived utility of the system by the medical managers of the most demanding departments. The review of patients' records for 1 year period indicates that 87 percent of the therapeutic recommendations were fully followed. The adherence was high, despite the IDS moving to the bedside only in 6 percent of cases. Medical managers of the most demanding departments considered the system to be useful. Moreover, 6,994 tele-expertise notifications have been recorded into the EPR for 48 months. The tele-expertise system is an original way to design information technology supported antimicrobial stewardship intervention based on the remote access to relevant information by the IDS and on the traceability of the medical counseling for the clinicians.

  9. Analysis of free text in electronic health records for identification of cancer patient trajectories

    DEFF Research Database (Denmark)

    Jensen, Kasper; Soguero-Ruiz, Cristina; Mikalsen, Karl Oyvind

    2017-01-01

    a methodology that allows disease trajectories of the cancer patients to be estimated from free text in electronic health records (EHRs). By using these disease trajectories, we predict 80% of patient events ahead in time. By control of confounders from 8326 quantified events, we identified 557 events......With an aging patient population and increasing complexity in patient disease trajectories, physicians are often met with complex patient histories from which clinical decisions must be made. Due to the increasing rate of adverse events and hospitals facing financial penalties for readmission......, there has never been a greater need to enforce evidence-led medical decision-making using available health care data. In the present work, we studied a cohort of 7,741 patients, of whom 4,080 were diagnosed with cancer, surgically treated at a University Hospital in the years 2004-2012. We have developed...

  10. Pediatrician-Parent Conversations About Human Papillomavirus Vaccination: An Analysis of Audio Recordings.

    Science.gov (United States)

    Sturm, Lynne; Donahue, Kelly; Kasting, Monica; Kulkarni, Amit; Brewer, Noel T; Zimet, Gregory D

    2017-08-01

    We sought to establish which human papillomavirus (HPV) vaccine communication approaches by pediatricians were associated with same-day HPV vaccination of 11- to 12-year-olds by evaluating audio recordings of visits. Verilogue, a market research company maintaining a panel of primary care pediatricians, provided audio recordings and transcriptions of well-child visits for 11- to 12-year-old patients from January through June 2013. Seventy-five transcripts from 19 pediatricians were coded for use of presumptive language (i.e., words conveying assumption of vaccine delivery), offer of delay, recommendation strength, and information provision. Using logistic regression, we evaluated the association between pediatrician communication approaches and agreement to same-day HPV vaccination. Generalized estimating equations accounted for clustering of patients within pediatricians. Same-day agreement to HPV vaccination occurred in 29% of encounters. Pediatricians in the sample often provided parents with inconsistent, mixed messages and sometimes offered information about HPV or HPV vaccination that was inaccurate. Pediatricians used presumptive language in only 11 of 75 encounters; when used, presumptive language was associated with higher odds of accepting HPV vaccine (73% vs. 22%; odds ratio = 8.96; 95% confidence interval = 2.32-34.70). Pediatricians offered or recommended delay in most encounters (65%). HPV vaccine acceptance occurred far more often when pediatricians did not mention delaying vaccination (82% vs. 6%; odds ratio = 80.84; 95% confidence interval = 15.72-415.67). Same-day vaccination was not associated with strength of recommendation or pediatrician reference to vaccinating their own children. Our findings highlight the need to develop and evaluate physician-focused trainings on using presumptive language for same-day HPV vaccination. Copyright © 2017 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  11. The determinants of performance in master swimmers: an analysis of master world records.

    Science.gov (United States)

    Zamparo, P; Gatta, G; di Prampero, P E

    2012-10-01

    Human performances in sports decline with age in all competitions/disciplines. Since the effects of age are often compounded by disuse, the study of master athletes provides the opportunity to investigate the effects of age per se on the metabolic/biomechanical determinants of performance. For all master age groups, swimming styles and distances, we calculated the metabolic power required to cover the distance (d) in the best performance time as: E' maxR ¼ C d=BTP ¼ C vmax; where C is the energy cost of swimming in young elite swimmers, vmax = d/BTP is the record speed over the distance d, and BTP was obtained form "cross-sectional data" (http://www.fina.org). To establish a record performance, E' maxR must be equal to the maximal available metabolic power (E'maxA). This was calculated assuming a decrease of 1% per year at 40 - 70 years, 2% at 70 - 80 years and 3% at 80 - 90 years (as indicated in the literature) and compared to the E' maxR values, whereas up to about 55 years of age E' maxR ¼ E' maxA; for older subjects E' maxA > E' maxR; the difference increasing linearly by about 0.30% (backstroke), 1.93% (butterfly), 0.92% (front crawl) and 0.37% (breaststroke) per year (average over the 50, 100 and 200 m distances). These data suggest that the energy cost of swimming increases with age. Hence, the decrease in performance in master swimmers is due to both decrease in the metabolic power available (E' maxA) and to an increase in C.

  12. Injectable loop recorder implantation in an ambulatory setting by advanced practice providers: Analysis of outcomes.

    Science.gov (United States)

    Kipp, Ryan; Young, Natasha; Barnett, Anne; Kopp, Douglas; Leal, Miguel A; Eckhardt, Lee L; Teelin, Thomas; Hoffmayer, Kurt S; Wright, Jennifer; Field, Michael

    2017-09-01

    Implantable loop recorder (ILR) insertion has historically been performed in a surgical environment such as the electrophysiology (EP) lab. The newest generation loop recorder (Medtronic Reveal LINQ™, Minneapolis, MN, USA) is injectable with potential for implantation in a non-EP lab setting by advanced practice providers (APPs) facilitating improved workflow and resource utilization. We report the safety and efficacy of injectable ILR placement in the ambulatory care setting by APPs. A retrospective review was performed including all patients referred for injectable ILR placement from March 2014 to November 2015. All device placement procedures were performed in an ambulatory care setting using the standard manufacturer deployment kit with sterile technique and local anesthetic following a single dose of intravenous antibiotics. Acute procedural success and complication rates following injectable ILR placement in the ambulatory setting were reviewed. During the study period, 125 injectable ILRs were implanted. Acute procedural success with adequate sensing (R-waves ≥ 0.2 mV) occurred in 100% of patients. There were no acute procedural complications. Subacute complications occurred in two patients (1.6% of implantations), including one possible infection treated with oral antibiotics and one device removal due to pain at the implant site. In this retrospective single-center study, implantation of injectable ILR in an ambulatory care setting by APPs following a single dose of intravenous antibiotics and standard manufacturer technique yielded a low complication rate with high acute procedural success. Use of this implantation strategy may improve EP lab workflow while providing a safe and effective technique for device placement. © 2017 Wiley Periodicals, Inc.

  13. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations

    Science.gov (United States)

    Talandier, Jacques; Okal, Emile A.

    2016-09-01

    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called `hotspot earthquakes' occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time-series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wave train, than in the case of dislocation sources embedded in the solid Earth.

  14. Selection of optimal recording sites for limited lead body surface potential mapping: A sequential selection based approach

    Directory of Open Access Journals (Sweden)

    McCullagh Paul J

    2006-02-01

    Full Text Available Abstract Background In this study we propose the development of a new algorithm for selecting optimal recording sites for limited lead body surface potential mapping. The proposed algorithm differs from previously reported methods in that it is based upon a simple and intuitive data driven technique that does not make any presumptions about deterministic characteristics of the data. It uses a forward selection based search technique to find the best combination of electrocardiographic leads. Methods The study was conducted using a dataset consisting of body surface potential maps (BSPM recorded from 116 subjects which included 59 normals and 57 subjects exhibiting evidence of old Myocardial Infarction (MI. The performance of the algorithm was evaluated using spatial RMS voltage error and correlation coefficient to compare original and reconstructed map frames. Results In all, three configurations of the algorithm were evaluated and it was concluded that there was little difference in the performance of the various configurations. In addition to observing the performance of the selection algorithm, several lead subsets of 32 electrodes as chosen by the various configurations of the algorithm were evaluated. The rationale for choosing this number of recording sites was to allow comparison with a previous study that used a different algorithm, where 32 leads were deemed to provide an acceptable level of reconstruction performance. Conclusion It was observed that although the lead configurations suggested in this study were not identical to that suggested in the previous work, the systems did bear similar characteristics in that recording sites were chosen with greatest density in the precordial region.

  15. A new high resolution glacial flood history from Japan based on the Lake Suigetsu sediment record

    Science.gov (United States)

    Schlolaut, Gordon; Brauer, Achim; Lamb, Henry F.; Marshall, Michael H.; Staff, Richard A.; Bronk Ramsey, Christopher; Nakagawa, Takeshi

    2017-04-01

    High precipitation events leading to natural disasters such as floods and landslides are a rather common occurrence in Japan since the country receives heavy rains as a result of the summer monsoon rainy season and of typhoons frequently making landfall on Japan. In order to study the natural variability of such precipitation events, Lake Suigetsu provides an ideal and currently unique archive. The lake is situated in central western Japan in Fukui prefecture and its sediment record spans over ≈150 ka, from which the last ≈50 ka contain seasonal laminations. Runoff events due to heavy rains are readily distinguishable as distinct detrital layers. Here we will present data from a 14 ka time slice between 52 and 38 ka BP. The varve quality in this interval is particularly good, allowing a seasonal discrimination of flood events and the construction of a high resolution flood history using thin section microscopy. Our initial results show pronounced centennial-scale variations of the flood frequency, with variable periodicities which we hypothesise to be driven by solar variations.

  16. ATLAS Recordings

    CERN Multimedia

    Steven Goldfarb; Mitch McLachlan; Homer A. Neal

    Web Archives of ATLAS Plenary Sessions, Workshops, Meetings, and Tutorials from 2005 until this past month are available via the University of Michigan portal here. Most recent additions include the Trigger-Aware Analysis Tutorial by Monika Wielers on March 23 and the ROOT Workshop held at CERN on March 26-27.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal.Feedback WelcomeOur group is making arrangements now to record plenary sessions, tutorials, and other important ATLAS events for 2007. Your suggestions for potential recording, as well as your feedback on existing archives is always welcome. Please contact us at wlap@umich.edu. Thank you.Enjoy the Lectures!

  17. Analysis of recently digitized continuous seismic data recorded during the March-May, 1980, eruption sequence at Mount St. Helens

    Science.gov (United States)

    Moran, S. C.; Malone, S. D.

    2013-12-01

    The May 18, 1980, eruption of Mount St. Helens (MSH) was an historic event, both for society and for the field of volcanology. However, our knowledge of the eruption and the precursory period leading up it is limited by the fact that most of the data, particularly seismic recordings, were not kept due to severe limitations in the amount of digital data that could be handled and stored using 1980 computer technology. Because of these limitations, only about 900 digital event files have been available for seismic studies of the March-May seismic sequence out of a total of more than 4,000 events that were counted using paper records. Fortunately, data from a subset of stations were also recorded continuously on a series of 24 analog 14-track IRIG magnetic tapes. We have recently digitized these tapes and time-corrected and cataloged the resultant digital data streams, enabling more in-depth studies of the (almost) complete pre-eruption seismic sequence using modern digital processing techniques. Of the fifteen seismic stations operating near MSH for at least a part of the two months between March 20 and May 18, six stations have relatively complete analog recordings. These recordings have gaps of minutes to days because of radio noise, poor tape quality, or missing tapes. In addition, several other stations have partial records. All stations had short-period vertical-component sensors with very limited dynamic range and unknown response details. Nevertheless, because the stations were at a range of distances and were operated at a range of gains, a variety of earthquake sizes were recorded on scale by at least one station, and therefore a much more complete understanding of the evolution of event types, sizes and character should be achievable. In our preliminary analysis of this dataset we have found over 10,000 individual events as