WorldWideScience

Sample records for based record analysis

  1. Pareto analysis based on records

    CERN Document Server

    Doostparast, M

    2012-01-01

    Estimation of the parameters of an exponential distribution based on record data has been treated by Samaniego and Whitaker (1986) and Doostparast (2009). Recently, Doostparast and Balakrishnan (2011) obtained optimal confidence intervals as well as uniformly most powerful tests for one- and two-sided hypotheses concerning location and scale parameters based on record data from a two-parameter exponential model. In this paper, we derive optimal statistical procedures including point and interval estimation as well as most powerful tests based on record data from a two-parameter Pareto model. For illustrative purpose, a data set on annual wages of a sample production-line workers in a large industrial firm is analyzed using the proposed procedures.

  2. Repetition-based Structure Analysis of Music Recordings

    OpenAIRE

    Jiang, Nanzhu

    2015-01-01

    Music Information Retrieval (MIR) is a current area of research which aims at providing techniques and tools for searching, organizing, processing and interacting with music data. In order to extract musically meaningful information from audio recordings, one requires methods from various fields such as digital signal processing, music theory, human perception, and information retrieval. One central research topic within MIR is referred to as music structure analysis, where an important goal ...

  3. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe B.; Fauerskov, Inge; Osmanagic, Armin;

    2013-01-01

    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals...... for heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  4. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt; Fauerskov, Inge; Osmanagic, Armin;

    2013-01-01

    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals for...... heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  5. Borneo: a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    OpenAIRE

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species occurrence based on the identified relationships between species recorded presences and the ecological circumstances at those localities. A new statistical method was developed to test the species distribut...

  6. How do repeat suicide attempters differ from first timers? An exploratory record based analysis

    Directory of Open Access Journals (Sweden)

    Vikas Menon

    2016-01-01

    Full Text Available Background: Evidence indicates that repeat suicide attempters, as a group, may differ from 1st time attempters. The identification of repeat attempters is a powerful but underutilized clinical variable. Aims: In this research, we aimed to compare individuals with lifetime histories of multiple attempts with 1st time attempters to identify factors predictive of repeat attempts. Setting and Design: This was a retrospective record based study carried out at a teaching cum Tertiary Care Hospital in South India. Methods: Relevant data was extracted from the clinical records of 1st time attempters (n = 362 and repeat attempters (n = 61 presenting to a single Tertiary Care Center over a 4½ year period. They were compared on various sociodemographic and clinical parameters. The clinical measures included Presumptive Stressful Life Events Scale, Beck Hopelessness Scale, Coping Strategies Inventory – Short Form, and the Global Assessment of Functioning Scale. Statistical Analysis Used: First time attempters and repeaters were compared using appropriate inferential statistics. Logistic regression was used to identify independent predictors of repeat attempts. Results: The two groups did not significantly differ on sociodemographic characteristics. Repeat attempters were more likely to have given prior hints about their act (χ2 = 4.500, P = 0.034. In the final regression model, beck hopelessness score emerged as a significant predictor of repeat suicide attempts (odds ratio = 1.064, P = 0.020. Conclusion: Among suicide attempters presenting to the hospital, the presence of hopelessness is a predictor of repeat suicide attempts, independent of clinical depression. This highlights the importance of considering hopelessness in the assessment of suicidality with a view to minimize the risk of future attempts.

  7. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann;

    2005-01-01

    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas, there ...

  8. Towards successful coordination of electronic health record based-referrals: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Paul Lindsey A

    2011-07-01

    Full Text Available Abstract Background Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP, the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs (i.e., e-referrals, lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. Methods We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Results Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Conclusions Despite facilitating information transfer between PCPs and

  9. The construction and periodicity analysis of natural disaster database of Alxa area based on Chinese local records

    Science.gov (United States)

    Yan, Zheng; Mingzhong, Tian; Hengli, Wang

    2010-05-01

    Chinese hand-written local records were originated from the first century. Generally, these local records include geography, evolution, customs, education, products, people, historical sites, as well as writings of an area. Through such endeavors, the information of the natural materials of China nearly has had no "dark ages" in the evolution of its 5000-year old civilization. A compilation of all meaningful historical data of natural-disasters taken place in Alxa of inner-Mongolia, the second largest desert in China, is used here for the construction of a 500-year high resolution database. The database is divided into subsets according to the types of natural-disasters like sand-dust storm, drought events, cold wave, etc. Through applying trend, correlation, wavelet, and spectral analysis on these data, we can estimate the statistically periodicity of different natural-disasters, detect and quantify similarities and patterns of the periodicities of these records, and finally take these results in aggregate to find a strong and coherent cyclicity through the last 500 years which serves as the driving mechanism of these geological hazards. Based on the periodicity obtained from the above analysis, the paper discusses the probability of forecasting natural-disasters and the suitable measures to reduce disaster losses through history records. Keyword: Chinese local records; Alxa; natural disasters; database; periodicity analysis

  10. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt

    2014-11-01

    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  11. Validation of PC-based sound card with Biopac for digitalization of ECG recording in short-term HRV analysis

    Directory of Open Access Journals (Sweden)

    K Maheshkumar

    2016-01-01

    Full Text Available Background: Heart rate variability (HRV analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR in healthy as well as disease conditions. The aim of the present study was to compare (validate the HRV using a temporal series of electrocardiograms (ECG obtained by simple analog amplifier with PC-based sound card (audacity and Biopac MP36 module. Materials and Methods: Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. Results: The unpaired Student′s t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001 between the values in time and frequency domain obtained by the devices. Conclusion: On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  12. Template-based data entry for general description in medical records and data transfer to data warehouse for analysis.

    Science.gov (United States)

    Matsumura, Yasushi; Kuwata, Shigeki; Yamamoto, Yuichiro; Izumi, Kazunori; Okada, Yasushi; Hazumi, Michihiro; Yoshimoto, Sachiko; Mineno, Takahiro; Nagahama, Munetoshi; Fujii, Ayumi; Takeda, Hiroshi

    2007-01-01

    General descriptions in medical records are so diverse that they are usually entered as free text into an electronic medical record, and the resulting data analysis is often difficult. We developed and implemented a template-based data entry module and data analyzing system for general descriptions. We developed a template with tree structure, whose content master and entered patient's data are simultaneously expressed by XML. The entered structured data is converted to narrative form for easy reading. This module was implemented in the EMR system, and is used in 35 hospitals as of October, 2006. So far, 3725 templates (3242 concepts) have been produced. The data in XML and narrative text data are stored in the EMR database. The XML data are retrieved, and then patient's data are extracted, to be stored in the data ware-house (DWH). We developed a search assisting system that enables users to find objective data from the DWH without requiring complicated SQL. By using this method, general descriptions in medical records can be structured and made available for clinical research.

  13. Magnetoencephalography recording and analysis

    Directory of Open Access Journals (Sweden)

    Jayabal Velmurugan

    2014-01-01

    Full Text Available Magnetoencephalography (MEG non-invasively measures the magnetic field generated due to the excitatory postsynaptic electrical activity of the apical dendritic pyramidal cells. Such a tiny magnetic field is measured with the help of the biomagnetometer sensors coupled with the Super Conducting Quantum Interference Device (SQUID inside the magnetically shielded room (MSR. The subjects are usually screened for the presence of ferromagnetic materials, and then the head position indicator coils, electroencephalography (EEG electrodes (if measured simultaneously, and fiducials are digitized using a 3D digitizer, which aids in movement correction and also in transferring the MEG data from the head coordinates to the device and voxel coordinates, thereby enabling more accurate co-registration and localization. MEG data pre-processing involves filtering the data for environmental and subject interferences, artefact identification, and rejection. Magnetic resonance Imaging (MRI is processed for correction and identifying fiducials. After choosing and computing for the appropriate head models (spherical or realistic; boundary/finite element model, the interictal/ictal epileptiform discharges are selected and modeled by an appropriate source modeling technique (clinically and commonly used - single equivalent current dipole - ECD model. The equivalent current dipole (ECD source localization of the modeled interictal epileptiform discharge (IED is considered physiologically valid or acceptable based on waveform morphology, isofield pattern, and dipole parameters (localization, dipole moment, confidence volume, goodness of fit. Thus, MEG source localization can aid clinicians in sublobar localization, lateralization, and grid placement, by evoking the irritative/seizure onset zone. It also accurately localizes the eloquent cortex-like visual, language areas. MEG also aids in diagnosing and delineating multiple novel findings in other neuropsychiatric

  14. Recording-based identification of site liquefaction

    Institute of Scientific and Technical Information of China (English)

    Hu Yuxian; Zhang Yushan; Liang Jianwen; Ray Ruichong Zhang

    2005-01-01

    Reconnaissance reports and pertinent research on seismic hazards show that liquefaction is one of the key sources of damage to geotechnical and structural engineering systems. Therefore, identifying site liquefaction conditions plays an important role in seismic hazard mitigation. One of the widely used approaches for detecting liquefaction is based on the time-frequency analysis of ground motion recordings, in which short-time Fourier transform is typically used. It is known that recordings at a site with liquefaction are the result of nonlinear responses of seismic waves propagating in the liquefied layers underneath the site. Moreover, Fourier transform is not effective in characterizing such dynamic features as time-dependent frequency of the recordings rooted in nonlinear responses. Therefore, the aforementioned approach may not be intrinsically effective in detecting liquefaction. An alternative to the Fourier-based approach is presented in this study,which proposes time-frequency analysis of earthquake ground motion recordings with the aid of the Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the liquefaction features shown in the recordings. The paper then defines the predominant instantaneous frequency (PIF) and introduces the PIF-related motion features to identify liquefaction conditions at a given site. Analysis of 29 recorded data sets at different site conditions shows that the proposed approach is effective in detecting site liquefaction in comparison with other methods.

  15. The Use of Continuous Wavelet Transform Based on the Fast Fourier Transform in the Analysis of Multi-channel Electrogastrography Recordings.

    Science.gov (United States)

    Komorowski, Dariusz; Pietraszek, Stanislaw

    2016-01-01

    This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.

  16. 基于SAR的装备维修手册生成技术%Research on Generation Technology of Equipment Maintenance Handbook Based on Supportability Analysis Record

    Institute of Scientific and Technical Information of China (English)

    黄文江; 胡起伟; 朱宁; 王业

    2014-01-01

    保障性分析记录(Supportability Analysis Record,SAR)是维修手册编写的基础数据来源,针对传统纸质维修手册存在的缺陷以及维修手册编写中SAR的利用率不高等问题,提出了以保障性分析记录建立底层数据环境直接生成维修手册的设计思想,构建了基于SAR的装备维修手册生成技术框架,突破了基于XML的SAR表示技术。该框架充分利用SAR数据生成维修手册,提高了分析数据的可重用性,避免了重复分析,降低了工作量,提高了维修手册生成效率。%Supportability analysis record is the basal data source of the maintenance handbook. For the limitation of traditional papery maintenance handbook and the low utilization of supportability analysis record when writing the maintenance handbook,the underlying data environment is established based on supportability analysis record,then the maintenance handbook created directly. In this paper, a technical framework of equipment maintenance handbook is built based on supportability analysis record,in which the supportability analysis record is fully utilized,and the technology of SAR expressing based on XML is broken through. So the reusability of the data is improved,repeated analysis is avoided,the workload is decreased,the efficiency of creating maintenance handbook is improved.

  17. Visualization Recorded Wave Analysis System Based on IEC61850%基于IEC61850的可视化故障录波分析系统

    Institute of Scientific and Technical Information of China (English)

    李楠; 尹军; 董贝; 韩春江; 葛雅川

    2012-01-01

    为满足智能变电站分析Comtrade格式录波的需求,设计并开发基于Comtrade档式和IEC61850文件传输模型的可视化故障录波分析系统.该软件能够根据IEC61850文传输模型,从智能电子设备中将Comtrade格式的录波文件下载到软件,解析后进行采样数据波形化显示,故障简报、事件报告、开入量列表化呈现,同时通过分层可视化逻辑显示提供系统了解逐层深入探究的录波分析方法,使录波更加直观便捷,科学精准.本文阐述可视化录波分析的总体架构和程序流程,着重介绍基于IEC61580文件传输模型的录波下载功能、采样数据分析算法,并对基于SVG可视化逻辑显示的具体实现进行详细介绍.%In order to meet the needs of recorded wave analysis in intelligence substation, the paper designs and develops the recorded wave analysis software based on QT. The software based on Comtrade format and IEC61850 file transfer model can download the Comtrade format wave file from intelligence electronic device, analyze the file, then display sample data, event report, and fault presentation graphically. Through the layered visual logic disply, the software provides the recorded analysis method which going from understand systematically to in-depth analysis, and makes the recorded wave analysis more intuitive, precision and scientific. This paper describes the overall software architecture and program flow of the recorded wave analysis, focuses on the function of downloading recorded wave file based on IEC61850 file transfer model, the sample data analysis algorithm, and describes software functions in detail.

  18. EEG Recording and Analysis for Sleep Research

    OpenAIRE

    Campbell, Ian G.

    2009-01-01

    The electroencephalogram (EEG) is the most common tool used in sleep research. This unit describes the methods for recording and analyzing the EEG. Detailed protocols describe recorder calibration, electrode application, EEG recording, and computer EEG analysis with power spectral analysis. Computer digitization of an analog EEG signal is discussed, along with EEG filtering and the parameters of fast Fourier transform (FFT) power spectral analysis. Sample data are provided for a typical night...

  19. SEPServer Solar Energetic Particle event Catalogues at 1 AU based on STEREO recordings: selected solar cycle 24 SEP event analysis

    Science.gov (United States)

    Papaioannou, Athanasios; Malandraki, Olga E.; Dresing, Nina; Klein, Karl-Ludwig; Heber, Bernd; Vainio, Rami; Nindos, Alexander; Rodríguez-Gasén, Rosa; Klassen, Andreas; Gómez Herrero, Raúl; Vilmer, Nicole; Mewaldt, Richard A.

    2014-05-01

    STEREO (Solar TErrestrial RElations Observatory) recordings provide an unprecedented opportunity to identify the evolution of Solar Energetic Particles (SEPs) at different observing points in the heliosphere. In this work, two instruments onboard STEREO have been used in order to identify all SEP events observed within the deciding phase of solar cycle 23 and the rising phase of solar cycle 24 from 2007 to 2012, namely: the Low Energy Telescope (LET) and the Solar Electron Proton Telescope (SEPT). A scan over STEREO/LET protons within the energy range 6-10 MeV has been performed for each of the two STEREO spacecraft. Furthermore, parallel scanning of the STEREO/SEPT electrons in order to pinpoint the presence (or not) of an electron event has been performed in the energy range of 55-85 keV, for all of the aforementioned proton events, included in our lists. We provide the onset and peak times as well as the peak value of all events for both protons and electrons. Time-shifting analysis for near relativistic electrons leads to the inferred solar release time and to the relevant solar associations from radio spectrographs (Nançay Decametric Array; STEREO/WAVES) to GOES Soft X-rays and hard X-rays from RHESSI. The aforementioned information materializes the STEREO SEPServer catalogues that recently have been released to the scientific community. In order to demonstrate the exploitation of the STEREO catalogues, we then focus at the series of SEP events that were recorded onboard STEREO A & B as well as at L1 (ACE, SOHO) from March 4-14, 2012. We track the activity of active region (AR) 1429 during its passage from the East to the West which produced a number of intense solar flares and coronal mass ejections and we compare the magnetic connectivity of each spacecraft in association with the corresponding SEP signatures. During this period the longitudinal separation of the STEREO spacecraft was > 220 degrees, yet both of them recorded SEP events. These complex multi

  20. Portable EGG recording system based on a digital voice recorder.

    Science.gov (United States)

    Jang, J-K; Shieh, M-J; Kuo, T-S; Jaw, F-S

    2009-01-01

    Cutaneous electrogastrogram (EGG) recording offers the benefit of non-invasive gastrointestinal diagnosis. With long-term ambulatory recording of signals, researchers and clinicians could have more opportunities to investigate and analyse paroxysmal or acute symptoms. A portable EGG system based on a digital voice recorder (DVR) is designed for long-term recording of cutaneous EGG signals. The system consists of electrodes, an EGG amplifier, a modulator, and a DVR. Online monitoring and off-line acquisition of EGG are handled by software. A special design employing an integrated timer circuit is used to modulate the EGG frequency to meet the input requirements of the DVR. This approach involves low supply voltage and low power consumption. Software demodulation is used to simplify the complexity of the system, and is helpful in reducing the size of the portable device. By using surface-mount devices (SMD) and a low-power design, the system is robust, compact, and suitable for long-term portable recording. As a result, researchers can record an ambulatory EGG signal by means of the proposed circuits in conjunction with an up-to-date voice-recording device.

  1. Reconstructing Holocene climate variability of northwestern Norway based on geochemical analysis of two lacustrine records from the Lofoten Islands

    Science.gov (United States)

    Balascio, N. L.; Bradley, R. S.

    2009-12-01

    We investigate changing environmental conditions during the Holocene in two lakes from the Lofoten Islands. The Lofoten archipelago is located off the northwestern coast of Norway (67-69°N). The maritime climate is mild despite this high latitude location with a mean annual air temperature of ~4°C and mean annual precipitation of ~1200 mm. These conditions are strongly regulated by the Norwegian Atlantic Current, an extension of the North Atlantic Drift, that flows immediately west of the islands. We have generated records from a 3 m sediment core from Vikjordvatnet, on Vestvågøya, and a 5.75 m core from Fiskebolvatnet, on Austvågøya. The record from Vikjordvatnet extends back to the Younger Dryas, which is the last time ice occupied the cirque at the head of this valley. Fiskebolvatnet was isolated from the ocean in the early Holocene and has a lacustrine sequence that covers the last ~9500 cal yr BP. The catchment around this lake is very steep and the sediments show large variations in clastic input throughout the record. Bulk physical and biogeochemical measurements along with scanning XRF data are used to show changes in sedimentation related to decadal- to centennial-scale environmental conditions. Northwestern Scandinavia - encompassing northwestern Norway, northern Sweden and Finland - contains a relatively dense, multi-proxy network of terrestrial and marine paleoclimate records. We evaluate past temperature, precipitation, and sea-surface conditions and fit our results into this regional context to improve our understanding of Holocene climate variability.

  2. Joint time-frequency analysis of EEG signals based on a phase-space interpretation of the recording process

    Science.gov (United States)

    Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.

    2012-10-01

    Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.

  3. Symptom diagnostics based on clinical records

    NARCIS (Netherlands)

    de Jong, Marianne; Punt, Marja; de Groot, Erik; Hielkema, Tjitske; Struik, Marianne; Minderaa, Ruud B.; Hadders-Algra, Mijna

    2009-01-01

    Child psychiatric diagnoses are generally based on a clinical examination and not on standardized questionnaires. The present study assessed whether symptom diagnostics based on clinical records facilitates the use of non-standardized clinical material for research. Six hundred and eighty-five child

  4. MORTALITY PATTERN OF HOSPITALIZED CHILDREN IN A REFERRAL HOSPITAL FROM UPPER ASSAM, NORTH EAST INDIA: A RECORD BASED RETROSPECTIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Rita Panyang

    2016-04-01

    Full Text Available OBJECTIVE Mortality rate of children is one of the important indicator of health status of country. India is one of the major contributor of under 5 mortality of the world. The present study was aimed at finding the causes of mortality among children admitted in paediatric ward of Assam Medical College Hospital during last two years and provides epidemiological information related to mortality. MATERIALS AND METHODS A retrospective cross-sectional descriptive study over 2 years’ period from 1st July 2013 to 31st June 2015 using data retrieved from the Hospital’s Admission and Death Record Register. Data collected was entered into a spreadsheet using SPSS software package version 16.0. RESULTS A total of 10970 children comprising of 4873 (44.4% males and 6097 (55.5 % of females were admitted to the Paediatric Department during this 24 months’ period. A total number of children died during this period was 814 comprising of 474 (58.2% males and 334 (41% females. Overall mortality among males and females were 9.72% and 5.47% respectively. An overall mortality of 7.42% was noted in cases admitted to the Paediatric ward. Out of total 814 paediatric deaths, 460 (57.9% were infant and more than 1/3rd (35.8% of infants died in new-born period; 186 (23.4% died in age group 1-5 years and 148 (18.1% died in age group >5 years. Majority of neonatal mortality was due to birth asphyxia (33.1% followed by septicaemia (31.4% and prematurity, LBW (18.5%. The common cause of post neonatal death was septicaemia (24% and respiratory tract infection (24% followed by meningitis (22.4%. Septicaemia (17.8% was the leading cause of death among infant. Acute encephalitis syndrome is the most common cause of death in 1-5 yrs. (24.1% and >5 yrs. (25% age group. CONCLUSION The pattern of childhood mortality has been documented in the present study. The results from this study confirm that high neonatal deaths were mainly due to birth asphyxia, while after neonatal

  5. An Analysis of the Accuracy of Electromechanical Eigenvalue Calculations Based on Instantaneous Power Waveforms Recorded in a Power Plant

    Directory of Open Access Journals (Sweden)

    Piotr Pruski

    2013-12-01

    Full Text Available The paper presents the results of calculating the eigenvalues (associated with electromechanical phenomena of the state matrix of the Polish Power System model on the basis of analysis of simulated and measured instantaneous power disturbance waveforms of generating units in Łaziska Power Plant. The method for electromechanical eigenvalue calculations used in investigations consists in approximation of the instantaneous power swing waveforms in particular generating units with the use of the waveforms being a superposition of the modal components associated with the searched eigenvalues and their participation factors. The hybrid optimisation algorithm consisting of the genetic and gradient algorithms was used for computations.

  6. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  7. ECG biometric analysis in different physiological recording conditions

    OpenAIRE

    Porée, Fabienne; Kervio, Gaëlle; Carrault, Guy

    2016-01-01

    International audience Biometric systems have for objective to perform identification or verification of identity of individuals. Human electrocardiogram (ECG) has been recently proposed as an additional tool for biometric applications. Then, a set of ECG-based biometric studies has occurred in the literature, but they are difficult to compare because they use various values of: the number of ECG leads, the length of the analysis window (only the QRS or more), the delays between recordings...

  8. Structure and performance of a real-time algorithm to detect tsunami or tsunami-like alert conditions based on sea-level records analysis

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2011-05-01

    Full Text Available The goal of this paper is to present an original real-time algorithm devised for detection of tsunami or tsunami-like waves we call TEDA (Tsunami Early Detection Algorithm, and to introduce a methodology to evaluate its performance. TEDA works on the sea level records of a single station and implements two distinct modules running concurrently: one to assess the presence of tsunami waves ("tsunami detection" and the other to identify high-amplitude long waves ("secure detection". Both detection methods are based on continuously updated time functions depending on a number of parameters that can be varied according to the application. In order to select the most adequate parameter setting for a given station, a methodology to evaluate TEDA performance has been devised, that is based on a number of indicators and that is simple to use. In this paper an example of TEDA application is given by using data from a tide gauge located at the Adak Island in Alaska, USA, that resulted in being quite suitable since it recorded several tsunamis in the last years using the sampling rate of 1 min.

  9. Segment clustering methodology for unsupervised Holter recordings analysis

    Science.gov (United States)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  10. Intelligent technique for knowledge reuse of dental medical records based on case-based reasoning.

    Science.gov (United States)

    Gu, Dong-Xiao; Liang, Chang-Yong; Li, Xing-Guo; Yang, Shan-Lin; Zhang, Pei

    2010-04-01

    With the rapid development of both information technology and the management of modern medical regulation, the generation of medical records tends to be increasingly intelligent. In this paper, Case-Based Reasoning is applied to the process of generating records of dental cases. Based on the analysis of the features of dental records, a case base is constructed. A mixed case retrieval method (FAIES) is proposed for the knowledge reuse of dental records by adopting Fuzzy Mathematics, which improves similarity algorithm based on Euclidian-Lagrangian Distance, and PULL & PUSH weight adjustment strategy. Finally, an intelligent system of dental cases generation (CBR-DENT) is constructed. The effectiveness of the system, the efficiency of the retrieval method, the extent of adaptation and the adaptation efficiency are tested using the constructed case base. It is demonstrated that FAIES is very effective in terms of reducing the time of writing medical records and improving the efficiency and quality. FAIES is also proven to be an effective aid for diagnoses and provides a new idea for the management of medical records and its applications.

  11. 基于群聊天记录的人类行为动力学分析%GROUP CHAT RECORDS BASED HUMAN BEHAVIOR DYNAMICS ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    王洪川; 郭进利; 樊超

    2012-01-01

    The paper analyzes the actual data from chat records of 6 QQ groups. From the group view, it makes statistics and analysis of the intermediate time between two successive messages and the length of each message in characters. Results turn out that both the intermediate time and the length obviously follow the power-low distribution, while their exponents are very close. Research output shows that the grouped communication behavior with IM abides by the general rule of human dynamics.%分析六个QQ群聊天记录的真实数据,从群体的角度出发,对群聊天记录的间隔时间和每条消息的字符长度进行统计分析.结果表明,QQ群聊天行为无论从时间间隔还是聊天记录长度上都表现出明显的重尾特征,且不同群之间的幂指数相近.研究说明,即时通讯的群体性的沟通行为服从人类动力学普遍的重尾规律.

  12. Analysis of Phase Multilevel Recording on Microholograms

    Science.gov (United States)

    Ide, Tatsuro; Mikami, Hideharu; Osawa, Kentaro; Watanabe, Koichi

    2011-09-01

    An optical phase multilevel recording technique using a microholographic system and phase-diversity homodyne detection for enhancement of optical disc capacity is investigated. In this technique, multilevel phase signals are stored as the fringe shifts along the optical axis and recovered from the arctangent of two homodyne-detected signals. For comparison, phase signals from Blu-ray Disc read-only memory (BD-ROM) and Blu-ray Disc recordable (BD-R) media obtained by phase-diversity homodyne detection are experimentally evaluated. From the experimental results, we demonstrated that phase-diversity homodyne detection is useful for detecting the phase signal modulation of the signal beam from an optical disc. Furthermore, simulation results on microholograms indicate that phase signals from the microholograms are much more stable despite the variety of their sizes than those from BD-ROM. These results demonstrate the potential of this multilevel recording method.

  13. Analysis of the Factors Influencing Breeding Record Establishment of Sheep-raising Households or Farms Based on Logit-ISM:Based on 849 Questionnaires from 17 Cities in Shandong Province

    Institute of Scientific and Technical Information of China (English)

    Shiping; ZHU; Shimin; SUN; Limin; HAN

    2015-01-01

    Breeding record is an important way to implement standardized sheep raising,trace major sheep raising epidemic information and ensure the quality and safety of products for sheep-raising households or farms.Based on 849 questionnaires from 17 cities of Shandong Province,the paper firstly used the binary discrete model of Logit to analyze the factors influencing the establishing behavior of breeding records of sheep-raising households or farms and then used the ISM model to explain the relationship and hierarchy of each influencing factor.The result showed that seven factors including the education level of the deciders,farming scale,fixed number of farming years,degree of specialization,support of the government,whether to join the industrialization organization and the recognition of the breeding records have a significant impact on the establishing behavior of breeding records of the sheep-raising households or farms.Among them,the support of the government and the recognition of breeding records are the surface direct factors,degree of specialization and whether to join the industrialization organization are the middle indirect factors,the education level of the deciders,the farming scale and the fixed number of farming years are deep source factors.

  14. Drive-based recording analyses at >800 Gfc/in 2 using shingled recording

    Science.gov (United States)

    William Cross, R.; Montemorra, Michael

    2012-02-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ˜130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond.

  15. Recording of ECG signals on a portable MiniDisc recorder for time and frequency domain heart rate variability analysis.

    Science.gov (United States)

    Norman, S E; Eager, R A; Waran, N K; Jeffery, L; Schroter, R C; Marlin, D J

    2005-01-17

    Analysis of heart rate variability (HRV) is a non-invasive technique useful for investigating autonomic function in both humans and animals. It has been used for research into both behaviour and physiology. Commercial systems for human HRV analysis are expensive and may not have sufficient flexibility for appropriate analysis in animals. Some heart rate monitors have the facility to provide inter-beat interval (IBI), but verification following collection is not possible as only IBIs are recorded, and not the raw electrocardiogram (ECG) signal. Computer-based data acquisition and analysis systems such as Po-Ne-Mah and Biopac offer greater flexibility and control but have limited portability. Many laboratories and veterinary surgeons have access to ECG machines but do not have equipment to record ECG signals for further analysis. The aim of the present study was to determine whether suitable HRV data could be obtained from ECG signals recorded onto a MiniDisc (MD) and subsequently digitised and analysed using a commercial data acquisition and analysis package. ECG signals were obtained from six Thoroughbred horses by telemetry. A split BNC connecter was used to allow simultaneous digitisation of analogue output from the ECG receiver unit by a computerised data acquisition system (Po-Ne-Mah) and MiniDisc player (MZ-N710, Sony). Following recording, data were played back from the MiniDisc into the same input channel of the data acquisition system as previously used to record the direct ECG. All data were digitised at a sampling rate of 500 Hz. IBI data were analysed in both time and frequency domains and comparisons between direct recorded and MiniDisc data were made using Bland-Altman analysis. Despite some changes in ECG morphology due to loss of low frequency content (primarily below 5 Hz) following MiniDisc recording, there was minimal difference in IBI or time or frequency domain analysis between the two recording methods. The MiniDisc offers a cost

  16. Analysis of Scan Records with a Recording Densitometer - The ''Re-Scanner''

    International Nuclear Information System (INIS)

    The impact of improvements in scanning equipment has not been fully felt at the clinical level, largely because of deficiencies in scan recording. In an attempt to improve visualization and contrast in scan records, various instrumental methods of analysis have been devised. We have devised a simple and comparatively inexpensive recording densitometer for ''re-scanning'' scan records. A light-sensor scans the record just as a scanner scans a patient. The output of the device is a pulse rate proportional to the opacity (or transmission) of the record, and may be used to make a new, or ''re-scan'', record. The area of the record over which information is integrated is set by sensor aperture. The wide range of output pulse-rates (zero to 15 000 parts/s) causes large and adjustable contrast amplification. A threshold control provides any ''cut-off level'' of choice. Operation is rapid, and a record can be re-scanned in a small fraction of the time required to obtain the original record. Studies on clinical scans of almost every organ or area of interest show that the re-scanner reveals information not at first evident in original scan records. It has been particularly useful in determining the statistical significance of small variations in counting rate in a scan record. In scan records of large dynamic range where no single cut-off level satisfactorily shows all regions of interest, re-scans at several cut-off levels were once necessary. A two-region sensor, that views a region of the record around the field of view of the main sensor, has been used in an attempt to overcome this difficulty. At least three modes of operation are possible with the two-region sensor: (1) ''normal'' operation; (2) ignoring general record density and responding only to small variations, thus setting its own cut-off level; and (3) reporting only abrupt changes in record density. Other modes seem to be possible. This relatively simple and inexpensive device is proving to be of valuable

  17. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  18. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    Science.gov (United States)

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  19. Fetal magnetocardiogram recordings and Fourier spectral analysis.

    Science.gov (United States)

    Anastasiadis, P; Anninos, P A; Lüdinghausen, M V; Kotini, A; Galazios, G; Limberis, B

    1999-07-01

    Power spectral analysis of fetal magnetocardiogram (FMCG) data was evaluated in 64 pregnancies, using the non-invasive one channel superconducting quantum interference device (DC-SQUID), in order to investigate the power spectral amplitude distribution in the frequency range between 2 and 3 Hz. In all cases with normal and uncomplicated pregnancies, the data from the fetal heart and specifically the QRS complexes, were identifiable and unaffected by any maternal cardiac activity and furthermore the power spectral amplitudes, which varied between 120 and 350 fT/Hz, were directly related to gestational age. PMID:15512338

  20. Drive-based recording analyses at >800 Gfc/in{sup 2} using shingled recording

    Energy Technology Data Exchange (ETDEWEB)

    William Cross, R., E-mail: William.r.cross@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States); Montemorra, Michael, E-mail: Mike.r.montemorra@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States)

    2012-02-15

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from {approx}130 to well over 500 Gb/in{sup 2} in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in{sup 2} using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in{sup 2} and beyond. - Research Highlights: > Drive-based recording demonstrations at 805 Gf/in{sup 2} has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. > Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. > Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack

  1. Coral-based climate records from tropical South Atlantic

    DEFF Research Database (Denmark)

    Pereira, Natan S.; Sial, Alcides N.; Kikuchi, Ruy K.P.;

    2015-01-01

    Coral skeletons contain records of past environmental conditions due to their long life span and well calibrated geochemical signatures. C and O isotope records of corals are especially interesting, because they can highlight multidecadal variability of local climate conditions beyond...... the instrumental record, with high fidelity and sub-annual resolution. Although, in order to get an optimal geochemical signal in coral skeleton, sampling strategies must be followed. Here we report one of the first coral-based isotopic record from the Equatorial South Atlantic from two colonies of Porites...... the two colonies are observed, yet both record the 2009/2010 El Niño event - a period of widespread coral bleaching - as anomalously negative δ18O values (up to −1 permil). δ13C is found to be measurably affected by the El Niño event in one colony, by more positive values (+0.39 ‰), and together...

  2. Digital image analysis of palaeoenvironmental records and applications

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Environmental change signals in geological or biological records are commonly reflected on their reflecting or transmitting images. These environmental signals can be extracted through digital image analysis. The analysis principle involves section line selection, color value reading and calculating environmental proxy index along the section lines, layer identification, auto-chronology and investigation of structure evolution of growth bands. On detailed illustrations of the image technique, this note provides image analyzing procedures of coral, tree-ring and stalagmite records. The environmental implications of the proxy index from image analysis are accordingly given through application demonstration of the image technique.

  3. Quantitative transmission electron microscopy analysis of multi-variant grains in present L10-FePt based heat assisted magnetic recording media

    International Nuclear Information System (INIS)

    We present a study on atomic ordering within individual grains in granular L10-FePt thin films using transmission electron microscopy techniques. The film, used as a medium for heat assisted magnetic recording, consists of a single layer of FePt grains separated by non-magnetic grain boundaries and is grown on an MgO underlayer. Using convergent-beam techniques, diffraction patterns of individual grains are obtained for a large number of crystallites. The study found that although the majority of grains are ordered in the perpendicular direction, more than 15% of them are multi-variant, or of in-plane c-axis orientation, or disordered fcc. It was also found that these multi-variant and in-plane grains have always grown across MgO grain boundaries separating two or more MgO grains of the underlayer. The in-plane ordered portion within a multi-variant L10-FePt grain always lacks atomic coherence with the MgO directly underneath it, whereas, the perpendicularly ordered portion is always coherent with the underlying MgO grain. Since the existence of multi-variant and in-plane ordered grains are severely detrimental to high density data storage capability, the understanding of their formation mechanism obtained here should make a significant impact on the future development of hard disk drive technology

  4. A Late Quaternary Climate Record Based on Multi-Proxies Analysis from the Jiaochang Loess Section in the Eastern Tibetan Plateau, China

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    We compared the stable carbon isotopic records from a loess transect of the Jiaochang in the eastern Tibetan Plateau, spanning the last ~21,000 years, with multiproxy data for pedogenesis,including magnetic susceptibility, clay fraction, Fed/Fet ratio, carbonate and total organic carbon content, in order to probe the mechanisms of δ13C values of organic matter and Late Quaternary climate variations in the eastern Tibetan Plateau. Our results indicate that there is no simple relationship between δ13C of organic matter and summer monsoon variations. The change in δ13C values of organic matter (in accordance with the ratios of C3 to C4 plants) results from the interaction among temperature, aridity and atmospheric pCO2 level. Drier climate and lower atmospheric pCO2 level contribute to positive carbon isotopic excursion, while negative carbon isotopic excursion is the result of lower temperature and increased atmospheric pCO2 level. Additionally, our results imply that the Tibetan monsoon may play an important role in climate system in the eastern Tibet Plateau,which specifically reflects frequently changing climate in that area. The results provide new insights into the forcing mechanisms on both the δ13C values of organic matter and the local climate system.

  5. A Factor-based Analysis on Comprehensive Evaluation on Academic Records of College Students in Shaanxi%基于因子分析的陕西高校学生成绩综合评价

    Institute of Scientific and Technical Information of China (English)

    张善红

    2015-01-01

    As to the necessity of the academic records assessment of college students, based on the factor analysis of the academic records of the college students, the factor model of the student grades is concluded and major elements affecting the learning abilities of the students are found. The empirical research is done to test this model. The results show that it is a scientific, valuable and impatrtial model to disclose the relationships between different courses and the caltivation of learning abilities of different students, and providing a basis for further studies in future.%押针对高校学生成绩评价的必要性,对高校学生学习成绩进行因子分析,得出影响学生知识和能力方面的主要因素,建立学生成绩因子分析模型。并通过实例研究对该模型进行检验,结果表明,学生成绩因子分析模型更为科学、合理、公平,旨在揭示不同课程对于学生不同学习能力培养的联系,进而为以后进一步深入研究提供依据。

  6. Analysis and modelling of tsunami-induced tilt for the 2007, M = 7.6, Tocopilla and the 2010, M = 8.8 Maule earthquakes, Chile, from long-base tiltmeter and broadband seismometer records

    Science.gov (United States)

    Boudin, F.; Allgeyer, S.; Bernard, P.; Hébert, H.; Olcay, M.; Madariaga, R.; El-Madani, M.; Vilotte, J.-P.; Peyrat, S.; Nercessian, A.; Schurr, B.; Esnoult, M.-F.; Asch, G.; Nunez, I.; Kammenthaler, M.

    2013-07-01

    We present a detailed study of tsunami-induced tilt at in-land sites, to test the interest and feasibility of such analysis for tsunami detection and modelling. We studied tiltmeter and broadband seismometer records of northern Chile, detecting a clear signature of the tsunamis generated by the 2007 Tocopilla (M = 7.6) and the 2010 Maule (M = 8.8) earthquakes. We find that these records are dominated by the tilt due to the elastic loading of the oceanic floor, with a small effect of the horizontal gravitational attraction. We modelled the Maule tsunami using the seismic source model proposed by Delouis et al. and a bathymetric map, correctly fitting three tide gauge records of the area (Antofagasta, Iquique and Arica). At all the closest stations (7 STS2, 2 long-base tiltmeters), we correctly modelled the first few hours of the tilt signal for the Maule tsunami. The only phase mismatch is for the site that is closer to the ocean. We find a tilt response of 0.005-0.01 μm at 7 km away from the coastline in response to a sea level amplitude change of 10 cm. For the Maule earthquake, we observe a clear tilt signal starting 20 min before the arrival time of the tsunami at the nearest point on the coastline. This capability of tilt or seismic sensors to detect distant tsunamis before they arrive has been successfully tested with a scenario megathrust in the southern Peru-northern Chile seismic gap. However, for large events near the stations, this analysis may no longer be feasible, due to the large amplitude of the long-period seismic signals expected to obscure the loading signal. Inland tilt measurements of tsunamis smooth out short, often unmodelled wavelengths of the sea level perturbation, thus providing robust, large-scale images of the tsunami. Furthermore, tilt measurements are not expected to saturate even for the largest run-ups, nor to suffer from near-coast tsunami damages. Tiltmeters and broadband seismometers are thus valuable instruments for monitoring

  7. Quantum-dot based nanothermometry in optical plasmonic recording media

    Energy Technology Data Exchange (ETDEWEB)

    Maestro, Laura Martinez [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain); Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Zhang, Qiming; Li, Xiangping; Gu, Min [Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Jaque, Daniel [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain)

    2014-11-03

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media.

  8. Quantum-dot based nanothermometry in optical plasmonic recording media

    International Nuclear Information System (INIS)

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media

  9. Electronic Health Record A Systems Analysis of the Medications Domain

    CERN Document Server

    Scarlat, Alexander

    2012-01-01

    An accessible primer, Electronic Health Record: A Systems Analysis of the Medications Domain introduces the tools and methodology of Structured Systems Analysis as well as the nuances of the Medications domain. The first part of the book provides a top-down decomposition along two main paths: data in motion--workflows, processes, activities, and tasks in parallel to the analysis of data at rest--database structures, conceptual, logical models, and entities relationship diagrams. Structured systems analysis methodology and tools are applied to: electronic prescription, computerized physician or

  10. A RNA-based nanodevice recording temperature over time

    Science.gov (United States)

    Höfinger, Siegfried; Zerbetto, Francesco

    2010-04-01

    Nucleic acids provide a wealth of interesting properties that can find important applications in nanotechnology. In this article we describe a concept of how to use RNA for temperature measurements. In particular the principal components of a nanodevice are outlined that works on the basis of RNA secondary structure rearrangement. The major mode of operation is a hairpin-coil transition occurring at different temperatures for different types of short RNA oligonucleotides. Coupling these events to a detection system based on specific RNA hybridization provides the framework for a nanodevice capable of temperature records as a function of time. The analysis is carried out with the help of a statistical mechanics package that has been specifically designed to study RNA secondary structure. The procedure yields an optimized list of eight RNA sequences operational in the range from -10 to 60 °C. The data can form the basis of a new technology of potential interest to many fields of process and quality control.

  11. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost. PMID:26076479

  12. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  13. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Directory of Open Access Journals (Sweden)

    Manuel Caldas-Morgan

    Full Text Available As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  14. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  15. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    Science.gov (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  16. Recommending Related Papers Based on Digital Library Access Records

    CERN Document Server

    Pohl, Stefan; Joachims, Thorsten

    2007-01-01

    An important goal for digital libraries is to enable researchers to more easily explore related work. While citation data is often used as an indicator of relatedness, in this paper we demonstrate that digital access records (e.g. http-server logs) can be used as indicators as well. In particular, we show that measures based on co-access provide better coverage than co-citation, that they are available much sooner, and that they are more accurate for recent papers.

  17. Diffusion of Electronic Medical Record Based Public Hospital Information Systems

    OpenAIRE

    Cho, Kyoung Won; Kim, Seong Min; An, Chang-Ho; Chae, Young Moon

    2015-01-01

    Objectives This study was conducted to evaluate the adoption behavior of a newly developed Electronic Medical Record (EMR)-based information system (IS) at three public hospitals in Korea with a focus on doctors and nurses. Methods User satisfaction scores from four performance layers were analyzed before and two times after the newly develop system was introduced to evaluate the adoption process of the IS with Rogers' diffusion theory. Results The 'intention to use' scores, the most importan...

  18. 'Citizen science' recording of fossils by adapting existing computer-based biodiversity recording tools

    Science.gov (United States)

    McGowan, Alistair

    2014-05-01

    Biodiversity recording activities have been greatly enhanced by the emergence of online schemes and smartphone applications for recording and sharing data about a wide variety of flora and fauna. As a palaeobiologist, one of the areas of research I have been heavily involved in is the question of whether the amount of rock available to sample acts as a bias on our estimates of biodiversity through time. Although great progress has been made on this question over the past ten years by a number of researchers, I still think palaeontology has not followed the lead offered by the 'citizen science' revolution in studies of extant biodiversity. By constructing clearly structured surveys with online data collection support, it should be possible to collect field data on the occurrence of fossils at the scale of individual exposures, which are needed to test competing hypotheses about these effects at relatively small spatial scales. Such data collection would be hard to justify for universities and museums with limited personnel but a co-ordinated citizen science programme would be capable of delivering such a programme. Data collection could be based on the MacKinnon's Lists method, used in rapid conservation assessment work. It relies on observers collecting lists of a fixed length (e.g. 10 species long) but what is important is that it focuses on getting observers to ignore sightings of the same species until that list is complete. This overcomes the problem of 'common taxa being commonly recorded' and encourages observers to seek out and identify the rarer taxa. This gives a targeted but finite task. Rather than removing fossils, participants would be encouraged to take photographs to share via a recording website. The success of iSpot, which allows users to upload photos of plants and animals for other users to help with identifications, offers a model for overcoming the problems of identifying fossils, which can often look nothing like the examples illustrated in

  19. Analysis of astronomical records of King Wu's Conquest

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    All related astronomical records of King Wu's Conquest have been searched and analysed comprehensively. Constrained by the newest conclusions of archeology, philology and history in the Xia-Shang-Zhou Chronology Project and based mainly on dates in Wucheng, Jupiter's position in Guoyu and information on the season, our first choice of the date of King Wu's Conquest is Jun. 20, BC1046. This conclusion explains properly most relevant literature.

  20. A full lipid biomarker based record from Lake Challa, Tanzania

    Science.gov (United States)

    Blaga, C. I.; de Leeuw, J. W.; Verschuren, D.; Sinninghe Damsté1, J. S.

    2012-04-01

    The climate of the regions surrounding the Indian Ocean - East Africa, Arabian and Indian peninsulas - is strongly dominated by the dynamics of the seasonal monsoon. To understand the long and short term driving forces behind the natural climatic variability in this region it is highly important to reconstruct climatic changes in the past and, thereby, predict future changes taking into account also anthropogenic activities. Most low latitude locations lack continuous, highly resolved continental records with good age control. From the few existing records acquired from tropical glacier ice, cave stalagmites and fossil diatoms a thorough understanding of the climatic variations reflected (rainfall and drought or temperature and its effect on precipitation) is scanty. Chemically stratified crater lakes accumulate high-quality climate-proxy records as shown in very recent studies done on the continuous and finely laminated sediment record of Lake Challa situated on the lower East slope of Mt. Kilimanjaro (Verschuren et al. 2009; Wolff et al. 2011). The unique location of this lake in equatorial East Africa implies that the climate variability is influenced by the Indian Ocean and not by the Atlantic due to the Congo Air Boundary (Thierney et al. 2011). The objective of this study is to fully explore the biomarker content of the Lake Challa sedimentary record already characterized by an excellent time resolution and chronology. Various normal chain lipids (n-alkanes, n-fatty acids, n-alcohols), sterols, long-chain diols, triterpenoids and glycolipids in sedimentary organic matter, were determined in their solvent-extractable (free) and saponification-released forms (bound). The changing composition of organic matter content from the investigated lake is used as a framework to trace palaeo-humidity, terrestrial input, algal input, temperature in sediment traps and underlying sediments of Lake Challa to further our palaeo-environmental knowledge based on GDGT's and

  1. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Science.gov (United States)

    2010-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record...

  2. Break and trend analysis of EUMETSAT Climate Data Records

    Science.gov (United States)

    Doutriaux-Boucher, Marie; Zeder, Joel; Lattanzio, Alessio; Khlystova, Iryna; Graw, Kathrin

    2016-04-01

    EUMETSAT reprocessed imagery acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board Meteosat 8-9. The data covers the period from 2004 to 2012. Climate Data Records (CDRs) of atmospheric parameters such as Atmospheric Motion Vectors (AMV) as well as Clear and All Sky Radiances (CSR and ASR) have been generated. Such CDRs are mainly ingested by ECMWF to produce a reanalysis data. In addition, EUMETSAT produced a long CDR (1982-2004) of land surface albedo exploiting imagery acquired by the Meteosat Visible and Infrared Imager (MVIRI) on board Meteosat 2-7. Such CDR is key information in climate analysis and climate models. Extensive validation has been performed for the surface albedo record and a first validation of the winds and clear sky radiances have been done. All validation results demonstrated that the time series of all parameter appear homogeneous at first sight. Statistical science offers a variety of analyses methods that have been applied to further analyse the homogeneity of the CDRs. Many breakpoint analysis techniques depend on the comparison of two time series which incorporates the issue that both may have breakpoints. This paper will present a quantitative and statistical analysis of eventual breakpoints found in the MVIRI and SEVIRI CDRs that includes attribution of breakpoints to changes of instruments and other events in the data series compared. The value of different methods applied will be discussed with suggestions how to further develop this type of analysis for quality evaluation of CDRs.

  3. Semantic models in medical record data-bases.

    Science.gov (United States)

    Cerutti, S

    1980-01-01

    A great effort has been recently made in the area of data-base design in a number of application fields (banking, insurance, travel, etc.). Yet, it is the current experience of computer scientists in the medical field that medical record information-processing requires less rigid and more complete definition of data-base specifications for a much more heterogeneous set of data, for different users who have different aims. Hence, it is important to state that the data-base in the medical field ought to be a model of the environment for which it was created, rather than just a collection of data. New more powerful and more flexible data-base models are being now designed, particularly in the USA, where the current trend in medicine is to implement, in the same structure, the connection among more different and specific users and the data-base (for administrative aims, medical care control, treatments, statistical and epidemiological results, etc.). In such a way the single users are able to talk with the data-base without interfering with one another. The present paper outlines that this multi-purpose flexibility can be achieved by improving mainly the capabilities of the data-base model. This concept allows the creation of procedures of semantic integrity control which will certainly have in the future a dramatic impact on important management features, starting from data-quality checking and non-physiological state detections, as far as more medical-oriented procedures like drug interactions, record surveillance and medical care review. That is especially true when a large amount of data are to be processed and the classical hierarchical and network data models are no longer sufficient for developing satisfactory and reliable automatic procedures. In this regard, particular emphasis will be dedicated to the relational model and, at the highest level, to the same semantic data model.

  4. [Retrospective analysis of nursing records at a specialized unit].

    Science.gov (United States)

    Ochoa-Vigo, Kattia; Pace, Ana Emilia; dos Santos, Claudia Benedita

    2003-01-01

    This is a retrospective longitudinal study conducted in a Specialized Unit in the interior of Brazil. It aimed at verifying differences in nursing records with regard to how to document the care provided to patients in three periods related to the implementation of the Nursing process. Data were collected from 20% of the records of patients aged over 15 years who were randomly selected according to proportional distribution. Such data were fully transcribed in an instrument constructed for this purpose and classified according to the proposals of the referential. The Binomial Test was used for analysis with a significance level of 0.05%. The results showed a tendency to improvement of the records as to writing intellection and complete description of the signs/symptoms in the second period. It was concluded that there was a difference between the initial and the pre-implementation periods, which expresses the need to review the way the Nursing Process must be developed as a responsibility of the whole nursing team. PMID:12852295

  5. Network Analysis of Time-Lapse Microscopy Recordings

    Directory of Open Access Journals (Sweden)

    Erik eSmedler

    2014-09-01

    Full Text Available Multicellular organisms rely on intercellular communication to regulate important cellular processes critical to life. To further our understanding of those processes there is a need to scrutinize dynamical signaling events and their functions in both cells and organisms. Here, we report a method and provide MATLAB code that analyzes time-lapse microscopy recordings to identify and characterize network structures within large cell populations, such as interconnected neurons. The approach is demonstrated using intracellular calcium (Ca2+ recordings in neural progenitors and cardiac myocytes, but could be applied to a wide variety of biosensors employed in diverse cell types and organisms. In this method, network structures are analyzed by applying cross-correlation signal processing and graph theory to single-cell recordings. The goal of the analysis is to determine if the single cell activity constitutes a network of interconnected cells and to decipher the properties of this network. The method can be applied in many fields of biology in which biosensors are used to monitor signaling events in living cells. Analyzing intercellular communication in cell ensembles can reveal essential network structures that provide important biological insights.

  6. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  7. Multifractal detrended moving average analysis of global temperature records

    CERN Document Server

    Mali, Provash

    2015-01-01

    Long-range correlation and multifractal nature of the global monthly mean temperature anomaly time series over the period 1850-2012 are studied in terms of the multifractal detrended moving average (MFDMA) method. We try to address the source(s) of multifractality in the time series by comparing the results derived from the actual series with those from a set of shuffled and surrogate series. It is seen that the newly developed MFDMA method predicts a multifractal structure of the temperature anomaly time series that is more or less similar to that observed by other multifractal methods. In our analysis the major contribution of multifractality in the temperature records is found to be stemmed from long-range temporal correlation among the measurements, however the contribution of fat-tail distribution function of the records is not negligible. The results of the MFDMA analysis, which are found to depend upon the location of the detrending window, tend towards the observations of the multifractal detrended fl...

  8. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... Part 171 Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records..., as amended (5 U.S.C. 552a). Certain portions of the Risk Analysis and Management (RAM) Records, State... system, Risk Analysis and Management (RAM) Records, State-78, will support the vetting of...

  9. Holographic storage system based on digital holography for recording a phase data page in a compact optical setup

    Science.gov (United States)

    Nobukawa, Teruyoshi; Nomura, Takanori

    2016-03-01

    A holographic storage system based on digital holography is proposed for recording and retrieving a phase data page in a compact and simple optical setup. In the proposed recording system, complex amplitude distribution can be modulated using a single phase-only spatial light modulator. The complex amplitude distribution of a retrieved phase data page is detected with the Fourier fringe analysis. The use of digital holographic techniques enables realizing a compact and simple holographic recording system, which is independent of misalignment problem in conventional holographic storage systems. The capability of the proposed recording system is numerically and experimentally evaluated.

  10. Quality Assurance in a Computer-Based Outpatient Record

    OpenAIRE

    Colloff, Edwin; Morgan, Mary; Beaman, Peter; Justice, Norma; Kunstaetter, Robert; Barnett, G. Octo

    1980-01-01

    COSTAR, a COmputer-STored Ambulatory Record system, was developed at the Massachusetts General Hospital Laboratory of Computer Science. It can supplement or entirely replace the paper medical record with a highly encoded record. Although a computer-stored medical record provides a unique opportunity for quality assurance activities, it requires programming skills to examine the data. We have taken the dual approach of writing pre-specified quality assurance packages and developing a high leve...

  11. Anonymization of Electronic Medical Records to Support Clinical Analysis

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2013-01-01

    Anonymization of Electronic Medical Records to Support Clinical Analysis closely examines the privacy threats that may arise from medical data sharing, and surveys the state-of-the-art methods developed to safeguard data against these threats. To motivate the need for computational methods, the book first explores the main challenges facing the privacy-protection of medical data using the existing policies, practices and regulations. Then, it takes an in-depth look at the popular computational privacy-preserving methods that have been developed for demographic, clinical and genomic data sharing, and closely analyzes the privacy principles behind these methods, as well as the optimization and algorithmic strategies that they employ. Finally, through a series of in-depth case studies that highlight data from the US Census as well as the Vanderbilt University Medical Center, the book outlines a new, innovative class of privacy-preserving methods designed to ensure the integrity of transferred medical data for su...

  12. Computational intelligence methods on biomedical signal analysis and data mining in medical records

    OpenAIRE

    Vladutu, Liviu-Mihai

    2004-01-01

    This thesis is centered around the development and application of computationally effective solutions based on artificial neural networks (ANN) for biomedical signal analysis and data mining in medical records. The ultimate goal of this work in the field of Biomedical Engineering is to provide the clinician with the best possible information needed to make an accurate diagnosis (in our case of myocardial ischemia) and to propose advanced mathematical models for recovering the complex de...

  13. Hilbert-Huang transform analysis of dynamic and earthquake motion recordings

    Science.gov (United States)

    Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.

    2003-01-01

    This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.

  14. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... investigation records, investigatory material for law enforcement purposes, and confidential source information... Unclassified computer network. Vetting requests, analyses, and results will be stored separately on a classified computer network. Both computer networks and the RAM database require a user identification...

  15. Changing negative core beliefs with trial-based thought record

    Directory of Open Access Journals (Sweden)

    Thaís R. Delavechia

    2016-04-01

    Full Text Available Abstract Background Trial-based thought record (TBTR is a technique used in trial-based cognitive therapy (TBCT, and simulates a court trial. It was designed to restructure unhelpful core beliefs (CBs during psychotherapy. Objective To confirm previous findings on the efficacy of TBTR in decreasing patients’ adherence to self-critical and unhelpful CBs and corresponding emotions, as well as assessing the differential efficacy of the empty-chair approach relative to the static format of TBTR. Methods Thirty-nine outpatients were submitted to a 50-minute, one-session, application of the TBTR technique in the empty-chair (n = 18 or conventional (n = 21 formats. Patients’ adherence to unhelpful CBs and the intensity of corresponding emotions were assessed after each step of TBTR, and the results obtained in each format were compared. Results Significant reductions in percent values both in the credit given to CBs and in the intensity of corresponding emotions were observed at the end of the session (p < .001, relative to baseline values. ANCOVA also showed a significant difference in favor of the empty-chair format for both belief credit and emotion intensity (p = .04. Discussion TBTR may help patients reduce adherence to unhelpful CBs and corresponding emotions and the empty-chair format seems to be more efficacious than the conventional format.

  16. Classifying Normal and Abnormal Status Based on Video Recordings of Epileptic Patients

    Directory of Open Access Journals (Sweden)

    Jing Li

    2014-01-01

    Full Text Available Based on video recordings of the movement of the patients with epilepsy, this paper proposed a human action recognition scheme to detect distinct motion patterns and to distinguish the normal status from the abnormal status of epileptic patients. The scheme first extracts local features and holistic features, which are complementary to each other. Afterwards, a support vector machine is applied to classification. Based on the experimental results, this scheme obtains a satisfactory classification result and provides a fundamental analysis towards the human-robot interaction with socially assistive robots in caring the patients with epilepsy (or other patients with brain disorders in order to protect them from injury.

  17. Analysis of recorded earthquake response data at the Hualien large-scale seismic test site

    International Nuclear Information System (INIS)

    A soil-structure interaction (SSI) experiment is being conducted in a seismically active region in Hualien, Taiwan. To obtain earthquake data for quantifying SSI effects and providing a basis to benchmark analysis methods, a 1/4-th scale cylindrical concrete containment model similar in shape to that of a nuclear power plant containment was constructed in the field where both the containment model and its surrounding soil, surface and sub-surface, are extensively instrumented to record earthquake data. In between September 1993 and May 1995, eight earthquakes with Richter magnitudes ranging from 4.2 to 6.2 were recorded. The author focuses on studying and analyzing the recorded data to provide information on the response characteristics of the Hualien soil-structure system, the SSI effects and the ground motion characteristics. An effort was also made to directly determine the site soil physical properties based on correlation analysis of the recorded data. No modeling simulations were attempted to try to analytically predict the SSI response of the soil and the structure. These will be the scope of a subsequent study

  18. ANALYSIS OF VIBROACOUSTIC SIGNALS RECORDED IN THE PASSENGER LIFT CABIN

    Directory of Open Access Journals (Sweden)

    Kamil Szydło

    2016-06-01

    Full Text Available The analysis of private tests is presented in the article. The applicable tests refer to accelerations, the level of the sound pressure as well as to the sound power emitted by the passenger lift cabin at different technical conditions of the lift. For a group of lifting devices the accelerations were tested at three axes with the use of an accelerometer. The accelerometer was placed in the central part of the cabin with simultaneous measurement of the acoustic parameters with the sound analyzer equipped with the sound volume double microphone probe. The attempt was made to determine the impact of the frame - cabin system construction as well as the lift technical condition on the recorded parameters. It can allow to establish the limit values of the lift structure parameters under which a rapid drop of comfort takes place while travelling in the lift as well as to indicate those construction elements the modification of which would affect the improvement of the operation noiselessness.

  19. Developing a personal-computer-based records retention system using Paradox{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Sprouse, B.; Wray, S.

    1993-10-01

    Many records managers are confronted with large caches of records stored in corners, attics, or warehouses that seem to be ``out of sight, out of mind.`` Much of this information becomes ``lost`` because it is not properly identified and cataloged. Perhaps the records have always been stored in these places because the lack of an alternative. In these situations, the records manager must organize and catalog the records and provide solutions to the records management and storage problems. A simple personal-computer-based records management system can be developed that will provide organization, accountability, and retrievability of the records. By developing a basic database structure and implementing some basic records management principles, a records manager can gain control of even the most extreme displays of records mismanagement. This paper will discuss practical ways of establishing a records system that provides for database tracking using off-the-shelf database software packages. Database examples using Paradox software will be used to explain the basic concepts for developing records systems. The paper will also discuss developing and performing a records assessment, researching applicable requirements, writing a records management plan, implementing the records system, and testing and modifying the system.

  20. VID-R and SCAN: Tools and Methods for the Automated Analysis of Visual Records.

    Science.gov (United States)

    Ekman, Paul; And Others

    The VID-R (Visual Information Display and Retrieval) system that enables computer-aided analysis of visual records is composed of a film-to-television chain, two videotape recorders with complete remote control of functions, a video-disc recorder, three high-resolution television monitors, a teletype, a PDP-8, a video and audio interface, three…

  1. Tussiphonographic analysis of cough sound recordings performed by Schmidt-Voigt and Hirschberg and Szende.

    Science.gov (United States)

    Korpás, J; Kelemen, S

    1987-01-01

    The cough sound records published by Schmidt-Voigt and Hirschberg and Szende were submitted to tussiphonographic analysis. It has been established that all the recordings of various types of cough sounds registered in airway disease were of pathological character in the tussiphonographic recordings. It has repeatedly been confirmed that tussiphonography is a suitable means for screening of respiratory diseases. PMID:3434295

  2. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  3. Julius – a template based supplementary electronic health record system

    Directory of Open Access Journals (Sweden)

    Klein Gunnar O

    2007-05-01

    Full Text Available Abstract Background EHR systems are widely used in hospitals and primary care centres but it is usually difficult to share information and to collect patient data for clinical research. This is partly due to the different proprietary information models and inconsistent data quality. Our objective was to provide a more flexible solution enabling the clinicians to define which data to be recorded and shared for both routine documentation and clinical studies. The data should be possible to reuse through a common set of variable definitions providing a consistent nomenclature and validation of data. Another objective was that the templates used for the data entry and presentation should be possible to use in combination with the existing EHR systems. Methods We have designed and developed a template based system (called Julius that was integrated with existing EHR systems. The system is driven by the medical domain knowledge defined by clinicians in the form of templates and variable definitions stored in a common data repository. The system architecture consists of three layers. The presentation layer is purely web-based, which facilitates integration with existing EHR products. The domain layer consists of the template design system, a variable/clinical concept definition system, the transformation and validation logic all implemented in Java. The data source layer utilizes an object relational mapping tool and a relational database. Results The Julius system has been implemented, tested and deployed to three health care units in Stockholm, Sweden. The initial responses from the pilot users were positive. The template system facilitates patient data collection in many ways. The experience of using the template system suggests that enabling the clinicians to be in control of the system, is a good way to add supplementary functionality to the present EHR systems. Conclusion The approach of the template system in combination with various local EHR

  4. Modal identification of boiler plant structures on AR spectral analysis of seismic records

    International Nuclear Information System (INIS)

    This paper deals with a modal identification method for large-scale structures such as boiler plants in thermal power station. Practical and accurate modal identification has been carried out by the proposed method, which is composed of two stages; processing frequency transfer functions by autoregressive (AR) spectral analysis, and a curve-fitting technique to extract modal parameters. Seismic records of base acceleration records at various points of the structure are used as multi-output data. This method is examined using time-series data of seismic response simulation. Introduction of the two techniques, namely, decimation of time data and FPE criterion to optimize the order of AR models have realized effective and accurate identification. This method has actually been applied to seismic observation data of boiler plants in operation. As a result of this study, the authors' modal identification has proven to be effective for seismic modeling of large-scale structures

  5. Design of Ground Analysis Program System Based on PAD for LY05 Voice Recording of Locomotive%基于PDA的LY05型机车语音录音装置的地面分析软件的设计

    Institute of Scientific and Technical Information of China (English)

    邓俊彦

    2012-01-01

    The design of the ground analysis program system based on PDA for locomotive voice recording was described, and design & realization of FTP downloading, analysis, playback, inquiring for recorded file were discussed. The similarities and differences of locomotive voice recording process between PDA and PC were analyzed and compared in detail. The application of the software improved die timeliness of locomotive fault analysis.%介绍了基于PDA平台的LY05型机车语音录音装置的地面分析软件的设计,以及机车语音录音文件的FTP下载、分析、回放、查询的设计与实现.重点分析比较了PDA与PC机上语音录音文件分析处理过程的异同.该软件的使用促进了机车故障分析的时校性.

  6. Revised estimates of Greenland ice sheet thinning histories based on ice-core records

    DEFF Research Database (Denmark)

    Lecavalier, B.S.; Milne, G.A.; Fisher, D.A.;

    2013-01-01

    -3 and Camp Century. In addition, compared to the original analysis, the 1-s uncertainty is considerably larger at GRIP and NGRIP. These changes reduce the data-model discrepancy reported by Vinther et al. (2009) at GRIP, NGRIP, DYE-3 and Camp Century. A more accurate treatment of isostasy......Ice core records were recently used to infer elevation changes of the Greenland ice sheet throughout the Holocene. The inferred elevation changes show a significantly greater elevation reduction than those output from numerical models, bringing into question the accuracy of the model......-based reconstructions and, to some extent, the estimated elevation histories. A key component of the ice core analysis involved removing the influence of vertical surface motion on the dO signal measured from the Agassiz and Renland ice caps. We re-visit the original analysis with the intent to determine if the use...

  7. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas

    2012-12-01

    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  8. Android-based access to holistic emergency care record.

    Science.gov (United States)

    Koufi, Vassiliki; Malamateniou, Flora; Prentza, Andriana; Vassilacopoulos, George

    2013-01-01

    This paper is concerned with the development of an Emergency Medical Services (EMS) system which interfaces with a Holistic Emergency Care Record (HECR) that aims at managing emergency care holistically by supporting EMS processes and is accessible by Android-enabled mobile devices. PMID:23823406

  9. Performance analysis of a medical record exchanges model.

    Science.gov (United States)

    Huang, Ean-Wen; Liou, Der-Ming

    2007-03-01

    Electronic medical record exchange among hospitals can provide more information for physician diagnosis and reduce costs from duplicate examinations. In this paper, we proposed and implemented a medical record exchange model. According to our study, exchange interface servers (EISs) are designed for hospitals to manage the information communication through the intra and interhospital networks linked with a medical records database. An index service center can be given responsibility for managing the EIS and publishing the addresses and public keys. The prototype system has been implemented to generate, parse, and transfer the health level seven query messages. Moreover, the system can encrypt and decrypt a message using the public-key encryption algorithm. The queuing theory is applied to evaluate the performance of our proposed model. We estimated the service time for each queue of the CPU, database, and network, and measured the response time and possible bottlenecks of the model. The capacity of the model is estimated to process the medical records of about 4000 patients/h in the 1-MB network backbone environments, which comprises about the 4% of the total outpatients in Taiwan.

  10. Area Disease Estimation Based on Sentinel Hospital Records

    OpenAIRE

    Yang, Yang; Wang, Jin-feng; Reis, Ben Y.; Hu, Mao-Gui; Christakos, George; Yang, Wei-Zhong; Sun, Qiao; Li, Zhong-Jie; Li, Xiao-Zhou; Lai, Sheng-Jie; Chen, Hong-Yan; Wang, Dao-Chen

    2011-01-01

    Background Population health attributes (such as disease incidence and prevalence) are often estimated using sentinel hospital records, which are subject to multiple sources of uncertainty. When applied to these health attributes, commonly used biased estimation techniques can lead to false conclusions and ineffective disease intervention and control. Although some estimators can account for measurement error (in the form of white noise, usually after de-trending), most mainstream health stat...

  11. Analysis of the Lunar Eclipse Records from the Goryeosa

    Science.gov (United States)

    Lee, Ki-Won; Mihn, Byeong-Hee; Ahn, Young Sook; Ahn, Sang-Hyeon

    2016-08-01

    In this paper, we study the lunar eclipse records in the Goryeosa (History of the Goryeo Dynasty), an official history book of the Goryeo dynasty (A.D. 918 -- 1392). In the history book, a total of 228 lunar eclipse accounts are recorded, covering the period from 1009 to 1392. However, we find that two accounts are duplications and four accounts correspond to no known lunar eclipses around the dates. For the remaining lunar eclipses, we calculate the magnitude and the time of the eclipse at different phases using the DE406 ephemeris. Of the 222 lunar eclipse accounts, we find that the minimum penumbral magnitude was 0.5583. For eclipses which occurred after midnight, we find that some accounts were recorded on the day before the eclipse, like the astronomical records of the Joseonwangjosillok (Annals of the Joseon Dynasty), while others were on the day of the lunar eclipse. We also find that four accounts show a difference in the Julian dates between this study and that of Ahn et al., even though it is assumed that the Goryeo court did not change the dates in the accounts for lunar eclipses that occurred after midnight. With regard to the contents of the lunar eclipse accounts, we confirm that the accounts recorded as total eclipses are accurate, except for two accounts. However, both eclipses were very close to the total eclipse. We also confirm that all predicted lunar eclipses did occur, although one eclipse happened two days after the predicted date. In conclusion, we believe that this study is very helpful for investigating the lunar eclipse accounts of other periods in Korea, and furthermore, useful for verifying the calendar dates of the Goryeo dynasty.

  12. 75 FR 79312 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking...

    Science.gov (United States)

    2010-12-20

    ... test reactor licensees to obtain a fingerprint- based criminal history records check before granting...; ] NUCLEAR REGULATORY COMMISSION 10 CFR Part 73 RIN 3150-AI25 Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking Unescorted Access to Research or Test Reactors...

  13. Current LBA development on disk based recorders and eVLBI

    Science.gov (United States)

    Phillips, Chris; et al.

    For the last 18months the ATNF, University of Swinburne and University of Tasmania have been collaborating to make available a disk based recording system for the LVA based on the Metsahovi VSIB digital input card. Developents of these systems has been very successfull and they where used Huygens VLBI tracking project. Disk based recording at data rates of up to 512 Mbps (eventually 1 Gbps) has now been added as an option for all LBA proposal. The disks based recording has also allowed us to recored data in parallel with the existing S2 recorders. We now routinely make realtime fringe tests for all experiments by "sniffing" a small section of data and correlating in software using the Swinburne cluster. We are in the processing of installing the second generation of disk based recorded PCs which will be used for transmitting data in realtime over gigabit links.

  14. Query log analysis of an electronic health record search engine.

    Science.gov (United States)

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A

    2011-01-01

    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users' information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR.

  15. Foetal heart rate recording: analysis and comparison of different methodologies

    OpenAIRE

    Ruffo, Mariano

    2011-01-01

    Monitoring foetal health is a very important task in clinical practice to appropriately plan pregnancy management and delivery. In the third trimester of pregnancy, ultrasound cardiotocography is the most employed diagnostic technique: foetal heart rate and uterine contractions signals are simultaneously recorded and analysed in order to ascertain foetal health. Because ultrasound cardiotocography interpretation still lacks of complete reliability, new parameters and methods of interpreta...

  16. Obesity research based on the Copenhagen School Health Records Register

    DEFF Research Database (Denmark)

    Baker, Jennifer L; Sørensen, Thorkild I A

    2011-01-01

    INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from the capi......INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from...... the capital city of Denmark. Additional information on the cohort members has been obtained via linkages with population studies and national registers. RESEARCH TOPICS: Studies using data from the register have made important contributions in the areas of the aetiology of obesity, the development...... of the obesity epidemic, and the long-term health consequences of birth weight as well as body size and growth in childhood. CONCLUSION: Research using this unique register is ongoing, and its contributions to the study of obesity as well as other topics will continue for years to come....

  17. Comparing the security risks of paper-based and computerized patient record systems

    Science.gov (United States)

    Collmann, Jeff R.; Meissner, Marion C.; Tohme, Walid G.; Winchester, James F.; Mun, Seong K.

    1997-05-01

    How should hospital administrators compare the security risks of paper-based and computerized patient record systems. There is a general tendency to assume that because computer networks potentially provide broad access to hospital archives, computerized patient records are less secure than paper records and increase the risk of breaches of patient confidentiality. This assumption is ill-founded on two grounds. Reasons exist to say that the computerized patient record provides better access to patient information while enhancing overall information system security. A range of options with different trade-offs between access and security exist in both paper-based and computerized records management systems. The relative accessibility and security of any particular patient record management system depends, therefore, on administrative choice, not simply on the intrinsic features of paper or computerized information management systems.

  18. Recording and Analysis of Tsetse Flight Responses in Three Dimensions

    International Nuclear Information System (INIS)

    Recording and analysing three dimensional (3D) motions of tsetse flies in flight are technically challenging due to their speed of flight. However, video recording of tsetse fly flight responses has already been made in both wind tunnels and the field. The aim of our research was to study the way tsetse flies exploit host odours and visual targets during host searching. Such knowledge can help in the development of better trapping devices. We built a wind tunnel where it is possible to control environmental parameters, e.g. temperature, relative humidity and light. The flight of the flies was filmed from above with two high speed Linux-embedded cameras equipped with fish-eye objectives viewing at 60o from one another. The synchronized stereo images were used to reconstruct the trajectory of flies in 3D and in real time. Software permitted adjustment for parameters such as luminosity and size of the tsetse species being tracked. Interpolation permitted us to calculate flight coordinates and to measure modifications of flight parameters such as acceleration, velocity, rectitude, angular velocity and curvature according to the experimental conditions. Using this system we filmed the responses of Glossina brevipalpis Newstead obtained from a colony at the IAEA Entomology Unit, Seibersdorf, Austria to human breath presented with and without a visual target. Flights lasting up to 150 s duration and covering up to 153 m were recorded. G. brevipalpis flights to human breath were characterized by wide undulations along the course. When a visual target was placed in the plume of breath, flights of G. brevipalpis were more tightly controlled, i.e. slower and more directed. This showed that after multiple generations in a laboratory colony G. brevipalpis was still capable of complex behaviours during bloodmeal searching. (author)

  19. Practical analysis of tide gauges records from Antarctica

    Science.gov (United States)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted

  20. Episode image analysis for wearable daily life recording system

    Science.gov (United States)

    Toda, Masashi; Nagasaki, Takeshi; Kawashima, Toshio

    2003-06-01

    Now we are developing a wearable recording system which can capture images from user"s view point in user"s everyday life automatically. The user refers to the images which is acquired using this system later, and these images support a user"s activity, such as human memory and human thoughts. Amount images which are acquired becomes so huge that it is difficult for the user to refer them. So the mechanism for viewing effectively is very important, such as summary. In this research, we describe the concept for summary mechanism of everyday life images.

  1. Analysis of In Mine Acoustic Recordings for Single Fired Explosions

    Science.gov (United States)

    McKenna, S.; Hayward, C.; Stump, B.

    2003-12-01

    In August of 2003, a series of single fired test shots were executed at a copper mine in Arizona. The ten shots, fired on August 18 and 19, 2003, ranged in size from 1700 lbs to 13600 lbs in simultaneously detonated patterns ranging from a single hole to eight holes. All were located within the same pit and within 100 m of each other. Both free face and bench shots were included. Southern Methodist University had previously deployed a set of acoustic gauges ringing the active production areas of the mine. The five Validyne DP250 sensors recorded not only the ten test shots, but also seven delay fired production shots over the four day period from August 18 to 21, 2003. Each recorded blast arrival was analyzed for peak amplitude and spectrum. Signals were then compared for the variability between shots and sensors as well as a comparison between fully contained and poorly contained shots. Blast yield, scale depth, and centroid depth were compared to the above measured quantities for each of the single-fired and production shots.

  2. Frequency analysis of electroencephalogram recorded from a bottlenose dolphin (Tursiops truncatus) with a novel method during transportation by truck

    OpenAIRE

    Hashio, Fuyuko; Tamura, Shinichi; Okada, Yasunori; Morimoto, Shigeru; Ohta, Mitsuaki; Uchida, Naoyuki

    2010-01-01

    In order to obtain information regarding the correlation between an electroencephalogram (EEG) and the state of a dolphin, we developed a noninvasive recording method of EEG of a bottlenose dolphin (Tursiops truncatus) and an extraction method of true-EEG (EEG) from recorded-EEG (R-EEG) based on a human EEG recording method, and then carried out frequency analysis during transportation by truck. The frequency detected in the EEG of dolphin during apparent awakening was divided conveniently in...

  3. Speech watermarking: an approach for the forensic analysis of digital telephonic recordings.

    Science.gov (United States)

    Faundez-Zanuy, Marcos; Lucena-Molina, Jose J; Hagmüller, Martin

    2010-07-01

    In this article, the authors discuss the problem of forensic authentication of digital audio recordings. Although forensic audio has been addressed in several articles, the existing approaches are focused on analog magnetic recordings, which are less prevalent because of the large amount of digital recorders available on the market (optical, solid state, hard disks, etc.). An approach based on digital signal processing that consists of spread spectrum techniques for speech watermarking is presented. This approach presents the advantage that the authentication is based on the signal itself rather than the recording format. Thus, it is valid for usual recording devices in police-controlled telephone intercepts. In addition, our proposal allows for the introduction of relevant information such as the recording date and time and all the relevant data (this is not always possible with classical systems). Our experimental results reveal that the speech watermarking procedure does not interfere in a significant way with the posterior forensic speaker identification. PMID:20412360

  4. A near real-time satellite-based global drought climate data record

    International Nuclear Information System (INIS)

    Reliable drought monitoring requires long-term and continuous precipitation data. High resolution satellite measurements provide valuable precipitation information on a quasi-global scale. However, their short lengths of records limit their applications in drought monitoring. In addition to this limitation, long-term low resolution satellite-based gauge-adjusted data sets such as the Global Precipitation Climatology Project (GPCP) one are not available in near real-time form for timely drought monitoring. This study bridges the gap between low resolution long-term satellite gauge-adjusted data and the emerging high resolution satellite precipitation data sets to create a long-term climate data record of droughts. To accomplish this, a Bayesian correction algorithm is used to combine GPCP data with real-time satellite precipitation data sets for drought monitoring and analysis. The results showed that the combined data sets after the Bayesian correction were a significant improvement compared to the uncorrected data. Furthermore, several recent major droughts such as the 2011 Texas, 2010 Amazon and 2010 Horn of Africa droughts were detected in the combined real-time and long-term satellite observations. This highlights the potential application of satellite precipitation data for regional to global drought monitoring. The final product is a real-time data-driven satellite-based standardized precipitation index that can be used for drought monitoring especially over remote and/or ungauged regions. (letter)

  5. A CORBA-based integration of distributed electronic healthcare records using the synapses approach.

    Science.gov (United States)

    Grimson, J; Grimson, W; Berry, D; Stephens, G; Felton, E; Kalra, D; Toussaint, P; Weier, O W

    1998-09-01

    The ability to exchange in a meaningful, secure, and simple fashion relevant healthcare data about patients is seen as vital in the context of efficient and cost-effective shared or team-based care. The electronic healthcare record (EHCR) lies at the heart of this information exchange, and it follows that there is an urgent need to address the ability to share EHCR's or parts of records between carers and across distributed health information systems. This paper presents the Synapses approach to sharing based on a standardized shared record, the Federated Healthcare Record, which is implemented in an open and flexible manner using the Common Object Request Broker Architecture (CORBA). The architecture of the Federated Healthcare Record is based on the architecture proposed by the Technical Committee 251 of the European Committee for Standardization.

  6. Low frequency signals analysis from broadband seismometers records

    Science.gov (United States)

    Hsu, Po-Chin

    2016-04-01

    Broadband seismometers record signals over a wide frequency band, in which the high-frequency background noise is usually associated with human activities, such as cars, trains and factory-related activities. Meanwhile, the low-frequency signals are generally linked to the microseisms, atmospheric phenomena and oceanic wave movement. In this study, we selected the broadband seismometer data recorded during the pass of the typhoons with different moving paths, such as Doksuri in 2012, Trami and Kong-Rey in 2013, Hagibis and Matmo in 2014. By comparing the broadband seismic data, the meteorological information, and the marine conditions, we attempt to understand the effect of the meteorological conditions on the low-frequency noise. The result shows that the broadband station located along the southwestern coast of Taiwan usually have relatively higher background noise value, while the inland stations were characterized by lower noise energy. This rapid decay of the noise energy with distance from the coastline suggest that the low frequency noise could be correlated with the oceanic waves. In addition, the noise energy level increases when the distance from the typhoon and the station decreases. The enhanced frequency range is between 0.1~0.3 Hz, which is consistent with the effect caused by the interference of oceanic waves as suggested by the previous studies. This observation indicates that when the pass of typhoon may reinforce the interaction of oceanic waves and caused some influence on the seismic records. The positive correlation between the significant wave height and the noise energy could also give evidence to this observation. However, we found that the noise energy is not necessarily the strongest when the distance from typhoon and the station is the shortest. This phenomenon seems to be related to the typhoon path. When the typhoon track is perpendicular to the coastline, the change of noise energy is generally more significantly; whereas less energy

  7. Factors influencing consumer adoption of USB-based Personal Health Records in Taiwan

    Directory of Open Access Journals (Sweden)

    Jian Wen-Shan

    2012-08-01

    Full Text Available Abstract Background Usually patients receive healthcare services from multiple hospitals, and consequently their healthcare data are dispersed over many facilities’ paper and electronic-based record systems. Therefore, many countries have encouraged the research on data interoperability, access, and patient authorization. This study is an important part of a national project to build an information exchange environment for cross-hospital digital medical records carried out by the Department of Health (DOH of Taiwan in May 2008. The key objective of the core project is to set up a portable data exchange environment in order to enable people to maintain and own their essential health information. This study is aimed at exploring the factors influencing behavior and adoption of USB-based Personal Health Records (PHR in Taiwan. Methods Quota sampling was used, and structured questionnaires were distributed to the outpatient department at ten medical centers which participated in the DOH project to establish the information exchange environment across hospitals. A total of 3000 questionnaires were distributed and 1549 responses were collected, out of those 1465 were valid, accumulating the response rate to 48.83%. Results 1025 out of 1465 respondents had expressed their willingness to apply for the USB-PHR. Detailed analysis of the data reflected that there was a remarkable difference in the “usage intention” between the PHR adopters and non-adopters (χ2 =182.4, p  Conclusions Higher Usage Intentions, Perceived Usefulness and Subjective Norm of patients were found to be the key factors influencing PHR adoption. Thus, we suggest that government and hospitals should promote the potential usefulness of PHR, and physicians should encourage patients' to adopt the PHR.

  8. Simplified Technique for Incorporating a Metal Mesh into Record Bases for Mandibular Implant Overdentures.

    Science.gov (United States)

    Godoy, Antonio; Siegel, Sharon C

    2015-12-01

    Mandibular implant-retained overdentures have become the standard of care for patients with mandibular complete edentulism. As part of the treatment, the mandibular implant-retained overdenture may require a metal mesh framework to be incorporated to strengthen the denture and avoid fracture of the prosthesis. Integrating the metal mesh framework as part of the acrylic record base and wax occlusion rim before the jaw relation procedure will avoid the distortion of the record base and will minimize the chances of processing errors. A simplified method to incorporate the mesh into the record base and occlusion rim is presented in this technique article. PMID:25659988

  9. Foreign technology alert-bibliography: Photography and recording devices. Citations from the NTIS data base

    Science.gov (United States)

    Wilkinson, G.

    1982-11-01

    A systematically organized collection of abstracts from a bibliographic data base is provided on reports relating to photographic, imaging and recording systems originating from countries outside the USA. A tailored search of the data base was performed and the output carefully categorized, edited and indexed. Subjects covered include: photographic devices and imaging systems (cameras, image carriers, holography and applications); audiovisual recording (digital, magnetic and video); date encoding, recording and storage; and satellite equipment. Each of the sections in the book is cross-referenced and there is also an author index and useful subject index based on major descriptors.

  10. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    Science.gov (United States)

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  11. Comorbidities in rheumatoid arthritis: analysis of hospital discharge records

    Directory of Open Access Journals (Sweden)

    G.S. Mela

    2011-09-01

    Full Text Available Objective: Arthritis is often associated with comorbidities. For many of them, such as hypertension, cardiovascular disease, chronic pulmonary disease, and upper gastrointestinal disease, arthritis and its treatment may also represent a risk factor. This study is concerned with an evaluation of the frequency of comorbidities in a cohort of patients with rheumatoid arthritis (RA. Methods: The discharge diagnoses of patients with RA during the period 1 January 1997 to 31 December 2000 were retrieved from the database of the Department of Internal Medicine of the University of Genova, Italy. The diagnosis of RA was made if the patient’s discharge record contained the code 714 of the International Classification of Diseases, IX revision, as first 3 numbers. The other diagnoses were also recorded along with demographic data, type and duration of hospital stay, and performed procedures. Results: During the study period, 427 patients with RA were admitted to the hospital for a total number of 761 admissions, which represented 2.2% of total admissions. Ninety-one (21.3% patients did not have comorbidities, whereas 336 (78.6% had one or more comorbidities. The most frequently observed comorbidities were cardiovascular diseases (34.6%, including hypertension (14.5% and angina (3.5%, followed by gastrointestinal (24.5%, genito-urinary (18.7% and respiratory (17% diseases. There was a male predominance (p=0.004 within patients with comorbidities, who were significantly older (64.2±3.2 years vs. 57.2±4.2 years; p<0.001 and required longer periods of hospital stay (22.7 days vs. 12.5 days; p<0.001. Conclusions: Comorbidities are present in nearly 80% of RA inpatients. Comorbidity is a good predictor of health outcome, health services utilization, and medical costs. Because RA comorbidity can act as confounder, it should be considered in epidemiologic studies and clinical trials.

  12. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.

    2009-01-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  13. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  14. Analysis of Continuous Microseismic Recordings: Resonance Frequencies and Unconventional Events

    Science.gov (United States)

    Tary, J.; van der Baan, M.

    2012-12-01

    Hydrofracture experiments, where fluids and proppant are injected into reservoirs to create fractures and enhance oil recovery, are often monitored using microseismic recordings. The total stimulated volume is then estimated by the size of the cloud of induced micro-earthquakes. This implies that only brittle failure should occur inside reservoirs during the fracturing. Yet, this assumption may not be correct, as the total energy injected into the system is orders of magnitude larger than the total energy associated with brittle failure. Instead of using only triggered events, it has been shown recently that the frequency content of continuous recordings may also provide information on the deformations occurring inside reservoirs. Here, we use different kinds of time-frequency transforms to track the presence of resonance frequencies. We analyze different data sets using regular, long-period and broadband geophones. The resonance frequencies observed are mainly included in the frequency band of 5-60 Hz. We systematically examine first the possible causes of resonance frequencies, dividing them into source, path and receiver effects. We then conclude that some of the observed frequency bands likely result from source effects. The resonance frequencies could be produced by either interconnected fluid-filled fractures in the order of tens of meters, or by small repetitive events occurring at a characteristic periodicity. Still, other mechanisms may occur or be predominant during reservoir fracturing, depending on the lithology as well as the pressure and temperature conditions at depth. During one experiment, both regular micro-earthquakes, long-period long-duration events (LPLD) and resonance frequencies are observed. The lower part of the frequency band of these resonance frequencies (5-30 Hz) overlaps with the anticipated frequencies of observed LPLDs in other experiments (Hz). The exact origin of both resonance frequencies and LPLDs is still under debate

  15. Use and Characteristics of Electronic Health Record Systems among Office-Based Physician Practices: United States, ...

    Science.gov (United States)

    ... National Ambulatory Medical Care Survey Adoption of basic EHR systems by office-based physicians increased 21% between ... Survey, Electronic Health Records Survey. Adoption of basic EHR systems and any EHR system varied widely across ...

  16. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  17. Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System

    OpenAIRE

    Park, Hyeoun-Ae; Min, Yul Ha; Jeon, Eunjoo; Chung, Eunja

    2012-01-01

    Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed ...

  18. A Wavelet-Based Algorithm for Delineation and Classification of Wave Patterns in Continuous Holter ECG Recordings

    OpenAIRE

    Johannesen, L; Grove, USL; Sørensen, JS; Schmidt, ML; Couderc, J-P; Graff, C

    2010-01-01

    Quantitative analysis of the electrocardiogram (ECG) requires delineation and classification of the individual ECG wave patterns. We propose a wavelet-based waveform classifier that uses the fiducial points identified by a delineation algorithm. For validation of the algorithm, manually annotated ECG records from the QT database (Physionet) were used. ECG waveform classification accuracies were: 85.6% (P-wave), 89.7% (QRS complex), 92.8% (T-wave) and 76.9% (U-wave). The proposed classificatio...

  19. Ex post power economic analysis of record of decision operational restrictions at Glen Canyon Dam.

    Energy Technology Data Exchange (ETDEWEB)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.; Loftin, S.; Osiek, B; Decision and Information Sciences; Western Area Power Administration

    2010-07-31

    On October 9, 1996, Bruce Babbitt, then-Secretary of the U.S. Department of the Interior signed the Record of Decision (ROD) on operating criteria for the Glen Canyon Dam (GCD). Criteria selected were based on the Modified Low Fluctuating Flow (MLFF) Alternative as described in the Operation of Glen Canyon Dam, Colorado River Storage Project, Arizona, Final Environmental Impact Statement (EIS) (Reclamation 1995). These restrictions reduced the operating flexibility of the hydroelectric power plant and therefore its economic value. The EIS provided impact information to support the ROD, including an analysis of operating criteria alternatives on power system economics. This ex post study reevaluates ROD power economic impacts and compares these results to the economic analysis performed prior (ex ante) to the ROD for the MLFF Alternative. On the basis of the methodology used in the ex ante analysis, anticipated annual economic impacts of the ROD were estimated to range from approximately $15.1 million to $44.2 million in terms of 1991 dollars ($1991). This ex post analysis incorporates historical events that took place between 1997 and 2005, including the evolution of power markets in the Western Electricity Coordinating Council as reflected in market prices for capacity and energy. Prompted by ROD operational restrictions, this analysis also incorporates a decision made by the Western Area Power Administration to modify commitments that it made to its customers. Simulated operations of GCD were based on the premise that hourly production patterns would maximize the economic value of the hydropower resource. On the basis of this assumption, it was estimated that economic impacts were on average $26.3 million in $1991, or $39 million in $2009.

  20. Tidal analysis of data recorded by a superconducting gravimeter

    Directory of Open Access Journals (Sweden)

    F. Palmonari

    1995-06-01

    Full Text Available A superconducting gravimeter was used to monitor the tidal signal for a period of five months. The instrument was placed in a site (Brasimone station, Italy chat-acterized by a low noise level, and was calibrated with a precision of 0.2%. Then tidal analysis on hourly data was performed and the results presented in this paper; amplitudes, gravimetric factors, phase differences for the main tidal waves, M2, S2, N2, 01, Pl, K1, QI, were calculated together with barometric pressure admittance and long term instrumental drift.

  1. Microcomputer-based recording system for clinical electrophysiology.

    Science.gov (United States)

    Török, B

    1990-09-01

    We developed a personal computer-based system for clinical electrophysiologic measurements. The computer interfaced with a commercially available A/D converter, a low-noise isolation preamplifier, filter circuits, pattern and Ganzfeld stimulators, and a hardcopy unit. Separate programs were developed for electroretinography (ERG), pattern ERG and simultaneous visual evoked potential (VEP), flash and pattern-shift VEP, and electro-oculographic measurements. The complete control of the applied hardware (eg, stimulus control, automatic gain, and filter selection) is a common feature of the computer programs. These programs provide oscilloscopic functions, overload protection, artifact elimination, averaging, automatic peak latency and amplitude determination, baseline correction, smoothing, and digital filtering. The results can be presented on matrix, laser printers, or digital plotters. The hardware components and the features of the driver software are demonstrated on normal and pathologic signals. PMID:2276319

  2. An efficient record linkage scheme using graphical analysis for identifier error detection

    Directory of Open Access Journals (Sweden)

    Peto Tim EA

    2011-02-01

    Full Text Available Abstract Background Integration of information on individuals (record linkage is a key problem in healthcare delivery, epidemiology, and "business intelligence" applications. It is now common to be required to link very large numbers of records, often containing various combinations of theoretically unique identifiers, such as NHS numbers, which are both incomplete and error-prone. Methods We describe a two-step record linkage algorithm in which identifiers with high cardinality are identified or generated, and used to perform an initial exact match based linkage. Subsequently, the resulting clusters are studied and, if appropriate, partitioned using a graph based algorithm detecting erroneous identifiers. Results The system was used to cluster over 250 million health records from five data sources within a large UK hospital group. Linkage, which was completed in about 30 minutes, yielded 3.6 million clusters of which about 99.8% contain, with high likelihood, records from one patient. Although computationally efficient, the algorithm's requirement for exact matching of at least one identifier of each record to another for cluster formation may be a limitation in some databases containing records of low identifier quality. Conclusions The technique described offers a simple, fast and highly efficient two-step method for large scale initial linkage for records commonly found in the UK's National Health Service.

  3. Performance analysis of seismocardiography for heart sound signal recording in noisy scenarios.

    Science.gov (United States)

    Jain, Puneet Kumar; Tiwari, Anil Kumar; Chourasia, Vijay S

    2016-01-01

    This paper presents a system based on Seismocardiography (SCG) to monitor the heart sound signal for the long-term. It uses an accelerometer, which is of small size and low weight and, thus, convenient to wear. Such a system should also be robust to various noises which occur in real life scenarios. Therefore, a detailed analysis is provided of the proposed system and its performance is compared to the performance of the Phoncardiography (PCG) system. For this purpose, both signals of five subjects were simultaneously recorded in clinical and different real life noisy scenarios. For the quantitative analysis, the detection rate of fundamental heart sound components, S1 and S2, is obtained. Furthermore, a quality index based on the energy of fundamental components is also proposed and obtained for the same. Results show that both the techniques are able to acquire the S1 and S2, in clinical set-up. However, in real life scenarios, we observed many favourable features in the proposed system as compared to PCG, for its use for long-term monitoring. PMID:26860039

  4. Multi-scale dynamical analysis (MSDA) of sea level records versus PDO, AMO, and NAO indexes

    CERN Document Server

    Scafetta, Nicola

    2013-01-01

    Herein I propose a multi-scale dynamical analysis to facilitate the physical interpretation of tide gauge records. The technique uses graphical diagrams. It is applied to six secular-long tide gauge records representative of the world oceans: Sydney, Pacific coast of Australia; Fremantle, Indian Ocean coast of Australia; New York City, Atlantic coast of USA; Honolulu, U.S. state of Hawaii; San Diego, U.S. state of California; and Venice, Mediterranean Sea, Italy. For comparison, an equivalent analysis is applied to the Pacific Decadal Oscillation (PDO) index and to the Atlantic Multidecadal Oscillation (AMO) index. Finally, a global reconstruction of sea level and a reconstruction of the North Atlantic Oscillation (NAO) index are analyzed and compared: both sequences cover about three centuries from 1700 to 2000. The proposed methodology quickly highlights oscillations and teleconnections among the records at the decadal and multidecadal scales. At the secular time scales tide gauge records present relatively...

  5. Community-based, interdisciplinary geriatric care team satisfaction with an electronic health record: a multimethod study.

    Science.gov (United States)

    Sockolow, Paulina S; Bowles, Kathryn H; Lehmann, Harold P; Abbott, Patricia A; Weiner, Jonathan P

    2012-06-01

    This multimethod study measured the impact of an electronic health record (EHR) on clinician satisfaction with clinical process. Subjects were 39 clinicians at a Program of All-inclusive Care for Elders (PACE) site in Philadelphia utilizing an EHR. Methods included the evidence-based evaluation framework, Health Information Technology Research-Based Evaluation Framework, which guided assessment of clinician satisfaction with surveys, observations, follow-up interviews, and actual EHR use at two points in time. Mixed-methods analysis of findings provided context for interpretation and improved validity. The study found that clinicians were satisfied with the EHR; however, satisfaction declined between time periods. Use of EHR was universal and wide and was differentiated by clinical role. Between time periods, EHR use increased in volume, with increased timeliness and decreased efficiency. As the first EHR evaluation at a PACE site from the perspective of clinicians who use the system, this study provides insights into EHR use in the care of older people in community-based healthcare settings. PMID:22411417

  6. Design of a system based on DSP and FPGA for video recording and replaying

    Science.gov (United States)

    Kang, Yan; Wang, Heng

    2013-08-01

    This paper brings forward a video recording and replaying system with the architecture of Digital Signal Processor (DSP) and Field Programmable Gate Array (FPGA). The system achieved encoding, recording, decoding and replaying of Video Graphics Array (VGA) signals which are displayed on a monitor during airplanes and ships' navigating. In the architecture, the DSP is a main processor which is used for a large amount of complicated calculation during digital signal processing. The FPGA is a coprocessor for preprocessing video signals and implementing logic control in the system. In the hardware design of the system, Peripheral Device Transfer (PDT) function of the External Memory Interface (EMIF) is utilized to implement seamless interface among the DSP, the synchronous dynamic RAM (SDRAM) and the First-In-First-Out (FIFO) in the system. This transfer mode can avoid the bottle-neck of the data transfer and simplify the circuit between the DSP and its peripheral chips. The DSP's EMIF and two level matching chips are used to implement Advanced Technology Attachment (ATA) protocol on physical layer of the interface of an Integrated Drive Electronics (IDE) Hard Disk (HD), which has a high speed in data access and does not rely on a computer. Main functions of the logic on the FPGA are described and the screenshots of the behavioral simulation are provided in this paper. In the design of program on the DSP, Enhanced Direct Memory Access (EDMA) channels are used to transfer data between the FIFO and the SDRAM to exert the CPU's high performance on computing without intervention by the CPU and save its time spending. JPEG2000 is implemented to obtain high fidelity in video recording and replaying. Ways and means of acquiring high performance for code are briefly present. The ability of data processing of the system is desirable. And smoothness of the replayed video is acceptable. By right of its design flexibility and reliable operation, the system based on DSP and FPGA

  7. Analysis of global and hemispheric temperature records and prognosis

    Science.gov (United States)

    Werner, Rolf; Valev, Dimitar; Danov, Dimitar; Guineva, Veneta; Kirillov, Andrey

    2015-06-01

    Climate changes are connected to long term variations of global and hemispheric temperatures, which are important for the work out of socio-political strategy for the near future. In the paper the annual temperature time series are modeled by linear multiple regression to identify important climate forcings including external climate factors such as atmospheric CO2 content, volcanic emissions, and the total solar irradiation as well as internal factors such as El Niño-Southern oscillation, Pacific decadal oscillation and Atlantic multidecadal oscillation. Adjusted temperatures were determined by removal of all significant influences except CO2. The adjusted temperatures follow a linear dependence toward the logarithm of the CO2 content, and the coefficient of determination is about 0.91. The evolution of the adjusted temperatures suggests that the warming due to CO2 from the beginning of the studied here time interval in 1900 has never stopped and is going on up to now. The global warming rate deduced from the adjusted temperatures since 1980 is about 0.14 ± 0.02 °C/decade. The warming rate reported in the IPCC assessment report 4 based on observed global surface temperature set is about 20% higher, due to the warming by the Atlantic multidecadal oscillation additional to the anthropogenic warming. The predicted temperature evolution based on long time changes of CO2 and the Atlantic multidecadal oscillation index shows that the Northern Hemispheric temperatures are modulated by the Atlantic multidecadal oscillation influence and will not change significantly to about 2040, after that they will increase speedily, just like during the last decades of the past century. The temperatures of the Southern Hemisphere will increase almost linearly and don't show significant periodic changes due to Atlantic multidecadal oscillation. The concrete warming rates of course are strongly depending on the future atmospheric CO2 content.

  8. DrCell – A Software Tool for the Analysis of Cell Signals Recorded with Extracellular Microelectrodes

    Directory of Open Access Journals (Sweden)

    Christoph Nick

    2013-09-01

    Full Text Available Microelectrode arrays (MEAs have been applied for in vivo and in vitro recording and stimulation of electrogenic cells, namely neurons and cardiac myocytes, for almost four decades. Extracellular recordings using the MEA technique inflict minimum adverse effects on cells and enable long term applications such as implants in brain or heart tissue. Hence, MEAs pose a powerful tool for studying the processes of learning and memory, investigating the pharmacological impacts of drugs and the fundamentals of the basic electrical interface between novel electrode materials and biological tissue. Yet in order to study the areas mentioned above, powerful signal processing and data analysis tools are necessary. In this paper a novel toolbox for the offline analysis of cell signals is presented that allows a variety of parameters to be detected and analyzed. We developed an intuitive graphical user interface (GUI that enables users to perform high quality data analysis. The presented MATLAB® based toolbox gives the opportunity to examine a multitude of parameters, such as spike and neural burst timestamps, network bursts, as well as heart beat frequency and signal propagation for cardiomyocytes, signal-to-noise ratio and many more. Additionally a spike-sorting tool is included, offering a powerful tool for cases of multiple cell recordings on a single microelectrode. For stimulation purposes, artifacts caused by the stimulation signal can be removed from the recording, allowing the detection of field potentials as early as 5 ms after the stimulation.

  9. Validity of a hospital-based obstetric register using medical records as reference

    DEFF Research Database (Denmark)

    Brixval, Carina Sjöberg; Thygesen, Lau Caspar; Johansen, Nanna Roed;

    2015-01-01

    and validity of a hospital-based clinical register - the Obstetric Database - using a national register and medical records as references. METHODS: We assessed completeness of a hospital-based clinical register - the Obstetric Database - by linking data from all women registered in the Obstetric Database...... as having given birth in 2013 to the National Patient Register with coverage of all births in 2013. Validity of eleven selected indicators from the Obstetric Database was assessed using medical records as a golden standard. Using a random sample of 250 medical records, we calculated proportion of agreement......, sensitivity, specificity, and positive and negative predictive values for each indicator. Two assessors independently reviewed medical records and inter-rater reliability was calculated as proportion of agreement and Cohen's κ coefficient. RESULTS: We found 100% completeness of the Obstetric Database when...

  10. Analysis of patent value evaluation and recorded value based on real option theory%基于实物期权理论的专利权价值评估及入账价值计算分析

    Institute of Scientific and Technical Information of China (English)

    徐文静

    2013-01-01

      本文通过对实物期权理论的介绍以及其在进行专利权评估上的运用,简要分析了利弊,并给出了合适的专利权价值评估公式。%In this paper, through the introduction of the real options theory and use during patent assessment, a brief analysis of the pros and cons, and given the appropriate patent valuation formula.

  11. Deciphering the record of short-term base-level changes in Gilbert-type deltas

    Science.gov (United States)

    Gobo, Katarina; Ghinassi, Massimiliano; Nemec, Wojciech

    2016-04-01

    -front accommodation driven by short-term base-level changes, with some accompanying inevitable 'noise' in the facies record due to the system autogenic variability and regional climatic fluctuations. Comparison of delta coeval foreset and toeset/bottomset deposits in a delta shows further a reverse pattern of reciprocal changes in facies assemblages, with the TFA assemblage of foreset deposits passing downdip into a DFA assemblage of delta-foot deposits, and the DFA assemblage of foreset deposits passing downdip into a TFA assemblage. This reverse reciprocal alternation of TFA and DFA facies assemblages is attributed to the delta-slope own morphodynamics. When the delta slope is dominated by deposition of debrisflows, only the most diluted turbulent flows and chute bypassing turbidity currents are reaching the delta-foot zone. When the delta slope is dominated by turbiditic sedimentation, larger chutes and gullies form - triggering and conveying debrisflows to the foot zone. These case studies as a whole shed a new light on the varying pattern of subaqueous sediment dispersal processes in an evolving Gilbert-type deltaic system and point to an the attractive possibility of the recognition of a 'hidden' record of base-level changes on the basis of detailed facies analysis.

  12. Secure Management of Personal Health Records by Applying Attribute-Based Encryption

    NARCIS (Netherlands)

    Ibraimi, Luan; Asim, Muhammad; Petkovic, Milan

    2009-01-01

    The confidentiality of personal health records is a major problem when patients use commercial Web-based systems to store their health data. Traditional access control mechanisms, such as Role-Based Access Control, have several limitations with respect to enforcing access control policies and ensuri

  13. Using the Java language to develop computer based patient records for use on the Internet.

    OpenAIRE

    Zuckerman, A. E.

    1996-01-01

    The development of the Java Programming Language by Sun Microsystems has provided a new tool for the development of Internet based applications. Our preliminary work has shown how Java can be used to program an Internet based CBPR. Java is well suited to the needs of patient records and can interface with clinical data repositories written in MUMPS or SQL.

  14. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features

    Science.gov (United States)

    García, A.; Romano, H.; Laciar, E.; Correa, R.

    2011-12-01

    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases "arrhythmias MIT BIH database" and "MIT BIH supraventricular arrhythmias database". A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  15. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features

    International Nuclear Information System (INIS)

    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases 'arrhythmias MIT BIH database' and MIT BIH supraventricular arrhythmias database. A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  16. A Lower Rhine flood chronology based on the sedimentary record of an abandoned channel fill

    Science.gov (United States)

    Toonen, W. H. J.; Winkels, T. G.; Prins, M. A.; de Groot, L. V.; Bunnik, F. P. M.; Cohen, K. M.

    2012-04-01

    The Bienener Altrhein is an abandoned channel of the Lower Rhine (Germany). Following a late 16th century abandonment event, the channel was disconnected from the main stream and the oxbow lake gradually filled with 8 meters of flood deposits. This process still continues today. During annual floods, a limited proportion of overbank discharge is routed across the oxbow lake. Large floods produce individual flood layers, which are visually recognized in the sedimentary sequence. Based on the sedimentary characteristics of these event layers, we created a ~450-year flood chronology for the Lower Rhine. Laser-diffraction grain size measurements were used to assess relative flood magnitudes for individual flood event layers. Continuous sampling at a ~2 cm interval provided a high-resolution record, resolving the record at an annual scale. Standard descriptive techniques (e.g., mean grain size, 95th percentile, % sand) and the more advanced 'end member modelling' were applied to zoom in on the coarse particle bins in the grain size distributions, which are indicative of higher flow velocities. The most recent part of the record was equated to modern discharge measurements. This allows to establish relations between deposited grain size characteristics in the abandoned channel and flood magnitudes in the main river. This relation can also be applied on flood event layers from previous centuries, for which only water level measurements and historical descriptions exist. This makes this method relevant to expand data series used in flood frequency analysis from 100 years to more than 400 years. To date event-layers in the rapidly accumulated sequence, we created an age-depth model that uses organic content variations to tune sedimentation rates between the known basal and top ages. No suitable identifiable organic material for radiocarbon dating was found in the cores. Instead, palynological results (introduction of agricultural species) and palaeomagnetic secular

  17. Remote heartbeat signal detection from visible spectrum recordings based on blind deconvolution

    Science.gov (United States)

    Kaur, Balvinder; Moses, Sophia; Luthra, Megha; Ikonomidou, Vasiliki N.

    2016-05-01

    While recent advances have shown that it is possible to acquire a signal equivalent to the heartbeat from visual spectrum video recordings of the human skin, extracting the heartbeat's exact timing information from it, for the purpose of heart rate variability analysis, remains a challenge. In this paper, we explore two novel methods to estimate the remote cardiac signal peak positions, aiming at a close representation of the R-peaks of the ECG signal. The first method is based on curve fitting (CF) using a modified filtered least mean square (LMS) optimization and the second method is based on system estimation using blind deconvolution (BDC). To prove the efficacy of the developed algorithms, we compared results obtained with the ground truth (ECG) signal. Both methods achieved a low relative error between the peaks of the two signals. This work, performed under an IRB approved protocol, provides initial proof that blind deconvolution techniques can be used to estimate timing information of the cardiac signal closely correlated to the one obtained by traditional ECG. The results show promise for further development of a remote sensing of cardiac signals for the purpose of remote vital sign and stress detection for medical, security, military and civilian applications.

  18. The computer based patient record: a strategic issue in process innovation.

    Science.gov (United States)

    Sicotte, C; Denis, J L; Lehoux, P

    1998-12-01

    Reengineering of the workplace through Information Technology is an important strategic issue for today's hospitals. The computer-based patient record (CPR) is one technology that has the potential to profoundly modify the work routines of the care unit. This study investigates a CPR project aimed at allowing physicians and nurses to work in a completely electronic environment. The focus of our analysis was the patient nursing care process. The rationale behind the introduction of this technology was based on its alleged capability to both enhance quality of care and control costs. This is done by better managing the flow of information within the organization and by introducing mechanisms such as the timeless and spaceless organization of the work place, de-localization, and automation of work processes. The present case study analyzed the implementation of a large CPR project ($45 million U.S.) conducted in four hospitals in joint venture with two computer firms. The computerized system had to be withdrawn because of boycotts from both the medical and nursing personnel. User-resistance was not the problem. Despite its failure, this project was a good opportunity to understand better the intricate complexity of introducing technology in professional work where the usefulness of information is short lived and where it is difficult to predetermine the relevancy of information. Profound misconceptions in achieving a tighter fit (synchronization) between care processes and information processes were the main problems. PMID:9871877

  19. PC analysis of bit records enhances drilling operations in southern Alabama; Case history

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.M. (Helmerich and Payne IDC (US))

    Kelly No. 1 was drilled on a footage basis in Excambia County, AL. Computer bit-record analysis provided summarized information in a format easily read and understood by field personnel. With a PC used to compare surrounding bit records on a cost-per-foot basis, the best bit type, weight on bit (WOB), and bit speed for a given section of hole were identified. This paper reports that from the analysis, field personnel were able to improve bit selection, resulting in three bit runs having the lowest cost per foot in a given section of hole and an overall lower-cost-per-foot drilling operation.

  20. Computer analysis of sound recordings from two Anasazi sites in northwestern New Mexico

    Science.gov (United States)

    Loose, Richard

    2002-11-01

    Sound recordings were made at a natural outdoor amphitheater in Chaco Canyon and in a reconstructed great kiva at Aztec Ruins. Recordings included computer-generated tones and swept sine waves, classical concert flute, Native American flute, conch shell trumpet, and prerecorded music. Recording equipment included analog tape deck, digital minidisk recorder, and direct digital recording to a laptop computer disk. Microphones and geophones were used as transducers. The natural amphitheater lies between the ruins of Pueblo Bonito and Chetro Ketl. It is a semicircular arc in a sandstone cliff measuring 500 ft. wide and 75 ft. high. The radius of the arc was verified with aerial photography, and an acoustic ray trace was generated using cad software. The arc is in an overhanging cliff face and brings distant sounds to a line focus. Along this line, there are unusual acoustic effects at conjugate foci. Time history analysis of recordings from both sites showed that a 60-dB reverb decay lasted from 1.8 to 2.0 s, nearly ideal for public performances of music. Echoes from the amphitheater were perceived to be upshifted in pitch, but this was not seen in FFT analysis. Geophones placed on the floor of the great kiva showed a resonance at 95 Hz.

  1. Patients covertly recording clinical encounters: threat or opportunity? A qualitative analysis of online texts.

    Directory of Open Access Journals (Sweden)

    Maka Tsulukidze

    Full Text Available The phenomenon of patients covertly recording clinical encounters has generated controversial media reports. This study aims to examine the phenomenon and analyze the underlying issues.We conducted a qualitative analysis of online posts, articles, blogs, and forums (texts discussing patients covertly recording clinical encounters. Using Google and Google Blog search engines, we identified and analyzed 62 eligible texts published in multiple countries between 2006 and 2013. Thematic analysis revealed four key themes: 1 a new behavior that elicits strong reactions, both positive and negative, 2 an erosion of trust, 3 shifting patient-clinician roles and relationships, and 4 the existence of confused and conflicting responses. When patients covertly record clinical encounters - a behavior made possible by various digital recording technologies - strong reactions are evoked among a range of stakeholders. The behavior represents one consequence of an erosion of trust between patients and clinicians, and when discovered, leads to further deterioration of trust. Confused and conflicting responses to the phenomenon by patients and clinicians highlight the need for policy guidance.This study describes strong reactions, both positive and negative, to the phenomenon of patients covertly recording clinical encounters. The availability of smartphones capable of digital recording, and shifting attitudes to patient-clinician relationships, seems to have led to this behavior, mostly viewed as a threat by clinicians but as a welcome and helpful innovation by some patients, possibly indicating a perception of subordination and a lack of empowerment. Further examination of this tension and its implications is needed.

  2. Early Warning and Risk Estimation methods based on Unstructured Text in Electronic Medical Records to Improve Patient Adherence and Care

    OpenAIRE

    Sairamesh, Jakka; Rajagopal, Ram; Nemana, Ravi; Argenbright, Keith

    2009-01-01

    In this paper we present risk-estimation models and methods for early detection of patient non-adherence based on unstructured text in patient records. The primary objectives are to perform early interventions on patients at risk of non-adherence and improve outcomes. We analyzed over 1.1 million visit notes corresponding to 30,095 Cancer patients, spread across 12 years of Oncology practice. Our risk analysis, based on a rich risk-factor dictionary, revealed that a staggering 30% of the pati...

  3. Flood Risk Analysis Using Non-Stationary Models: Application to 1500 Records and Assessment of Predictive Ability

    Science.gov (United States)

    Luke, A.; Sanders, B. F.; Aghakouchak, A.; Vrugt, J. A.; Matthew, R.

    2015-12-01

    Urbanization and shifts in precipitation patterns have altered the risk of inland flooding. Methods to assess the flood risk, such as flood frequency analysis, are based on the key assumption of stationarity (ST). Under the ST assumption, the behavior of the hydroclimatic system (precipitation, temperature) and watershed is assumed to be time invariant. This ST assumption is quite restrictive and perhaps not accurate for flood risk assessment in watersheds that have undergone significant urbanization. Consequently, there is an urgent need for statistical methods that can account explicitly for system non-stationarity (NS) in the analysis and quantification of flood risks. One approach is to use time variant parameters in an extreme value distribution. This approach, called NEVA, has shown to improve the statistical representation of observed data (within-sample), yet NEVA has not been comprehensively evaluated for predictive analysis (out-of-sample). We apply NEVA to 1,548 records of observed annual maximum discharges with the goal to (1) assess which of the two approaches (ST/NS) and their parametric models, in Log-Pearson Type III (LPIII) distribution, best describe the statistical representation of future flood risks, and (2) which diagnostic is most suitable for model selection (NS/ST). To explore these questions, we use the first half of each flood record for inference of the LPIII model parameters using MCMC simulation with the DREAM(ZS) algorithm - and the second part of the record is used for evaluation purposes (predictive analysis). Our results show that in about 70% of the records with a trend, the LPIII ST model performed better during evaluation than the LPIII NS model - unless the "trend" record is more than 55-years long; then the NS model is always preferred. If trend classification of the 1,548 records was done using summary metrics of watershed processes (runoff coefficient) rather than the peak discharges, the performance of the NS model improved

  4. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  5. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  6. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  7. Spironolactone use and renal toxicity: population based longitudinal analysis.

    OpenAIRE

    Wei, L; Struthers, A D; Fahey, T; Watson, A D; MacDonald, T. M.

    2010-01-01

    Objective To determine the safety of spironolactone prescribing in the setting of the UK National Health Service. Design Population based longitudinal analysis using a record linkage database. Setting Tayside, Scotland. Population All patients who received one or more dispensed prescriptions for spironolactone between 1994 and 2007. Main outcome measures Rates of prescribing for spironolactone, hospital admissions for hyperkalaemia, and hyperkalaemia and renal function without...

  8. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  9. Patients covertly recording clinical encounters: threat or opportunity? A qualitative analysis of online texts

    NARCIS (Netherlands)

    Tsulukidze, M.; Grande, S.W.; Thompson, R.; Rudd, K.; Elwyn, G.

    2015-01-01

    BACKGROUND: The phenomenon of patients covertly recording clinical encounters has generated controversial media reports. This study aims to examine the phenomenon and analyze the underlying issues. METHODS AND FINDINGS: We conducted a qualitative analysis of online posts, articles, blogs, and forums

  10. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  11. Receiver Function Analysis using Ocean-bottom Seismometer Records around the Kii Peninsula, Southwestern Japan

    Science.gov (United States)

    Akuhara, T.; Mochizuki, K.

    2014-12-01

    Recent progress on receiver function (RF) analysis has provided us with new insight about the subsurface structure. The method is now gradually being more applied to records of ocean-bottom seismometers (OBSs). In the present study, we conducted RF analysis using OBS records at 32 observation sites around the Kii Peninsula, southwestern Japan, from 2003 to 2007 (Mochizuki et al., 2010, GRL). We addressed problems concerning water reverberations. We first checked the effects of water reverberations on the OBS vertical component records by calculating vertical P-wave RFs (Langston and Hammer, 2001, BSSA), where the OBS vertical component records were deconvolved by stacked traces of on-land records as source functions. The resultant RFs showed strong peaks corresponding to the water reverberations. Referring to these RFs, we constructed inverse filters to remove the effects of water reverberations from the vertical component records, which were assumed to be represented by two parameters, a two-way travel time within the water layer, and a reflection coefficient at the seafloor. We then calculated radial RFs using the filtered, reverberation-free, vertical component records of OBS data as source functions. The resultant RFs showed that some phases at later times became clearer than those obtained by an ordinary method. From the comparison with a previous tomography model (Akuhara et al., 2013, GRL), we identified phases originating from the oceanic Moho, which delineates the relationship between the depth of earthquakes and the oceanic Moho: seaward intraslab seismicity is high within the oceanic crust while the landward seismicity is high within the oceanic mantle. This character may be relevant to the dehydration process.

  12. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging.

    Science.gov (United States)

    Haghpanahi, Masoumeh; Borkholder, David A

    2014-08-01

    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother's abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice.

  13. Noninvasive method for electrocardiogram recording in conscious rats: feasibility for heart rate variability analysis

    Directory of Open Access Journals (Sweden)

    Pedro P. Pereira-Junior

    2010-06-01

    Full Text Available Heart rate variability (HRV analysis consists in a well-established tool for the assessment of cardiac autonomic control, both in humans and in animal models. Conventional methods for HRV analysis in rats rely on conscious state electrocardiogram (ECG recording based on prior invasive surgical procedures for electrodes/transmitters implants. The aim of the present study was to test a noninvasive and inexpensive method for ECG recording in conscious rats, assessing its feasibility for HRV analysis. A custom-made elastic cotton jacket was developed to fit the rat's mean thoracic circumference, with two pieces of platinum electrodes attached on its inner surface, allowing ECG to be recorded noninvasively in conscious, restrained rats (n=6. Time- and frequency-domain HRV analyses were conducted, under basal and autonomic blockade conditions. High-quality ECG signals were obtained, being feasible for HRV analysis. As expected, mean RR interval was significantly decreased in the presence of atropine (p A análise da variabilidade da freqüência cardíaca (VFC consiste em uma metodologia bem estabelecida para o estudo do controle autonômico cardíaco, tanto em humanos como em modelos animais. As metodologias convencionais para o estudo da VFC em ratos utilizam-se de procedimentos cirúrgicos para o implante de eletródios ou transmissores, o que possibilita a posterior aquisição do eletrocardiograma (ECG no estado consciente. O objetivo do presente trabalho foi o de desenvolver e aplicar um método não-invasivo para o registro do ECG em ratos conscientes, verificando sua validade para a análise da VFC. Uma vestimenta de tecido elástico em algodão foi desenvolvida de acordo com as dimensões médias da circunferência torácica dos animais, e dois pequenos eletródios retangulares de platina foram aderidos à superfície interna da vestimenta, permitindo o registro do ECG de forma não-invasiva em ratos conscientes (n=6, sob contenção. Foram

  14. Influence of weather factors on population dynamics of two lagomorph species based on hunting bag records

    NARCIS (Netherlands)

    Rödel, H.; Dekker, J.J.A.

    2012-01-01

    Weather conditions can have a significant influence on short-term fluctuations of animal populations. In our study, which is based on time series of hunting bag records of up to 28 years from 26 counties of The Netherlands and Germany, we investigated the impact of different weather variables on ann

  15. 13 CFR 106.303 - Who has authority to approve and sign a Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who has authority to approve and... Activities § 106.303 Who has authority to approve and sign a Fee Based Record? The Administrator, or upon his... consultation with the General Counsel (or designee), has the authority to approve and sign each Fee...

  16. 49 CFR 1544.230 - Fingerprint-based criminal history records checks (CHRC): Flightcrew members.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Fingerprint-based criminal history records checks (CHRC): Flightcrew members. 1544.230 Section 1544.230 Transportation Other Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY AIRCRAFT OPERATOR...

  17. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re

  18. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re PMID

  19. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow.

  20. Seismic simulation analysis of a nuclear reactor building using observed earthquake records

    International Nuclear Information System (INIS)

    In this paper, to verify the effectiveness of dynamic response analysis technique, simulation analyses using observed records of five different earthquakes are performed for the reactor building of Unit 6 of the Fukushima Daiichi Nuclear Power Plant. A sway-rocking model (SR model) with embedment effect is adopted for the analyses. The model properties of the structure and soil springs are estimated by using the results of the forced vibration test. The soil properties are estimated by referring to the observed records of free field and the soil test data. The flow of the process for establishing the model properties is shown

  1. Characteristics of solar diurnal variations: a case study based on records from the ground magnetic observatory at Vassouras, Brazil

    CERN Document Server

    Klausner, Virginia; Mendes, Odim; Domingues, Margarete O; Frick, Peter

    2011-01-01

    The horizontal component amplitudes observed by ground-based observatories of the INTERMAGNET network have been used to analyze the global pattern variance of the solar diurnal variations. Data from magnetic stations present gaps in records and consequently we explored them via a time-frequency gapped wavelet algorithm. After computing the gapped wavelet transform, we performed wavelet cross-correlation analysis which was useful to isolate the period of the spectral components of the geomagnetic field in each of the selected magnetic stations and to correlate them as function of scale (period) with the low latitude Vassouras Observatory, Rio de Janeiro, Brazil, which is under the South Atlantic Magnetic Anomaly (SAMA) influence and should be used as a reference for an under-construction Brazilian network of magnetic observatories. The results show that the records in magnetic stations have a latitudinal dependence affected by the season of year and by the level of solar activity. We have found a disparity on ...

  2. A new photopolymerizable holographic recording material based on acrylamide and N-hydroxymethyl acrylamide

    Institute of Scientific and Technical Information of China (English)

    Gong Qiao-Xia; Wang Su-Lian; Huang Ming-Ju; Gan Fu-Xi

    2005-01-01

    A new polyvinylalcohol-based photopolymeric holographic recording material has been developed. The recording is obtained by the copolymerization of acrylamide and N-hydroxymethyl acrylamide. Diffraction efficiencies near 50% are obtained with energetic exposure of 80m J/cm2. N-hydroxymethyl acrylamide can improve the optical quality of the film. With the increase of the concentration of N-hydroxymethyl acrylamide, the flatness of the film increases, scattering reduces and the straight image is clearer with a small distortion. The postexposure effect on the grating is also studied.The diffraction efficiency of grating increases further during postexposure, gradient of monomer exists after exposure.

  3. In Pursuit of Reciprocity: Researchers, Teachers, and School Reformers Engaged in Collaborative Analysis of Video Records

    Science.gov (United States)

    Curry, Marnie W.

    2012-01-01

    In the ideal, reciprocity in qualitative inquiry occurs when there is give-and-take between researchers and the researched; however, the demands of the academy and resource constraints often make the pursuit of reciprocity difficult. Drawing on two video-based, qualitative studies in which researchers utilized video records as resources to enhance…

  4. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    NARCIS (Netherlands)

    Haan, de S.; Lodder, J.C.

    1991-01-01

    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having perpend

  5. An integrable, web-based solution for easy assessment of video-recorded performances.

    Science.gov (United States)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; however, managing video recordings and assessment sheets is troublesome and may lead to missing or incorrect data. Currently, no existing software solution can provide a local solution for the management of videos and assessments but this is necessary as assessment scores are confidential information, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA). PMID:24833946

  6. Smart Card Based Integrated Electronic Health Record System For Clinical Practice

    Directory of Open Access Journals (Sweden)

    N. Anju Latha

    2012-10-01

    Full Text Available Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Smart cards are used as a Electronic Health Record (EHR Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in hospitals. In this paper, a smart card based Integrated Electronic health Record System is developed. The system uses smart card for personal identification and transfer of health data and provides data communication. In addition to personal information, general health information about the patient is also loaded to patient smart card. Health care providers use smart cards to access data on patient cards. Electronic health records have number of advantages over the paper record, which improve the accuracy, quality of patient care, reduce the cost, efficiency, productivity. In present work we measure the biomedical parameters like Blood Pressure, Diabetes Mellitus and Pulse oxygen measurement.,etc clinical parameters of patient and store health details in Electronic Health record. The system has been successfully tested and implemented (Abstract

  7. The Private Communications of Magnetic Recording under Socialism (Retrospective Disco Analysis

    Directory of Open Access Journals (Sweden)

    Oleg Vladimir Sineokij

    2013-07-01

    Full Text Available The article analyzes the formation and development of a general model of rare sound records in the structure of institutions of a social communication. The author considers psychocomminicative features of the filophone communication as a special type of interaction in the field of entertainment. The author studied the causes and conditions of a tape subculture in the USSR. It is observed the dynamics of the disco-communication in limited information conditions from socialism till modern high-tech conditions.At the end of the article the author argues based achievements in the field of advanced technology systems, innovation revival in the industry of music-record. Hence, using innovative approaches in the study, the author sets out the basic concept of recording popular music as a special information and legal institution, in retrospect, the theory and practice of the future needs in the information society.

  8. Temporomandibular joint sounds: a critique of techniques for recording and analysis.

    Science.gov (United States)

    Widmer, C G

    1989-01-01

    Sonography, or the graphic recording of sounds, has been proposed as an objective measure of various pathological conditions in the temporomandibular joint. Various electronic devices have been developed to enhance our ability to auscultate the joint, monitor the timing of the sounds with jaw movement, and analyze the characteristics of the sound; the intent of these devices is to diagnose the intracapsular condition "objectively." This review paper critically evaluates the advantages and limitations of this technique. Based on the existing literature, these instruments can record sounds; however, the origin of these sounds is uncertain, since room noise, skin and hair sounds, respiration, arterial blood flow, and cross-over noises from the opposite TMJ have not been excluded as possible artifacts of the recording. More important, the diagnostic specificity, as an indicator of each type of TMJ disease, has not been clearly and consistently demonstrated with the sonographic technique.

  9. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  10. Factors affecting the quality of sound recording for speech and voice analysis.

    Science.gov (United States)

    Vogel, Adam P; Morgan, Angela T

    2009-01-01

    The importance and utility of objective evidence-based measurement of the voice is well documented. Therefore, greater consideration needs to be given to the factors that influence the quality of voice and speech recordings. This manuscript aims to bring together the many features that affect acoustically acquired voice and speech. Specifically, the paper considers the practical requirements of individual speech acquisition configurations through examining issues relating to hardware, software and microphone selection, the impact of environmental noise, analogue to digital conversion and file format as well as the acoustic measures resulting from varying levels of signal integrity. The type of recording environment required by a user is often dictated by a variety of clinical and experimental needs, including: the acoustic measures being investigated; portability of equipment; an individual's budget; and the expertise of the user. As the quality of recorded signals is influenced by many factors, awareness of these issues is essential. This paper aims to highlight the importance of these methodological considerations to those previously uninitiated with voice and speech acoustics. With current technology, the highest quality recording would be made using a stand-alone hard disc recorder, an independent mixer to attenuate the incoming signal, and insulated wiring combined with a high quality microphone in an anechoic chamber or sound treated room.

  11. Web application for recording learners’ mouse trajectories and retrieving their study logs for data analysis

    Directory of Open Access Journals (Sweden)

    Yoshinori Miyazaki

    2012-03-01

    Full Text Available With the accelerated implementation of e-learning systems in educational institutions, it has become possible to record learners’ study logs in recent years. It must be admitted that little research has been conducted upon the analysis of the study logs that are obtained. In addition, there is no software that traces the mouse movements of learners during their learning processes, which the authors believe would enable teachers to better understand their students’ behaviors. The objective of this study is to develop a Web application that records students’ study logs, including their mouse trajectories, and to devise an IR tool that can summarize such diversified data. The results of an experiment are also scrutinized to provide an analysis of the relationship between learners’ activities and their study logs.

  12. Role-based access control through on-demand classification of electronic health record.

    Science.gov (United States)

    Tiwari, Basant; Kumar, Abhay

    2015-01-01

    Electronic health records (EHR) provides convenient method to exchange medical information of patients between different healthcare providers. Access control mechanism in healthcare services characterises authorising users to access EHR records. Role Based Access Control helps to restrict EHRs to users in a certain role. Significant works have been carried out for access control since last one decade but little emphasis has been given to on-demand role based access control. Presented work achieved access control through physical data isolation which is more robust and secure. We propose an algorithm in which selective combination of policies for each user of the EHR database has been defined. We extend well known data mining technique 'classification' to group EHRs with respect to the given role. Algorithm works by taking various roles as class and defined their features as a vector. Here, features are used as a Feature Vector for classification to describe user authority.

  13. Role-based access control through on-demand classification of electronic health record.

    Science.gov (United States)

    Tiwari, Basant; Kumar, Abhay

    2015-01-01

    Electronic health records (EHR) provides convenient method to exchange medical information of patients between different healthcare providers. Access control mechanism in healthcare services characterises authorising users to access EHR records. Role Based Access Control helps to restrict EHRs to users in a certain role. Significant works have been carried out for access control since last one decade but little emphasis has been given to on-demand role based access control. Presented work achieved access control through physical data isolation which is more robust and secure. We propose an algorithm in which selective combination of policies for each user of the EHR database has been defined. We extend well known data mining technique 'classification' to group EHRs with respect to the given role. Algorithm works by taking various roles as class and defined their features as a vector. Here, features are used as a Feature Vector for classification to describe user authority. PMID:26559071

  14. Signals embedded in the OBS records, in light of Gabor Spectral Analysis

    Science.gov (United States)

    Chang, T.; Wang, Y.; Chang, C.; Lee, C.

    2005-12-01

    Since the last decades, seismological survey has been expanded to marine area, with goal of making up the deficiency of seismogenic study outside the land. Although teleseismic data can resolve plate boundaries location and certain seismic parameters for great earthquake, local seismogenic frame can be merely emerged by the seismic network in situ. The Ocean Bottom Seismometer (OBS), therefore, is developing for this kind of purpose and becoming an important facility for seismological study. This work introduces a synthesized spectral method to analyze the seismograms recorded by 15 OBS deployed at the Okinawa trough in 14 days (Nov. 19 ~Dec. 2, 2003). Geological background of Okinawa trough is well known to correspond with the back-arc spreading in the regime of the Philippine Sea plate subducting northward beneath the Eurasia plate. As the complex affections at sea bottom, for instance, strong current, slope slumping, turbidite flow, and even sea animal attack, the OBS seismogram show a rather noisy sequence in comparison with the record on land. However, hundreds of tectonic earthquake can be extracted from such noisy records (done by Drs. Lin and Sibuet). Our job is to sort out the signals with the distinguishable sources by means of a systematically spectral analysis. The continuous wavelet transform and short-term Fourier transform, both taking Gaussian function as kernel, are synthesized as the Gabor transform in data process. The use of a limited Gaussian window along time axis with negligible low frequency error can largely enhance the stability of discrete Fourier spectrum. With a proper window factor selection, the Gabor transform can improve the resolution of spectrogram in time domain. We have converted the OBS records into spectrograms to detect the variation of signal causes. Up-to-date, some tremors signals and strong current oscillations have been told apart from these continuous records with varied frequency composing. We anticipate the further

  15. Implications of the Java language on computer-based patient records.

    OpenAIRE

    Pollard, D.; Kucharz, E.; Hammond, W.E.

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal so...

  16. Comparison of experimental approaches to study selective properties of thick phase-amplitude holograms recorded in materials with diffusion-based formation mechanisms

    Science.gov (United States)

    Borisov, Vladimir; Klepinina, Mariia; Veniaminov, Andrey; Angervaks, Aleksandr; Shcheulin, Aleksandr; Ryskin, Aleksandr

    2016-04-01

    Volume holographic gratings, both transmission and reflection-type, may be employed as one-dimensional pho- tonic crystals. More complex two- and three-dimensional holographic photonic-crystalline structures can be recorded using several properly organized beams. As compared to colloidal photonic crystals, their holographic counterparts let minimize distortions caused by multiple inner boundaries of the media. Unfortunately, it's still hard to analyze spectral response of holographic structures. This work presents the results of thick holographic gratings analysis based on spectral-angular selectivity contours approximation. The gratings were recorded in an additively colored fluorite crystal and a glassy polymer doped with phenanthrenequinone (PQ-PMMA). The two materials known as promising candidates for 3D diffraction optics including photonic crystals, employ diffusion-based mechanisms of grating formation. The surfaces of spectral-angular selectivity were obtained in a single scan using a white-light LED, rotable table and a matrix spectrometer. The data expressed as 3D plots make apparent visual estimation of the grating phase/amplitude nature, noninearity of recording, etc., and provide sufficient information for numerical analysis. The grating recorded in the crystal was found to be a mixed phase-amplitude one, with different contributions of refractive index and absorbance modulation at different wavelengths, and demonstrated three diffraction orders corresponding to its three spatial harmonics originating from intrinsically nonlinear diffusion-drift recording mechanism. Contrastingly, the grating in the polymeric medium appeared purely phase and linearly recorded.

  17. Natural Language Processing Based Instrument for Classification of Free Text Medical Records

    Directory of Open Access Journals (Sweden)

    Manana Khachidze

    2016-01-01

    Full Text Available According to the Ministry of Labor, Health and Social Affairs of Georgia a new health management system has to be introduced in the nearest future. In this context arises the problem of structuring and classifying documents containing all the history of medical services provided. The present work introduces the instrument for classification of medical records based on the Georgian language. It is the first attempt of such classification of the Georgian language based medical records. On the whole 24.855 examination records have been studied. The documents were classified into three main groups (ultrasonography, endoscopy, and X-ray and 13 subgroups using two well-known methods: Support Vector Machine (SVM and K-Nearest Neighbor (KNN. The results obtained demonstrated that both machine learning methods performed successfully, with a little supremacy of SVM. In the process of classification a “shrink” method, based on features selection, was introduced and applied. At the first stage of classification the results of the “shrink” case were better; however, on the second stage of classification into subclasses 23% of all documents could not be linked to only one definite individual subclass (liver or binary system due to common features characterizing these subclasses. The overall results of the study were successful.

  18. Multi-image Photogrammetry for Underwater Archaeological Site Recording: An Accessible, Diver-Based Approach

    Science.gov (United States)

    McCarthy, John; Benjamin, Jonathan

    2014-06-01

    This article presents a discussion of recent advances in underwater photogrammetric survey, illustrated by case studies in Scotland and Denmark between 2011 and 2013. Results from field trials are discussed with the aim of illustrating practical low-cost solutions for recording underwater archaeological sites in 3D using photogrammetry and using this data to offer enhanced recording, interpretation and analysis. We argue that the availability of integrated multi-image photogrammetry software, highly light-sensitive digital sensors and wide-aperture compact cameras, now allow for simple work flows with minimal equipment and excellent natural colour images even at depths of up to 30 m. This has changed the possibilities for underwater photogrammetric recording, which can now be done on a small scale, through the use of a single camera and automated work flow. The intention of this paper is to demonstrate the quality and versatility of the `one camera/ambient light/integrated software' technique through the case studies presented and the results derived from this process. We also demonstrate how the 3D data generated can be subjected to surface analysis techniques to enhance detail and to generate data-driven fly-throughs and reconstructions, opening the door to new avenues of engagement with both specialists and the wider public.

  19. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Science.gov (United States)

    Crescentini, Marco; Bennati, Marco; Saha, Shimul Chandra; Ivica, Josip; de Planque, Maurits; Morgan, Hywel; Tartagni, Marco

    2016-01-01

    High-throughput screening (HTS) using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM) is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i) design of scalable microfluidic devices; (ii) design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii) design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter. PMID:27213382

  20. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Directory of Open Access Journals (Sweden)

    Marco Crescentini

    2016-05-01

    Full Text Available High-throughput screening (HTS using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i design of scalable microfluidic devices; (ii design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  1. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording.

    Science.gov (United States)

    Crescentini, Marco; Bennati, Marco; Saha, Shimul Chandra; Ivica, Josip; de Planque, Maurits; Morgan, Hywel; Tartagni, Marco

    2016-01-01

    High-throughput screening (HTS) using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM) is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i) design of scalable microfluidic devices; (ii) design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii) design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter. PMID:27213382

  2. Multi-periodic climate dynamics: spectral analysis of long-term instrumental and proxy temperature records

    Directory of Open Access Journals (Sweden)

    H.-J. Lüdecke

    2012-09-01

    Full Text Available The longest six instrumental temperature records of monthly means reach back maximally to 1757 AD and were recorded in Europe. All six show a V-shape, with temperature drop in the 19th and rise in the 20th century. Proxy temperature time series of Antarctic ice cores show this same characteristic shape, indicating this pattern as a global phenomenon. We used the mean of the 6 instrumental records for analysis by discrete Fourier transformation (DFT, wavelets, and the detrended fluctuation method (DFA. For comparison, a stalagmite record was also analyzed by DFT. The harmonic decomposition of the mean shows only 6 significant frequencies above periods over 30 yr. The Pearson correlation between the mean, smoothed by a 15 yr running average (boxcar and the reconstruction using the 6 significant frequencies yields r = 0.961. This good agreement has a > 99.9% confidence level confirmed by Monte Carlo simulations. Assumption of additional forcing by anthropogenic green house gases would therefore not improve the agreement between measurement and temperature construction from the 6 documented periodicities. We find indications that the observed periodicities result from intrinsic system dynamics.

  3. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  4. Optimal discrimination and classification of neuronal action potential waveforms from multiunit, multichannel recordings using software-based linear filters.

    Science.gov (United States)

    Gozani, S N; Miller, J P

    1994-04-01

    We describe advanced protocols for the discrimination and classification of neuronal spike waveforms within multichannel electrophysiological recordings. The programs are capable of detecting and classifying the spikes from multiple, simultaneously active neurons, even in situations where there is a high degree of spike waveform superposition on the recording channels. The protocols are based on the derivation of an optimal linear filter for each individual neuron. Each filter is tuned to selectively respond to the spike waveform generated by the corresponding neuron, and to attenuate noise and the spike waveforms from all other neurons. The protocol is essentially an extension of earlier work [1], [13], [18]. However, the protocols extend the power and utility of the original implementations in two significant respects. First, a general single-pass automatic template estimation algorithm was derived and implemented. Second, the filters were implemented within a software environment providing a greatly enhanced functional organization and user interface. The utility of the analysis approach was demonstrated on samples of multiunit electrophysiological recordings from the cricket abdominal nerve cord.

  5. Paper-Based Medical Records: the Challenges and Lessons Learned from Studying Obstetrics and Gynaecological Post-Operation Records in a Nigerian Hospital

    Directory of Open Access Journals (Sweden)

    Adekunle Yisau Abdulkadir

    2010-10-01

    Full Text Available AIM: With the background knowledge that auditing of Medical Records (MR for adequacy and completeness is necessary if it is to be useful and reliable in continuing patient care; protection of the legal interest of the patient, physicians, and the Hospital; and meeting requirements for researches, we scrutinized theatre records of our hospital to identify routine omissions or deficiencies, and correctable errors in our MR system. METHOD: Obstetrics and Gynaecological post operation theatre records between January 2006 and December 2008 were quantitatively and qualitatively analyzed for details that included: hospital number; Patients age; diagnosis; surgery performed; types and modes of anesthesia; date of surgery; patients’ ward; Anesthetists names; surgeons and attending nurses names, and abbreviations used with SPSS 15.0 for Windows. RESULTS: Hardly were any of the 1270 surgeries during the study period documented without an omission or an abbreviation. Hospital numbers and patients’ age were not documented in 21.8% (n=277 and 59.1% (n=750 respectively. Diagnoses and surgeries were recorded with varying abbreviations in about 96% of instances. Surgical team names were mostly abbreviated or initials only given. CONCLUSION: To improve the quality of Paper-based Medical Record, regular auditing, training and good orientation of medical personnel for good record practices, and discouraging large volume record book to reduce paper damages and sheet loss from handling are necessary else what we record toady may neither be useful nor available tomorrow. [TAF Prev Med Bull 2010; 9(5.000: 427-432

  6. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  7. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  8. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  9. An empirical approach to predicting long term behavior of metal particle based recording media

    Science.gov (United States)

    Hadad, Allan S.

    1991-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. Since iron particles used for magnetic recording are small, additional oxidation has a direct impact on performance especially where archival storage of recorded information for long periods of time is important. Further stabilization chemistry/processes had to be developed to guarantee that iron particles could be considered as a viable long term recording medium. In an effort to retard the diffusion of iron ions through the oxide layer, other elements such as silicon, aluminum, and chromium have been added to the base iron to promote more dense scale formation or to alleviate some of the non-stoichiometric behavior of the oxide or both. The presence of water vapor has been shown to disrupt the passive layer, subsequently increasing the oxidation rate of the iron. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 deg C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  10. European temperature records of the past five centuries based on documentary information compared to climate simulations

    Science.gov (United States)

    Zorita, E.

    2009-09-01

    Two European temperature records for the past half-millennium, January-to-April air temperature for Stockholm (Sweden) and seasonal temperature for a Central European region, both derived from the analysis of documentary sources combined with long instrumental records, are compared with the output of forced (solar, volcanic, greenhouse gases) climate simulations with the model ECHO-G. The analysis is complemented with the long (early)-instrumental record of Central England Temperature (CET). Both approaches to study past climates (simulations and reconstructions) are burdened with uncertainties. The main objective of this comparative analysis is to identify robust features and weaknesses that may help to improve models and reconstruction methods. The results indicate a general agreement between simulations and the reconstructed Stockholm and CET records regarding the long-term temperature trend over the recent centuries, suggesting a reasonable choice of the amplitude of the solar forcing in the simulations and sensitivity of the model to the external forcing. However, the Stockholm reconstruction and the CET record also show a long and clear multi-decadal warm episode peaking around 1730, which is absent in the simulations. The uncertainties associated with the reconstruction method or with the simulated internal climate variability cannot easily explain this difference. Regarding the interannual variability, the Stockholm series displays in some periods higher amplitudes than the simulations but these differences are within the statistical uncertainty and further decrease if output from a regional model driven by the global model is used. The long-term trends in the simulations and reconstructions of the Central European temperature agree less well. The reconstructed temperature displays, for all seasons, a smaller difference between the present climate and past centuries than the simulations. Possible reasons for these differences may be related to a limitation

  11. Study on key techniques for camera-based hydrological record image digitization

    Science.gov (United States)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping

    2015-10-01

    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  12. An integrable, web-based solution for easy assessment of video-recorded performances

    Directory of Open Access Journals (Sweden)

    Subhi Y

    2014-05-01

    Full Text Available Yousif Subhi,1,2,3 Tobias Todsen,1,4 Lars Konge1,21Centre for Clinical Education, Centre for HR, The Capital Region of Denmark, Copenhagen, Denmark; 2University of Copenhagen, Copenhagen, Denmark; 3Clinical Eye Research Unit, Department of Ophthalmology, Copenhagen University Hospital Roskilde, Roskilde, Denmark; 4Department of Internal Medicine, Queen Ingrid's Hospital, Nuuk, GreenlandAbstract: Assessment of clinical competencies by direct observation is problematic for two main reasons: the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; however, managing video recordings and assessment sheets is troublesome and may lead to missing or incorrect data. Currently, no existing software solution can provide a local solution for the management of videos and assessments but this is necessary as assessment scores are confidential information, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA.Keywords: education assessment, assessment software, video-based assessment

  13. Beat-to-beat analysis method for magnetocardiographic recordings during interventions

    International Nuclear Information System (INIS)

    Multichannel magnetocardiography (MCG) during exercise testing has been shown to detect myocardial ischaemia in patients with coronary artery disease. Previous studies on exercise MCG have focused on one or few time intervals during the recovery period and only a fragment of the data available has been utilized. We present a method for beat-to-beat analysis and parametrization of the MCG signal. The method can be used for studying and quantifying the changes induced in the MCG by interventions. We test the method with data recorded in bicycle exercise testing in healthy volunteers and patients with coronary artery disease. Information in all cardiac cycles recorded during the recovery period of exercise MCG testing is, for the first time, utilized in the signal analysis. Exercise-induced myocardial ischaemia was detected by heart rate adjustment of change in magnetic field map orientation. In addition to the ST segment, the T wave in the MCG was also found to provide information related to myocardial ischaemia. The method of analysis efficiently utilizes the spatial and temporal properties of multichannel MCG mapping, providing a new tool for detecting and quantifying fast phenomena during interventional MCG studies. The method can also be applied to an on-line analysis of MCG data. (author)

  14. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens

    2013-09-01

    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  15. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record].

    Science.gov (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen

    2015-06-01

    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital. PMID:26485982

  16. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record].

    Science.gov (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen

    2015-06-01

    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital.

  17. Development of Co-Cr-based longitudinal magnetic recording media:Thermodynamic consideration

    Institute of Scientific and Technical Information of China (English)

    QIN Gao-wu; K. Oikawa

    2004-01-01

    This paper reviews our recent work on development of Co-Cr-based longitudinal magnetic recording media through the point of view of thermodynamics. It focuses on our experimental finding on the miscibility gap in the fcc α-Co phase region of the Co-Cr binary system, and on the predictions on the improvements of magnetic properties of many Co-Cr-Z ternary systems by thermodynamic computing on the basis of the newly-assessed Co-Cr binary thermodynamic parameters. Good agreement in the phase separation behavior of many Co-Cr-Z (Z=Pt, Ta, Ge)alloy systems between the calculation and the experiments has been achieved, as discussed in detail in the full paper.By the same token, many other elements, such as Ir, P, B, Mo, Zr, Nb, have been predicted to improve the magnetic grain isolation of the potential Co-Cr-Z multicomponent magnetic recording media in the future.

  18. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  19. Space and Astrophysical Plasmas : Matched filtering-parameter estimation method and analysis of whistlers recorded at Varanasi

    Indian Academy of Sciences (India)

    R P Singh; R P Patel; Ashok K Singh; D Hamar; J Lichtenberger

    2000-11-01

    The matched filtering technique is based on the digital-construction of theoretical whistlers and their comparison with observed whistlers. The parameters estimated from the theoretical and experimental whistler curves are matched to have higher accuracy using digital filters. This yields a resolution ten times better in the time domain. We have tested the applicability of this technique for the analysis of whistlers recorded at Varanasi. It is found that the whistlers have propagated along > 2 and have wave normal angles after exiting from the ionosphere such that they propagate towards equator in the earth-ionosphere wave-guide. High-resolution analysis shows the presence of fine structures present in the dynamic spectrum. An effort is made to interpret the results.

  20. Laboratory-based recording of holographic fine structure in X-ray absorption anisotropy using polycapillary optics

    Energy Technology Data Exchange (ETDEWEB)

    Dabrowski, K.M. [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland); Korecki, P., E-mail: pawel.korecki@uj.edu.pl [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer Holographic fine structures in X-ray absorption recorded using a tabletop setup. Black-Right-Pointing-Pointer Setup based on polycapillary collimating optics and an HOPG crystal. Black-Right-Pointing-Pointer Demonstration of element sensitivity by detection of X-ray fluorescence. Black-Right-Pointing-Pointer Potential of laboratory-based experiments for heavily doped crystals and thin films. - Abstract: A tabletop setup composed of a collimating polycapillary optics and a highly oriented pyrolytic graphite monochromator (HOPG) was characterized and used for recording two-dimensional maps of X-ray absorption anisotropy (XAA). XAA originates from interference of X-rays directly inside the sample. Depending on experimental conditions, fine structures in XAA can be interpreted in terms of X-ray holograms or X-ray standing waves and can be used for an element selective atomic-resolved structural analysis. The implementation of polycapillary optics resulted in a two-order of magnitude gain in the radiant intensity (photons/s/solid angle) as compared to a system without optics and enabled efficient recording of XAA with a resolution of 0.15 Degree-Sign for Mo K{alpha} radiation. Element sensitivity was demonstrated by acquisition of distinct XAA signals for Ga and As atoms in a GaAs (1 1 1) wafer by using X-ray fluorescence as a secondary signal. These results indicate the possibility of performing laboratory-based XAA experiments for heavily doped single crystals or thin films. So far, because of the weak holographic modulation of XAA, such experiments could be only performed using synchrotron radiation.

  1. A Multi-Scaler Recording System and its Application to Radiometric ''Off-Line'' Analysis

    International Nuclear Information System (INIS)

    In large complex reprocessing plants a great deal has been done over the past few years to provide in-line instrumentation for the contemporary analysis of process stream content and characteristics. However, these instruments have a qualitative rather than a quantitative part to play in the overall control of the plant. Quantitative information, which must be obtained for control and accounting purposes, demands and relies upon the efficient use of laboratory techniques and instrumentation for the precise analysis of representative samples taken from the process streams. These techniques, in particular those involving pulse counting systems, can be made automatic with modern instrumentation, such as will be described, in which the data is obtained in digital form in electronic stores (scalers). To support a large plant there will be many separate counting systems of this kind, independently controlled and therefore having no time correlation between them. The automatic recording system described in the paper provides a common data read-out facility for more than 50 independently operating counting systems, recording scaler data, together with associated sample and system identification and the absolute time occurrence of each read-out. The data can be recorded, in forms suitable for subsequent processing by a computer, on a variety of tape and card punches, serial and parallel printers or magnetic tape. In addition, the whole recording system, including the scalers in any one system, can be checked for correct operation on an automatic routine basis which does not interfere with the operation of other counting systems. It is concluded that the effective quantitative control of a plant rests on a rapid efficient sample analysis under laboratory conditions. It is probable that future developments of ''off-line'' facilities rather than on-line instrumentation will be possible and more worthwhile. The desirable characteristics of instrumentation for such a laboratory

  2. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  3. Comparison of the Hazard Mapping System (HMS) fire product to ground-based fire records in Georgia, USA

    Science.gov (United States)

    Hu, Xuefei; Yu, Chao; Tian, Di; Ruminski, Mark; Robertson, Kevin; Waller, Lance A.; Liu, Yang

    2016-03-01

    Biomass burning has a significant and adverse impact on air quality, climate change, and various ecosystems. The Hazard Mapping System (HMS) detects fires using data from multiple satellite sensors in order to maximize its fire detection rate. However, to date, the detection rate of the HMS fire product for small fires has not been well studied, especially using ground-based fire records. This paper utilizes the 2011 fire information compiled from ground observations and burn authorizations in Georgia to assess the comprehensiveness of the HMS active fire product. The results show that detection rates of the hybrid HMS increase substantially by integrating multiple satellite instruments. The detection rate increases dramatically from 3% to 80% with an increase in fire size from less than 0.02 km2 to larger than 2 km2, resulting in detection of approximately 12% of all recorded fires which represent approximately 57% of the total area burned. The spatial pattern of detection rates reveals that grid cells with high detection rates are generally located in areas where large fires occur frequently. The seasonal analysis shows that overall detection rates in winter and spring (12% and 13%, respectively) are higher than those in summer and fall (3% and 6%, respectively), mainly because of higher percentages of large fires (>0.19 km2) that occurred in winter and spring. The land cover analysis shows that detection rates are 2-7 percentage points higher in land cover types that are prone to large fires such as forestland and shrub land.

  4. ANALYSIS-BASED SPARSE RECONSTRUCTION WITH SYNTHESIS-BASED SOLVERS

    OpenAIRE

    Cleju, Nicolae; Jafari, Maria,; Plumbley, Mark D.

    2012-01-01

    Analysis based reconstruction has recently been introduced as an alternative to the well-known synthesis sparsity model used in a variety of signal processing areas. In this paper we convert the analysis exact-sparse reconstruction problem to an equivalent synthesis recovery problem with a set of additional constraints. We are therefore able to use existing synthesis-based algorithms for analysis-based exact-sparse recovery. We call this the Analysis-By-Synthesis (ABS) approach. We evaluate o...

  5. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  6. Similarities and differences of doctor-patient co-operated evidence-based medical record of treating digestive system diseases with integrative medicine compared with traditional medical records

    Institute of Scientific and Technical Information of China (English)

    Bo Li; Wen-Hong Shao; Yan-Da Li; Ying-Pan Zhao; Qing-Na Li; Zhao Yang; Hong-Cai Shang

    2016-01-01

    遵循叙事循证医学理念,咨询中西医消化内科及循证医学专家,凝练医患共建式病历的理论,建立医患共建式病历的范本,对比医患共建式病历与传统病历记录的不同,分析医患共建式病历的优缺点。思考与展望:医患共建式病历有可能成为中西医合作治疗脾胃病疗效评价方法学体系的一个要素。%Objective: To establish the model of doctor-patient cooperated record, based on the concepts of narrative evidence-based medicine and related theories on Doctor-Patient Co-operated Evidence-Based Medical Record. Methods: We conducted a literature search from Pubmed, following the principles of narrative evidence-based medicine, and refer to the advice of experts of digestive system and EBM in both traditional Chinese medicine and Western medicine. Result: This research is a useful attempt to discuss the establishment of doctor-patient co-operated evidence-based medical record guided by narrative evidence-based medicine. Conclusion:Doctor-patient co-operated medical record can become a key factor of the curative effect evaluation methodology system of integrated therapy of tradition Chinese medicine and Western medicine on spleen and stomach diseases.

  7. Development and programming of Geophonino: A low cost Arduino-based seismic recorder for vertical geophones

    Science.gov (United States)

    Soler-Llorens, J. L.; Galiana-Merino, J. J.; Giner-Caturla, J.; Jauregui-Eslava, P.; Rosa-Cintas, S.; Rosa-Herranz, J.

    2016-09-01

    The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.

  8. A Critical Ear: Analysis of Value Judgments in Reviews of Beethoven's Piano Sonata Recordings.

    Science.gov (United States)

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron

    2016-01-01

    What sets a great music performance apart? In this study, we addressed this question through an examination of value judgments in written criticism of recorded performance. One hundred reviews of recordings of Beethoven's piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative) expressed by critics' statements and the evaluation criteria that underpinned their judgments. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording. PMID:27065900

  9. Personal dose analysis of TLD glow curve data from individual monitoring records

    International Nuclear Information System (INIS)

    Radiation exposure of workers in Ghana have been estimated on the basis of personal dose records of the occupationally exposed in medical, industrial and research/teaching practices for the period 2008-09. The estimated effective doses for 2008 are 0.400, 0.495 and 0.426 mSv for medical, industrial and research/teaching practices, respectively. The corresponding collective effective doses are 0.128, 0.044 and 0.017 person-Sv, respectively. Similarly, the effective doses recorded in 2009 are 0.448, 0.545 and 0.388 mSv, respectively with corresponding collective effective doses of 0.108, 0.032 and 0.012 person-Sv, respectively. The study shows that occupational exposure in Ghana is skewed to the lower doses (between 0.001 and 0.500 mSv). A statistical analysis of the effective doses showed no significant difference at p < 0.05 among the means of the effective doses recorded in various practices. (authors)

  10. A Critical Ear: Analysis of Value Judgments in Reviews of Beethoven's Piano Sonata Recordings.

    Science.gov (United States)

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron

    2016-01-01

    What sets a great music performance apart? In this study, we addressed this question through an examination of value judgments in written criticism of recorded performance. One hundred reviews of recordings of Beethoven's piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative) expressed by critics' statements and the evaluation criteria that underpinned their judgments. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  11. A quantitative analysis of signal reproduction from cylinder recordings measured via noncontact full surface mapping.

    Science.gov (United States)

    Nascè, Antony; Hill, Martyn; McBride, John W; Boltryk, Peter J

    2008-10-01

    Sound reproduction via a noncontact surface mapping technique has great potential for sound archives, aiming to digitize content from early sound recordings such as wax cylinders, which may otherwise be "unplayable" with a stylus. If the noncontact techniques are to be considered a viable solution for sound archivists, a method for quantifying the quality of the reproduced signal needs to be developed. In this study, a specially produced test cylinder recording, encoded with sinusoids, provides the basis for the first quantitative analysis of signal reproduction from the noncontact full surface mapping method. The sampling and resolution of the measurement system are considered with respect to the requirements for digital archiving of cylinder recordings. Two different methods of audio signal estimation from a discrete groove cross section are described and rated in terms of signal-to-noise ratio and total harmonic distortion. Noncontact and stylus methods of sound reproduction are then compared using the same test cylinder. It is shown that noncontact methods appear to have distinct advantages over stylus reproduction, in terms of reduced harmonic distortion and lower frequency modulation. PMID:19062844

  12. A critical ear: Analysis of value judgements in reviews of Beethoven’s piano sonata recordings

    Directory of Open Access Journals (Sweden)

    Elena eAlessandri

    2016-03-01

    Full Text Available What sets a great music performance apart? In this study we addressed this question through an examination of value judgements in written criticism of recorded performance. One hundred reviews of recordings of Beethoven’s piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative expressed by critics’ statements and the evaluation criteria that underpinned their judgements. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  13. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    Science.gov (United States)

    Prouty, N.G.; Jupiter, S.D.; Field, M.E.; McCulloch, M.T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologie and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (-40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge. Copyright 2009 by the American Geophysical Union.

  14. Design and implementation of web-based mobile electronic medication administration record.

    Science.gov (United States)

    Hsieh, Sung-Huai; Hou, I-Ching; Cheng, Po-Hsun; Tan, Ching-Ting; Shen, Po-Chao; Hsu, Kai-Ping; Hsieh, Sheau-Ling; Lai, Feipei

    2010-10-01

    Patients' safety is the most essential, critical issue, however, errors can hardly prevent, especially for human faults. In order to reduce the errors caused by human, we construct Electronic Health Records (EHR) in the Health Information System (HIS) to facilitate patients' safety and to improve the quality of medical care. During the medical care processing, all the tasks are based upon physicians' orders. In National Taiwan University Hospital (NTUH), the Electronic Health Record committee proposed a standard of order flows. There are objectives of the standard: first, to enhance medical procedures and enforce hospital policies; secondly, to improve the quality of medical care; third, to collect sufficient, adequate data for EHR in the near future. Among the proposed procedures, NTUH decides to establish a web-based mobile electronic medication administration record (ME-MAR) system. The system, build based on the service-oriented architecture (SOA) as well as embedded the HL7/XML standard, is installed in the Mobile Nursing Carts. It also implement accompany with the advanced techniques like Asynchronous JavaScript and XML (Ajax) or Web services to enhance the system usability. According to researches, it indicates that medication errors are highly proportion to total medical faults. Therefore, we expect the ME-MAR system can reduce medication errors. In addition, we evaluate ME-MAR can assist nurses or healthcare practitioners to administer, manage medication properly. This successful experience of developing the NTUH ME-MAR system can be easily applied to other related system. Meanwhile, the SOA architecture of the system can also be seamless integrated to NTUH or other HIS system. PMID:20703613

  15. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    Science.gov (United States)

    Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.

  16. Extracting physician group intelligence from electronic health records to support evidence based medicine.

    Directory of Open Access Journals (Sweden)

    Griffin M Weber

    Full Text Available Evidence-based medicine employs expert opinion and clinical data to inform clinical decision making. The objective of this study is to determine whether it is possible to complement these sources of evidence with information about physician "group intelligence" that exists in electronic health records. Specifically, we measured laboratory test "repeat intervals", defined as the amount of time it takes for a physician to repeat a test that was previously ordered for the same patient. Our assumption is that while the result of a test is a direct measure of one marker of a patient's health, the physician's decision to order the test is based on multiple factors including past experience, available treatment options, and information about the patient that might not be coded in the electronic health record. By examining repeat intervals in aggregate over large numbers of patients, we show that it is possible to 1 determine what laboratory test results physicians consider "normal", 2 identify subpopulations of patients that deviate from the norm, and 3 identify situations where laboratory tests are over-ordered. We used laboratory tests as just one example of how physician group intelligence can be used to support evidence based medicine in a way that is automated and continually updated.

  17. Designing ETL Tools to Feed a Data Warehouse Based on Electronic Healthcare Record Infrastructure.

    Science.gov (United States)

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2015-01-01

    Aim of this paper is to propose a methodology to design Extract, Transform and Load (ETL) tools in a clinical data warehouse architecture based on the Electronic Healthcare Record (EHR). This approach takes advantages on the use of this infrastructure as one of the main source of information to feed the data warehouse, taking also into account that clinical documents produced by heterogeneous legacy systems are structured using the HL7 CDA standard. This paper describes the main activities to be performed to map the information collected in the different types of document with the dimensional model primitives.

  18. Interval Estimation of Stress-Strength Reliability Based on Lower Record Values from Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Bahman Tarvirdizade

    2014-01-01

    Full Text Available We consider the estimation of stress-strength reliability based on lower record values when X and Y are independently but not identically inverse Rayleigh distributed random variables. The maximum likelihood, Bayes, and empirical Bayes estimators of R are obtained and their properties are studied. Confidence intervals, exact and approximate, as well as the Bayesian credible sets for R are obtained. A real example is presented in order to illustrate the inferences discussed in the previous sections. A simulation study is conducted to investigate and compare the performance of the intervals presented in this paper and some bootstrap intervals.

  19. Reconstruction of Subdecadal Changes in Sunspot Numbers Based on the NGRIP 10Be Record

    DEFF Research Database (Denmark)

    Inceoglu, Fadil; Knudsen, Mads Faurschou; Karoff, Christoffer;

    2014-01-01

    in solar activity levels before 1610 relies on proxy records of solar activity stored in natural archives, such as 10Be in ice cores and 14C in tree rings. These cosmogenic radionuclides are produced by the interaction between Galactic cosmic rays (GCRs) and atoms in the Earth's atmosphere......, to reconstruct both long-term and subdecadal changes in sunspot numbers (SSNs). We compare three different approaches for reconstructing subdecadal-scale changes in SSNs, including a linear approach and two approaches based on the hysteresis effect, i.e. models with ellipse-linear and ellipse relationships...

  20. Development of Software for dose Records Data Base Access; Programacion para la consulta del Banco de Datos Dosimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M.

    1990-07-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs.

  1. Model based paleoclimate interpretations of Holocene oxygen isotope records from the Pacific Northwest

    Science.gov (United States)

    Steinman, B. A.; Pompeani, D. P.; Abbott, M. B.; Ortiz, J. D.; Stansell, N.; Mihindukulasooriya, L. N.; Hillman, A. L.; Finkenbinder, M. S.

    2015-12-01

    Oxygen isotope measurements of authigenic carbonate from Cleland Lake (British Columbia), Paradise Lake (British Columbia), and Lime Lake (Washington) provide an ~9,000 year Holocene record of precipitation-evaporation balance variations in the Pacific Northwest. Both Cleland Lake and Paradise Lake are small, surficially closed-basin systems with no active inflows or outflows. Lime Lake is surficially open with a seasonally active overflow. We sampled the lake sediment cores at 1-60 mm intervals (~3-33 years per sample on average) and measured the isotopic composition of fine-grained, authigenic CaCO3 in each sample. Negative δ18O values, which indicate wetter conditions in closed-basin lakes, occur in Cleland Lake and Paradise Lake sediment during the mid-Holocene and are followed by more positive δ18O values, which suggest drier conditions, in the late Holocene. The δ18O record from Lime Lake, which principally reflects changes in the isotopic composition of precipitation, exhibits less variability than the closed-basin lake records and follows an increasing trend from the mid-Holocene to present. Power spectrum analysis of the Cleland Lake δ18O data from 1,000 yr BP to present demonstrates significant periodicities of ~6 and ~67 years that likely reflect the enhancement of El Niño Southern Oscillation (ENSO) variability in the late Holocene with an associated multidecadal (i.e., 50 to 70 yr) component of the Pacific Decadal Oscillation. Results from mid-Holocene (6,000 yr BP) climate model simulations conducted as part of the Paleoclimate Modeling Intercomparison Project Phase 3 (PMIP3) indicate that in much of western North America, the cold season was wetter, and the warm season (April-September) was considerably drier (relative to the late Holocene), leading to an overall drier climate in western North America but with enhanced hydroclimatic seasonality. This is consistent with inferences from the Cleland and Paradise Lake δ18O records, which lake

  2. Analysis of the 23 June 2001 Southern Peru Earthquake Using Locally Recorded Seismic Data

    Science.gov (United States)

    Tavera, H.; Comte, D.; Boroschek, R.; Dorbath, L.; Portugal, D.; Haessler, H.; Montes, H.; Bernal, I.; Antayhua, Y.; Salas, H.; Inza, A.; Rodriguez, S.; Glass, B.; Correa, E.; Balmaceda, I.; Meneses, C.

    2001-12-01

    The 23 June 2001, Mw=8.4 southern Peru earthquake ruptured the northern and central part of the previous large earthquake occurred on 13 August 1868, Mw ~9. A detailed analysis of the aftershock sequence was possible due to the deployment of a temporary seismic network along the coast in the Arequipa and Moquegua districts, complementing the Peruvian permanent stations. The deployed temporary network included 10 short period three component stations from the U. of Chile-IRD-France and 7 broad-band seismic stations from the Instituto Geofísico del Perú. This network operated during the first weeks after the mainshock and recorded the major aftershocks like the larger one occurred on 7 July 2001, Mw=7.5, this event defines the southern limit of the rupture area of the 2001 Peruvian earthquake. The majority of the aftershocks shows a thrusting fault focal mechanisms according with the average convergence direction of the subducting Nazca plate, however, normal faulting events are also present in the aftershock sequence like the 5 July 2001, Mw=6.6 one. The depth distribution of the events permitted a detailed definition of the Wadati-Benioff zone in the region. The segment between Ilo and Tacna did not participated in the rupture process of the 2001 southern Peru earthquake. Seismicity located near the political Peruvian-Chilean boundary was reliable determined using the data recorded by the northern Chile permanent network. Analysis of the mainshock and aftershock acelerograms recorded in Arica, northern Chile are also included. The occurrence of the 1995 Antofagasta (Mw=8.0) and the 2001 southern Peru earthquakes suggests that the probability of having a major earthquake in the northern Chile region increased, considering that the previous large earthquake in this region happened in 1877 (Mw ~9), and since that time no earthquake with magnitude Mw>8 had occurred inside of the 1877 estimated rupture area (between Arica and Antofagasta).

  3. An analysis of concert saxophone vibrato through the examination of recordings by eight prominent soloists

    Science.gov (United States)

    Zinninger, Thomas

    This study examines concert saxophone vibrato through the analysis of several recordings of standard repertoire by prominent soloists. The vibrato of Vincent Abato, Arno Bornkamp, Claude Delangle, Jean-Marie Londeix, Marcel Mule, Otis Murphy, Sigurd Rascher, and Eugene Rousseau is analyzed with regards to rate, extent, shape, and discretionary use. Examination of these parameters was conducted through both general observation and precise measurements with the aid of a spectrogram. Statistical analyses of the results provide tendencies for overall vibrato use, as well as the effects of certain musical attributes (note length, tempo, dynamic, range) on vibrato. The results of this analysis are also compared among each soloist and against pre-existing theories or findings in vibrato research.

  4. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    OpenAIRE

    Haan; Lodder, J. C.

    1991-01-01

    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having perpendicular anisotropy. The measurements set-up that was built has a sensitivity capable of measuring the hysteresis loops of 0.2x0.2 mm2 Hall structures in Co-Cr and jumps were observed in the Hall vol...

  5. An iPad and Android-based Application for Digitally Recording Geologic Field Data

    Science.gov (United States)

    Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.

    2011-12-01

    Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file

  6. The effect of recording and analysis bandwidth on acoustic identification of delphinid species

    Science.gov (United States)

    Oswald, Julie N.; Rankin, Shannon; Barlow, Jay

    2004-11-01

    Because many cetacean species produce characteristic calls that propagate well under water, acoustic techniques can be used to detect and identify them. The ability to identify cetaceans to species using acoustic methods varies and may be affected by recording and analysis bandwidth. To examine the effect of bandwidth on species identification, whistles were recorded from four delphinid species (Delphinus delphis, Stenella attenuata, S. coeruleoalba, and S. longirostris) in the eastern tropical Pacific ocean. Four spectrograms, each with a different upper frequency limit (20, 24, 30, and 40 kHz), were created for each whistle (n=484). Eight variables (beginning, ending, minimum, and maximum frequency; duration; number of inflection points; number of steps; and presence/absence of harmonics) were measured from the fundamental frequency of each whistle. The whistle repertoires of all four species contained fundamental frequencies extending above 20 kHz. Overall correct classification using discriminant function analysis ranged from 30% for the 20-kHz upper frequency limit data to 37% for the 40-kHz upper frequency limit data. For the four species included in this study, an upper bandwidth limit of at least 24 kHz is required for an accurate representation of fundamental whistle contours..

  7. Estimating the frequency of extremely energetic solar events, based on solar, stellar, lunar, and terrestrial records

    CERN Document Server

    Schrijver, C J; Baltensperger, U; Cliver, E W; Guedel, M; Hudson, H S; McCracken, K G; Osten, R A; Peter, Th; Soderblom, D R; Usoskin, I G; Wolff, E W

    2012-01-01

    The most powerful explosions on the Sun [...] drive the most severe space-weather storms. Proxy records of flare energies based on SEPs in principle may offer the longest time base to study infrequent large events. We conclude that one suggested proxy, nitrate concentrations in polar ice cores, does not map reliably to SEP events. Concentrations of select radionuclides measured in natural archives may prove useful in extending the time interval of direct observations up to ten millennia, but as their calibration to solar flare fluences depends on multiple poorly known properties and processes, these proxies cannot presently be used to help determine the flare energy frequency distribution. Being thus limited to the use of direct flare observations, we evaluate the probabilities of large-energy solar explosions by combining solar flare observations with an ensemble of stellar flare observations. We conclude that solar flare energies form a relatively smooth distribution from small events to large flares, while...

  8. Eielson Air Force Base operable unit 2 and other areas record of decision

    International Nuclear Information System (INIS)

    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater

  9. Provincial prenatal record revision: a multiple case study of evidence-based decision-making at the population-policy level

    Directory of Open Access Journals (Sweden)

    Olson Joanne

    2008-12-01

    Full Text Available Abstract Background There is a significant gap in the knowledge translation literature related to how research evidence actually contributes to health care decision-making. Decisions around what care to provide at the population (rather than individual level are particularly complex, involving considerations such as feasibility, cost, and population needs in addition to scientific evidence. One example of decision-making at this "population-policy" level involves what screening questions and intervention guides to include on standardized provincial prenatal records. As mandatory medical reporting forms, prenatal records are potentially powerful vehicles for promoting population-wide evidence-based care. However, the extent to which Canadian prenatal records reflect best-practice recommendations for the assessment of well-known risk factors such as maternal smoking and alcohol consumption varies markedly across Canadian provinces and territories. The goal of this study is to better understand the interaction of contextual factors and research evidence on decision-making at the population-policy level, by examining the processes by which provincial prenatal records are reviewed and revised. Methods Guided by Dobrow et al.'s (2004 conceptual model for context-based evidence-based decision-making, this study will use a multiple case study design with embedded units of analysis to examine contextual factors influencing the prenatal record revision process in different Canadian provinces and territories. Data will be collected using multiple methods to construct detailed case descriptions for each province/territory. Using qualitative data analysis techniques, decision-making processes involving prenatal record content specifically related to maternal smoking and alcohol use will be compared both within and across each case, to identify key contextual factors influencing the uptake and application of research evidence by prenatal record review

  10. Robert Recorde

    CERN Document Server

    Williams, Jack

    2011-01-01

    The 16th-Century intellectual Robert Recorde is chiefly remembered for introducing the equals sign into algebra, yet the greater significance and broader scope of his work is often overlooked. This book presents an authoritative and in-depth analysis of the man, his achievements and his historical importance. This scholarly yet accessible work examines the latest evidence on all aspects of Recorde's life, throwing new light on a character deserving of greater recognition. Topics and features: presents a concise chronology of Recorde's life; examines his published works; describes Recorde's pro

  11. Stochasticity of Road Traffic Dynamics: Comprehensive Linear and Nonlinear Time Series Analysis on High Resolution Freeway Traffic Records

    CERN Document Server

    Siegel, H; Siegel, Helge; Belomestnyi, Dennis

    2006-01-01

    The dynamical properties of road traffic time series from North-Rhine Westphalian motorways are investigated. The article shows that road traffic dynamics is well described as a persistent stochastic process with two fixed points representing the freeflow (non-congested) and the congested state regime. These traffic states have different statistical properties, with respect to waiting time distribution, velocity distribution and autocorrelation. Logdifferences of velocity records reveal non-normal, obviously leptocurtic distribution. Further, linear and nonlinear phase-plane based analysis methods yield no evidence for any determinism or deterministic chaos to be involved in traffic dynamics on shorter than diurnal time scales. Several Hurst-exponent estimators indicate long-range dependence for the free flow state. Finally, our results are not in accordance to the typical heuristic fingerprints of self-organized criticality. We suggest the more simplistic assumption of a non-critical phase transition between...

  12. Astronomical calibration and global correlation of the Santonian (Cretaceous) based on the marine carbon isotope record

    Science.gov (United States)

    Thibault, N.; Jarvis, I.; Voigt, S.; Gale, A. S.; Attree, K.; Jenkyns, H. C.

    2016-06-01

    High-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the sections at Seaford Head (southern England) and Bottaccione (central Italy). An unambiguous stratigraphic correlation is presented for the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of carbon and oxygen isotopes at Seaford Head points to the Boreal Santonian spanning five 405 kyr cycles (Sa1 to Sa5). Correlation of the Seaford Head time scale to that of the Niobrara Formation (Western Interior Basin) permits anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19 ± 0.38 Ma. Among the five tuning options examined, option 2 places the Sa/Ca at the 84.2 Ma 405 kyr insolation minimum and appears as the most likely. This solution indicates that minima of the 405 kyr filtered output of the resistivity in the Niobrara Formation correlate to 405 kyr insolation minima in the astronomical solution and to maxima in the filtered δ13C of Seaford Head. We suggest that variance in δ13C is driven by climate forcing of the proportions of CaCO3 versus organic carbon burial on land and in oceanic basins. The astronomical calibration generates a 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, due either to diachronism of the lowest occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions or to remaining uncertainties of radiometric dating and cyclostratigraphic records.

  13. Feature-based sentiment analysis with ontologies

    OpenAIRE

    Taner, Berk

    2011-01-01

    Sentiment analysis is a topic that many researchers work on. In recent years, new research directions under sentiment analysis appeared. Feature-based sentiment analysis is one such topic that deals not only with finding sentiment in a sentence but providing a more detailed analysis on a given domain. In the beginning researchers focused on commercial products and manually generated list of features for a product. Then they tried to generate a feature-based approach to attach sentiments to th...

  14. Untangling inconsistent magnetic polarity records through an integrated rock magnetic analysis: A case study on Neogene sections in East Timor

    Science.gov (United States)

    Aben, F. M.; Dekkers, M. J.; Bakker, R. R.; van Hinsbergen, D. J. J.; Zachariasse, W. J.; Tate, G. W.; McQuarrie, N.; Harris, R.; Duffy, B.

    2014-06-01

    polarity patterns in sediments are a common problem in magnetostratigraphic and paleomagnetic research. Multiple magnetic mineral generations result in such remanence "haystacks." Here we test whether end-member modeling of isothermal remanent magnetization acquisition curves as a basis for an integrated rock magnetic and microscopic analysis is capable of isolating original magnetic polarity patterns. Uppermost Miocene-Pliocene deep-marine siliciclastics and limestones in East Timor, originally sampled to constrain the uplift history of the young Timor orogeny, serve as case study. An apparently straightforward polarity record was obtained that, however, proved impossible to reconcile with the associated biostratigraphy. Our analysis distinguished two magnetic end-members for each section, which result from various greigite suites and a detrital magnetite suite. The latter yields largely viscous remanence signals and is deemed unsuited. The greigite suites are late diagenetic in the Cailaco River section and early diagenetic, thus reliable, in the Viqueque Type section. By selecting reliable sample levels based on a quality index, a revised polarity pattern of the latter section is obtained: consistent with the biostratigraphy and unequivocally correlatable to the Geomagnetic Polarity Time Scale. Although the Cailaco River section lacks a reliable magnetostratigraphy, it does record a significant postremagnetization tectonic rotation. Our results shows that the application of well-designed rock magnetic research, based on the end-member model and integrated with microscopy and paleomagnetic data, can unravel complex and seemingly inconsistent polarity patterns. We recommend this approach to assess the veracity of the polarity of strata with complex magnetic mineralogy.

  15. Analysis of microseismic signals and temperature recordings for rock slope stability investigations in high mountain areas

    Science.gov (United States)

    Occhiena, C.; Coviello, V.; Arattano, M.; Chiarle, M.; Morra di Cella, U.; Pirulli, M.; Pogliotti, P.; Scavia, C.

    2012-07-01

    The permafrost degradation is a probable cause for the increase of rock instabilities and rock falls observed in recent years in high mountain areas, particularly in the Alpine region. The phenomenon causes the thaw of the ice filling rock discontinuities; the water deriving from it subsequently freezes again inducing stresses in the rock mass that may lead, in the long term, to rock falls. To investigate these processes, a monitoring system composed by geophones and thermometers was installed in 2007 at the Carrel hut (3829 m a.s.l., Matterhorn, NW Alps). In 2010, in the framework of the Interreg 2007-2013 Alcotra project no. 56 MASSA, the monitoring system has been empowered and renovated in order to meet project needs. In this paper, the data recorded by this renewed system between 6 October 2010 and 5 October 2011 are presented and 329 selected microseismic events are analysed. The data processing has concerned the classification of the recorded signals, the analysis of their distribution in time and the identification of the most important trace characteristics in time and frequency domain. The interpretation of the results has evidenced a possible correlation between the temperature trend and the event occurrence. The research is still in progress and the data recording and interpretation are planned for a longer period to better investigate the spatial-temporal distribution of microseismic activity in the rock mass, with specific attention to the relation of microseismic activity with temperatures. The overall goal is to verify the possibility to set up an effective monitoring system for investigating the stability of a rock mass under permafrost conditions, in order to supply the researchers with useful data to better understand the relationship between temperature and rock mass stability and, possibly, the technicians with a valid tool for decision-making.

  16. Analysis of microseismic signals and temperature recordings for rock slope stability investigations in high mountain areas

    Directory of Open Access Journals (Sweden)

    C. Occhiena

    2012-07-01

    Full Text Available The permafrost degradation is a probable cause for the increase of rock instabilities and rock falls observed in recent years in high mountain areas, particularly in the Alpine region. The phenomenon causes the thaw of the ice filling rock discontinuities; the water deriving from it subsequently freezes again inducing stresses in the rock mass that may lead, in the long term, to rock falls. To investigate these processes, a monitoring system composed by geophones and thermometers was installed in 2007 at the Carrel hut (3829 m a.s.l., Matterhorn, NW Alps. In 2010, in the framework of the Interreg 2007–2013 Alcotra project no. 56 MASSA, the monitoring system has been empowered and renovated in order to meet project needs.

    In this paper, the data recorded by this renewed system between 6 October 2010 and 5 October 2011 are presented and 329 selected microseismic events are analysed. The data processing has concerned the classification of the recorded signals, the analysis of their distribution in time and the identification of the most important trace characteristics in time and frequency domain. The interpretation of the results has evidenced a possible correlation between the temperature trend and the event occurrence.

    The research is still in progress and the data recording and interpretation are planned for a longer period to better investigate the spatial-temporal distribution of microseismic activity in the rock mass, with specific attention to the relation of microseismic activity with temperatures. The overall goal is to verify the possibility to set up an effective monitoring system for investigating the stability of a rock mass under permafrost conditions, in order to supply the researchers with useful data to better understand the relationship between temperature and rock mass stability and, possibly, the technicians with a valid tool for decision-making.

  17. Vocal registers of the countertenor voice: Based on signals recorded and analyzed in VoceVista

    Science.gov (United States)

    Chenez, Raymond

    Today's countertenors possess vocal ranges similar to the mezzo-soprano, and are trained to sing with a vibrant, focused tone. Little research has been conducted on the registers of the countertenor voice. Advancement in vocal techniques in the countertenor voice from the late 20th century to the present has been rapid. This treatise attempts to define the registers of the countertenor voice, and is intended as a resource for singers and teachers. The voices of eleven North American countertenors were recorded and analyzed using VoceVista Pro software, which was developed and designed by Donald Miller. Through spectrographic and electroglottographic analysis, the registers of the countertenor voice were identified and outlined.

  18. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson

    2010-06-01

    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  19. [Internal audit based on the recording critical incidents: the first results].

    Science.gov (United States)

    Terekhova, N N; Kazakova, E A; Sitnikov, A V

    2005-01-01

    The critical incident concept on which an internal medical audit is based has been proposed to comparatively assess different protocols of anesthesiological support. The purpose of this study was to develop a procedure and to implement it at an anesthesiological unit. The study included and analyzed 361 anesthesiological supports. The list of critical incidents (CIs) contained 53 items and was divided into 8 main groups. CIs were recorded in 42.1% of anesthesias: a total of 304 CIs were noted and the frequency of CIs (the number of recorded CIs per anesthesia was 0.84). The bulk of CIs was associated with the cardiovascular system and varying allergic reactions. The study also yielded data on the distribution of CIs in relation to the type of anesthesiological support, the type of a surgical intervention and the physical status of a patient (according to the ASA classification). This study has only opened a little way to internal audit and showed the importance of its routine use to assess different procedures for anesthesiological support.

  20. Experimental analysis of decay biases in the fossil record of lobopodians

    Science.gov (United States)

    Murdock, Duncan; Gabbott, Sarah; Purnell, Mark

    2016-04-01

    If fossils are to realize their full potential in reconstructing the tree of life we must understand how our view of ancient organisms is obscured by taphonomic filters of decay and preservation. In most cases, processes of decay will leave behind either nothing or only the most decay resistant body parts, and even in those rare instances where soft tissues are fossilized we cannot assume that the resulting fossil, however exquisite, represents a faithful anatomical representation of the animal as it was in life.Recent experiments have shown that the biases introduced by decay can be far from random; in chordates, for example, the most phylogenetically informative characters are also the most decay-prone, resulting in 'stemward slippage'. But how widespread is this phenomenon, and are there other non-random biases linked to decay? Intuitively, we make assumptions about the likelihood of different kinds of characters to survive and be preserved, with knock-on effects for anatomical and phylogenetic interpretations. To what extent are these assumptions valid? We combine our understanding of the fossil record of lobopodians with insights from decay experiments of modern onychophorans (velvet worms) to test these assumptions. Our analysis demonstrates that taphonomically informed tests of character interpretations have the potential to improve phylogenetic resolution. This approach is widely applicable to the fossil record - allowing us to ground-truth some of the assumptions involved in describing exceptionally preserved fossil material.

  1. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    Science.gov (United States)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  2. A new method for estimating morbidity rates based on routine electronic medical records in primary care

    NARCIS (Netherlands)

    Nielen, M.; Spronk, I.; Davids, R.; Korevaar, J.; Poos, R.; Hoeymans, N.; Opstelten, W.; Sande, M. van der; Biermans, M.; Schellevis, F.; Verheij, R.

    2016-01-01

    Background & Aim: Routinely recorded electronic health records (EHRs) from general practitioners (GPs) are increasingly available and provide valuable data for estimating incidence and prevalence rates of diseases in the general population. Valid morbidity rates are essential for patient management

  3. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  4. Mercury Determination in Fish Samples by Chronopotentiometric Stripping Analysis Using Gold Electrodes Prepared from Recordable CDs

    Directory of Open Access Journals (Sweden)

    Andrei Florin Danet

    2008-11-01

    Full Text Available A simple method for manufacturing gold working electrodes for chronopotentiometric stripping measurements from recordable CD-R’s is described. These gold electrodes are much cheaper than commercially available ones. The electrochemical behavior of such an electrode and the working parameters for mercury determination by chronopotentiometric stripping analysis were studied. Detection limit was 0.30 μg Hg/L and determination limit was 1.0 μg Hg/L for a deposition time of 600 s. Using the developed working electrodes it was possible to determine the total mercury in fish samples. A method for fish sample digestion was developed by using a mixture of fuming nitric acid and both concentrated sulfuric and hydrochloric acids. The recovery degree for a known amount of mercury introduced in the sample before digestion was 95.3% (n=4.

  5. Detrended fluctuation analysis of daily temperature records: Geographic dependence over Australia

    CERN Document Server

    Kir'aly, A; Kir\\'aly, Andrea; J\\'anosi, Imre M.

    2004-01-01

    Daily temperature anomaly records are analyzed (61 for Australia, 18 for Hungary) by means of detrended fluctuation analysis. Positive long range asymptotic correlations extending up to 5-10 years are detected for each case. Contrary to earlier claims, the correlation exponent is not universal for continental stations. Interestingly, the dominant factor is geographic latitude over Australia: the general tendency is a decrease of correlation exponent with increasing distance from the equator. This tendency is in a complete agreement with the results found by Tsonis et al. (1999) for 500-hPa height anomalies in the northern hemisphere. The variance of fluctuations exhibits an opposite trend, the larger is the distance from the equator, the larger the amplitude of intrinsic fluctuations. The presence of Tropospheric Biennial Oscillation is clearly identified for three stations at the north-eastern edge of the Australian continent.

  6. Analysis of the Impact of Wildfire on Surface Ozone Record in the Colorado Front Range

    Science.gov (United States)

    McClure-Begley, A.; Petropavlovskikh, I. V.; Oltmans, S. J.; Pierce, R. B.; Sullivan, J. T.; Reddy, P. J.

    2015-12-01

    Ozone plays an important role on the oxidation capacity of the atmosphere, and at ground-level has negative impacts on human health and ecosystem processes. In order to understand the dynamics and variability of surface ozone, it is imperative to analyze individual sources, interactions between sources, transport, and chemical processes of ozone production and accumulation. Biomass burning and wildfires have been known to emit a suite of particulate matter and gaseous compounds into the atmosphere. These compounds, such as, volatile organic compounds, carbon monoxide, and nitrogen oxides are precursor species which aid in the photochemical production and destruction of ozone. The Colorado Front Range (CFR) is a region of complex interactions between pollutant sources and meteorological conditions which result in the accumulation of ozone. High ozone events in the CFR associated with fires are analyzed for 2003-2014 to develop understanding of the large scale influence and variability of ozone and wildfire relationships. This study provides analysis of the frequency of enhanced ozone episodes that can be confirmed to be transported within and affected by the fires and smoke plumes. Long-term records of surface ozone data from the CFR provide information on the impact of wildfire pollutants on seasonal and diurnal ozone behavior. Years with increased local fire activity, as well as years with increased long-range transport of smoke plumes, are evaluated for the effect on the long-term record and high ozone frequency of each location. Meteorological data, MODIS Fire detection images, NOAA HYSPLIT Back Trajectory analysis, NOAA Smoke verification model, Fire Tracer Data (K+), RAQMS Model, Carbon Monoxide data, and Aerosol optical depth retrievals are used with NOAA Global Monitoring Division surface ozone data from three sites in Colorado. This allows for investigation of the interactions between pollutants and meteorology which result in high surface ozone levels.

  7. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  8. Automatic BSS-based filtering of metallic interference in MEG recordings: definition and validation using simulated signals

    Science.gov (United States)

    Migliorelli, Carolina; Alonso, Joan F.; Romero, Sergio; Mañanas, Miguel A.; Nowak, Rafał; Russi, Antonio

    2015-08-01

    Objective. One of the principal drawbacks of magnetoencephalography (MEG) is its high sensitivity to metallic artifacts, which come from implanted intracranial electrodes and dental ferromagnetic prosthesis and produce a high distortion that masks cerebral activity. The aim of this study was to develop an automatic algorithm based on blind source separation (BSS) techniques to remove metallic artifacts from MEG signals. Approach. Three methods were evaluated: AMUSE, a second-order technique; and INFOMAX and FastICA, both based on high-order statistics. Simulated signals consisting of real artifact-free data mixed with real metallic artifacts were generated to objectively evaluate the effectiveness of BSS and the subsequent interference reduction. A completely automatic detection of metallic-related components was proposed, exploiting the known characteristics of the metallic interference: regularity and low frequency content. Main results. The automatic procedure was applied to the simulated datasets and the three methods exhibited different performances. Results indicated that AMUSE preserved and consequently recovered more brain activity than INFOMAX and FastICA. Normalized mean squared error for AMUSE decomposition remained below 2%, allowing an effective removal of artifactual components. Significance. To date, the performance of automatic artifact reduction has not been evaluated in MEG recordings. The proposed methodology is based on an automatic algorithm that provides an effective interference removal. This approach can be applied to any MEG dataset affected by metallic artifacts as a processing step, allowing further analysis of unusable or poor quality data.

  9. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  10. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis.

    Science.gov (United States)

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2011-01-01

    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible. PMID:20676763

  11. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis.

    Science.gov (United States)

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2011-01-01

    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.

  12. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Fong-Lin Jang

    2013-10-01

    Full Text Available The health care sector has become increasingly interested in developing personal health record (PHR systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM that integrates the physician-patient relationship (PPR construct into TAM’s original perceived ease of use (PEOU and perceived usefulness (PU constructs to explore which factors will influence the behavioral intentions (BI of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR.

  13. Pulse artifact detection in simultaneous EEG-fMRI recording based on EEG map topography.

    Science.gov (United States)

    Iannotti, Giannina R; Pittau, Francesca; Michel, Christoph M; Vulliemoz, Serge; Grouiller, Frédéric

    2015-01-01

    One of the major artifact corrupting electroencephalogram (EEG) acquired during functional magnetic resonance imaging (fMRI) is the pulse artifact (PA). It is mainly due to the motion of the head and attached electrodes and wires in the magnetic field occurring after each heartbeat. In this study we propose a novel method to improve PA detection by considering the strong gradient and inversed polarity between left and right EEG electrodes. We acquired high-density EEG-fMRI (256 electrodes) with simultaneous electrocardiogram (ECG) at 3 T. PA was estimated as the voltage difference between right and left signals from the electrodes showing the strongest artifact (facial and temporal). Peaks were detected on this estimated signal and compared to the peaks in the ECG recording. We analyzed data from eleven healthy subjects, two epileptic patients and four healthy subjects with an insulating layer between electrodes and scalp. The accuracy of the two methods was assessed with three criteria: (i) standard deviation, (ii) kurtosis and (iii) confinement into the physiological range of the inter-peak intervals. We also checked whether the new method has an influence on the identification of epileptic spikes. Results show that estimated PA improved artifact detection in 15/17 cases, when compared to the ECG method. Moreover, epileptic spike identification was not altered by the correction. The proposed method improves the detection of pulse-related artifacts, particularly crucial when the ECG is of poor quality or cannot be recorded. It will contribute to enhance the quality of the EEG increasing the reliability of EEG-informed fMRI analysis. PMID:25307731

  14. Transect based analysis versus area based analysis to quantify shoreline displacement: spatial resolution issues.

    Science.gov (United States)

    Anfuso, Giorgio; Bowman, Dan; Danese, Chiara; Pranzini, Enzo

    2016-10-01

    Field surveys, aerial photographs, and satellite images are the most commonly employed sources of data to analyze shoreline position, which are further compared by area based analysis (ABA) or transect based analysis (TBA) methods. The former is performed by computing the mean shoreline displacement for the identified coastal segments, i.e., dividing the beach area variation by the segment length; the latter is based on the measurement of the distance between each shoreline at set points along transects. The present study compares, by means of GIS tools, the ABA and TBA methods by computing shoreline displacements recorded on two stretches of the Tuscany coast (Italy): the beaches of Punta Ala, a linear coast without shore protection structures, and the one at Follonica, which is irregular due to the presence of groins and detached breakwaters. Surveys were carried out using a differential global positioning system (DGPS) in RTK mode. For each site, a 4800-m-long coastal segment was analyzed and divided into ninety-six 50-m-long sectors for which changes were computed using both the ABA and TBA methods. Sectors were progressively joined to have a length of 100, 200, 400, and 800 m to examine how this influenced results. ABA and TBA results are highly correlated for transect distance and sector length up to 100 m at both investigated locations. If longer transects are considered, the two methods still produce good correlated data on the smooth shoreline (i.e. at Punta Ala), but correlation became significantly lower on the irregular shoreline (i.e., at Follonica). PMID:27640163

  15. Analysis of records of external occupational dose records in Brazil; Analise dos registros de dose ocupacional externa no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Mauricio, Claudia L.P.; Silva, Herica L.R. da, E-mail: claudia@ird.gov.br, E-mail: herica@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ),Rio de Janeiro, RJ (Brazil); Silva, Claudio Ribeiro da, E-mail: claudio@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    Brazil, a continental country, with actually more than 150,000 workers under individual monitoring for ionizing radiation, has implemented in 1987 a centralized system for storage of external occupational dose. This database has been improved over the years and is now a web-based information system called Brazilian External Occupational Dose Management Database System - GDOSE. This paper presents an overview of the Brazilian external occupational dose over the years. The estimated annual average effective dose shows a decrease from 2.4 mSv in 1987 to about 0.6 mSv, having been a marked reduction from 1987 to 1990. Analyzing by type of controlled practice, one sees that the medical and dental radiology is the area with the largest number of users of individual monitors (70%); followed by education practices (8%) and the industrial radiography (7%). Additionally to photon whole body monitoring; neutron monitors are used in maintenance (36%), reactor (30%) and education (27%); and extremity monitors, in education (27%), nuclear medicine (22%) and radiology (19%). In terms of collective dose, the highest values are also found in conventional radiology, but the highest average dose values are those of interventional radiology. Nuclear medicine, R and D and radiotherapy also have average annual effective dose higher than 1 mSv. However, there is some very high dose values registered in GDOSE that give false information. This should be better analyzed in the future. Annual doses above 500 are certainly not realistic. (author)

  16. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review

    Directory of Open Access Journals (Sweden)

    Martínez-García Ana I

    2010-10-01

    Full Text Available Abstract Background The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. Methods This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. Results The review identifies two main issues: 1 a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2 the definition of a base for the design of a conceptual

  17. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Science.gov (United States)

    2010-07-01

    ... film as specified in Department of Transportation regulations (49 CFR 172.101, Hazardous materials..., and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT AUDIOVISUAL...-1738, immediately after inspection about deteriorating permanent or unscheduled audiovisual...

  18. Reconstructing Past Depositional and Diagenetic Processes through Quantitative Stratigraphic Analysis of the Martian Sedimentary Rock Record

    Science.gov (United States)

    Stack, Kathryn M.

    High-resolution orbital and in situ observations acquired of the Martian surface during the past two decades provide the opportunity to study the rock record of Mars at an unprecedented level of detail. This dissertation consists of four studies whose common goal is to establish new standards for the quantitative analysis of visible and near-infrared data from the surface of Mars. Through the compilation of global image inventories, application of stratigraphic and sedimentologic statistical methods, and use of laboratory analogs, this dissertation provides insight into the history of past depositional and diagenetic processes on Mars. The first study presents a global inventory of stratified deposits observed in images from the High Resolution Image Science Experiment (HiRISE) camera on-board the Mars Reconnaissance Orbiter. This work uses the widespread coverage of high-resolution orbital images to make global-scale observations about the processes controlling sediment transport and deposition on Mars. The next chapter presents a study of bed thickness distributions in Martian sedimentary deposits, showing how statistical methods can be used to establish quantitative criteria for evaluating the depositional history of stratified deposits observed in orbital images. The third study tests the ability of spectral mixing models to obtain quantitative mineral abundances from near-infrared reflectance spectra of clay and sulfate mixtures in the laboratory for application to the analysis of orbital spectra of sedimentary deposits on Mars. The final study employs a statistical analysis of the size, shape, and distribution of nodules observed by the Mars Science Laboratory Curiosity rover team in the Sheepbed mudstone at Yellowknife Bay in Gale crater. This analysis is used to evaluate hypotheses for nodule formation and to gain insight into the diagenetic history of an ancient habitable environment on Mars.

  19. Implementation of a cloud-based electronic medical record to reduce gaps in the HIV treatment continuum in rural Kenya

    OpenAIRE

    John Haskew; Gunnar Rø; Kenrick Turner; Davies Kimanga; Martin Sirengo; Shahnaaz Sharif

    2015-01-01

    Background Electronic medical record (EMR) systems are increasingly being adopted to support the delivery of health care in developing countries and their implementation can help to strengthen pathways of care and close gaps in the HIV treatment cascade by improving access to and use of data to inform clinical and public health decision-making. Methods This study implemented a novel cloud-based electronic medical record system in an HIV outpatient setting in Western Kenya and eval...

  20. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  1. Application of Dense Array Analysis to Strong Motion Data Recorded at The SMART-1 Array

    Science.gov (United States)

    Francois, C.

    2003-12-01

    This paper is part of a project to design an optimal strong motion dense array in New Zealand. The overall project looks at developing a dense network of strong motion seismometers in order to measure directly the rupture process of major seismogenic sources such as the Alpine Fault and strands of the Marlborough Fault System defining the South Island sector of the Australia-Pacific plate boundary zone. This work shows the application of dense array analysis to a set of seismic data recorded at the SMART-1 array in Taiwan (data kindly provided by the Institute of Earth Sciences, Academia Sinica Data Management Center for Strong Motion Seismology - Taiwan). The data have been processed and analysed applying modified MUSIC algorithm with higher computing capabilities giving higher resolution results. The SMART-1 array is an ideal dense array of 37 strong motion instruments set up in the following configuration: 3 concentric circles of radii 200m, 1 km and 2km, and one central station. The studied event called Event 5 was recorded on January 29th 1981 and had a magnitude 6. Event 5 is an ideal case study as its epicentral distance (about 30 km) is comparable to epicentral distances for expected events on the Alpine Fault or on the Hope Fault in New Zealand. Event 5 has been previously widely analysed using strong motion array studies and aftershocks studies but with disagreeing results; this new study hopes to bring new insights in the debate. Using simple fault and velocity models, this latest analysis of Event 5 has given the following rupture properties. It has confirmed one of the hypotheses that the fault ruptured from southeast to northwest. The higher resolution of the computation has improved the location of the hypocentre depth and the location of the propagating rupture front. This allowed resolving changes of velocities in the rupture process and locating asperities in the fault plane. Contrary to the previous array studies, the inferred size of the fault

  2. Fragmented implementation of maternal and child health home-based records in Vietnam: need for integration

    Directory of Open Access Journals (Sweden)

    Hirotsugu Aiga

    2016-02-01

    Full Text Available Background: Home-based records (HBRs are globally implemented as the effective tools that encourage pregnant women and mothers to timely and adequately utilise maternal and child health (MCH services. While availability and utilisation of nationally representative HBRs have been assessed in several earlier studies, the reality of a number of HBRs subnationally implemented in a less coordinated manner has been neither reported nor analysed. Objectives: This study is aimed at estimating the prevalence of HBRs for MCH and the level of fragmentation of and overlapping between different HBRs for MCH in Vietnam. The study further attempts to identify health workers’ and mothers’ perceptions towards HBR operations and utilisations. Design: A self-administered questionnaire was sent to the provincial health departments of 28 selected provinces. A copy of each HBR available was collected from them. A total of 20 semi-structured interviews with health workers and mothers were conducted at rural communities in four of 28 selected provinces. Results: Whereas HBRs developed exclusively for maternal health and exclusively for child health were available in four provinces (14% and in 28 provinces (100%, respectively, those for both maternal health and child health were available in nine provinces (32%. The mean number of HBRs in 28 provinces (=5.75 indicates over-availability of HBRs. All 119 minimum required items for recording found in three different HBRs under nationwide scale-up were also included in the Maternal and Child Health Handbook being piloted for nationwide scaling-up. Implementation of multiple HBRs is likely to confuse not only health workers by requiring them to record the same data on several HBRs but also mothers about which HBR they should refer to and rely on at home. Conclusions: To enable both health workers and pregnant women to focus on only one type of HBR, province-specific HBRs for maternal and/or child health need to be

  3. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  4. Image Processing Based Girth Monitoring and Recording System for Rubber Plantations

    Directory of Open Access Journals (Sweden)

    Chathura Thilakarathne

    2015-02-01

    Full Text Available Measuring the girth and continuous monitoring of the increase in girth is one of the most important processes in rubber plantations since identification of girth deficiencies would enable planters to take corrective actions to ensure a good yield from the plantation. This research paper presents an image processing based girth measurement & recording system that can replace existing manual process in an efficient and economical manner. The system uses a digital image of the tree which uses the current number drawn on the tree to identify the tree number & its width. The image is threshold first & then filtered out using several filtering criterion to identify possible candidates for numbers. Identified blobs are then fed to the Tesseract OCR for number recognition. Threshold image is then filtered again with different criterion to segment out the black strip drawn on the tree which is then used to calculate the width of the tree using calibration parameters. Once the tree number is identified & width is calculated the girth the measured girth of the tree is stored in the data base under the identified tree number. The results obtained from the system indicated significant improvement in efficiency & economy for main plantations. As future developments we are proposing a standard commercial system for girth measurement using standardized 2D Bar Codes as tree identifiers

  5. EHR query language (EQL)--a query language for archetype-based health records.

    Science.gov (United States)

    Ma, Chunlan; Frankel, Heath; Beale, Thomas; Heard, Sam

    2007-01-01

    OpenEHR specifications have been developed to standardise the representation of an international electronic health record (EHR). The language used for querying EHR data is not as yet part of the specification. To fill in this gap, Ocean Informatics has developed a query language currently known as EHR Query Language (EQL), a declarative language supporting queries on EHR data. EQL is neutral to EHR systems, programming languages and system environments and depends only on the openEHR archetype model and semantics. Thus, in principle, EQL can be used in any archetype-based computational context. In the EHR context described here, particular queries mention concepts from the openEHR EHR Reference Model (RM). EQL can be used as a common query language for disparate archetype-based applications. The use of a common RM, archetypes, and a companion query language, such as EQL, semantic interoperability of EHR information is much closer. This paper introduces the EQL syntax and provides example clinical queries to illustrate the syntax. Finally, current implementations and future directions are outlined. PMID:17911747

  6. TECHNICAL NOTE: The development of a PZT-based microdrive for neural signal recording

    Science.gov (United States)

    Park, Sangkyu; Yoon, Euisung; Lee, Sukchan; Shin, Hee-sup; Park, Hyunjun; Kim, Byungkyu; Kim, Daesoo; Park, Jongoh; Park, Sukho

    2008-04-01

    A hand-controlled microdrive has been used to obtain neural signals from rodents such as rats and mice. However, it places severe physical stress on the rodents during its manipulation, and this stress leads to alertness in the mice and low efficiency in obtaining neural signals from the mice. To overcome this issue, we developed a novel microdrive, which allows one to adjust the electrodes by a piezoelectric device (PZT) with high precision. Its mass is light enough to install on the mouse's head. The proposed microdrive has three H-type PZT actuators and their guiding structure. The operation principle of the microdrive is based on the well known inchworm mechanism. When the three PZT actuators are synchronized, linear motion of the electrode is produced along the guiding structure. The electrodes used for the recording of the neural signals from neuron cells were fixed at one of the PZT actuators. Our proposed microdrive has an accuracy of about 400 nm and a long stroke of about 5 mm. In response to formalin-induced pain, single unit activities are robustly measured at the thalamus with electrodes whose vertical depth is adjusted by the microdrive under urethane anesthesia. In addition, the microdrive was efficient in detecting neural signals from mice that were moving freely. Thus, the present study suggests that the PZT-based microdrive could be an alternative for the efficient detection of neural signals from mice during behavioral states without any stress to the mice.

  7. Maturity Matrices for Quality of Model- and Observation-Based Climate Data Records

    Science.gov (United States)

    Höck, Heinke; Kaiser-Weiss, Andrea; Kaspar, Frank; Stockhause, Martina; Toussaint, Frank; Lautenschlager, Michael

    2015-04-01

    In the field of Software Engineering the Capability Maturity Model is used to evaluate and improve software development processes. The application of a Maturity Matrix is a method to assess the degree of software maturity. This method was adapted to the maturity of Earth System data in scientific archives. The application of such an approach to Climate Data Records was first proposed in the context of satellite-based climate products and applied by NOAA and NASA. The European FP7 project CORE-CLIMAX suggested and tested extensions of the approach in order to allow the applicability to additional climate datasets, e.g. based on in-situ observations as well as model-based reanalysis. Within that project the concept was applied to products of satellite- and in-situ based datasets. Examples are national ground-based data from Germany as an example for typical products of a national meteorological service, the EUMETSAT Satellite Application Facility Network, the ESA Climate Change Initiative, European Reanalysis activities (ERA-CLIM) and international in situ-based climatologies such as GPCC, ECA&D, BSRN, HadSST. Climate models and their related output have some additional characteristics that need specific consideration in such an approach. Here we use examples from the World Data Centre for Climate (WDCC) to discuss the applicability. The WDCC focuses on climate data products, specifically those resulting from climate simulations. Based on these already existing Maturity Matrix models, WDCC developed a generic Quality Assessment System for Earth System data. A self-assessment is performed using a maturity matrix evaluating the data quality for five maturity levels with respect to the criteria data and metadata consistency, completeness, accessibility and accuracy. The classical goals of a quality assessment system in a data processing workflow are: (1) to encourage data creators to improve quality to reach the next quality level, (2) enable data consumers to decide

  8. [The influence of Donguibogam during the middle Joseon era based on clinical records on low back pain in Seungjeongwon ilgi].

    Science.gov (United States)

    Jung, Jae Young; Lee, Jun Hwan; Chung, Seok Hee

    2011-06-30

    The recently increasing interest in historical records has led to more research on historical records in various fields of study. This trend has also affected medical research, with the medical climate and popular treatment modalities of the past now being revealed based on historical records. However, most research on medical history during the Joseon era has been based on the most well-known record, Joseon wangjo sillok or Annals of the Joseon Dynasty. Joseon wangjo sillok is a comprehensive and organized record of society during the Joseon era and contains key knowledge about medical history during the period, but it lacks details on the treatment of common disorders at the time. Seungjeongwon ilgi or Diary of the Royal Secretariat has detailed records of daily events and is a valuable resource for the daily activities of the era. And in the middle Josoen era, a variety of medical books - especially Donguibogam - was published. Therefore, the authors focused on the under-researched Seungjeongwon ilgi, Donguibogam and attempted to assess and evaluate low back pain treatment performed on Joseon royalty. The most notable characteristic of low back treatment records within the Seungjeongwon ilgi is that diagnosis and treatment was made based on an independent Korean medicine, rather than conventional Chinese medicine. This paradigm shift is represented in Dongeuibogam, and can be seen in the close relationship between Dongeuibogam and national medical exams of the day. Along with the pragmatism of the middle Joseon era, medical treatment also put more focus on pragmatic treatment methods, and records show emphasis on acupuncture and moxibustion and other points in accord with this. The authors also observed meaning and limitations of low back pain treatment during that era through comparison with current diagnosis and treatment. PMID:21894068

  9. How to limit the burden of data collection for Quality Indicators based on medical records? The COMPAQH experience

    Directory of Open Access Journals (Sweden)

    Grenier Catherine

    2008-10-01

    Full Text Available Abstract Background Our objective was to limit the burden of data collection for Quality Indicators (QIs based on medical records. Methods The study was supervised by the COMPAQH project. Four QIs based on medical records were tested: medical record conformity; traceability of pain assessment; screening for nutritional disorders; time elapsed before sending copy of discharge letter to the general practitioner. Data were collected by 6 Clinical Research Assistants (CRAs in a panel of 36 volunteer hospitals and analyzed by COMPAQH. To limit the burden of data collection, we used the same sample of medical records for all 4 QIs, limited sample size to 80 medical records, and built a composite score of only 10 items to assess medical record completeness. We assessed QI feasibility by completing a grid of 19 potential problems and evaluating time spent. We assessed reliability (κ coefficient as well as internal consistency (Cronbach α coefficient in an inter-observer study, and discriminatory power by analysing QI variability among hospitals. Results Overall, 23 115 data items were collected for the 4 QIs and analyzed. The average time spent on data collection was 8.5 days per hospital. The most common feasibility problem was misunderstanding of the item by hospital staff. QI reliability was good (κ: 0.59–0.97 according to QI. The hospitals differed widely in their ability to meet the quality criteria (mean value: 19–85%. Conclusion These 4 QIs based on medical records can be used to compare the quality of record keeping among hospitals while limiting the burden of data collection, and can therefore be used for benchmarking purposes. The French National Health Directorate has included them in the new 2009 version of the accreditation procedure for healthcare organizations.

  10. Digital Audio Legal Recorder

    Data.gov (United States)

    Department of Transportation — The Digital Audio Legal Recorder (DALR) provides the legal recording capability between air traffic controllers, pilots and ground-based air traffic control TRACONs...

  11. Lakes as recorders of extreme flows: utilising particle size analysis to generate a millennial-scale palaeoflood record from the English Lake District

    Science.gov (United States)

    Schillereff, Daniel; Chiverrell, Richard; Macdonald, Neil; Hooke, Janet

    2013-04-01

    Developing new quantitative measures of catchment processes, such as flood events, is a key goal of geomorphologists. The geomorphic effects of extreme hydrological events are effectively recorded in upland lake basins as efficient sediment trapping renders flow-related proxy indicators (e.g., particle size) reflective of changes in river discharge. We demonstrate that integrating particle size analysis of lake sediment cores with data from an on-going sediment trapping protocol within the lake can provide a valuable natural archive for investigating hydrogeomorphic extremes over extended time periods. A series of sediment cores (3 - 5 m length) extracted from Brotherswater, English Lake District, contain numerous coarse-grained laminations, discerned by applying high-resolution (0.5 cm) laser granulometry and interpreted to reflect a palaeoflood record extending to ~2000 yr BP. Well-constrained core chronologies are derived through integrating radionuclide (210Pb, 137Cs, 241Am, 14C) dating with geochemical markers which reflect phases of local lead (Pb) mining. Geochemical and magnetic profiles have facilitated precise core correlation and the repeatability of the distinctive coarse facies to be verified. That these laminae exhibit inverse grading underlying normal grading, most likely reflecting the waxing and waning of flood-induced hyperpycnal flows, supports our palaeoflood interpretation. Application of a recently-published end-member model for unmixing particle size distributions (Deitze et al., 2012) demonstrates a prominent coarse end-member (medium sand) which we attribute to fluvial transport of coarse grains during high-magnitude flows. Two end members feature in the silt-size fraction, most likely reflecting the sedimentary component delivered under normal flow conditions. The relative importance of these two modes appears to respond to catchment conditioning due to land-use change, which has important implications for how flood events may be recorded

  12. ROAn, a ROOT based Analysis Framework

    CERN Document Server

    Lauf, Thomas

    2013-01-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and...

  13. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  14. TEXTURE ANALYSIS BASED IRIS RECOGNITION

    OpenAIRE

    GÜRKAN, Güray; AKAN, Aydın

    2012-01-01

    In this paper, we present a new method for personal identification, based on iris patterns. The method composed of iris image acquisition, image preprocessing, feature extraction and finally decision stages. Normalized iris images are vertically log-sampled and filtered by circular symmetric Gabor filters. The output of filters are windowed and mean absolute deviation of pixels in the window are calculated as the feature vectors. The proposed  method has the desired properties of an iris reco...

  15. Reconstructing Late Pleistocene air temperature variability based on branched GDGTs in the sedimentary record of Llangorse Lake (Wales)

    Science.gov (United States)

    Maas, David; Hoek, Wim; Peterse, Francien; Akkerman, Keechy; Macleod, Alison; Palmer, Adrian; Lowe, John

    2015-04-01

    This study aims to provide a temperature reconstruction of the Lateglacial sediments of Llangorse Lake. A new temperature proxy is used, based on the occurrence of different membrane lipids of soil bacteria (de Jonge et al., 2014). Application of this proxy on lacustrine environments is difficult because of in situ (water column) production and co-elution of isomers. Pollen analysis provides a palynological record that can be used for biostratigraphical correlation to other records. Llangorse Lake lies in a glacial basin just northeast of the Brecon Beacons in Powys, South Wales. The lake is located upstream in the Afon Llynfi valley, at the edge of the watershed of the River Wye. The lake consists of two semi-separated basins with a maximum water depth of 7.5 m, arranged in an L-shape with a surface area of roughly 1.5 km2. Previous studies have focused on the Holocene development of the lake and its surrounding environment (Jones et al., 1985). This study focuses on the deglacial record that appeared to be present in the basal part of the sequence. The lake was cored in the September, 2014 with a manual operated 3 m piston corer from a small coring platform. Overlapping cores were taken to form a continuous 12 m core, spanning the Holocene and the Lateglacial sediments. Six adjacent Lateglacial core segments from the southern basin of Llangorse lake were scanned for their major element composition using XRF scanning at 5 mm resolution to discern changes in sediment origin. Furthermore, loss on ignition (LOI) analysis was used to determine the changes in organic content of the sediments. Subsamples of the Lateglacial sedimentary record were analyzed for the occurrence of different bacterial membrane lipids (brGDGTs: branched glycerol dialkyl glycerol tetraethers) by means of HPLC-MS (high performance liquid chromatography and mass spectrometry) using two silica columns to achieve proper separation of isomers (de Jonge et al., 2013). Air temperatures are

  16. Reconstruction of Oceanographic Changes Based on the Diatom Records of the Central Okhotsk Sea over the last 500000 Years

    Directory of Open Access Journals (Sweden)

    Wei-Lung Wang and Liang-Chi Wang

    2008-01-01

    Full Text Available This study provides insight into changes in sea ice conditions and the oceanographic environment over the past 500 kyr through analysis of the diatom record. Based on the relative abundance of 13 diatoms species in piston core MD012414, four types of environmental conditions in the central Okhotsk Sea over the last 330 ka BP have been distinguished: (1 open-ocean alternating with seasonal sea-ice cover in Stages 9, 5, and 1; (2 almost open-ocean free of sea-ice cover in Stages 7 and 3; (3 perennial sea-ice cover in Stages 6, 4, and 2; and (4 a warm ice-age dominated by open ocean assemblages in Stage 8. The littoral diatom species, Paralia sulcata, showed a sudden increase from the glacial period to the nterglacial period over the last 330 ka BP, except during Stage 8. Such a result implies that melting sea-ice transported terrigenous materials from the north Okhotsk Sea continental shelves to the central ocean during eglaciation. From Stage 13 to Stage 10, however, cold and warm marine conditions unexpectedly occurred in the late interglacial periods and the glacial periods, respectively. One possible reason for this is a lack of age control points from Stage 13 to Stage 10, and the different sediment accumulation rates between glacial and interglacial periods. This study suggests not only the process by which oceanographic variation of sea ice occurred, but also new significance for Paralia sulcata as an indicator in the diatom record of the Okhotsk Sea.

  17. A Study on Enhancing Data Storage Capacity and Mechanical Reliability of Solid Immersion Lens-Based Near-Field Recording System

    Science.gov (United States)

    Park, No-Cheol; Yang, Hyun-Seok; Rhim, Yoon-Cheol; Park, Young-Pil

    2008-08-01

    In this study, several technical issues on solid immersion lens (SIL)-based near-field recording (NFR) are explored, namely, to enhance storage capacity and to guarantee mechanical reliability of the device. For the purpose of enhancing the storage capacity of the NFR system, two optical configurations using radial polarization and dual recording layers are proposed. Through a feasibility analysis of the proposed optical configuration with radial polarization, it was determined that illumination of radial polarization is not a suitable solution to achieve higher areal density. To apply highly focusing characteristics of incidence of radial polarized light to cover-layer protected data storage, an annular pupil filtering method was introduced. Complete field analysis of the proposed dual layered NFR optics verified its feasibility, and the assembly of the SIL of the proposed model was successfully achieved. In addition, to improve mechanical reliability of the SIL-based NFR system, improved near-field (NF) air-gap servo methods and air flow analysis around the low part of the SIL have been evaluated. With improved NF gap servo methods using an error-based disturbance observer (EDOB) on a base air-gap controller, residual gap errors were markebly reduced by 26.26% while controlling the NF air-gap to 30 nm. Air flow near the head media interface was visualized and an undesirable effect of backward flow climbing from the bottom surface of the SIL was ovserved.

  18. Independent Component Analysis and Decision Trees for ECG Holter Recording De-Noising

    OpenAIRE

    Jakub Kuzilek; Vaclav Kremen; Filip Soucek; Lenka Lhotska

    2014-01-01

    We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA). This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE) between origin...

  19. Excel-Based Business Analysis

    CERN Document Server

    Anari, Ali

    2012-01-01

    ai"The trend is your friend"is a practical principle often used by business managers, who seek to forecast future sales, expenditures, and profitability in order to make production and other operational decisions. The problem is how best to identify and discover business trends and utilize trend information for attaining objectives of firms.This book contains an Excel-based solution to this problem, applying principles of the authors' "profit system model" of the firm that enables forecasts of trends in sales, expenditures, profits and other business variables. The program,

  20. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  1. Using a Web-Based Database to Record and Monitor Athletic Training Students' Clinical Experiences

    Science.gov (United States)

    Brown, Kirk W.; Williams, Lisa; Janicki, Thomas

    2008-01-01

    Objective: The purpose of this article is to introduce a documentation recording system employing the Microsoft Structured Query Language (MS-SQL) database used by the Athletic Training Education Program (ATEP) for recording and monitoring of athletic training student (ATS) clinical experiences and hours. Background: Monitoring ATSs clinical…

  2. An integrable, web-based solution for easy assessment of video-recorded performances

    DEFF Research Database (Denmark)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; howe......-recorded performances (ISEA)....

  3. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L;

    2014-01-01

    BACKGROUND: Elevated resting heart rate (RHR) is associated with cardiovascular mortality and morbidity. Assessment of heart rate (HR) from Holter recording may afford a more precise estimate of the effect of RHR on cardiovascular risk, as compared to casual RHR. Comparative analysis was carried ...

  4. A Novel Error Correcting System Based on Product Codes for Future Magnetic Recording Channels

    CERN Document Server

    Van, Vo Tam

    2012-01-01

    We propose a novel construction of product codes for high-density magnetic recording based on binary low-density parity check (LDPC) codes and binary image of Reed Solomon (RS) codes. Moreover, two novel algorithms are proposed to decode the codes in the presence of both AWGN errors and scattered hard errors (SHEs). Simulation results show that at a bit error rate (bER) of approximately 10^-8, our method allows improving the error performance by approximately 1.9dB compared with that of a hard decision decoder of RS codes of the same length and code rate. For the mixed error channel including random noises and SHEs, the signal-to-noise ratio (SNR) is set at 5dB and 150 to 400 SHEs are randomly generated. The bit error performance of the proposed product code shows a significant improvement over that of equivalent random LDPC codes or serial concatenation of LDPC and RS codes.

  5. A tutorial on activity-based costing of electronic health records.

    Science.gov (United States)

    Federowicz, Marie H; Grossman, Mila N; Hayes, Bryant J; Riggs, Joseph

    2010-01-01

    As the American Recovery and Restoration Act of 2009 allocates $19 billion to health information technology, it will be useful for health care managers to project the true cost of implementing an electronic health record (EHR). This study presents a step-by-step guide for using activity-based costing (ABC) to estimate the cost of an EHR. ABC is a cost accounting method with a "top-down" approach for estimating the cost of a project or service within an organization. The total cost to implement an EHR includes obvious costs, such as licensing fees, and hidden costs, such as impact on productivity. Unlike other methods, ABC includes all of the organization's expenditures and is less likely to miss hidden costs. Although ABC is used considerably in manufacturing and other industries, it is a relatively new phenomenon in health care. ABC is a comprehensive approach that the health care field can use to analyze the cost-effectiveness of implementing EHRs. In this article, ABC is applied to a health clinic that recently implemented an EHR, and the clinic is found to be more productive after EHR implementation. This methodology can help health care administrators assess the impact of a stimulus investment on organizational performance. PMID:20042937

  6. A tutorial on activity-based costing of electronic health records.

    Science.gov (United States)

    Federowicz, Marie H; Grossman, Mila N; Hayes, Bryant J; Riggs, Joseph

    2010-01-01

    As the American Recovery and Restoration Act of 2009 allocates $19 billion to health information technology, it will be useful for health care managers to project the true cost of implementing an electronic health record (EHR). This study presents a step-by-step guide for using activity-based costing (ABC) to estimate the cost of an EHR. ABC is a cost accounting method with a "top-down" approach for estimating the cost of a project or service within an organization. The total cost to implement an EHR includes obvious costs, such as licensing fees, and hidden costs, such as impact on productivity. Unlike other methods, ABC includes all of the organization's expenditures and is less likely to miss hidden costs. Although ABC is used considerably in manufacturing and other industries, it is a relatively new phenomenon in health care. ABC is a comprehensive approach that the health care field can use to analyze the cost-effectiveness of implementing EHRs. In this article, ABC is applied to a health clinic that recently implemented an EHR, and the clinic is found to be more productive after EHR implementation. This methodology can help health care administrators assess the impact of a stimulus investment on organizational performance.

  7. Multimedia consultation session recording and playback using Java-based browser in global PACS

    Science.gov (United States)

    Martinez, Ralph; Shah, Pinkesh J.; Yu, Yuan-Pin

    1998-07-01

    The current version of the Global PACS software system uses a Java-based implementation of the Remote Consultation and Diagnosis (RCD) system. The Java RCD includes a multimedia consultation session between physicians that includes text, static image, image annotation, and audio data. The JAVA RCD allows 2-4 physicians to collaborate on a patient case. It allows physicians to join the session via WWW Java-enabled browsers or stand alone RCD application. The RCD system includes a distributed database archive system for archiving and retrieving patient and session data. The RCD system can be used for store and forward scenarios, case reviews, and interactive RCD multimedia sessions. The RCD system operates over the Internet, telephone lines, or in a private Intranet. A multimedia consultation session can be recorded, and then played back at a later time for review, comments, and education. A session can be played back using Java-enabled WWW browsers on any operating system platform. The JAVA RCD system shows that a case diagnosis can be captured digitally and played back with the original real-time temporal relationships between data streams. In this paper, we describe design and implementation of the RCD session playback.

  8. Implications of the Java language on computer-based patient records.

    Science.gov (United States)

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs. PMID:8947762

  9. An ecometric analysis of the fossil mammal record of the Turkana Basin

    Science.gov (United States)

    Žliobaitė, Indrė; Kaya, Ferhat; Bibi, Faysal; Bobe, René; Leakey, Louise; Leakey, Meave; Patterson, David; Rannikko, Janina; Werdelin, Lars

    2016-01-01

    Although ecometric methods have been used to analyse fossil mammal faunas and environments of Eurasia and North America, such methods have not yet been applied to the rich fossil mammal record of eastern Africa. Here we report results from analysis of a combined dataset spanning east and west Turkana from Kenya between 7 and 1 million years ago (Ma). We provide temporally and spatially resolved estimates of temperature and precipitation and discuss their relationship to patterns of faunal change, and propose a new hypothesis to explain the lack of a temperature trend. We suggest that the regionally arid Turkana Basin may between 4 and 2 Ma have acted as a ‘species factory’, generating ecological adaptations in advance of the global trend. We show a persistent difference between the eastern and western sides of the Turkana Basin and suggest that the wetlands of the shallow eastern side could have provided additional humidity to the terrestrial ecosystems. Pending further research, a transient episode of faunal change centred at the time of the KBS Member (1.87–1.53 Ma), may be equally plausibly attributed to climate change or to a top-down ecological cascade initiated by the entry of technologically sophisticated humans. This article is part of the themed issue ‘Major transitions in human evolution’. PMID:27298463

  10. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    Science.gov (United States)

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency. PMID:23968036

  11. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  12. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  13. A Patient-Based Analysis of Drug Disorder Diagnoses in the Medicare Population

    OpenAIRE

    Cartwright, William S.; Ingster, Lillian M.

    1993-01-01

    This article utilizes the Part A Medicare provider analysis and review (MEDPAR) file for fiscal year (FY) 1987. The discharge records were organized into a patient-based record that included alcohol, drug, and mental (ADM) disorder diagnoses as well as measures of resource use. The authors find that there are substantially higher costs of health care incurred by the drug disorder diagnosed population. Those of the Medicare population diagnosed with drug disorders had longer lengths of stay (L...

  14. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  15. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information: http://homer.ou.nl/lsa-workshop0

  16. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005)

    Science.gov (United States)

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  17. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos

    2012-03-01

    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  18. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton

    2012-01-01

    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  19. Analysis of Readex's Serial Set MARC Records: Improving the Data for the Library Catalog

    Science.gov (United States)

    Draper, Daniel; Lederer, Naomi

    2013-01-01

    Colorado State University Libraries (CSUL) purchased the digitized "United States Congressional Serial Set," 1817-1994 and "American State Papers" (1789-1838) from the Readex Division of NewsBank, Inc. and, once funds and records were available, the accompanying MARC records. The breadth of information found in the "Serial Set" is described, along…

  20. Quantitative analysis of single muscle fibre action potentials recorded at known distances

    NARCIS (Netherlands)

    Albers, B.A.; Put, J.H.M.; Wallinga, W.; Wirtz, P.

    1989-01-01

    In vivo records of single fibre action potentials (SFAPs) have always been obtained at unknown distance from the active muscle fibre. A new experimental method has been developed enabling the derivation of the recording distance in animal experiments. A single fibre is stimulated with an intracellu

  1. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    Science.gov (United States)

    Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server

  2. [Pressure ulcer care quality indicator: analysis of medical records and incident report].

    Science.gov (United States)

    dos Santos, Cássia Teixeira; Oliveira, Magáli Costa; Pereira, Ana Gabriela da Silva; Suzuki, Lyliam Midori; Lucena, Amália de Fátima

    2013-03-01

    Cross-sectional study that aimed to compare the data reported in a system for the indication of pressure ulcer (PU) care quality, with the nursing evolution data available in the patients' medical records, and to describe the clinical profile and nursing diagnosis of those who developed PU grade 2 or higher Sample consisted of 188 patients at risk for PU in clinical and surgical units. Data were collected retrospectively from medical records and a computerized system of care indicators and statistically analyzed. Of the 188 patients, 6 (3%) were reported for pressure ulcers grade 2 or higher; however, only 19 (10%) were recorded in the nursing evolution records, thus revealing the underreporting of data. Most patients were women, older adults and patients with cerebrovascular diseases. The most frequent nursing diagnosis was risk of infection. The use of two or more research methodologies such as incident reporting data and retrospective review of patients' records makes the results trustworthy.

  3. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Directory of Open Access Journals (Sweden)

    Catalina Pimiento

    Full Text Available Carcharocles megalodon ("Megalodon" is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma. Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  4. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Science.gov (United States)

    Pimiento, Catalina; Clements, Christopher F

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE) model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  5. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Science.gov (United States)

    Pimiento, Catalina; Clements, Christopher F

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE) model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems. PMID:25338197

  6. A novel assessment of odor sources using instrumental analysis combined with resident monitoring records for an industrial area in Korea

    Science.gov (United States)

    Lee, Hyung-Don; Jeon, Soo-Bin; Choi, Won-Joon; Lee, Sang-Sup; Lee, Min-Ho; Oh, Kwang-Joong

    2013-08-01

    The residents living nearby the Sa-sang industrial area (SSIA) continuously were damaged by odorous pollution since 1990s. We determined the concentrations of reduced sulfur compounds (RSCs) [hydrogen sulfide (H2S), methyl mercaptan (CH3SH), dimethyl sulfide (DMS), and dimethyl disulfide (DMDS)], nitrogenous compounds (NCs) [ammonia (NH3) and trimethylamine (TMA)], and carbonyl compounds (CCs) [acetaldehyde and butyraldehyde] by instrumental analysis in the SSIA in Busan, Korea from Jun to Nov, 2011. We determined odor intensity (OI) based on the concentrations of the odorants and resident monitoring records (RMR). The mean concentration of H2S was 10-times higher than NCs, CCs and the other RSC. The contribution from RSCs to the OI was over 50% at all sites excluding the A-5 (chemical production) site. In particular, A-4 (food production) site showed more than 8-times higher the sum of odor activity value (SOAV) than the other sites. This suggested that the A-4 site was the most malodorous area in the SSIA. From the RMR analysis, the annoyance degree (OI ≥ 2) was 51.9% in the industrial area. The 'Rotten' smell arising from the RSCs showed the highest frequency (25.3%) while 'Burned' and 'Other' were more frequent than 'Rotten' in the residential area. The correlation between odor index calculated by instrumental analysis and OI from the RMR was analyzed. The Pearson correlation coefficient (r) of the SOAV was the highest at 0.720 (P < 0.05), and overall results of coefficient showed a moderately high correlation distribution range (from 0.465 to 0.720). Therefore, the overall results of this research confirm that H2S emitted from A-4 site including food production causes significant annoyance in the SSIA. We also confirm RMR data can be used effectively to evaluate the characteristic of odorants emitted from the SSIA.

  7. DETECTION OF QRS COMPLEXES OF ECG RECORDING BASED ON WAVELET TRANSFORM USING MATLAB

    Directory of Open Access Journals (Sweden)

    Ruchita Gautam,

    2010-07-01

    Full Text Available The electrocardiogram (ECG is quite important tool to find out more information about the heart. The main tasks in ECG signal analysis are the detection of QRS complex (i.e. R wave, and the estimation ofinstantaneous heart rate by measuring the time interval between two consecutive R-waves. After recognizing R wave, other components like P, Q, S and T can be detected by using window method. In this paper, we describe a QRS complex detector based on the Dyadic wavelet transform (DyWT which is robust in comparison with time- varying QRS complex morphology and to noise. We illustrate the performance of the DyWT-based QRS detector by considering problematic ECG signals from Common Standard for Electrocardiography (CSE database. We also compare and analyze its performance to some of the QRS detectors developed in the past.

  8. How does Canada stack up? A bibliometric analysis of the primary healthcare electronic medical record literature

    Directory of Open Access Journals (Sweden)

    Amanda L Terry

    2013-09-01

    Full Text Available Background Major initiatives are underway in Canada which are designed to increase electronic medical record (EMR implementation and maximise its use in primary health care. These developments need to be supported by sufficient evidence from the literature, particularly relevant research conducted in the Canadian context.Objectives This study sought to quantify this lack of research by: (1 identifying and describing the primary health care EMR literature; and (2 comparing the Canadian and international primary healthcare EMR literature on the basis of content and publication levels.Methods Seven bibliographic databases were searched using primary health care and EMR keywords. Publication abstracts were reviewed and categorised. First author affiliation was used to identify country of origin. Proportions of Canadian- and non-Canadian-authored publications were compared using Fisher’s exact test. For countries having 10 or more primary healthcare EMR publications, publications per 10 000 researchers were calculated.Results After exclusions, 750 publications were identified. More than one-third used primary healthcare EMRs as a study data source. Twenty-two (3% were Canadian-authored. There were significantly different publication levels in three categories between Canadian- and non-Canadian-authored publications. Based on publications per researchers, the Netherlands ranked first, while Canada ranked eighth of nine countries with 10 or more publications.Conclusions A relatively small body of literature focused on EMRs in primary health care exists; publications by Canadian authors were low. This study highlights the need to develop a strong evidence base to support the effective implementation

  9. Analysis of Enhanced Associativity Based Routing Protocol

    Directory of Open Access Journals (Sweden)

    Said A. Shaar

    2006-01-01

    Full Text Available This study introduces an analysis to the performance of the Enhanced Associativity Based Routing protocol (EABR based on two factors; Operation complexity (OC and Communication Complexity (CC. OC can be defined as the number of steps required in performing a protocol operation, while CC can be defined as the number of messages exchanged in performing a protocol operation[1]. The values represent the worst-case analysis. The EABR has been analyzed based on CC and OC and the results have been compared with another routing technique called ABR. The results have shown that EABR can perform better than ABR in many circumstances during the route reconstruction.

  10. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  11. Version based spatial record management techniques for spatial database management system

    Institute of Scientific and Technical Information of China (English)

    KIM Ho-seok; KIM Hee-taek; KIM Myung-keun; BAE Hae-young

    2004-01-01

    The search operation of spatial data was a principal operation in existent spatial database management system, but the update operation of spatial data such as tracking are occurring frequently in the spatial database management system recently. So, necessity of concurrency improvement among transactions is increasing. In general database management system, many techniques have been studied to solve concurrency problem of transaction. Among them, multi-version algorithm does to minimize interference among transactions. However, to apply existent multi-version algorithm to improve concurrency of transaction to spatial database management system, the waste of storage happens because it must store entire version for spatial record even if only aspatial data of spatial record is changed. This paper has proposed the record management techniques to manage separating aspatial data version and spatial data version to decrease waste of storage for record version and improve concurrency among transactions.

  12. Analysis of a Chaotic Memristor Based Oscillator

    Directory of Open Access Journals (Sweden)

    F. Setoudeh

    2014-01-01

    Full Text Available A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT. Then, using Advance Design System (ADS software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of the paper.

  13. Analysis of a Chaotic Memristor Based Oscillator

    OpenAIRE

    F. Setoudeh; Khaki Sedigh, A.; Dousti, M

    2014-01-01

    A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT). Then, using Advance Design System (ADS) software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of t...

  14. Longitudinal Prescribing Patterns for Psychoactive Medications in Community-Based Individuals with Developmental Disabilities: Utilization of Pharmacy Records

    Science.gov (United States)

    Lott, I. T.; McGregor, M.; Engelman, L.; Touchette, P.; Tournay, A.; Sandman, C.; Fernandez, G.; Plon, L.; Walsh, D.

    2004-01-01

    Little is known about longitudinal prescribing practices for psychoactive medications for individuals with intellectual disabilities and developmental disabilities (IDDD) who are living in community settings. Computerized pharmacy records were accessed for 2344 community-based individuals with IDDD for whom a total of 3421 prescriptions were…

  15. 13 CFR 106.403 - Who has authority to approve and sign a Non-Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who has authority to approve and...-Sponsored Activities § 106.403 Who has authority to approve and sign a Non-Fee Based Record? The appropriate Responsible Program Official, after consultation with the designated legal counsel, has authority to...

  16. Holographic theory and recording techniques. Citations from the NTIS data base

    Science.gov (United States)

    Carrigan, B.

    1980-05-01

    The topics cited include holographic recording techniques, theory, equipment, and materials. Among the techniques cited are color holography, X-ray holography, high speed holography, and motion picture holography. Photographic materials, films, emulsions, and equipment for recording and information storage are covered. Techniques for image motion compensation, image deblurring, wave-front reconstruction, and resolution are also cited. This updated bibliography contains 251 abstracts, 17 of which are new entries to the previous edition.

  17. Polarization properties of four-wave interaction in dynamic recording material based on bacteriorhodopsin

    Science.gov (United States)

    Korchemskaya, Ellen Y.; Soskin, Marat S.

    1994-10-01

    The polarization properties of four-wave interaction on polymer films with bacteriorhodopsin that possess anisotropically saturating nonlinearity are studied both theoretically and experimentally. The amplitude and the polarization of the diffracted wave for recording material with anisotropically saturating nonlinearity are calculated. Low saturation intensity allows the operation of the polarization of low-intensity signals to be realized. It is shown that control of the diffractive wave polarization is possible only with the variation of the light recording intensity.

  18. 3D Reconstruction of Human Laryngeal Dynamics Based on Endoscopic High-Speed Recordings.

    Science.gov (United States)

    Semmler, Marion; Kniesburges, Stefan; Birk, Veronika; Ziethe, Anke; Patel, Rita; Dollinger, Michael

    2016-07-01

    Standard laryngoscopic imaging techniques provide only limited two-dimensional insights into the vocal fold vibrations not taking the vertical component into account. However, previous experiments have shown a significant vertical component in the vibration of the vocal folds. We present a 3D reconstruction of the entire superior vocal fold surface from 2D high-speed videoendoscopy via stereo triangulation. In a typical camera-laser set-up the structured laser light pattern is projected on the vocal folds and captured at 4000 fps. The measuring device is suitable for in vivo application since the external dimensions of the miniaturized set-up barely exceed the size of a standard rigid laryngoscope. We provide a conservative estimate on the resulting resolution based on the hardware components and point out the possibilities and limitations of the miniaturized camera-laser set-up. In addition to the 3D vocal fold surface, we extended previous approaches with a G2-continuous model of the vocal fold edge. The clinical applicability was successfully established by the reconstruction of visual data acquired from 2D in vivo high-speed recordings of a female and a male subject. We present extracted dynamic parameters like maximum amplitude and velocity in the vertical direction. The additional vertical component reveals deeper insights into the vibratory dynamics of the vocal folds by means of a non-invasive method. The successful miniaturization allows for in vivo application giving access to the most realistic model available and hence enables a comprehensive understanding of the human phonation process. PMID:26829782

  19. Independent component analysis and decision trees for ECG holter recording de-noising.

    Directory of Open Access Journals (Sweden)

    Jakub Kuzilek

    Full Text Available We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA. This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE between original ECG and filtered data contaminated with artificial noise. Proposed algorithm achieved comparable result in terms of standard noises (power line interference, base line wander, EMG, but noticeably significantly better results were achieved when uncommon noise (electrode cable movement artefact were compared.

  20. Development of an Electronic Claim System Based on an Integrated Electronic Health Record Platform to Guarantee Interoperability

    OpenAIRE

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-01-01

    Objectives We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. Methods The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medic...

  1. Beyond the Eyes of the Monster: An Analysis of Recent Trends in Assessment and Recording.

    Science.gov (United States)

    Ainscow, Mel

    1988-01-01

    The article analyzes existing practice in assessment and recording in the special needs field and makes such recommendations as assessment with a wider perspective more continuously to aid in making the curriculum responsive to individual needs. (DB)

  2. Managing Everyday Life: A Qualitative Study of Patients’ Experiences of a Web-Based Ulcer Record for Home-Based Treatment

    Science.gov (United States)

    Trondsen, Marianne V.

    2014-01-01

    Chronic skin ulcers are a significant challenge for patients and health service resources, and ulcer treatment often requires the competence of a specialist. Although e-health interventions are increasingly valued for ulcer care by giving access to specialists at a distance, there is limited research on patients’ use of e-health services for home-based ulcer treatment. This article reports an exploratory qualitative study of the first Norwegian web-based counselling service for home-based ulcer treatment, established in 2011 by the University Hospital of North Norway (UNN). Community nurses, general practitioners (GPs) and patients are offered access to a web-based record system to optimize ulcer care. The web-based ulcer record enables the exchange and storage of digital photos and clinical information, by the use of which, an ulcer team at UNN, consisting of specialized nurses and dermatologists, is accessible within 24 h. This article explores patients’ experiences of using the web-based record for their home-based ulcer treatment without assistance from community nurses. Semi-structured interviews were conducted with a total of four patients who had used the record. The main outcomes identified were: autonomy and flexibility; safety and trust; involvement and control; and motivation and hope. These aspects improved the patients’ everyday life during long-term ulcer care and can be understood as stimulating patient empowerment.

  3. Analysis of laser turbulence utilizing a video tape recorder and digital storage oscilloscope.

    OpenAIRE

    Connor, John Henry

    1982-01-01

    Approved for public release; distribution unlimited The ability to measure and predict atmospheric turbulence affecting laser beam propagation is a major concern when considering military applications. Such a method using a telescope, high resolution television camera, video tape recorder, digital storage oscilloscope, and calculator system has been devised, tested and utilized. A laser beam signal is recorded on video tape for further processing. This signal is displayed...

  4. Analysis of historical meteor and meteor shower records: Korea, China, and Japan

    CERN Document Server

    Yang, H J; Park, M G; Yang, Hong-Jin; Park, Changbom; Park, Myeong-Gu

    2005-01-01

    We have compiled and analyzed historical Korean meteor and meteor shower records in three Korean official history books, Samguksagi which covers the three Kingdoms period (57 B.C -- A.D. 935), Goryeosa of Goryeo dynasty (A.D. 918 -- 1392), and Joseonwangjosillok of Joseon dynasty (A.D. 1392 -- 1910). We have found 3861 meteor and 31 meteor shower records. We have confirmed the peaks of Perseids and an excess due to the mixture of Orionids, north-Taurids, or Leonids through the Monte-Carlo test. The peaks persist from the period of Goryeo dynasty to that of Joseon dynasty, for almost one thousand years. Korean records show a decrease of Perseids activity and an increase of Orionids/north-Taurids/Leonids activity. We have also analyzed seasonal variation of sporadic meteors from Korean records. We confirm the seasonal variation of sporadic meteors from the records of Joseon dynasty with the maximum number of events being roughly 1.7 times the minimum. The Korean records are compared with Chinese and Japanese re...

  5. Analysis of debris-flow recordings in an instrumented basin: confirmations and new findings

    Directory of Open Access Journals (Sweden)

    M. Arattano

    2012-03-01

    Full Text Available On 24 August 2006, a debris flow took place in the Moscardo Torrent, a basin of the Eastern Italian Alps instrumented for debris-flow monitoring. The debris flow was recorded by two seismic networks located in the lower part of the basin and on the alluvial fan, respectively. The event was also recorded by a pair of ultrasonic sensors installed on the fan, close to the lower seismic network. The comparison between the different recordings outlines particular features of the August 2006 debris flow, different from that of events recorded in previous years. A typical debris-flow wave was observed at the upper seismic network, with a main front abruptly appearing in the torrent, followed by a gradual decrease of flow height. On the contrary, on the alluvial fan the wave displayed an irregular pattern, with low flow depth and the main peak occurring in the central part of the surge both in the seismic recording and in the hydrographs. Recorded data and field evidences indicate that the surge observed on the alluvial fan was not a debris flow, and probably consisted in a water surge laden with fine to medium-sized sediment. The change in shape and characteristics of the wave can be ascribed to the attenuation of the surge caused by the torrent control works implemented in the lower basin during the last years.

  6. Multi-level analysis of electronic health record adoption by health care professionals: A study protocol

    Directory of Open Access Journals (Sweden)

    Labrecque Michel

    2010-04-01

    Full Text Available Abstract Background The electronic health record (EHR is an important application of information and communication technologies to the healthcare sector. EHR implementation is expected to produce benefits for patients, professionals, organisations, and the population as a whole. These benefits cannot be achieved without the adoption of EHR by healthcare professionals. Nevertheless, the influence of individual and organisational factors in determining EHR adoption is still unclear. This study aims to assess the unique contribution of individual and organisational factors on EHR adoption in healthcare settings, as well as possible interrelations between these factors. Methods A prospective study will be conducted. A stratified random sampling method will be used to select 50 healthcare organisations in the Quebec City Health Region (Canada. At the individual level, a sample of 15 to 30 health professionals will be chosen within each organisation depending on its size. A semi-structured questionnaire will be administered to two key informants in each organisation to collect organisational data. A composite adoption score of EHR adoption will be developed based on a Delphi process and will be used as the outcome variable. Twelve to eighteen months after the first contact, depending on the pace of EHR implementation, key informants and clinicians will be contacted once again to monitor the evolution of EHR adoption. A multilevel regression model will be applied to identify the organisational and individual determinants of EHR adoption in clinical settings. Alternative analytical models would be applied if necessary. Results The study will assess the contribution of organisational and individual factors, as well as their interactions, to the implementation of EHR in clinical settings. Conclusions These results will be very relevant for decision makers and managers who are facing the challenge of implementing EHR in the healthcare system. In addition

  7. Literature based drug interaction prediction with clinical assessment using electronic medical records: novel myopathy associated drug interactions.

    Directory of Open Access Journals (Sweden)

    Jon D Duke

    Full Text Available Drug-drug interactions (DDIs are a common cause of adverse drug events. In this paper, we combined a literature discovery approach with analysis of a large electronic medical record database method to predict and evaluate novel DDIs. We predicted an initial set of 13197 potential DDIs based on substrates and inhibitors of cytochrome P450 (CYP metabolism enzymes identified from published in vitro pharmacology experiments. Using a clinical repository of over 800,000 patients, we narrowed this theoretical set of DDIs to 3670 drug pairs actually taken by patients. Finally, we sought to identify novel combinations that synergistically increased the risk of myopathy. Five pairs were identified with their p-values less than 1E-06: loratadine and simvastatin (relative risk or RR = 1.69; loratadine and alprazolam (RR = 1.86; loratadine and duloxetine (RR = 1.94; loratadine and ropinirole (RR = 3.21; and promethazine and tegaserod (RR = 3.00. When taken together, each drug pair showed a significantly increased risk of myopathy when compared to the expected additive myopathy risk from taking either of the drugs alone. Based on additional literature data on in vitro drug metabolism and inhibition potency, loratadine and simvastatin and tegaserod and promethazine were predicted to have a strong DDI through the CYP3A4 and CYP2D6 enzymes, respectively. This new translational biomedical informatics approach supports not only detection of new clinically significant DDI signals, but also evaluation of their potential molecular mechanisms.

  8. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty

    2011-10-01

    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  9. Dissertation on the computer-based exploitation of a coincidence multi parametric recording. Application to the study of the disintegration scheme of Americium 241

    International Nuclear Information System (INIS)

    After having presented the meaning of disintegration scheme (alpha and gamma emissions, internal conversion, mean lifetime), the author highlights the benefits of the use of multi-parametric chain for the recording of correlated parameters, and of the use of a computer for the analysis of bi-parametric information based on contour lines. Using the example of Americium 241, the author shows how these information are obtained (alpha and gamma spectrometry, time measurement), how they are chosen, coded, analysed and stored, and then processed by contour lines

  10. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records

    DEFF Research Database (Denmark)

    Walker, Mike; Johnsen, Sigfus Johann; Rasmussen, Sune Olander;

    2009-01-01

    of climatic warming at the end of the Younger Dryas/Greenland Stadial 1 cold phase, to be located with a high degree of precision. This climatic event is most clearly reflected in an abrupt shift in deuterium excess values, accompanied by more gradual changes in d18O, dust concentration, a range of chemical......The Greenland ice core from NorthGRIP (NGRIP) contains a proxy climate record across the Pleistocene-Holocene boundary of unprecedented clarity and resolution. Analysis of an array of physical and chemical parameters within the ice enables the base of the Holocene, as reflected in the first signs...

  11. A new global geomagnetic model based on archeomagnetic, volcanic and historical records

    Science.gov (United States)

    Arneitz, Patrick; Leonhardt, Roman; Fabian, Karl

    2016-04-01

    The major challenge of geomagnetic field reconstruction lies in the inhomogeneous spatio-temporal distribution of the available data and their highly variable quality. Paleo- and archeomagnetic records provide information about the ancient geomagnetic field beyond the historical period. Typically these data types have larger errors than their historical counterparts, and investigated materials and applied experimental methods potentially bias field readings. Input data for the modelling approach were extracted from available collections of archeomagnetic, volcanic and historical records, which were integrated into a single database along with associated meta-data. The used iterative Bayesian inversion scheme targets the implementation of reliable error treatments, which allows to combine the different data types. The proposed model is scrutinized by carrying out tests with artificial records. Records are synthesized using a known field evolution generated by a geodynamo model showing realistic energy characteristics. Using the artificial field, a synthetic data set is generated that exactly mirrors the existing measured records in all meta-data, but provides data that would have been observed if the artificial field would have been real. After inversion of the synthetic data, the comparison of known artificial Gauss coefficients and modelled ones allows for the verification of the applied modelling strategy as well as for the examination of the potential and limits of the current data compilation.

  12. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Nielsen, Mads; Lo, Pechin Chien Pau;

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based on...... subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density. The...

  13. Viewpoint from Defects Analysis of Medical Records%由病案缺陷引发的思考

    Institute of Scientific and Technical Information of China (English)

    林春生

    2012-01-01

    Objective This paper discusses new countermeasures about medical records quality . Methods 3000 hospitalization medical records are randomly selected from July 2010 to October 2010 , had quality analysis . Results There are 2872 class A medical records (95 .73% ) , 128 class B medical records (4 .27% ) , no class C medical records; there are 5430 defects , in which defects about clinical basic and standardization accounting for 55 .49% , defects about medical safety record accounting for 28 .71% and defects about treatment technology and medication accounting for 15 .8% . Conclusions It's advised that the doctors study correlative legal items about medicine and carry through good medical training , so as to enhance quality of medical records writing . The real time medical records monitoring system might be a new mode of medical records management . Rules must be es-tablished such as drug application so as to improve quality of medical records writing . According to Law of Tort liability , it is very important to further standardize the clinicians' diagnosis and treatment behavior.%目的 探讨提高病案质量的新措施.方法 随机抽取我院2010年7月-2010年10月间的出院病案3000份进行质控分析.结果 甲级病案2872份(95.73%),乙级病案128份(4.27%),无丙级病案;病案缺陷共5430处,其中临床基础与规范类缺陷占55.49%、医疗安全记录类缺陷占28.71%、诊疗技术与用药类缺陷占15.80%.结论 通过对临床医师进行法制和专业知识培训,以提高病案书写能力;利用医院信息管理系统,对病案质量进行实时监控是一种新的病案管理模式;建议进一步完善药物使用等管理制度,确保病历内涵质量的提高;应结合进一步规范临床医师诊疗行为.

  14. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  15. Construction of a SORCE-based Solar Spectral Irradiance (SSI) Record for Input into Chemistry Climate Models

    Science.gov (United States)

    Harder, J. W.; Fontenla, J. M.

    2015-12-01

    We present a research program to produce a solar spectral irradiance (SSI) record suitable for whole atmosphere chemistry-climate model (CCM) transient studies over the 2001-2015 time period for Solar Cycle 23 and 24 (SC23-24). Climate simulations during this time period are particularly valuable because SC23-24 represents the best-observed solar cycle in history - both from the perspective of solar physics and in terms of Earth observation systems. This record will be based predominantly on the observed irradiance of the SORCE mission as measured by the SIM and SOLSTICE instruments from April of 2003 to the present time. The SSI data record for this proposed study requires very broad wavelength coverage (115-100000 nm), daily spectral coverage, compliance of the integrated SSI record with the TSI, and well-defined and documented uncertainty estimates. While the majority of the record will be derived from SORCE observations, extensions back to the SC23 maximum time period (early 2001) and closure of critical gaps in the SORCE record will be generated employing the Fontenla et al. (2015) Solar Radiation Physical Model (SRPMv2). Since SRPM is a physics-based model, estimates of the SSI for wavelengths outside the SORCE measurement range can be meaningfully included. This model now includes non-LTE contributions from metals in the atomic number range 22-28 (i.e. titanium through nickel) as well as important molecular photo-disassociation contributions from molecules such as NH, molecular hydrogen, CH, and OH led have led to greatly improved agreement between the model and the observed 0.1 nm SOLSTICE spectrum. Thus comparative studies of the SORCE observations with SRPMv2 provide meaningful insight into the nature of solar variability critical for subsequent Earth atmospheric modeling efforts.

  16. Diffractive Optical Elements with a Large Angle of Operation Recorded in Acrylamide Based Photopolymer on Flexible Substrates

    Directory of Open Access Journals (Sweden)

    Hoda Akbari

    2014-01-01

    Full Text Available A holographic device characterised by a large angular range of operation is under development. The aim of this study is to increase the angular working range of the diffractive lens by stacking three layers of high efficiency optical elements on top of each other so that light is collected (and focussed from a broader range of angles. The angular range of each individual lens element is important, and work has already been done in an acrylamide-based photosensitive polymer to broaden the angular range of individual elements using holographic recording at a low spatial frequency. This paper reports new results on the angular selectivity of stacked diffractive lenses. A working range of 12° is achieved. The diffractive focussing elements were recorded holographically with a central spatial frequency of 300 l/mm using exposure energy of 60 mJ/cm2 at a range of recording angles. At this spatial frequency with layers of thickness 50 ± 5 µm, a diffraction efficiency of 80% and 50% was achieved in the single lens element and combined device, respectively. The optical recording process and the properties of the multilayer structure are described and discussed. Holographic recording of a single lens element is also successfully demonstrated on a flexible glass substrate (Corning(R Willow(R Glass for the first time.

  17. Mobile Application of Water Meter Recorder Based on Short Message Service Transmissions Using Windows Mobile Platform

    Directory of Open Access Journals (Sweden)

    I Dewa Nyoman Anom Manuaba

    2013-01-01

    Full Text Available The rapid development of technology nowadays has major impact to the development of cellular technology. This development led to a new wide range of smartphone. The growth of life nowadays requires people to work more quickly, so that they can use the time more effective and increase the performance. The process which done manually takes a lot more time than the process which done automatically, because the process which done manually have a higher risk of error than the process which done automatically. The process that are still done manually is recording the amount of customer water consumption in PDAM (Regional Water Company. This problem can be solved by creating mobile application that can record the water meter and then automatically send the data of the customer and the amount of water use directly to the computer server and calculated automatically. This application can solve the problem in recording the water meter.

  18. A stratigraphic framework for naming and robust correlation of abrupt climatic changes during the last glacial period based on three synchronized Greenland ice core records

    Science.gov (United States)

    Rasmussen, Sune O.

    2014-05-01

    Due to their outstanding resolution and well-constrained chronologies, Greenland ice core records have long been used as a master record of past climatic changes during the last interglacial-glacial cycle in the North Atlantic region. As part of the INTIMATE (INtegration of Ice-core, MArine and TErrestrial records) project, protocols have been proposed to ensure consistent and robust correlation between different records of past climate. A key element of these protocols has been the formal definition of numbered Greenland Stadials (GS) and Greenland Interstadials (GI) within the past glacial period as the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. Using a recent synchronization of the NGRIP, GRIP, and GISP2 ice cores that allows the parallel analysis of all three records on a common time scale, we here present an extension of the GS/GI stratigraphic template to the entire glacial period. This is based on a combination of isotope ratios (δ18O, reflecting mainly local temperature) and calcium concentrations (reflecting mainly atmospheric dust loading). In addition to the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice core records more than two decades ago, a number of short-lived climatic oscillations have been identified in the three synchronized records. Some of these events have been observed in other studies, but we here propose a consistent scheme for discriminating and naming all the significant climatic events of the last glacial period that are represented in the Greenland ice cores. This is a key step aimed at promoting unambiguous comparison and correlation between different proxy records, as well as a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations. The work presented is under review for publication in Quaternary Science Reviews. Author team: S

  19. Time capsule: an autonomous sensor and recorder based on diffusion-reaction.

    Science.gov (United States)

    Gerber, Lukas C; Rosenfeld, Liat; Chen, Yunhan; Tang, Sindy K Y

    2014-11-21

    We describe the use of chemical diffusion and reaction to record temporally varying chemical information as spatial patterns without the need for external power. Diffusion of chemicals acts as a clock, while reactions forming immobile products possessing defined optical properties perform sensing and recording functions simultaneously. The spatial location of the products reflects the history of exposure to the detected substances of interest. We refer to our device as a time capsule and show an initial proof of principle in the autonomous detection of lead ions in water.

  20. Facies characterization based on physical properties from downhole logging for the sediment record of Lake Van, Turkey

    Science.gov (United States)

    Baumgarten, H.; Wonik, T.; Kwiecien, O.

    2014-11-01

    significant depth shifts of up to 2.5 m between the composite profile based on the VCD and the downhole measurements in hole 2D of the Ahlat Ridge, (b) a correlation was difficult to ascertain from the vertical resolution of the downhole logging data and the extremely detailed core description in mm-scale, (c) mixed signals were obtained because of prevailing thin layers and intercalations of different lithotypes and (d) cluster analysis was difficult to perform because the contrast within the input data is too low (possibly background sedimentation) to distinguish between glacial and interglacial deposits. Tephra units are characterized by contrasting properties and differ mainly in their magnetic susceptibility, spectral gamma ray components (uranium, thorium and potassium) and XRF-intensities of calcium and zirconium. Tephra units have been linked to the dominant volcanic composition of the deposited tephra layers and partly to the volcanic sources. Depth trends are derived with prevailing basaltic deposits in the bottom part (128 m-210 m below lake floor) and are gradually outweighed by the highly differentiated (dacitic and rhyolitic/trachytic) products towards the top of the record.

  1. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  2. Polyphase Order Analysis Based on Convolutional Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-06-01

    Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.

  3. Study of TCM clinical records based on LSA and LDA SHTDT model

    Science.gov (United States)

    LIN, FAN; ZHANG, ZHIHONG; LIN, SHU-FU; ZENG, JIA-SONG; GAN, YAN-FANG

    2016-01-01

    Description of syndromes and symptoms in traditional Chinese medicine are extremely complicated. The method utilized to diagnose a patient's syndrome more efficiently is the primary aim of clinical health care workers. In the present study, two models were presented concerning this issue. The first is the latent semantic analysis (LSA)-based semantic classification model, which is employed when the classification and words used to depict these classfications have been confirmed. The second is the symptom-herb-therapies-diagnosis topic (SHTDT), which is employed when the classification has not been confirmed or described. The experimental results showed that this method was successful, and symptoms can be diagnosed to a certain extent. The experimental results indicated that the topic feature reflected patient characteristics and the topic structure was obtained, which was clinically significant. The experimental results showed that when provided with a patient's symptoms, the model can be used to predict the theme and diagnose the disease, and administer appropriate drugs and treatments. Additionally, the SHTDT model prediction results did not yield completely accurate results because this prediction is equivalent to multi-label prediction, whereby the drugs, treatment and diagnosis are considered as labels. In conclusion, diagnosis, and the drug and treatment administered are based on human factors. PMID:27347051

  4. A content analysis of stroke physical therapy intervention using stroke physiotherapy intervention recording tool.

    Science.gov (United States)

    Cho, Hyuk-Shin; Cha, Hyun-Gyu

    2016-05-01

    [Purpose] Physical therapy for recovery of function in people with stroke is known to be effective, but which type of physical therapy intervention is most effective is uncertain because a concrete and detailed record of interventions is done. This study aimed to record, analyze, and describe the content of physical therapy interventions for recovery of function after stroke using stroke physiotherapy intervention recording tool (SPIRIT). [Subjects and Methods] A convenience sample of 23 physical therapists from a rehabilitation hospital in Chung-nam recorded the interventions for 73 patients with stroke who were treated for 30 minutes in 670 treatment sessions. Treatment session contents were recorded using SPIRIT. Descriptive statistics were used to describe the interventions accurately and to investigate the differences according to time since stroke. [Results] Facilitation techniques were the most frequently used interventions (n=1,342, 35.1%), followed by practice (n=1,056, 27.6%), and exercise (n=748, 19.6%) in the physical therapists' clinical practice. [Conclusion] This pattern shows that physical therapists were focused on functional activity. Organizing or teaching patient activities for independent practice interventions (n=286, 7.5%) were used to encourage patient activity and independence outside the treatment sessions. Interventions according to time since stroke were not significantly different. PMID:27313368

  5. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng

    2006-01-01

    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  6. 77 FR 47826 - Record of Decision for F35A Training Basing Final Environmental Impact Statement

    Science.gov (United States)

    2012-08-10

    ... ACTION: Notice of Availability (NOA) of a Record of Decision (ROD). SUMMARY: On August 1, 2012, the... relevant factors. The FEIS was made available to the public on June 15, 2012 through a NOA in the Federal... FEIS. Authority: This NOA is published pursuant to the regulations (40 CFR Part 1506.6)...

  7. Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data

    Science.gov (United States)

    Butterfield, Ansel J.

    2001-01-01

    Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.

  8. Analysis of observational records of Dae-gyupyo in Joseon Dynasty

    Science.gov (United States)

    Mihn, Byeong-Hee; Lee, Ki-Won; Kim, Sang-Hyuk; Ahn, Young Sook; Lee, Yong Sam

    2012-09-01

    It is known that Dae-gyupyo (the Large Noon Gnomon) and So-gyupyo (the Small Noon Gnomon) were constructed in the reign of King Sejong (1418--1450) of the Joseon Dynasty. Gyupyo is an astronomical instrument for measuring the length of the shadow cast by a celestial body at the meridian passage time; it consists of two basic parts: a measuring scale and a vertical column. According to the Veritable Records of King Sejong and of King Myeongjong (1545--1567), the column of Dae-gyupyo was 40 Cheok (˜ 8 m) in height from the measuring scale and had a cross-bar, like the Guibiao of Shoujing Guo of the Yuan Dynasty in China. In the latter Veritable Records, three observations of the Sun on the date of the winter solstice and two of the full Moon on the first month in a luni-solar calendar are also recorded. In particular, the observational record of Dae-gyupyo for the Sun on Dec. 12, 1563 is ˜ 1 m shorter than the previous two records. To explain this, we investigated two possibilities: the vertical column was inclined, and the cross-bar was lowered. The cross-bar was attached to the column by a supporting arm; that should be installed at an angle of ˜ 36.9° to the north on the basis of a geometric structure inferred from the records of Yuanshi (History of the Yuan Dynasty). We found that it was possible that the vertical column was inclined ˜ 7.7° to the south or the supporting arm was tilted ˜ 58.3° downward. We suggest that the arm was tilted by ˜ 95° (= 36.9° + 58.3°).

  9. Ground-based assessment of the bias and long-term stability of 14 limb and occultation ozone profile data records

    Science.gov (United States)

    Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, José; Keppens, Arno; Baray, Jean-Luc; Bourassa, Adam E.; Cortesi, Ugo; Degenstein, Doug A.; Froidevaux, Lucien; Godin-Beekmann, Sophie; Hoppel, Karl W.; Johnson, Bryan J.; Kyrölä, Erkki; Leblanc, Thierry; Lichtenberg, Günter; Marchand, Marion; McElroy, C. Thomas; Murtagh, Donal; Nakane, Hideaki; Portafaix, Thierry; Querel, Richard; Russell, James M., III; Salvador, Jacobo; Smit, Herman G. J.; Stebel, Kerstin; Steinbrecht, Wolfgang; Strawbridge, Kevin B.; Stübi, René; Swart, Daan P. J.; Taha, Ghassan; Tarasick, David W.; Thompson, Anne M.; Urban, Joachim; van Gijsel, Joanna A. E.; Van Malderen, Roeland; von der Gathen, Peter; Walker, Kaley A.; Wolfram, Elian; Zawodny, Joseph M.

    2016-06-01

    profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of 14 limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20 and 40 km the satellite ozone measurement biases are smaller than ±5 %, the short-term variabilities are less than 5-12 % and the drifts are at most ±5 % decade-1 (or even ±3 % decade-1 for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10 % and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY) and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions

  10. Scoring tail damage in pigs: an evaluation based on recordings at Swedish slaughterhouses

    Directory of Open Access Journals (Sweden)

    Keeling Linda J

    2012-05-01

    Full Text Available Abstract Background There is increasing interest in recording tail damage in pigs at slaughter to identify problem farms for advisory purposes, but also for benchmarking within and between countries as part of systematic monitoring of animal welfare. However, it is difficult to draw conclusions when comparing prevalence’s between studies and countries partly due to differences in management (e.g. differences in tail docking and enrichment routines and partly due to differences in the definition of tail damage. Methods Tail damage and tail length was recorded for 15,068 pigs slaughtered during three and four consecutive days at two slaughterhouses in Sweden. Tail damage was visually scored according to a 6-point scale and tail length was both visually scored according to a 5-point scale and recorded as tail length in centimetres for pigs with injured or shortened tails. Results The total prevalence of injury or shortening of the tail was 7.0% and 7.2% in slaughterhouse A and B, respectively. When only considering pigs with half or less of the tail left, these percentages were 1.5% and 1.9%, which is in line with the prevalence estimated from the routine recordings at slaughter in Sweden. A higher percentage of males had injured and/or shortened tails, and males had more severely bitten tails than females. Conclusions While the current method to record tail damage in Sweden was found to be reliable as a method to identify problem farms, it clearly underestimates the actual prevalence of tail damage. For monitoring and benchmarking purposes, both in Sweden and internationally, we propose that a three graded scale including both old and new tail damage would be more appropriate. The scale consists of one class for no tail damage, one for mild tail damage (injured or shortened tail with more than half of the tail remaining and one for severe tail damage (half or less of the tail remaining.

  11. Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy

    CERN Document Server

    Kopitzki, K; Timmer, J

    1998-01-01

    Invasive electroencephalograph (EEG) recordings of ten patients suffering from focal epilepsy were analyzed using the method of renormalized entropy. Introduced as a complexity measure for the different regimes of a dynamical system, the feature was tested here for its spatio-temporal behavior in epileptic seizures. In all patients a decrease of renormalized entropy within the ictal phase of seizure was found. Furthermore, the strength of this decrease is monotonically related to the distance of the recording location to the focus. The results suggest that the method of renormalized entropy is a useful procedure for clinical applications like seizure detection and localization of epileptic foci.

  12. 故障录波器后台分析软件关键问题研究%Research on key technique of fault recorder background analysis software

    Institute of Scientific and Technical Information of China (English)

    郭振华; 江亚群; 杨帅雄; 梁勇超; 黄纯

    2011-01-01

    This paper studies the difficulties and key technique of designing power fault record analysis software of fault record device. Firstly, the algorithm of calculating waveform vertical coordinate with consideration of unified waveform scaling is proposed according to the characteristics and requirements of the power fault record waveform analysis and displaying. Then the computer double buffer drawing method is adopted to avoid graphics flicker which happened in traditional Windows graphics plot. In the IEC COMTRADE formatted fault recording file, the recording device samples the signal with different sampling rates in different periods; in order to estimate cross-period signal parameters, a dual sampling rate parameter algorithm based on discrete Fourier transform is given. The power fault record analysis system developed by Visual C++6.0 has friendly man-machine interface and excellent performance, which has been used in engineering practice.%对故障录波装置中录波分析软件设计中的难点和关键技术进行了研究.根据电力故障录波波形分析、显示的特点和要求,提出统一考虑波形缩放的波形显示曲线纵坐标的计算方法.采用计算机双缓存绘图方法,解决了传统Windows绘图闪烁问题,改善了显示效果.针对IEC C0MTRADE格式录波文件分时段多频率采样的特点,提出基于离散傅里叶变换的双采样速率波形参数估计算法,实现了波形参数的跨时段计算.采用Visual C++6.0设计的电力故障录波分析系统人机界面友好,性能优良,已应用于工程实践.

  13. Climatology Analysis of Aerosol Effect on Marine Water Cloud from Long-Term Satellite Climate Data Records

    OpenAIRE

    Xuepeng Zhao; Andrew K. Heidinger; Andi Walther

    2016-01-01

    Satellite aerosol and cloud climate data records (CDRs) have been used successfully to study the aerosol indirect effect (AIE). Data from the Advanced Very High Resolution Radiometer (AVHRR) now span more than 30 years and allow these studies to be conducted from a climatology perspective. In this paper, AVHRR data are used to study the AIE on water clouds over the global oceans. Correlation analysis between aerosol optical thickness (AOT) and cloud parameters, including cloud droplet effecti...

  14. Physician assessment of disease activity in JIA subtypes. Analysis of data extracted from electronic medical records

    Directory of Open Access Journals (Sweden)

    Wang Deli

    2011-04-01

    Full Text Available Abstract Objective Although electronic medical records (EMRs have facilitated care for children with juvenile idiopathic arthritis (JIA, analyses of treatment outcomes have required paper based or manually re-entered data. We have started EMR discrete data entry for JIA patient visits, including joint examination and global assessment, by physician and patient. In this preliminary study, we extracted data from the EMR to Xenobase™ (TransMed Systems, Inc., Cupertino, CA, an application permitting cohort analyses of the relationship between global assessment to joint examination and subtype. Methods During clinic visits, data were entered into discrete fields in ambulatory visit forms in the EMR (EpicCare™, Epic Systems, Verona, WI. Data were extracted using Clarity Reports, then de-identified and uploaded for analyses to Xenobase™. Parameters included joint examination, ILAR diagnostic classification, physician global assessment, patient global assessment, and patient pain score. Data for a single visit for each of 160 patients over a 2 month period, beginning March, 2010, were analyzed. Results In systemic JIA patients, strong correlations for physician global assessment were found with pain score, joint count and patient assessment. In contrast, physician assessment for patients with persistent oligoarticular and rheumatoid factor negative patients showed strong correlation with joint counts, but only moderate correlation with pain scores and patient global assessment. Conversely, for enthesitis patients, physician assessment correlated strongly with pain scores, and moderately with joint count and patient global assessment. Rheumatoid factor positive patients, the smallest group studied, showed moderate correlation for all three measures. Patient global assessment for systemic patients showed strong correlations with pain scores and joint count, similar to data for physician assessment. For polyarticular and enthesitis patients

  15. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang

    2004-01-01

    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  16. Development of an apnea detection algorithm based on temporal analysis of thoracic respiratory effort signal

    Science.gov (United States)

    Dell’Aquila, C. R.; Cañadas, G. E.; Correa, L. S.; Laciar, E.

    2016-04-01

    This work describes the design of an algorithm for detecting apnea episodes, based on analysis of thorax respiratory effort signal. Inspiration and expiration time, and range amplitude of respiratory cycle were evaluated. For range analysis the standard deviation statistical tool was used over respiratory signal temporal windows. The validity of its performance was carried out in 8 records of Apnea-ECG database that has annotations of apnea episodes. The results are: sensitivity (Se) 73%, specificity (Sp) 83%. These values can be improving eliminating artifact of signal records.

  17. Design and evaluation of area-efficient and wide-range impedance analysis circuit for multichannel high-quality brain signal recording system

    Science.gov (United States)

    Iwagami, Takuma; Tani, Takaharu; Ito, Keita; Nishino, Satoru; Harashima, Takuya; Kino, Hisashi; Kiyoyama, Koji; Tanaka, Tetsu

    2016-04-01

    To enable chronic and stable neural recording, we have been developing an implantable multichannel neural recording system with impedance analysis functions. One of the important things for high-quality neural signal recording is to maintain well interfaces between recording electrodes and tissues. We have proposed an impedance analysis circuit with a very small circuit area, which is implemented in a multichannel neural recording and stimulating system. In this paper, we focused on the design of an impedance analysis circuit configuration and the evaluation of a minimal voltage measurement unit. The proposed circuit has a very small circuit area of 0.23 mm2 designed with 0.18 µm CMOS technology and can measure interface impedances between recording electrodes and tissues in ultrawide ranges from 100 Ω to 10 MΩ. In addition, we also successfully acquired interface impedances using the proposed circuit in agarose gel experiments.

  18. Time based measurement of the impedance of the skin-electrode interface for dry electrode ECG recording.

    Science.gov (United States)

    Dozio, Roberta; Baba, Adeshina; Assambo, Cedric; Burke, Martin J

    2007-01-01

    This paper reports the measurement of the properties of dry or pasteless conductive electrodes to be used for long-term recording of the human electrocardiogram (ECG). Knowledge of these properties is essential for the correct design of the input stage of associated recording amplifiers. Measurements were made on three commercially available conductive carbon based electrodes at pressures of 5 mmHg and 20 mmHg, located on the lower abdomen of the body on three subjects having different skin types. Parameter values were fitted to a two-time-constant based model of the electrode using data measured over a period of 10s. Values of resistance, ranging from 40kOmega to 1590kOmega and of capacitance ranging from 0.05 microF to 38 microF were obtained for the components, while the values of the time-constants varied from 0.07 s to 3.9s.

  19. The Accuracy Analysis of Five-planet Movements Recorded in China in the Han Dynasty

    Science.gov (United States)

    Zhang, J.

    2010-04-01

    The observations and researches of five-planet are one of the important part of ancient calendars and also one of the methods to evaluate their accuracies. So astronomers paid much attention to this field. In "Hanshu·Tian wen zhi" and "Xuhanshu· Tian wen zhi", there are 160 records with detailed dates and positions, which are calculated and studied by the modern astronomical method in this paper. The calculated results show that these positions are mostly correct, taking up 77.5% of the total records. While the rest 36 records are incorrect, taking up 22.5%. In addition, there are three typical or special forms of five-planet movements. The numbers of “shou”, “he”, “fan” movements are 14, 22 and 46, taking up 9%, 14% and 29%, respectively. In this paper, a detailed research on these three typical forms of five-planet movements is carried out. We think that the 36 incorrect records are caused by various reasons, but mainly in the data processes carried out by later generations.

  20. Estimating the Preferences of Central Bankers : An Analysis of Four Voting Records

    NARCIS (Netherlands)

    Eijffinger, S.C.W.; Mahieu, R.J.; Raes, L.B.D.

    2013-01-01

    Abstract: This paper analyzes the voting records of four central banks (Sweden, Hungary, Poland and the Czech Republic) with spatial models of voting. We infer the policy preferences of the monetary policy committee members and use these to analyze the evolution in preferences over time and the diff

  1. Analysis of Self-Recording in Self-Management Interventions for Stereotypy

    Science.gov (United States)

    Fritz, Jennifer N.; Iwata, Brian A.; Rolider, Natalie U.; Camp, Erin M.; Neidert, Pamela L.

    2012-01-01

    Most treatments for stereotypy involve arrangements of antecedent or consequent events that are imposed entirely by a therapist. By contrast, results of some studies suggest that self-recording, a common component of self-management interventions, might be an effective and efficient way to reduce stereotypy. Because the procedure typically has…

  2. Multi-electrode nerve cuff recording - model analysis of the effects of finite cuff length

    NARCIS (Netherlands)

    Veltink, P.H.; Tonis, T.; Buschman, H.P.J.; Marani, E.; Wesselink, W.A.

    2005-01-01

    The effect of finite cuff length on the signals recorded by electrodes at different positions along the nerve was analysed in a model study. Relations were derived using a one-dimensional model. These were evaluated in a more realistic axially symmetric 3D model. This evaluation indicated that the c

  3. A Correlational Analysis: Electronic Health Records (EHR) and Quality of Care in Critical Access Hospitals

    Science.gov (United States)

    Khan, Arshia A.

    2012-01-01

    Driven by the compulsion to improve the evident paucity in quality of care, especially in critical access hospitals in the United States, policy makers, healthcare providers, and administrators have taken the advise of researchers suggesting the integration of technology in healthcare. The Electronic Health Record (EHR) System composed of multiple…

  4. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis

    OpenAIRE

    Moreno-Conde, Alberto; Moner Cano, David; Da Cruz, Wellington Dimas; Santos, Marcelo R.; Maldonado Segura, José Alberto; Robles Viejo, Montserrat; KALRA, Dipak

    2015-01-01

    This is a pre-copyedited, author-produced PDF of an article accepted for publication in Journal of the American Medical Informatics Association following peer review. The version of record is available online at: http://dx.doi.org/10.1093/jamia/ocv008

  5. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study

    Directory of Open Access Journals (Sweden)

    Ann R R Robertson

    2015-03-01

    Full Text Available Background   Globally, diabetes mellitus presents a substantial burden to individuals and healthcare systems. Structuring and/or coding of medical records underpin attempts to improve information sharing and searching, potentially bringing clinical and secondary uses benefits.Aims and objectives   We investigated if, how and why records for adults with diabetes were structured and/or coded, and explored stakeholders’ perceptions of current practice.Methods   We carried out a qualitative, theoretically-informed case study of documenting healthcare information for diabetes patients in family practice and hospital settings, using semi-structured interviews, observations, systems demonstrations and documentary data.Results   We conducted 22 interviews and four on-site observations, and reviewed 25 documents. For secondary uses – research, audit, public health and service planning – the benefits of highly structured and coded diabetes data were clearly articulated. Reported clinical benefits in terms of managing and monitoring diabetes, and perhaps encouraging patient self-management, were modest. We observed marked differences in levels of record structuring and/or coding between settings, and found little evidence that these data were being exploited to improve information sharing between them.Conclusions   Using high levels of data structuring and coding in medical records for diabetes patients has potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK.

  6. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  7. Allen's big-eared bat (Idionycteris phyllotis) documented in colorado based on recordings of its distinctive echolocation call

    Science.gov (United States)

    Hayes, M.A.; Navo, K.W.; Bonewell, L.; Mosch, C.J.; Adams, R.A.

    2009-01-01

    Allen's big-eared bat (Idionycteris phyllotis) inhabits much of the southwestern USA, but has not been documented in Colorado. We recorded echolocation calls consistent with I. phyllotis near La Sal Creek, Montrose County, Colorado. Based on characteristics of echolocation calls and flight behavior, we conclude that the echolocation calls described here were emitted by I. phyllotis and that they represent the first documentation of this species in Colorado.

  8. Analysis and correction of ballistocardiogram contamination of EEG recordings in MR

    International Nuclear Information System (INIS)

    Purpose: to examine the influence of cardiac activity-related head movements and varying blood pulse frequencies on the shape of electroencephalography (EEG) recordings in a high magnetic field, and to implement a post-processing technique to eliminate cardiac activity-related artifacts. Material and methods: respiratory thoracic movements, changes of blood pulse frequency and passive head movements to 20 healthy subjects were examined outside and inside an MR magnet at rest in a simultaneously recorded 21-channel surface EEG. An electrocardiogram (ECG) was recorded simultaneously. On the basis of the correlation of the left ventricular ejection time (LVET) with the heart-rate, a post-processing heart-rate dependent subtraction of the cardiac activity-related artifacts of the EEG was developed. The quality of the post-processed EEG was tested by detecting alpha-activity in the pre- and post-processed EEGs. Results: inside the magnet, passive head motion but not respiratory thoracic movements resulted in EEG artifacts that correlated strongly with cardiac activity-related artifacts of the EEG. The blood pulse frequency influenced the appearance of the cardiac activity-related artifacts of the EEG. The removal of the cardiac activity-related artifacts of the EEG by the implemented post-processing algorithm resulted in an EEG of diagnostic quality with detected alpha-activity. Conclusion: when recording an EEG in MR environment, heart rate-dependent subtraction of EEG artifacts caused by ballistocardiogram contamination is essential to obtain EEG recordings of diagnostic quality and reliability. (orig.)

  9. Improved Security of Attribute Based Encryption for Securing Sharing of Personal Health Records

    Directory of Open Access Journals (Sweden)

    Able E Alias

    2014-11-01

    Full Text Available Cloud computing servers provides platform for users to remotely store data and share the data items to everyone. Personal health record (PHR has emerged as a patient –centric model of health information exchange. Confidentiality of the shared data is the major problem when patients uses the commercial cloud servers because it can be view by everyone., to assure the patient’s control over access to their own medical records; it is a promising method to encrypt the files before outsourcing and give access control to that data. Privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control In this paper a high degree of patient privacy is guaranteed by exploiting multi-authority ABE. Divide the users in the PHR system into multiple security domains that greatly reduces the key management complexity for owners and users

  10. Design of an Electronic Healthcare Record Server Based on Part 1 of ISO EN 13606

    Directory of Open Access Journals (Sweden)

    Tony Austin

    2011-01-01

    Full Text Available ISO EN 13606 is a newly approved standard at European and ISO levels for the meaningful exchange of clinical information between systems. Although conceived as an inter-operability standard to which existing electronic health record (EHR systems will transform legacy data, the requirements met and architectural approach reflected in this standard also make it a good candidate for the internal architecture of an EHR server. The authors have built such a server for the storage of healthcare records and demonstrated that it is possible to use ISO EN 13606 part 1 as the basis of an internal system architecture. The development of the system and some of the applications of the server are described in this paper. It is the first known operational implementation of the standard as an EHR system.

  11. Recording the dynamic endocytosis of single gold nanoparticles by AFM-based force tracing

    Science.gov (United States)

    Ding, Bohua; Tian, Yongmei; Pan, Yangang; Shan, Yuping; Cai, Mingjun; Xu, Haijiao; Sun, Yingchun; Wang, Hongda

    2015-04-01

    We utilized force tracing to directly record the endocytosis of single gold nanoparticles (Au NPs) with different sizes, revealing the size-dependent endocytosis dynamics and the crucial role of membrane cholesterol. The force, duration and velocity of Au NP invagination are accurately determined at the single-particle and microsecond level unprecedentedly.We utilized force tracing to directly record the endocytosis of single gold nanoparticles (Au NPs) with different sizes, revealing the size-dependent endocytosis dynamics and the crucial role of membrane cholesterol. The force, duration and velocity of Au NP invagination are accurately determined at the single-particle and microsecond level unprecedentedly. Electronic supplementary information (ESI) available: Details of the experimental procedures and the results of the control experiments. See DOI: 10.1039/c5nr01020a

  12. Theory-based Support for Mobile Language Learning: Noticing and Recording

    Directory of Open Access Journals (Sweden)

    Agnes Kukulska-Hulme

    2009-04-01

    Full Text Available This paper considers the issue of 'noticing' in second language acquisition, and argues for the potential of handheld devices to: (i support language learners in noticing and recording noticed features 'on the spot', to help them develop their second language system; (ii help language teachers better understand the specific difficulties of individuals or those from a particular language background; and (iii facilitate data collection by applied linguistics researchers, which can be fed back into educational applications for language learning. We consider: theoretical perspectives drawn from the second language acquisition literature, relating these to the practice of writing language learning diaries; and the potential for learner modelling to facilitate recording and prompting noticing in mobile assisted language learning contexts. We then offer guidelines for developers of mobile language learning solutions to support the development of language awareness in learners.

  13. End-user developed workflow-based hemodialysis nursing record system.

    Science.gov (United States)

    Tai, Hsin-Ling; Lin, Hsiu-Wen; Ke, Suh-Huei; Lin, Shu-Ai; Chang, Chiung-Chu; Chang, Polun

    2009-01-01

    We reported how we decided to build our own Hemodialysis nursing record system using the end user computing strategy with Excel VBA. The project took one year to complete since we used our off-duty time and started everything from the grounds. We are proud of the final system which tightly meets our workflow and clinical needs. Its interface was carefully designed to be easy to use with a style. PMID:19593037

  14. Theory-based support for mobile language learning: noticing and recording

    OpenAIRE

    Agnes Kukulska-Hulme; Susan Bull

    2009-01-01

    This paper considers the issue of 'noticing' in second language acquisition, and argues for the potential of handheld devices to: (i) support language learners in noticing and recording noticed features 'on the spot', to help them develop their second language system; (ii) help language teachers better understand the specific difficulties of individuals or those from a particular language background; and (iii) facilitate data collection by applied linguistics researchers, which can be fed bac...

  15. Residential segregation, dividing walls and mental health: A population-based record linkage study

    OpenAIRE

    Maguire, Aideen; French, Declan; O'Reilly, Dermot

    2016-01-01

    BackgroundNeighbourhood segregation has been described as a fundamental determinant of physical health, but literature on its effect on mental health is less clear. Whilst most previous research has relied on conceptualized measures of segregation, Northern Ireland is unique as it contains physical manifestations of segregation in the form of segregation barriers (or “peacelines”) which can be used to accurately identify residential segregation. MethodsWe used population-wide health record da...

  16. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    OpenAIRE

    Balasubramaniam, S; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third pa...

  17. HL7 document patient record architecture: an XML document architecture based on a shared information model.

    OpenAIRE

    Dolin, R H; Alschuler, L.; Behlen, F.; Biron, P. V.; BOYER S.; Essin, D.; Harding, L.; Lincoln, T.; Mattison, J E; Rishel, W.; Sokolowski, R.; Spinosa, J.; Williams, J. P.

    1999-01-01

    The HL7 SGML/XML Special Interest Group is developing the HL7 Document Patient Record Architecture. This draft proposal strives to create a common data architecture for the interoperability of healthcare documents. Key components are that it is under the umbrella of HL7 standards, it is specified in Extensible Markup Language, the semantics are drawn from the HL7 Reference Information Model, and the document specifications form an architecture that, in aggregate, define the semantics and stru...

  18. The Quality of Stakeholder-Based Decisions: Lessons from the Case Study Record

    OpenAIRE

    Beierle, Thomas

    2000-01-01

    The increased use of stakeholder processes in environmental decisionmaking has raised concerns that the inherently “political” nature of such processes may sacrifice substantive quality for political expediency. In particular, there is concern that good science will not be used adequately in stakeholder processes nor be reflected in their decision outcomes. This paper looks to the case study record to examine the quality of the outcomes of stakeholder efforts and the scientific and technical ...

  19. Impact of a computerized system for evidence-based diabetes care on completeness of records: a before–after study

    Directory of Open Access Journals (Sweden)

    Roshanov Pavel S

    2012-07-01

    Full Text Available Abstract Background Physicians practicing in ambulatory care are adopting electronic health record (EHR systems. Governments promote this adoption with financial incentives, some hinged on improvements in care. These systems can improve care but most demonstrations of successful systems come from a few highly computerized academic environments. Those findings may not be generalizable to typical ambulatory settings, where evidence of success is largely anecdotal, with little or no use of rigorous methods. The purpose of our pilot study was to evaluate the impact of a diabetes specific chronic disease management system (CDMS on recording of information pertinent to guideline-concordant diabetes care and to plan for larger, more conclusive studies. Methods Using a before–after study design we analyzed the medical record of approximately 10 patients from each of 3 diabetes specialists (total = 31 who were seen both before and after the implementation of a CDMS. We used a checklist of key clinical data to compare the completeness of information recorded in the CDMS record to both the clinical note sent to the primary care physician based on that same encounter and the clinical note sent to the primary care physician based on the visit that occurred prior to the implementation of the CDMS, accounting for provider effects with Generalized Estimating Equations. Results The CDMS record outperformed by a substantial margin dictated notes created for the same encounter. Only 10.1% (95% CI, 7.7% to 12.3% of the clinically important data were missing from the CDMS chart compared to 25.8% (95% CI, 20.5% to 31.1% from the clinical note prepared at the time (p p  Conclusions The CDMS chart captured information important for the management of diabetes more often than dictated notes created with or without its use but we were unable to detect a difference in completeness between notes dictated in CDMS-associated and usual-care encounters. Our sample of

  20. Ontology-Based Analysis of Microarray Data.

    Science.gov (United States)

    Giuseppe, Agapito; Milano, Marianna

    2016-01-01

    The importance of semantic-based methods and algorithms for the analysis and management of biological data is growing for two main reasons. From a biological side, knowledge contained in ontologies is more and more accurate and complete, from a computational side, recent algorithms are using in a valuable way such knowledge. Here we focus on semantic-based management and analysis of protein interaction networks referring to all the approaches of analysis of protein-protein interaction data that uses knowledge encoded into biological ontologies. Semantic approaches for studying high-throughput data have been largely used in the past to mine genomic and expression data. Recently, the emergence of network approaches for investigating molecular machineries has stimulated in a parallel way the introduction of semantic-based techniques for analysis and management of network data. The application of these computational approaches to the study of microarray data can broad the application scenario of them and simultaneously can help the understanding of disease development and progress.

  1. Astronomical calibration of the Boreal Santonian (Cretaceous) based on the marine carbon isotope record and correlation to the tropical realm

    Science.gov (United States)

    Thibault, Nicolas; Jarvis, Ian; Voigt, Silke; Gale, Andy; Attree, Kevin; Jenkyns, Hugh

    2016-04-01

    New high-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the reference sections at Seaford Head (southern England) and Bottaccione (Gubbio, central Italy). These records allow for a new and unambiguous stratigraphic correlation of the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of stable carbon and oxygen isotopes can be highlighted in the Seaford Head dataset, and a floating astronomical time scale is presented for the Santonian of the section, which spans five 405 kyr cycles (Sa1 to Sa5). Macro-, micro- and nannofossil biostratigraphy of the Seaford section is integrated along with magnetostratigraphy, carbon-isotope chemostratigraphy and cyclostratigraphy. Correlation of the Seaford Head astronomical time scale to that of the Niobrara Formation (U.S. Western Interior Basin) allows for anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19±0.38 Ma. Five different astronomical tuning options are examined. The astronomical calibration generates a c. 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, likely due either to slight diachronism of the first occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions, or to remaining uncertainties of radiometric dating and the cyclostratigraphic records.

  2. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  3. Constructing a population-based research database from routine maternal screening records: a resource for studying alloimmunization in pregnant women.

    Directory of Open Access Journals (Sweden)

    Brian K Lee

    Full Text Available BACKGROUND: Although screening for maternal red blood cell antibodies during pregnancy is a standard procedure, the prevalence and clinical consequences of non-anti-D immunization are poorly understood. The objective was to create a national database of maternal antibody screening results that can be linked with population health registers to create a research resource for investigating these issues. STUDY DESIGN AND METHODS: Each birth in the Swedish Medical Birth Register was uniquely identified and linked to the text stored in routine maternal antibody screening records in the time window from 9 months prior to 2 weeks after the delivery date. These text records were subjected to a computerized search for specific antibodies using regular expressions. To illustrate the research potential of the resulting database, selected antibody prevalence rates are presented as tables and figures, and the complete data (from more than 60 specific antibodies presented as online moving graphical displays. RESULTS: More than one million (1,191,761 births with valid screening information from 1982-2002 constitute the study population. Computerized coverage of screening increased steadily over time and varied by region as electronic records were adopted. To ensure data quality, we restricted analysis to birth records in areas and years with a sustained coverage of at least 80%, representing 920,903 births from 572,626 mothers in 17 of the 24 counties in Sweden. During the study period, non-anti-D and anti-D antibodies occurred in 76.8/10,000 and 14.1/10,000 pregnancies respectively, with marked differences between specific antibodies over time. CONCLUSION: This work demonstrates the feasibility of creating a nationally representative research database from the routine maternal antibody screening records from an extended calendar period. By linkage with population registers of maternal and child health, such data are a valuable resource for addressing important

  4. Record Schedules

    Data.gov (United States)

    Department of Homeland Security — A records schedule describes FAA records, identifies the records as either temporary or permanent, and provides specific, mandatory instructions for the disposition...

  5. Paleoclimate record and paleohydrogeological analysis of travertine from the Niangziguan Karst Springs, northern China

    Institute of Scientific and Technical Information of China (English)

    李义连; 王焰新; 邓安力

    2001-01-01

    Travertine deposited around the Niangziguan karst springs was used as a new type of paleoclimate record in this study. Five stages of climate change in northern China from 200± ka to 36± ka before the present (B. P.) were reconstructed using the 18O and 13C isotope record of the travertine. Tendency of the change was towards a more arid climate. Coupling the temporal-spatial evolution of the springs with climate change, the hydrogeological evolution could be divided into four major periods since middle Pleistocene: (1)No spring period; (2)The initial period of spring outcropping as the predominant way of discharge; (3)The culmination period of spring development; and (4)The spring discharge attenuation period. The attenuation is partly related to the decrease of recharge as a result of the dry climate after 90±kaBP.

  6. A 350 ka record of climate change from Lake El'gygytgyn, Far East Russian Arctic: refining the pattern of climate modes by means of cluster analysis

    Directory of Open Access Journals (Sweden)

    U. Frank

    2013-07-01

    Full Text Available Rock magnetic, biochemical and inorganic records of the sediment cores PG1351 and Lz1024 from Lake El'gygytgyn, Chukotka peninsula, Far East Russian Arctic, were subject to a hierarchical agglomerative cluster analysis in order to refine and extend the pattern of climate modes as defined by Melles et al. (2007. Cluster analysis of the data obtained from both cores yielded similar results, differentiating clearly between the four climate modes warm, peak warm, cold and dry, and cold and moist. In addition, two transitional phases were identified, representing the early stages of a cold phase and slightly colder conditions during a warm phase. The statistical approach can thus be used to resolve gradual changes in the sedimentary units as an indicator of available oxygen in the hypolimnion in greater detail. Based upon cluster analyses on core Lz1024, the published succession of climate modes in core PG1351, covering the last 250 ka, was modified and extended back to 350 ka. Comparison to the marine oxygen isotope (δ18O stack LR04 (Lisiecki and Raymo, 2005 and the summer insolation at 67.5° N, with the extended Lake El'gygytgyn parameter records of magnetic susceptibility (κLF, total organic carbon content (TOC and the chemical index of alteration (CIA; Minyuk et al., 2007, revealed that all stages back to marine isotope stage (MIS 10 and most of the substages are clearly reflected in the pattern derived from the cluster analysis.

  7. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  8. Analysis of Clinical Record Data for Anticoagulation Management within an EHR System

    OpenAIRE

    Austin, T.; Kalra, D.; Lea, N C; Patterson, D. L.; Ingram, D.

    2009-01-01

    OBJECTIVES: This paper reports an evaluation of the properties of a generic electronic health record information model that were actually required and used when importing an existing clinical application into a generic EHR repository. METHOD: A generic EHR repository and system were developed as part of the EU Projects Synapses and SynEx. A Web application to support the management of anticoagulation therapy was developed to interface to the EHR system, and deployed within a north London hosp...

  9. A strategic analysis of synapse and Canada health infoway’s electronic health record solution blueprint

    OpenAIRE

    Labrosse, Chadwick Andre

    2007-01-01

    Synapse is a currently deployed software application that collects and presents clinical and administrative information about Mental Health & Addictions patients, in the form of an Electronic Health Record (EHR). Synapse was jointly developed by regional health authorities, federal and provincial governments and research institutions. While Synapse has enjoyed limited regional success in British Columbia, the Synapse Project Steering Committee seeks to expand its adoption with clinicians ...

  10. Redundancy in electronic health record corpora: analysis, impact on text mining performance and mitigation strategies

    OpenAIRE

    Cohen, Raphael; Elhadad, Michael; Elhadad, Noémie

    2013-01-01

    Background The increasing availability of Electronic Health Record (EHR) data and specifically free-text patient notes presents opportunities for phenotype extraction. Text-mining methods in particular can help disease modeling by mapping named-entities mentions to terminologies and clustering semantically related terms. EHR corpora, however, exhibit specific statistical and linguistic characteristics when compared with corpora in the biomedical literature domain. We focus on copy-and-paste r...

  11. Hair Analysis Provides a Historical Record of Cortisol Levels in Cushing’s Syndrome

    Science.gov (United States)

    Thomson, S.; Koren, G.; Fraser, L.-A.; Rieder, M.; Friedman, T. C.; Van Uum, S. H. M.

    2010-01-01

    The severity of Cushing’s Syndrome (CS) depends on the duration and extent of the exposure to excess glucocorticoids. Current measurements of cortisol in serum, saliva and urine reflect systemic cortisol levels at the time of sample collection, but cannot assess past cortisol levels. Hair cortisol levels may be increased in patients with CS, and, as hair grows about 1 cm/month, measurement of hair cortisol may provide historical information on the development of hypercortisolism. We attempted to measure cortisol in hair in relation to clinical course in six female patients with CS and in 32 healthy volunteers in 1 cm hair sections. Hair cortisol content was measured using a commercially available salivary cortisol immune assay with a protocol modified for use with hair. Hair cortisol levels were higher in patients with CS than in controls, the medians (ranges) were 679 (279–2500) and 116 (26–204) ng/g respectively (P <0.001). Segmental hair analysis provided information for up to 18 months before time of sampling. Hair cortisol concentrations appeared to vary in accordance with the clinical course. Based on these data, we suggest that hair cortisol measurement is a novel method for assessing dynamic systemic cortisol exposure and provides unique historical information on variation in cortisol, and that more research is required to fully understand the utility and limits of this technique. PMID:19609841

  12. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  13. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  14. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  15. Vertical Microbial Community Variability of Carbonate-based Cones may Provide Insight into Formation in the Rock Record

    Science.gov (United States)

    Trivedi, C.; Bojanowski, C.; Daille, L. K.; Bradley, J.; Johnson, H.; Stamps, B. W.; Stevenson, B. S.; Berelson, W.; Corsetti, F. A.; Spear, J. R.

    2015-12-01

    Stromatolite morphogenesis is poorly understood, and the process by which microbial mats become mineralized is a primary question in microbialite formation. Ancient conical stromatolites are primarily carbonate-based whereas the few modern analogues in hot springs are either non-mineralized or mineralized by silica. A team from the 2015 International GeoBiology Course investigated carbonate-rich microbial cones from near Little Hot Creek (LHC), Long Valley Caldera, California, to investigate how conical stromatolites might form in a hot spring carbonate system. The cones are up to 3 cm tall and are found in a calm, ~45° C pool near LHC that is 4 times super-saturated with respect to CaCO3. The cones rise from a flat, layered microbial mat at the edge of the pool. Scanning electron microscopy revealed filamentous bacteria associated with calcite crystals within the cone tips. Preliminary 16S rRNA gene analysis indicated variability of community composition between different vertical levels of the cone. The cone tip had comparatively greater abundance of filamentous cyanobacteria (Leptolyngbya and Phormidium) and fewer heterotrophs (e.g. Chloroflexi) compared to the cone bottom. This supports the hypothesis that cone formation may depend on the differential abundance of the microbial community and their potential functional roles. Metagenomic analyses of the cones revealed potential genes related to chemotaxis and motility. Specifically, a genomic bin identified as a member of the genus Isosphaera contained an hmp chemotaxis operon implicated in gliding motility in the cyanobacterium Nostoc punctiforme [1]. Isosphaera is a Planctomycete shown to have phototactic capabilities [2], and may play a role in conjunction with cyanobacteria in the vertical formation of the cones. This analysis of actively growing cones indicates a complex interplay of geochemistry and microbiology that form structures which can serve as models for processes that occurred in the past and are

  16. An analysis of the recording of tobacco use among inpatients in Irish hospitals.

    LENUS (Irish Health Repository)

    Sheridan, A

    2014-10-01

    Smoking is the largest avoidable cause of premature mortality in the world. Hospital admission is an opportunity to identify and help smokers quit. This study aimed to determine the level of recording of tobacco use (current and past) in Irish hospitals. Information on inpatient discharges with a tobacco use diagnosis was extracted from HIPE. In 2011, a quarter (n=84, 679) of discharges had a recording of tobacco use, which were more common among males (29% (n=50,161) male v. 20% (n=30,162) female), among medical patients (29% (n=54,375) medical v. 20% (n=30,162) other) and was highest among those aged 55-59 years (30.6%; n=7,885). SLAN 2007 reported that 48% of adults had smoked at some point in their lives. This study would suggest an under- reporting of tobacco use among hospital inpatients. Efforts should be made to record smoking status at hospital admission, and to improve the quality of the HIPE coding of tobacco use.

  17. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  18. TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING

    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie

    2003-01-01

    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  19. Ostracod-based isotope record from Lake Ohrid (Balkan Peninsula) over the last 140 ka

    Science.gov (United States)

    Belmecheri, Soumaya; von Grafenstein, Ulrich; Andersen, Nils; Eymard-Bordon, Amandine; Régnier, Damien; Grenier, Christophe; Lézine, Anne-Marie

    2010-12-01

    The stable isotope composition of benthic ostracods from a deep-lake sediment core (JO2004-1) recovered from Lake Ohrid (Albania-Macedonia) was studied to investigate regional responses to climate change at the interface between the north-central European and Mediterranean climate systems. Ostracod valves are present only during interglacial intervals, during the Marine Isotope Stage (MIS) 5 and 1. The ostracod oxygen isotope values (δ 18O) quantitatively reflect changes in the oxygen isotope signal of the lake water (δ 18O L). The interpretation of this record however, is far from straight forward. δ 18O L variations throughout MIS 5/6 transition (TII), MIS 5 and MIS 1 appear to be controlled by site specific hydrological processes as shown by modern isotope hydrology. The δ 18O L trends at TII, MIS 5 and MIS 1 match the timing and the main structural feature of the major regional climate records (Corchia cave δ 18O, Iberian margin Sea Surface Temperature) suggesting that the Ohrid δ 18O L responded to global-scale climate changes, although it seems certain that the lake experienced a significant degree of evaporation and varying moisture availability. The carbon isotope signal (δ 13C) seems to respond more accurately to climate changes in agreement with other JO2004-1 proxies. δ 13C of the ostracod calcite is directly linked to the δ 13C of the dissolved inorganic carbon (DIC) in the lake, which in this case is controlled by the isotopic composition of the DIC in the incoming water and by the internal processes of the lake. High δ 13C during cold periods and low values during warm periods reflect changing vegetation cover and soil activity. These results suggest that Lake Ohrid has the potential to capture a long record of regional environment related-temperature trends during interglacial periods, particularly given the exceptional thickness of the lake sediment covering probably the entire Quaternary.

  20. Electronic Medical Record System Based on XML%基于XML的电子病历系统

    Institute of Scientific and Technical Information of China (English)

    陈可

    2012-01-01

    According to the problems of medical information transmission delaying and time consuming of browsing anamnesis in the hand-writing medical records, the design solution of the electronic medical record based on XML was raised and complemented. By the development platform of Web Services, the clinical information centered by patients was integrated by the support of EMR, including the physical orders, medical technical inspections, nursing care and infectious diseases' reports. And the information query and integration was implemented in the system. By applying the electronic medical record system, supervision on medical records from multi-direction is being substituted for emphasis on terminal quality control only in hand-writing medical records.%文章针对传统病历书写中存在的医疗信息传递慢,历史病历调阅繁琐等问题,提出并实现了基于XML技术的电子病历系统设计方案.该系统作为临床信息数据的载体,以患者诊疗信息为主线,借助Web Services开发平台,集成了医嘱、医技、护理及传染病报病等信息,实现了数据查询与集成.本系统克服了传统手写病历管理只重病历的终末监控的问题,强化了对病历的多点、多方位监控.

  1. The implementation of a Personal Digital Assistant (PDA) based patient record and charting system: lessons learned.

    OpenAIRE

    Carroll, Aaron E.; Saluja, Sunil; Tarczy-Hornoch, Peter

    2002-01-01

    Personal Digital Assistants (PDAs) offer many potential advantages to clinicians. A number of systems have begun to appear for all types of PDAs that allow for the recording and tracking of patient information. PDAs allow information to be both entered and accessed at the point of care. They also allow information entered away from a central repository to be added or "synced" with data through the use of a wireless or wired connection. Few systems, however, have been designed to work in the c...

  2. HL7 document patient record architecture: an XML document architecture based on a shared information model.

    Science.gov (United States)

    Dolin, R H; Alschuler, L; Behlen, F; Biron, P V; Boyer, S; Essin, D; Harding, L; Lincoln, T; Mattison, J E; Rishel, W; Sokolowski, R; Spinosa, J; Williams, J P

    1999-01-01

    The HL7 SGML/XML Special Interest Group is developing the HL7 Document Patient Record Architecture. This draft proposal strives to create a common data architecture for the interoperability of healthcare documents. Key components are that it is under the umbrella of HL7 standards, it is specified in Extensible Markup Language, the semantics are drawn from the HL7 Reference Information Model, and the document specifications form an architecture that, in aggregate, define the semantics and structural constraints necessary for the exchange of clinical documents. The proposal is a work in progress and has not yet been submitted to HL7's formal balloting process. PMID:10566319

  3. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    OpenAIRE

    Fritsche, L; Lindemann, G.; Schroeter, K.; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential b...

  4. Combining terrestrial stereophotogrammetry, DGPS and GIS-based 3D voxel modelling in the volumetric recording of archaeological features

    Science.gov (United States)

    Orengo, Hector A.

    2013-02-01

    Archaeological recording of structures and excavations in high mountain areas is greatly hindered by the scarce availability of both space, to transport material, and time. The Madriu-Perafita-Claror, InterAmbAr and PCR Mont Lozère high mountain projects have documented hundreds of archaeological structures and carried out many archaeological excavations. These projects required the development of a technique which could record both structures and the process of an archaeological excavation in a fast and reliable manner. The combination of DGPS, close-range terrestrial stereophotogrammetry and voxel based GIS modelling offered a perfect solution since it helped in developing a strategy which would obtain all the required data on-site fast and with a high degree of precision. These data are treated off-site to obtain georeferenced orthoimages covering both the structures and the excavation process from which site and excavation plans can be created. The proposed workflow outputs also include digital surface models and volumetric models of the excavated areas from which topography and archaeological profiles were obtained by voxel-based GIS procedures. In this way, all the graphic recording required by standard archaeological practices was met.

  5. Management of radiological related equipments. Creating the equipment management database and analysis of the repair and maintenance records

    International Nuclear Information System (INIS)

    In 1997, we established the committee of equipments maintenance and management in our department. We designed the database in order to classify and register all the radiological related equipments using Microsoft Access. The management of conditions and cost of each equipment has become easier, by keeping and recording the database in the equipments management ledger and by filing the history of repairs or maintenances occurred to modalities. We then accounted numbers, cost of repairs and downtimes from the data of the repair and maintenance records for four years, and we reexamined the causal analysis of failures and the contents of the regular maintenance for CT and MRI equipments that had shown the higher numbers of repairs. Consequently, we have found the improvement of registration method of the data and the more economical way to use of the cost of repair. (author)

  6. Web-based analysis of nasal sound spectra.

    Science.gov (United States)

    Seren, Erdal

    2005-10-01

    The spectral analysis of the nasal sound is an indicator of the nasal airflow pattern. We investigated a new technique for nasal sound analysis via Internet. This study includes 27 patients and 22 healthy people. Patients were treated by septoplasty operation for septal deviation. Postoperation 10(th) day, this technique was applied to follow nasal airflow course. The patients recorded the nasal sound by microphone into the computer as a .wav file and sent us via internet, all those records were evaluated by us. The results were sent back to themselves. The 11 patients who had nasal obstruction symptoms (group A) were called to the hospital to check. In the nasal sound analyses e-mails of those patients, the sound intensity was at high frequencies (2-4 kHz, 4-6 kHz) above 30 dB, but low (500-1000 Hz) and medium frequencies (1-2 kHz), are below then 10 dB. In the patients without nasal obstruction symptom (group B), the sound intensity was at high frequencies below 10 dB, but low and medium frequencies are above 20 dB. There was a statistically significant difference in sound intensity between group A and group B. In the endoscopical examination of those obstructions, which decreases the nasal airway, crusting formation in the nasal cavity was found. Web-based nasal sound analysis is an important method to follow the postoperative course and the nasal airflow evaluation. The new method will save time and money, avoiding a return visit to the hospital unnecessarily.

  7. Quantum entanglement analysis based on abstract interpretation

    OpenAIRE

    Perdrix, Simon

    2008-01-01

    Entanglement is a non local property of quantum states which has no classical counterpart and plays a decisive role in quantum information theory. Several protocols, like the teleportation, are based on quantum entangled states. Moreover, any quantum algorithm which does not create entanglement can be efficiently simulated on a classical computer. The exact role of the entanglement is nevertheless not well understood. Since an exact analysis of entanglement evolution induces an exponential sl...

  8. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  9. Late Holocene stable-isotope based winter temperature records from ice wedges in the Northeast Siberian Arctic

    Science.gov (United States)

    Opel, Thomas; Meyer, Hanno; Laepple, Thomas; Dereviagin, Alexander Yu.

    2016-04-01

    The Arctic is currently undergoing an unprecedented warming. This highly dynamic response on changes in climate forcing and the global impact of the Arctic water, carbon and energy balances make the Arctic a key region to study past, recent and future climate changes. Recent proxy-based temperature reconstructions indicate a long-term cooling over the past about 8 millennia that is mainly related to a decrease in solar summer insolation and has been reversed only by the ongoing warming. Climate model results on the other hand show no significant change or even a slight warming over this period. This model-proxy data mismatch might be caused by a summer bias of the used climate proxies. Ice wedges may provide essential information on past winter temperatures for a comprehensive seasonal picture of Holocene Arctic climate variability. Polygonal ice wedges are a widespread permafrost feature in the Arctic tundra lowlands. Ice wedges form by the repeated filling of thermal contraction cracks with snow melt water, which quickly refreezes at subzero ground temperatures and forms ice veins. As the seasonality of frost cracking and infill is generally related to winter and spring, respectively, the isotopic composition of wedge ice is indicative of past climate conditions during the annual cold season (DJFMAM, hereafter referred to as winter). δ18O of ice is interpreted as proxy for regional surface air temperature. AMS radiocarbon dating of organic remains in ice-wedge samples provides age information to generate chronologies for single ice wedges as well as regionally stacked records with an up to centennial resolution. In this contribution we seek to summarize Holocene ice-wedge δ18O based temperature information from the Northeast Siberian Arctic. We strongly focus on own work in the Laptev Sea region but consider as well literature data from other regional study sites. We consider the stable-isotope composition of wedge ice, ice-wedge dating and chronological

  10. Health scorecard of spacecraft platforms: Track record of on-orbit anomalies and failures and preliminary comparative analysis

    Science.gov (United States)

    Wise, Marcie A.; Saleh, Joseph H.; Haga, Rachel A.

    2011-01-01

    Choosing the "right" satellite platform for a given market and mission requirements is a major investment decision for a satellite operator. With a variety of platforms available on the market from different manufacturers, and multiple offerings from the same manufacturer, the down-selection process can be quite involved. In addition, because data for on-obit failures and anomalies per platform is unavailable, incomplete, or fragmented, it is difficult to compare options and make an informed choice with respect to the critical attribute of field reliability of different platforms. In this work, we first survey a large number of geosynchronous satellite platforms by the major satellite manufacturers, and we provide a brief overview of their technical characteristics, timeline of introduction, and number of units launched. We then analyze an extensive database of satellite failures and anomalies, and develop for each platform a "health scorecard" that includes all the minor and major anomalies, and complete failures—that is failure events of different severities—observed on-orbit for each platform. We identify the subsystems that drive these failure events and how much each subsystem contributes to these events for each platform. In addition, we provide the percentage of units in each platform which have experienced failure events, and, after calculating the total number of years logged on-orbit by each platform, we compute its corresponding average failure and anomaly rate. We conclude this work with a preliminary comparative analysis of the health scorecards of different platforms. The concept of a "health scorecard" here introduced provides a useful snapshot of the failure and anomaly track record of a spacecraft platform on orbit. As such, it constitutes a useful and transparent benchmark that can be used by satellite operators to inform their acquisition choices ("inform" not "base" as other considerations are factored in when comparing different spacecraft

  11. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  12. An Agent Based System Framework for Mining Data Record Extraction from Search Engine Result Pages

    Directory of Open Access Journals (Sweden)

    Dr.K.L Shunmuganathan

    2012-04-01

    Full Text Available Nowadays, the huge amount of information distributed through the Web motivates studying techniques to be adopted in order to extract relevant data in an efficient and reliable way. Information extraction (IE from semistructured Web documents plays an important role for a variety of information agents. In this paper, a framework of WebIE system with the help of the JADE platform is proposed to solve problems by non-visual automatic wrapper to extract data records from search engine results pages which contain important information for Meta search engine and computer users. It gives the idea about different agents used in WebIE and how the communication occurred between them and how to manage different agents. Multi Agent System (MAS provides an efficient way for communicating agents and it is decentralized. Prototype model is developed for the study purpose and how it is used to solve the complex problems arise into the WebIE. Our wrapper consists of a series of agent filter to detect and remove irrelevant data region from the web page. In this paper, we propose a highly effective and efficient algorithm for automatically mining result records from search engine responsepages.

  13. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations

    Science.gov (United States)

    Talandier, Jacques; Okal, Emile A.

    2016-07-01

    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called "hotspot earthquakes" occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wavetrain, than in the case of dislocation sources embedded in the solid Earth.

  14. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations

    Science.gov (United States)

    Talandier, Jacques; Okal, Emile A.

    2016-09-01

    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called `hotspot earthquakes' occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time-series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wave train, than in the case of dislocation sources embedded in the solid Earth.

  15. Recorded fatal and permanently disabling injuries in South African manufacturing industry - Overview, analysis and reflection

    DEFF Research Database (Denmark)

    Hedlund, Frank Huess

    2013-01-01

    Studies on occupational accident statistics in South Africa are few and far between, the most recent paper on the manufacturing sector was published in 1990. Accidents in South Africa are recorded in two systems: Exhaustive information is available from the insurance system under the Workmen...... and it is argued that WCC registrations may comprise industries outside the Standard Industrial Classification (SIC) scheme for manufacturing. The quality of accident reporting in official publications began to deteriorate by mid-1990s. The largest problem, however, is that reporting has come to a standstill...

  16. Detection of the short-term preseizure changes in EEG recordings using complexity and synchrony analysis

    Institute of Scientific and Technical Information of China (English)

    JIA Wenyan; KONG Na; MA Jun; LIU Hesheng; GAO Xiaorong; GAO Shangkai; YANG Fusheng

    2006-01-01

    An important consideration in epileptic seizure prediction is proving the existence of a pre-seizure state that can be detected using various signal processing algorithms. In the analyses of intracranial electroencephalographic (EEG)recordings of four epilepsy patients, the short-term changes in the measures of complexity and synchrony were detected before the majority of seizure events across the sample patient population. A decrease in complexity and increase in phase synchrony appeared several minutes before seizure onset and the changes were more pronounced in the focal region than in the remote region. This result was also validated statistically using a surrogate data method.

  17. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning

    Science.gov (United States)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.

    2012-07-01

    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get

  18. Implementation of a Next-Generation Electronic Nursing Records System Based on Detailed Clinical Models and Integration of Clinical Practice Guidelines

    OpenAIRE

    Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook

    2013-01-01

    Objectives The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. Methods A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation ...

  19. Soldering-based easy packaging of thin polyimide multichannel electrodes for neuro-signal recording

    Science.gov (United States)

    Baek, Dong-Hyun; Han, Chang-Hee; Jung, Ha-Chul; Kim, Seon Min; Im, Chang-Hwan; Oh, Hyun-Jik; Jungho Pak, James; Lee, Sang-Hoon

    2012-11-01

    We propose a novel packaging method for preparing thin polyimide (PI) multichannel microelectrodes. The electrodes were connected simply by making a via-hole at the interconnection pad of a thin PI electrode, and a nickel (Ni) ring was constructed by electroplating through the via-hole to permit stable soldering with strong adhesion to the electrode and the printed circuit board. The electroplating conditions were optimized for the construction of a well-organized Ni ring. The electrical properties of the packaged electrode were evaluated by fabricating and packaging a 40-channel thin PI electrode. Animal experiments were performed using the packaged electrode for high-resolution recording of somatosensory evoked potential from the skull of a rat. The in vivo and in vitro tests demonstrated that the packaged PI electrode may be used broadly for the continuous measurement of bio-signals or for neural prosthetics.

  20. Global solar radiation: comparison of satellite-based climatology with station records

    Science.gov (United States)

    Skalak, Petr; Zahradnicek, Pavel; Stepanek, Petr; Farda, Ales

    2016-04-01

    We analyze surface incoming shortwave radiation (SIS) from the SARAH dataset prepared by the EUMETSAT Climate Monitoring Satellite Applications Facility from satellite observations of the visible channels of the MVIRI and SEVIRI instruments onboard the geostationary Meteosat satellites. The satellite SIS data are evaluated within the period 1984-2014 on various time scales: from individual months and years to long-term climate means. The validation is performed using the ground measurements of global solar radiation (GLBR) carried out on 11 meteorological stations of the Czech Hydrometeorological Institute in the Czech Republic with at least 30 years long data series. Our aim is to explore whether the SIS data could potentially serve as an alternative source of information on GLBR outside of a relatively sparse network of meteorological stations recording GLBR. Acknowledgement: Supported by the Ministry of Education, Youth and Sports of the Czech Republic within the National Sustainability Program I (NPU I), grant number LO1415.

  1. Non-Contact Analysis of the Adsorptive Ink Capacity of Nano Silica Pigments on a Printing Coating Base

    Science.gov (United States)

    Jiang, Bo; Huang, Yu Dong

    2014-01-01

    Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2  = 0.80 and SEP  = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled. PMID:25329464

  2. Detector : knowledge-based systems for dairy farm management support and policy-analysis; methods and applications.

    NARCIS (Netherlands)

    Hennen, W.H.G.J.

    1995-01-01

    This thesis describes new methods and knowledge-based systems for the analysis of technical and economic accounting data from the year-end records of individual dairy farms to support the management and, after adaptation, for policy analysis.A new method for farm comparison, the farm-adjusted standa

  3. Spatial-temporal analysis on climate variation in early Qing dynasty (17th -18th century) using China's chronological records

    Science.gov (United States)

    Lin, Kuan-Hui Elaine; Wang, Pao-Kuan; Fan, I.-Chun; Liao, Yi-Chun; Liao, Hsiung-Ming; Pai, Pi-Ling

    2016-04-01

    Global climate change in the form of extreme, variation, and short- or mid-term fluctuation is now widely conceived to challenge the survival of the human beings and the societies. Meanwhile, improving present and future climate modeling needs a comprehensive understanding of the past climate patterns. Although historical climate modeling has gained substantive progress in recent years based on the new findings from dynamical meteorology, phenology, or paleobiology, less known are the mid- to short-term variations or lower-frequency variabilities at different temporal scale and their regional expressions. Enabling accurate historical climate modeling would heavily rely on the robustness of the dataset that could carry specific time, location, and meteorological information in the continuous temporal and spatial chains. This study thus presents an important methodological innovation to reconstruct historical climate modeling at multiple temporal and spatial scales through building a historical climate dataset, based on the Chinese chronicles compiled in a Zhang (2004) edited Compendium of Chinese Meteorological Records of the Last 3,000 Years since Zhou Dynasty (1100BC). The dataset reserves the most delicate meteorological data with accurate time, location, meteorological event, duration, and other phonological, social and economic impact information, and is carefully digitalized, coded, and geo-referenced on the Geographical Information System based maps according to Tan's (1982) historical atlas in China. The research project, beginning in January 2015, is a collaborative work among scholars across meteorology, geography, and historical linguistics disciplines. The present research findings derived from the early 100+ years of the Qing dynasty include the following. First, the analysis is based on the sampling size, denoted as cities/counties, n=1398 across the Mainland China in the observation period. Second, the frequencies of precipitation, cold

  4. Spatial-temporal analysis on climate variation in early Qing dynasty (17th -18th century) using China's chronological records

    Science.gov (United States)

    Lin, Kuan-Hui Elaine; Wang, Pao-Kuan; Fan, I.-Chun; Liao, Yi-Chun; Liao, Hsiung-Ming; Pai, Pi-Ling

    2016-04-01

    Global climate change in the form of extreme, variation, and short- or mid-term fluctuation is now widely conceived to challenge the survival of the human beings and the societies. Meanwhile, improving present and future climate modeling needs a comprehensive understanding of the past climate patterns. Although historical climate modeling has gained substantive progress in recent years based on the new findings from dynamical meteorology, phenology, or paleobiology, less known are the mid- to short-term variations or lower-frequency variabilities at different temporal scale and their regional expressions. Enabling accurate historical climate modeling would heavily rely on the robustness of the dataset that could carry specific time, location, and meteorological information in the continuous temporal and spatial chains. This study thus presents an important methodological innovation to reconstruct historical climate modeling at multiple temporal and spatial scales through building a historical climate dataset, based on the Chinese chronicles compiled in a Zhang (2004) edited Compendium of Chinese Meteorological Records of the Last 3,000 Years since Zhou Dynasty (1100BC). The dataset reserves the most delicate meteorological data with accurate time, location, meteorological event, duration, and other phonological, social and economic impact information, and is carefully digitalized, coded, and geo-referenced on the Geographical Information System based maps according to Tan's (1982) historical atlas in China. The research project, beginning in January 2015, is a collaborative work among scholars across meteorology, geography, and historical linguistics disciplines. The present research findings derived from the early 100+ years of the Qing dynasty include the following. First, the analysis is based on the sampling size, denoted as cities/counties, n=1398 across the Mainland China in the observation period. Second, the frequencies of precipitation, cold

  5. Tree-ring analysis by pixe for a historical record of soil chemistry response to acidic air pollution

    Science.gov (United States)

    Legge, Allan H.; Kaufmann, Henry C.; Winchester, John W.

    1984-04-01

    Tree cores have been analyzed intact in 1 mm steps, corresponding to time intervals in the rings as short as half a growing season, providing a chronological record of 16 elemental concentrations extending over thirty years back to 1950. Samples were collected in a forested region of western Canada in sandy soil which was impacted by acid-forming gases released by a sulfur recovery sour natural gas plant. Tree core samples of the hybrid lodgepole-Jack pine ( Pinns contorta Loud. × Pinus banksiana Lamb.) were taken in five ecologically similar locations between 1.2 and 9.6 km from the gas plant stacks. Concentrations of some elements showed patterns suggesting that the annual rings preserved a record of changing soil chemistry in response both to natural environmental conditions and to deposition from sulfur gas emissions, commencing after plant start-up in 1959 and modified by subsequent modifications in plant operating procedures. These patterns were most pronounced nearest the gas plant. Certain other elements did not exhibit these patterns, probably reflecting greater importance of biological than of soil chemical properties. The high time resolution of tree-ring analysis, which can be achieved by PIXE, demonstrates that the rings preserve a historical record of elemental composition which may reflect changes in soil chemistry during plant growth as it may be affected by both natural ecological processes and acidic deposition from the atmosphere.

  6. Analysis of geomagnetic storm variations and count-rate of cosmic ray muons recorded at the Brazilian southern space observatory

    Energy Technology Data Exchange (ETDEWEB)

    Frigo, Everton [University of Sao Paulo, USP, Institute of Astronomy, Geophysics and Atmospheric Sciences, IAG/USP, Department of Geophysics, Sao Paulo, SP (Brazil); Savian, Jairo Francisco [Space Science Laboratory of Santa Maria, LACESM/CT, Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil); Silva, Marlos Rockenbach da; Lago, Alisson dal; Trivedi, Nalin Babulal [National Institute for Space Research, INPE/MCT, Division of Space Geophysics, DGE, Sao Jose dos Campos, SP (Brazil); Schuch, Nelson Jorge, E-mail: efrigo@iag.usp.br, E-mail: savian@lacesm.ufsm.br, E-mail: njschuch@lacesm.ufsm.br, E-mail: marlos@dge.inpe.br, E-mail: dallago@dge.inpe.br, E-mail: trivedi@dge.inpe.br [Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil)

    2007-07-01

    An analysis of geomagnetic storm variations and the count rate of cosmic ray muons recorded at the Brazilian Southern Space Observatory -OES/CRS/INPE-MCT, in Sao Martinho da Serra, RS during the month of November 2004, is presented in this paper. The geomagnetic measurements are done by a three component low noise fluxgate magnetometer and the count rates of cosmic ray muons are recorded by a muon scintillator telescope - MST, both instruments installed at the Observatory. The fluxgate magnetometer measures variations in the three orthogonal components of Earth magnetic field, H (North-South), D (East-West) and Z (Vertical), with data sampling rate of 0.5 Hz. The muon scintillator telescope records hourly count rates. The arrival of a solar disturbance can be identified by observing the decrease in the muon count rate. The goal of this work is to describe the physical morphology and phenomenology observed during the geomagnetic storm of November 2004, using the H component of the geomagnetic field and vertical channel V of the multi-directional muon detector in South of Brazil. (author)

  7. ATLAS Recordings

    CERN Multimedia

    Steven Goldfarb; Mitch McLachlan; Homer A. Neal

    Web Archives of ATLAS Plenary Sessions, Workshops, Meetings, and Tutorials from 2005 until this past month are available via the University of Michigan portal here. Most recent additions include the Trigger-Aware Analysis Tutorial by Monika Wielers on March 23 and the ROOT Workshop held at CERN on March 26-27.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal.Feedback WelcomeOur group is making arrangements now to record plenary sessions, tutorials, and other important ATLAS events for 2007. Your suggestions for potential recording, as well as your feedback on existing archives is always welcome. Please contact us at wlap@umich.edu. Thank you.Enjoy the Lectures!

  8. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  9. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-01-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can correspond to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently. The proposed method is based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function φ (e.g. φ minimizing estimation errors. In order to avoid subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of φ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  10. Gait correlation analysis based human identification.

    Science.gov (United States)

    Chen, Jinyan

    2014-01-01

    Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x), vertical axis (y), and temporal axis (t). By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features' dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance. PMID:24592144

  11. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  12. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    Science.gov (United States)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  13. Annotation methods to develop and evaluate an expert system based on natural language processing in electronic medical records.

    Science.gov (United States)

    Gicquel, Quentin; Tvardik, Nastassia; Bouvry, Côme; Kergourlay, Ivan; Bittar, André; Segond, Frédérique; Darmoni, Stefan; Metzger, Marie-Hélène

    2015-01-01

    The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data. PMID:26262366

  14. Nut crop yield records show that budbreak-based chilling requirements may not reflect yield decline chill thresholds

    Science.gov (United States)

    Pope, Katherine S.; Dose, Volker; Da Silva, David; Brown, Patrick H.; DeJong, Theodore M.

    2015-06-01

    Warming winters due to climate change may critically affect temperate tree species. Insufficiently cold winters are thought to result in fewer viable flower buds and the subsequent development of fewer fruits or nuts, decreasing the yield of an orchard or fecundity of a species. The best existing approximation for a threshold of sufficient cold accumulation, the "chilling requirement" of a species or variety, has been quantified by manipulating or modeling the conditions that result in dormant bud breaking. However, the physiological processes that affect budbreak are not the same as those that determine yield. This study sought to test whether budbreak-based chilling thresholds can reasonably approximate the thresholds that affect yield, particularly regarding the potential impacts of climate change on temperate tree crop yields. County-wide yield records for almond ( Prunus dulcis), pistachio ( Pistacia vera), and walnut ( Juglans regia) in the Central Valley of California were compared with 50 years of weather records. Bayesian nonparametric function estimation was used to model yield potentials at varying amounts of chill accumulation. In almonds, average yields occurred when chill accumulation was close to the budbreak-based chilling requirement. However, in the other two crops, pistachios and walnuts, the best previous estimate of the budbreak-based chilling requirements was 19-32 % higher than the chilling accumulations associated with average or above average yields. This research indicates that physiological processes beyond requirements for budbreak should be considered when estimating chill accumulation thresholds of yield decline and potential impacts of climate change.

  15. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  16. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.

    2016-01-01

    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  17. Analysis of female radiation workers dose records in the DAE Units of Andhra Pradesh

    International Nuclear Information System (INIS)

    Basis for control of occupational exposures of women is same as that of men except for pregnant women. Percentage of women working in radiation areas of DAE has marginally increased in the last three decades. This paper analysed the data on the externally received personal dose equivalent for female radiation workers who have been exposed ti ionizing radiation in different occupations of DAE units in Andhra Pradesh. From this study we can say confidently that it is equally safe for women to work in radiation areas as long as they follow radiation protection principles. Hence, women in India should be made aware that it is safe to work in radiation areas and DAE is taking their care by periodical medical checkups, maintaining dose records, etc

  18. Arabic Interface Analysis Based on Cultural Markers

    Directory of Open Access Journals (Sweden)

    Mohammadi Akheela Khanum

    2012-01-01

    Full Text Available This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstedes cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstedes score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  19. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  20. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A

    2012-01-01

    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  1. Constructing storyboards based on hierarchical clustering analysis

    Science.gov (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  2. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  3. A Web-based vital sign telemonitor and recorder for telemedicine applications.

    Science.gov (United States)

    Mendoza, Patricia; Gonzalez, Perla; Villanueva, Brenda; Haltiwanger, Emily; Nazeran, Homer

    2004-01-01

    We describe a vital sign telemonitor (VST) that acquires, records, displays, and provides readings such as: electrocardiograms (ECGs), temperature (T), and oxygen saturation (SaO2) over the Internet to any site. The design of this system consisted of three parts: sensors, analog signal processing circuits, and a user-friendly graphical user interface (GUI). The first part involved selection of appropriate sensors. For ECG, disposable Ag/AgCl electrodes; for temperature, LM35 precision temperature sensor; and for SaO2 the Nonin Oximetry Development Kit equipped with a finger clip were selected. The second part consisted of processing the analog signals obtained from these sensors. This was achieved by implementing suitable amplifiers and filters for the vital signs. The final part focused on development of a GUI to display the vital signs in the LabVIEW environment. From these measurements, important values such as heart rate (HR), beat-to-beat (RR) intervals, SaO2 percentages, and T in both degrees Celsius and Fahrenheit were calculated The GUI could be accessed through the Internet in a Web-page facilitating the possibility of real-time patient telemonitoring. The final system was completed and tested on volunteers with satisfactory results.

  4. New Record of a Sea Urchin Echinometra mathaei (Echinoidea: Camarodonta: Echinometridae from Jeju Island, Korea and Its Molecular Analysis

    Directory of Open Access Journals (Sweden)

    Taekjun Lee

    2012-07-01

    Full Text Available Echinoids were collected at depths of 5-10 m in Munseom, Jeju Island by SCUBA diving on November 23, 2008 and September 15, 2009. Two specimens were identified as Echinometra mathaei (Blainville, 1825 based on morphological characteristics and molecular analyses of mitochondrial cytochrome c oxidase subunit I partial sequences. Echinometra mathaei collected from Korea was redescribed with photographs and was compared with other species from GenBank based on molecular data. Phylogenetic analyses showed that no significant differences were between base sequences of E. mathaei from Korea and that from GenBank. To date, 13 echinoids including this species have been reported from Jeju Island, and 32 echinoids have been recorded in Korea.

  5. Analysis of primary school children's abilities and strategies for reading and recording time from analogue and digital clocks

    Science.gov (United States)

    Boulton-Lewis, Gillian; Wilss, Lynn; Mutch, Sue

    1997-09-01

    Sixty-seven children in Grades 1-3 and 66 children in Grades 4-6 were tested for their ability to read and record analogue and digital times. The children in Grades 4-6 were asked to describe their strategies. A sequence of time acquisition was proposed, based on a recent theory of cognitive development and the literature. This was: hour, half hour, quarter hour, five minute, and minute times. Times after the hour would be more difficult and digital times would be learned sooner. The sequence was confirmed for Grades 1-3; irregularities occurred in Grades 4-6. Some implications are drawn for the teaching of time.

  6. Camera-Vision Based Oil Content Prediction for Oil Palm (Elaeis Guineensis Jacq Fresh Fruits Bunch at Various Recording Distances

    Directory of Open Access Journals (Sweden)

    Dinah Cherie

    2015-01-01

    Full Text Available In this study, the correlation between oil palm fresh fruits bunch (FFB appearance and its oil content (OC was explored. FFB samples were recorded from various distance (2, 7, and 10 m with different lighting spectrums and configurations (Ultraviolet: 280-380nm, Visible: 400-700nm, and Infrared: 720-1100nm and intensities (600watt and 1000watt lamps to explore the correlations. The recorded FFB images were segmented and its color features were subsequently extracted to be used as input variables for modeling the OC of the FFB. In this study, four developed models were selected to perform oil content prediction (OCP for intact FFBs. These models were selected based on their validity and accuracy upon performing the OCP. Models were developed using Multi-Linear-Perceptron-Artificial-Neural-Network (MLP-ANN methods, employing 10 hidden layers and 15 images features as input variables. Statistical engineering software was used to create the models. Although the number of FFB samples in this study was limited, four models were successfully developed to predict intact FFB’s OC, based on its images’ color features. Three OCP models developed for image recording from 10 m under UV, Vis2, and IR2 lighting configurations. Another model was successfully developed for short range imaging (2m under IR2 light. The coefficient of correlation for each model when validated was 0.816, 0.902, 0.919, and 0.886, respectively. For bias and error, these selected models obtained root-mean-square error (RMSE of 1.803, 0.753, 0.607, and 1.104, respectively.

  7. Methodological insights on information technologies and distributional analysis for the archaeological record in stratigraphic excavation contexts: the case study of Neolithic settlement of Sammardenchia

    Directory of Open Access Journals (Sweden)

    Cecilia Milantoni

    2008-06-01

    Full Text Available This paper deals with the methodology applied to the management of records from excavations; usually represented by a huge amount of finds and documents. A latter aim is to experiment the distributional analysis of the archaeological record with a GIS software package. The neolithic site of Sammardenchia represents a case study for approaching digitizing techniques of records collected in several seasons of fieldwork and to experiment distribution analysis. The result allow us to discuss about the latent structure and the function of detailed area; without evident structural features.

  8. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Directory of Open Access Journals (Sweden)

    Sergio Miranda Freire

    Full Text Available This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when

  9. Threshold-based system for noise detection in multilead ECG recordings

    International Nuclear Information System (INIS)

    This paper presents a system for detection of the most common noise types seen on the electrocardiogram (ECG) in order to evaluate whether an episode from 12-lead ECG is reliable for diagnosis. It implements criteria for estimation of the noise corruption level in specific frequency bands, aiming to identify the main sources of ECG quality disruption, such as missing signal or limited dynamics of the QRS components above 4 Hz; presence of high amplitude and steep artifacts seen above 1 Hz; baseline drift estimated at frequencies below 1 Hz; power–line interference in a band ±2 Hz around its central frequency; high-frequency and electromyographic noises above 20 Hz. All noise tests are designed to process the ECG series in the time domain, including 13 adjustable thresholds for amplitude and slope criteria which are evaluated in adjustable time intervals, as well as number of leads. The system allows flexible extension toward application-specific requirements for the noise levels in acceptable quality ECGs. Training of different thresholds’ settings to determine different positive noise detection rates is performed with the annotated set of 1000 ECGs from the PhysioNet database created for the Computing in Cardiology Challenge 2011. Two implementations are highlighted on the receiver operating characteristic (area 0.968) to fit to different applications. The implementation with high sensitivity (Se = 98.7%, Sp = 80.9%) appears as a reliable alarm when there are any incidental problems with the ECG acquisition, while the implementation with high specificity (Sp = 97.8%, Se = 81.8%) is less susceptible to transient problems but rather validates noisy ECGs with acceptable quality during a small portion of the recording. (paper)

  10. Data base of array characteristics instrument response and data, recorded at NNC

    Energy Technology Data Exchange (ETDEWEB)

    Bushueva, E.A.; Ermolenko, E.A.; Efremova, N.A. [and others

    1996-12-01

    A northern and east-northern parts of Kazakstan Republic are utterly favorable for a placing of seismic stations. There is a very low level of natural and industrial seismic noise. Rocks of Kazakh epi-Hercynian platform have a very good transmissive properties. Geophysical observatories (GOs), now belonging to the Institute of Geophysical Researches of National Nuclear Center of Kazakstan Republic (IGR NNC RK), were established in especially selected low-noise places of Northern Kazakstan, in accordance with Soviet program for nuclear weapons test monitoring. In 1994, these GOs were transferred by Russian Federation into the possession of Kazakstan. A location of GOs is shown on the Fig. 1. According to the studying of seismic noises, jointly implemented by scientists from IGR and IRIS, places, where a `Borovoye` and `Kurchatov` seismic stations are located, are among the best places for seismic observations in the world. A seismic arrays exist in `Borovoye` and `Kurchatov` observatories - in two observatories out four (`Aktiubinsk`, `Borovoye`, `Kurchatov` and `Makanchi`). These two observatories are described in this report. A history of geophysical observatories, conditions of equipment operations (climatic, geological and so on) are presented in this report, as well as it is described the equipment of GOs and seismic arrays, and samples of digital seismograms, recorded on the equipment of various types, are presented in this report. GO `Borovoye` is described in the 2nd chart, GO `Kurchatov` is described in the 3rd chart of the report. The main results of work are presented in the conclusion. A list of used papers, a list of tables and figures is given in the end of the report. 14 refs., 95 figs., 12 tabs.

  11. Cystic Echinococcosis Epidemiology in Spain Based on Hospitalization Records, 1997-2012

    Science.gov (United States)

    Siles-Lucas, Mar; Aparicio, Pilar; Lopez-Velez, Rogelio; Gherasim, Alin; Garate, Teresa; Benito, Agustín

    2016-01-01

    Background Cystic echinococcosis (CE) is a parasitic disease caused by the tapeworm Echinococcus granulosus. Although present throughout Europe, deficiencies in the official reporting of CE result in under-reporting and misreporting of this disease, which in turn is reflected in the wrong opinion that CE is not an important health problem. By using an alternative data source, this study aimed at describing the clinical and temporal-spatial characteristics of CE hospitalizations in Spain between 1997 and 2012. Methodology/Principal Findings We performed a retrospective descriptive study using the Hospitalization Minimum Data Set (CMBD in Spanish). All CMBD’s hospital discharges with echinococcosis diagnosis placed in first diagnostic position were reviewed. Hospitalization rates were computed and clinical characteristics were described. Spatial and temporal distribution of hospital discharges was also assessed. Between 1997 and 2012, 14,010 hospitalizations with diagnosis of CE were recorded, 55% were men and 67% were aged over 45 years. Pediatric hospitalizations occurred during the whole study period. The 95.2% were discharged at home, and only 1.7% were exitus. The average cost was 8,439.11 €. The hospitalization rate per 100,000 per year showed a decreasing trend during the study period. All the autonomous communities registered discharges, even those considered as non-endemic. Maximum rates were reached by Extremadura, Castilla-Leon and Aragon. Comparison of the CMBD data and the official Compulsory Notifiable Diseases (CND) reports from 2005 to 2012 showed that official data were lower than registered hospitalization discharges. Conclusions Hospitalizations distribution was uneven by year and autonomous region. Although CE hospitalization rates have decreased considerably due to the success of control programs, it remains a public health problem due to its severity and economic impact. Therefore, it would be desirable to improve its oversight and

  12. Security of the distributed electronic patient record: a case-based approach to identifying policy issues.

    Science.gov (United States)

    Anderson, J G

    2000-11-01

    The growth of managed care and integrated delivery systems has created a new commodity, health information and the technology that it requires. Surveys by Deloitte and Touche indicate that over half of the hospitals in the US are in the process of implementing electronic patient record (EPR) systems. The National Research Council has established that industry spends as much as $15 billion on information technology (IT), an amount that is expanding by 20% per year. The importance of collecting, electronically storing, and using the information is undisputed. This information is needed by consumers to make informed choices; by physicians to provide appropriate quality clinical care: and by health plans to assess outcomes, control costs and monitor quality. The collection, storage and communication of a large variety of personal patient data, however, present a major dilemma. How can we provide the data required by the new forms of health care delivery and at the same time protect the personal privacy of patients? Recent debates concerning medical privacy legislation, software regulation, and telemedicine suggest that this dilemma will not be easily resolved. The problem is systemic and arises out of the routine use and flow of information throughout the health industry. Health care information is primarily transferred among authorized users. Not only is the information used for patient care and financial reimbursement, secondary users of the information include medical, nursing, and allied health education, research, social services, public health, regulation, litigation, and commercial purposes such as the development of new medical technology and marketing. The main threats to privacy and confidentiality arise from within the institutions that provide patient care as well as institutions that have access to patient data for secondary purposes.

  13. Curvelet based offline analysis of SEM images.

    Directory of Open Access Journals (Sweden)

    Syed Hamad Shirazi

    Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.

  14. Glycocapture-based proteomics for secretome analysis.

    Science.gov (United States)

    Lai, Zon W; Nice, Edouard C; Schilling, Oliver

    2013-02-01

    Protein glycosylation represents the most abundant extracellular posttranslational modification in multicellular organisms. These glycoproteins unequivocally comprise the major biomolecules involved in extracellular processes, such as growth factors, signaling proteins for cellular communication, enzymes, and proteases for on- and off-site processing. It is now known that altered protein glycosylation is a hallmark event in many different pathologies. Glycoproteins are found mostly in the so-called secretome, which comprises classically and nonclassically secreted proteins and protein fragments that are released from the cell surface through ectodomain shedding. Due to biological complexity and technical difficulty, comparably few studies have taken an in-depth investigation of cellular secretomes using system-wide approaches. The cellular secretomes are considered to be a valuable source of therapeutic targets and novel biomarkers. It is not surprising that many existing biomarkers, including biomarkers for breast, ovarian, prostate, and colorectal cancers are glycoproteins. Focused analysis of secreted glycoproteins could thus provide valuable information for early disease diagnosis, and surveillance. Furthermore, since most secreted proteins are glycosylated and glycosylation predominantly targets secreted proteins, the glycan/sugar moiety itself can be used as a chemical "handle" for the targeted analysis of cellular secretomes, thereby reducing sample complexity and allowing detection of low abundance proteins in proteomic workflows. This review will focus on various glycoprotein enrichment strategies that facilitate proteomics-based technologies for the quantitative analysis of cell secretomes and cell surface proteomes.

  15. A stable, unbiased, long-term satellite based data record of sea surface temperature from ESA's Climate Change Initiative

    Science.gov (United States)

    Rayner, Nick; Good, Simon; Merchant, Chris

    2013-04-01

    The study of climate change demands long-term, stable observational records of climate variables such as sea surface temperature (SST). ESA's Climate Change Initiative was set up to unlock the potential of satellite data records for this purpose. As part of this initiative, 13 projects were established to develop the data records for different essential climate variables - aerosol, cloud, fire, greenhouse gases, glaciers, ice sheets, land cover, ocean colour, ozone, sea ice, sea level, soil moisture and SST. In this presentation we describe the development work that has taken place in the SST project and present new prototype data products that are available now for users to trial. The SST project began in 2010 and has now produced two prototype products. The first is a long-term product (covering mid-1991 - 2010 currently, but with a view to update this in the future), which prioritises length of data record and stability over other considerations. It is based on data from the Along-Track Scanning Radiometer (ATSR) and Advanced Very-High Resolution Radiometer (AVHRR) series of satellite instruments. The product aims to combine the favourable stability and bias characteristics of ATSR data with the geographical coverage achieved with the AVHRR series. Following an algorithm selection process, an optimal estimation approach to retrieving SST from the satellite measurements from both sensors was adopted. The retrievals do not depend on in situ data and so this data record represents an independent assessment of SST change. In situ data are, however, being used to validate the resulting data. The second data product demonstrates the coverage that can be achieved using the modern satellite observing system including, for example, geostationary satellite data. Six months worth of data have been processed for this demonstration product. The prototype SST products will be released in April to users to trial in their work. The long term product will be available as

  16. The CONTENT project: a problem-oriented, episode-based electronic patient record in primary care

    Directory of Open Access Journals (Sweden)

    Gunter Laux

    2005-12-01

    The aims are strictly scientific and the underlying hypothesis is that the knowledge-gaining process can be accelerated by combining the experience of many, especially with respect to complex interactions of factors and the analysis of rare events. Aside from maintaining a morbidity registry, within the CONTENT framework various prospective and retrospective studies on particular epidemiological and health economic research topics will be conducted.

  17. Statistical analysis of low-frequency noise recorded in ASIAEX by a PANDA system

    Science.gov (United States)

    Potter, John R.; Beng, Koay T.; Pallayil, Venugopalan

    2002-11-01

    As part of the ASIAEX experiment, the Acoustic Research Laboratory in the Tropical Marine Science Institute at the National University of Singapore deployed a Pop-up Ambient Noise Data Acquisition (PANDA) system. The PANDA was recovered 18 days later with over 9 days of continuous data recorded from a single hydrophone at 2 kSa/s. The data show the various sources that were deployed as part of the experiment, but also provide interesting statistical information on low-frequency ambient noise in the region and the passage of numerous ships. This deployment was in a heavy shipping traffic area, hostile both in terms of potential snagging by fishing activity and in terms of the high levels of noise encountered, both of which are of interest for the deployment and successful use of autonomous acoustic systems in busy littoral waters. We present some statistical results from the 3+ GByte of data. [Work supported by the Defence Science and Technology Agency, Singapore and the US ONR.

  18. Performance evaluation of a web-based system to exchange Electronic Health Records using Queueing model (M/M/1).

    Science.gov (United States)

    de la Torre, Isabel; Díaz, Francisco Javier; Antón, Míriam; Martínez, Mario; Díez, José Fernando; Boto, Daniel; López, Miguel; Hornero, Roberto; López, María Isabel

    2012-04-01

    Response time measurement of a web-based system is essential to evaluate its performance. This paper shows a comparison of the response times of a Web-based system for Ophthalmologic Electronic Health Records (EHRs), TeleOftalWeb. It makes use of different database models like Oracle 10 g, dbXML 2.0, Xindice 1.2, and eXist 1.1.1. The system's modelling, which uses Tandem Queue networks, will allow us to estimate the service times of the different components of the system (CPU, network and databases). In order to calculate those times, associated to the different databases, benchmarking techniques are used. The final objective of the comparison is to choose the database system resulting in the lowest response time to TeleOftalWeb and to compare the obtained results using a new benchmarking. PMID:20703642

  19. Performance evaluation of a web-based system to exchange Electronic Health Records using Queueing model (M/M/1).

    Science.gov (United States)

    de la Torre, Isabel; Díaz, Francisco Javier; Antón, Míriam; Martínez, Mario; Díez, José Fernando; Boto, Daniel; López, Miguel; Hornero, Roberto; López, María Isabel

    2012-04-01

    Response time measurement of a web-based system is essential to evaluate its performance. This paper shows a comparison of the response times of a Web-based system for Ophthalmologic Electronic Health Records (EHRs), TeleOftalWeb. It makes use of different database models like Oracle 10 g, dbXML 2.0, Xindice 1.2, and eXist 1.1.1. The system's modelling, which uses Tandem Queue networks, will allow us to estimate the service times of the different components of the system (CPU, network and databases). In order to calculate those times, associated to the different databases, benchmarking techniques are used. The final objective of the comparison is to choose the database system resulting in the lowest response time to TeleOftalWeb and to compare the obtained results using a new benchmarking.

  20. Congenital anomalies in children with cerebral palsy: a population-based record linkage study

    DEFF Research Database (Denmark)

    Rankin, Judith; Cans, Christine; Garne, Ester;

    2010-01-01

    Our aim was to determine the proportion of children with cerebral palsy (CP) who have a congenital anomaly (CA) in three regions (Isère Region, French Alps; Funen County, Denmark; Northern Region, England) where population-based CP and CA registries exist, and to classify the children according to...

  1. Number of Black Children in Extreme Poverty Hits Record High. Analysis Background.

    Science.gov (United States)

    Children's Defense Fund, Washington, DC.

    To examine the experiences of black children and poverty, researchers conducted a computer analysis of data from the U.S. Census Bureau's Current Population Survey, the source of official government poverty statistics. The data are through 2001. Results indicated that nearly 1 million black children were living in extreme poverty, with after-tax…

  2. Manual vs. automated analysis of polysomnographic recordings in patients with chronic obstructive pulmonary disease

    NARCIS (Netherlands)

    Stege, G.; Vos, P.J.E.; Dekhuijzen, P.N.R.; Hilkens, P.H.; Ven, M.J.T. van de; Heijdra, Y.F.; Elshout, F.J.J. van den

    2013-01-01

    PURPOSE: The sleep quality, as assessed by polysomnography (PSG), of patients with chronic obstructive pulmonary disease (COPD) can be severely disturbed. The manual analysis of PSGs is time-consuming, and computer systems have been developed to automatically analyze PSGs. Studies on the reliability

  3. A comparative analysis of gamma and hadron families at the superhigh energies recorded in experiment Pamir

    Science.gov (United States)

    Azimov, S. A.; Mulladjanov, E. J.; Nosov, A. N.; Nuritdinov, H.; Talipov, D. A.; Halilov, D. A.; Yuldashbaev, T. S.

    1985-01-01

    A comparative analysis of hadron and gamma families which have undergone the decascading procedure is made. Results are compared with different models of interactions. In hadron families with energies Summary E sub H sup gamma 20 TeV as well as in gamma families with energies Summary E sub gamma 70 TeV, increasing azimuthal anisotropy is observed.

  4. Pattern recognition on X-ray fluorescence records from Copenhagen lake sediments using principal component analysis

    DEFF Research Database (Denmark)

    Schreiber, Norman; Garcia, Emanuel; Kroon, Aart;

    2014-01-01

    Principle Component Analysis (PCA) was performed on chemical data of two sediment cores from an urban fresh-water lake in Copenhagen, Denmark. X-ray fluorescence (XRF) core scanning provided the underlying datasets on 13 variables (Si, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Cd, Pb). Principle Com...

  5. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    Science.gov (United States)

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  6. Records Management

    Data.gov (United States)

    U.S. Environmental Protection Agency — All Federal Agencies are required to prescribe an appropriate records maintenance program so that complete records are filed or otherwise preserved, records can be...

  7. The role of home-based records in the establishment of a continuum of care for mothers, newborns, and children in Indonesia

    OpenAIRE

    Osaki, Keiko; Hattori, Tomoko; Kosen, Soewarta

    2013-01-01

    Background: The provision of appropriate care along the continuum of maternal, newborn, and child health (MNCH) service delivery is a challenge in developing countries. To improve this, in the 1990s, Indonesia introduced the maternal and child health (MCH) handbook, as an integrated form of parallel home-based records.Objective: This study aimed to identify the roles of home-based records both before and after childbirth, especially in provinces where the MCH handbook (MCHHB) was extensively ...

  8. Assessment of providers' referral decisions in Rural Burkina Faso: a retrospective analysis of medical records

    Directory of Open Access Journals (Sweden)

    Ilboudo Tegawende

    2012-03-01

    Full Text Available Abstract Background A well-functioning referral system is fundamental to primary health care delivery. Understanding the providers' referral decision-making process becomes critical. This study's aim was to assess the correctness of diagnoses and appropriateness of the providers' referral decisions from health centers (HCs to district hospitals (DHs among patients with severe malaria and pneumonia. Methods A record review of twelve months of consultations was conducted covering eight randomly selected HCs to identify severe malaria (SM cases among children under five and pneumonia cases among adults. The correctness of the diagnosis and appropriateness of providers' referral decisions were determined using the National Clinical Guidebook as a 'gold standard'. Results Among the 457 SM cases affecting children under five, only 66 cases (14.4% were correctly diagnosed and of those 66 correctly diagnosed cases, 40 cases (60.6% received an appropriate referral decision from their providers. Within these 66 correctly diagnosed SM cases, only 60.6% were appropriately referred. Among the adult pneumonia cases, 5.9% (79/1331 of the diagnoses were correctly diagnosed; however, the appropriateness rate of the provider's referral decision was 98.7% (78/79. There was only one case that should not have been referred but was referred. Conclusions The adherence to the National Guidelines among the health center providers when making a diagnosis was low for both severe malaria cases and pneumonia cases. The appropriateness of the referral decisions was particularly poor for children with severe malaria. Health center providers need to be better trained in the diagnostic process and in disease management in order to improve the performance of the referral system in rural Burkina Faso.

  9. North Cascade Glacier Annual Mass Balance Record Analysis 1984-2013

    Science.gov (United States)

    Pelto, M. S.

    2014-12-01

    The North Cascade Glacier Climate Project (NCGCP) was founded in 1983 to monitor 10 glaciers throughout the range and identify their response to climate change. The annual observations include mass balance, terminus behavior, glacier surface area and accumulation area ratio (AAR). Annual mass balance (Ba) measurements have been continued on the 8 original glaciers that still exist. Two glaciers have disappeared: the Lewis Glacier and Spider Glacier. In 1990, Easton Glacier and Sholes Glacier were added to the annual balance program to offset the loss. One other glacier Foss Glacier has declined to the extent that continued measurement will likely not be possible. Here we examine the 30 year long Ba time series from this project. All of the data have been reported to the World Glacier Monitoring Service (WGMS). This comparatively long record from glaciers in one region conducted by the same research program using the same methods offers some useful comparative data. Degree day factors for melt of 4.3 mm w.e.°C-1d-1 for snow and 6.6 mm w.e.°C-1d-1 for ice has been determined from 412 days of ablation observation. The variation in the AAR for equilibrium Ba is small ranging from 60 to 67. The mean annual balance of the glaciers from 1984-2013 is -0.45 ma-1, ranging from -0.31 to -0.57 ma-1 for individual glacier's. The correlation coefficient of Ba is above 0.80 between all glaciers including the USGS benchmark glacier, South Cascade Glacier. This indicates that the response is to regional climate change, not local factors. The mean annual balance of -0.45 ma-1 is close to the WGMS global average for this period -0.50 ma-1. The cumulative loss of 13.5 m w.e. and 15 m of ice thickness represents more than 20% of the volume of the glaciers.

  10. Long-term invariant parameters obtained from 24-h Holter recordings: A comparison between different analysis techniques

    Science.gov (United States)

    Cerutti, Sergio; Esposti, Federico; Ferrario, Manuela; Sassi, Roberto; Signorini, Maria Gabriella

    2007-03-01

    Over the last two decades, a large number of different methods had been used to study the fractal-like behavior of the heart rate variability (HRV). In this paper some of the most used techniques were reviewed. In particular, the focus is set on those methods which characterize the long memory behavior of time series (in particular, periodogram, detrended fluctuation analysis, rescale range analysis, scaled window variance, Higuchi dimension, wavelet-transform modulus maxima, and generalized structure functions). The performances of the different techniques were tested on simulated self-similar noises (fBm and fGn) for values of α, the slope of the spectral density for very small frequency, ranging from -1 to 3 with a 0.05 step. The check was performed using the scaling relationships between the various indices. DFA and periodogram showed the smallest mean square error from the expected values in the range of interest for HRV. Building on the results obtained from these tests, the effective ability of the different methods in discriminating different populations of patients from RR series derived from Holter recordings, was assessed. To this extent, the Noltisalis database was used. It consists of a set of 30, 24-h Holter recordings collected from healthy subjects, patients suffering from congestive heart failure, and heart transplanted patients. All the methods, with the exception at most of rescale range analysis, were almost equivalent in distinguish between the three groups of patients. Finally, the scaling relationships, valid for fBm and fGn, when empirically used on HRV series, also approximately held.

  11. Detection and removal of ventricular ectopic beats in atrial fibrillation recordings via principal component analysis.

    Science.gov (United States)

    Martínez, Arturo; Alcaraz, Raúl; Rieta, José J

    2011-01-01

    Ectopic beats are early heart beats with remarkable large amplitude that provoke serious disturbances in the analysis of electrocardiograms (ECG). These beats are very common in atrial fibrillation (AF) and are the source of important residua when the QRST is intended to be removed. Given that QRST cancellation is a binding step in the appropriate analysis of atrial activity (AA) in AF, a method for ventricular ectopic beats cancellation is proposed as a previous step to the application of any QRST removal technique. First, the method discriminates between normal and ectopic beats with an accuracy higher than 99% through QRS morphological characterization. Next, the most similar ectopic beats to the one under cancellation are clustered and serve to get their eigenvector matrix by principal component analysis. Finally, the highest variance eigenvector is used as cancellation template. The reduction ectopic rate (RER) has been defined to evaluate the method's performance by using templates generated with 5, 10, 20, 40 or 80 ectopics. Optimal results were reached with the 5 most similar complexes, yielding a RER higher than 5.5. In addition, a decreasing RER trend was noticed as the number of considered ectopics for cancellation increased. As conclusion, given that ectopics presented a remarkable variability in their morphology, the proposed cancellation approach is a robust ectopic remover and can notably facilitate the later application of any QRST cancellation technique to extract the AA in the best conditions. PMID:22255385

  12. SEPServer catalogues of solar energetic particle events at 1 AU based on STEREO recordings: 2007-2012

    Science.gov (United States)

    Papaioannou, A.; Malandraki, O. E.; Dresing, N.; Heber, B.; Klein, K.-L.; Vainio, R.; Rodríguez-Gasén, R.; Klassen, A.; Nindos, A.; Heynderickx, D.; Mewaldt, R. A.; Gómez-Herrero, R.; Vilmer, N.; Kouloumvakos, A.; Tziotziou, K.; Tsiropoula, G.

    2014-09-01

    The Solar Terrestrial Relations Observatory (STEREO) recordings provide an unprecedented opportunity to study the evolution of solar energetic particle (SEP) events from different observation points in the heliosphere, allowing one to identify the effects of the properties of the interplanetary magnetic field (IMF) and solar wind structures on the interplanetary transport and acceleration of SEPs. Two catalogues based on STEREO recordings, have been compiled as a part of the SEPServer project, a three-year collaborative effort of eleven European partners funded under the Seventh Framework Programme of the European Union (FP7/SPACE). In particular, two instruments on board STEREO have been used to identify all SEP events observed within the descending phase of solar cycle 23 and the rising phase of solar cycle 24 from 2007 to 2012, namely: the Low Energy Telescope (LET) and the Solar Electron Proton Telescope (SEPT). A scan of STEREO/LET protons within the energy range 6-10 MeV has been performed for each of the two STEREO spacecraft. We have tracked all enhancements that have been observed above the background level of this particular channel and cross-checked with available lists of interplanetary coronal mass ejections (ICMEs), stream interaction regions (SIRs), and shocks, as well as with the reported events in literature. Furthermore, parallel scanning of the STEREO near relativistic electrons has been performed in order to pinpoint the presence (or absence) of an electron event in the energy range of 55-85 keV, for all of the aforementioned proton events included in our lists. We provide the onset and peak time as well as the peak value of all events for both protons and electrons, the relevant solar associations in terms of electromagnetic emissions, soft and hard X-rays (SXRs and HXRs). Finally, a subset of events with clear recordings at both STEREO spacecraft is presented together with the parent solar events of these multispacecraft SEP events.

  13. Watermark Resistance Analysis Based On Linear Transformation

    Directory of Open Access Journals (Sweden)

    N.Karthika Devi

    2012-06-01

    Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.

  14. Visual Similarity Based Document Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    Di Wen; Xiao-Qing Ding

    2006-01-01

    In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.

  15. Damage detection of a large structure based on strong motion record. Theory of adoptive forward-backward Kalman filter

    International Nuclear Information System (INIS)

    The report presents a new system identification procedure for a time-varing system to estimate natural frequency transition of a damaged building from a strong seismic motion record, named as 'Adaptive Forward-Backward Kalman Filter (AFB-KF)'. The AFB-KF is compared to the conventional Kalman filter in the below three points: (1) Forgetting Factor for corvatiance functions to track time-varying structural parameters rapidly, (2) Time-backward estimation scheme and global iteration scheme of the forward processes to estimate unknown initial value of structural parameters, (3) The time series renewal algorithm of statistical properties by reflecting the previous analysis information to improve the identification accuracy. It is useful to accurate identify from natural frequency transition of a building during earthquake for structural health monitoring which evaluates structural integrity. (author)

  16. SINGLE-SHELL TANK INTEGRITY PROJECT ANALYSIS OF RECORD-PRELIMINARY MODELING PLAN FOR THERMAL AND OPERATING LOADS

    Energy Technology Data Exchange (ETDEWEB)

    RAST RS; RINKER MW; BAPANAALLI SK; DEIBLER JE; GUZMAN-LEONG CE; JOHNSON KI; KARRI NK; PILLI SP; SANBORN SE

    2010-10-22

    This document is a Phase I deliverable for the Single-Shell Tank Analysis of Record effort. This document is not the Analysis of Record. The intent of this document is to guide the Phase II detailed modeling effort. Preliminary finite element models for each of the tank types were developed and different case studies were performed on one or more of these tank types. Case studies evaluated include thermal loading, waste level variation, the sensitivity of boundary effects (soil radial extent), excavation slope or run to rise ratio, soil stratigraphic (property and layer thickness) variation at different farm locations, and concrete material property variation and their degradation under thermal loads. The preliminary analysis document reviews and preliminary modeling analysis results are reported herein. In addition, this report provides recommendations for the next phase of the SST AOR project, SST detailed modeling. Efforts and results discussed in this report do not include seismic modeling as seismic modeling is covered by a separate report. The combined results of both static and seismic models are required to complete this effort. The SST AOR project supports the US Department of Energy's (DOE) Office of River Protection (ORP) mission for obtaining a better understanding of the structural integrity of Hanford's SSTs. The 149 SSTs, with six different geometries, have experienced a range of operating histories which would require a large number of unique analyses to fully characterize their individual structural integrity. Preliminary modeling evaluations were conducted to determine the number of analyses required for adequate bounding of each of the SST tank types in the Detailed Modeling Phase of the SST AOR Project. The preliminary modeling was conducted in conjunction with the Evaluation Criteria report, Johnson et al. (2010). Reviews of existing documents were conducted at the initial stage of preliminary modeling. These reviews guided the topics

  17. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  18. Rehabilitation System based on the Use of Biomechanical Analysis and Videogames through the Kinect Sensor

    Directory of Open Access Journals (Sweden)

    John E. Muñoz-Cardona

    2013-11-01

    Full Text Available This paper presents development of a novel system for physical rehabilitation of patients with multiple pathologies, through dynamic with exercise videogames (exergames and analysis of the movements of patients using developed software. This system is based on the use of the Kinect sensor for both purposes: amusing the patient in therapy through of specialist exergames and provide a tool to record and analyze MoCap data taken through the Kinect sensor and processed using biomechanical analysis through Euler angles. All interactive system is installed in a rehabilitation center and works with different pathologies (stroke, IMOC, craneoencephallic trauma, etc., patients interact with the platform while the specialist records data for later analysis, which is performed by software designed for this purpose. The motion graphics are shown in the sagittal, frontal and rotationalplanefrom20 points distributed in the body. The final system is portable, non-invasive, inexpensive, natural interaction with the patient and easily implemented for medical purposes.

  19. A Retrospective Analysis of the Burn Injury Patients Records in the Emergency Department, an Epidemiologic Study

    Directory of Open Access Journals (Sweden)

    Nilgün Aksoy

    2014-08-01

    Full Text Available Introduction: Burns can be very destructive, and severely endanger the health and lives of humans. It maybe cause disability and even psychological trauma in individuals. . Such an event can also lead to economic burden on victim’s families and society. The aim of our study is to evaluate epidemiology and outcome of burn patients referring to emergency department. Methods: This is a cross-sectional study was conducted by evaluation of patients’ files and forensic reports of burned patients’ referred to the emergency department (ED of Akdeniz hospital, Turkey, 2008. Demographic data, the season, place, reason, anatomical sites, total body surface area, degrees, proceeding treatment, and admission time were recorded. Multinomial logistic regression was used to compare frequencies’ differences among single categorized variables. Stepwise logistic regression was applied to develop a predictive model for hospitalization. P<0.05 was defined as a significant level. Results: Two hundred thirty patients were enrolled (53.9% female. The mean of patients' ages was 25.3 ± 22.3 years. The most prevalence of burn were in the 0-6 age group and most of which was hot liquid scalding (71.3%. The most affected parts of the body were the left and right upper extremities. With increasing the severity of triage level (OR=2.2; 95% CI: 1.02-4.66; p=0.046, intentional burn (OR=4.7; 95% CI: 1.03-21.8; p=0.047, referring from other hospitals or clinics (OR=3.4; 95% CI: 1.7-6.6; p=0.001, and percentage of burn (OR=18.1; 95% CI: 5.42-62.6; p<0.001 were independent predictive factor for hospitalization. In addition, odds of hospitalization was lower in patients older than 15 years (OR=0.7; 95% CI: 0.5-0.91; p=0.035. Conclusion: This study revealed the most frequent burns are encountered in the age group of 0-6 years, percentage of <10%, second degree, upper extremities, indoor, and scalding from hot liquids. Increasing ESI severity, intentional burn, referring from

  20. Interpreting land records

    CERN Document Server

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  1. Seismic Design Value Evaluation Based on Checking Records and Site Geological Conditions Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Tienfuan Kerh

    2013-01-01

    Full Text Available This study proposes an improved computational neural network model that uses three seismic parameters (i.e., local magnitude, epicentral distance, and epicenter depth and two geological conditions (i.e., shear wave velocity and standard penetration test value as the inputs for predicting peak ground acceleration—the key element for evaluating earthquake response. Initial comparison results show that a neural network model with three neurons in the hidden layer can achieve relatively better performance based on the evaluation index of correlation coefficient or mean square error. This study further develops a new weight-based neural network model for estimating peak ground acceleration at unchecked sites. Four locations identified to have higher estimated peak ground accelerations than that of the seismic design value in the 24 subdivision zones are investigated in Taiwan. Finally, this study develops a new equation for the relationship of horizontal peak ground acceleration and focal distance by the curve fitting method. This equation represents seismic characteristics in Taiwan region more reliably and reasonably. The results of this study provide an insight into this type of nonlinear problem, and the proposed method may be applicable to other areas of interest around the world.

  2. Discourses of aggression in forensic mental health: a critical discourse analysis of mental health nursing staff records.

    Science.gov (United States)

    Berring, Lene L; Pedersen, Liselotte; Buus, Niels

    2015-12-01

    Managing aggression in mental health hospitals is an important and challenging task for clinical nursing staff. A majority of studies focus on the perspective of clinicians, and research mainly depicts aggression by referring to patient-related factors. This qualitative study investigates how aggression is communicated in forensic mental health nursing records. The aim of the study was to gain insight into the discursive practices used by forensic mental health nursing staff when they record observed aggressive incidents. Textual accounts were extracted from the Staff Observation Aggression Scale-Revised (SOAS-R), and Fairclough's critical discourse analysis was used to identify short narrative entries depicting patients and staffs in typical ways. The narratives contained descriptions of complex interactions between patient and staff that were linked to specific circumstances surrounding the patient. These antecedents, combined with the aggression incident itself, created stereotyping representations of forensic psychiatric patients as deviant, unpredictable and dangerous. Patient and staff identities were continually (re)produced by an automatic response from the staff that was solely focused on the patient's behavior. Such response might impede implementation of new strategies for managing aggression.

  3. Identification and analysis of shear waves recorded by three-component OBSs in northeastern South China Sea

    Institute of Scientific and Technical Information of China (English)

    Minghui Zhao; Xuelin Qiu; Shaohong Xia; Ping Wang; Kanyuan Xia; Huilong Xu

    2008-01-01

    Structure models associated with P- and S-wave velocities contain considerable amount of information on lithology and geophysical properties, which can be used to better understand the complexity of the deep crustal structure. However, records of converted shear waves are less due to the speciality of seismic survey at sea and the rigorous generated conditions. The study on shear waves has always been a weakness for studying the deep crustal structures of South China Sea (SCS). In this paper, eleven three-component OBSs were deployed along the Profile OBS-2001 in northeastern SCS. After the data processing of polarization and band-pass filter, converted Swave phases were identified in the radical component records of nine OBSs. Taking the OBS7 as an example, identification and analysis of converted shear waves were presented and discussed in detail. A few phase groups, such as PwSc, PgSs, PnSc, PmS, and PwSn, were found coming from the deep crust or Moho interface by simple theoretical model calculation and ray-tracing simulation. The results not only provide the underlying basis for studies of S-wave velocity structure and Poisson's ratio structure, but also reveal the relationship between crustal petrology and seismology, which will be of importance for making full use of S-wave information in the future.

  4. Rehabilitation System based on the Use of Biomechanical Analysis and Videogames through the Kinect Sensor

    OpenAIRE

    John E. Muñoz-Cardona; Oscar A. Henao-Gallo; José F. López-Herrera

    2013-01-01

    This paper presents development of a novel system for physical rehabilitation of patients with multiple pathologies, through dynamic with exercise videogames (exergames) and analysis of the movements of patients using developed software. This system is based on the use of the Kinect sensor for both purposes: amusing the patient in therapy through of specialist exergames and provide a tool to record and analyze MoCap data taken through the Kinect sensor and processed using biomechanical analys...

  5. 基于单片机的语音记录仪%Voice recorder based on microcontroller

    Institute of Scientific and Technical Information of China (English)

    王彦茹; 胡体玲

    2011-01-01

    介绍了一种基于单片机AT89S52和语音芯片ISD25120的语音记录仪,可实现对语音信号的实时采集,分段存储,选段播放的功能。通过按键和液晶1602实现人机对话,对功能(录音或者放音)、频道(存储或播放的位置)以及音量的大小进行选择和控制。该语音记录仪可应用于公交语音报站系统或银行报号系统等装置中。%This paper introduces a voice recorder based on AT89S52 and ISD25120. It can be realized the functions of realtime acquisition of the voice signal, fragmentation, excerpts playback. Moreover, it can be to come true about human-machine dialogue, channel selection and volume control. The voice recorder can be applied to bus-stop system, banking system and other devices.

  6. When Did Carcharocles megalodon Become Extinct? A New Analysis of the Fossil Record

    OpenAIRE

    Catalina Pimiento; Clements, Christopher F.

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding th...

  7. When did carcharocles megalodon become extinct? A new analysis of the fossil record

    OpenAIRE

    Pimiento, Catalina; Clements, Christopher F.

    2014-01-01

    Carcharocles megalodon (“Megalodon”) is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9–2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding th...

  8. Some results of analysis of inverted echo-sounder records from the Atlantic Equatorial region

    Directory of Open Access Journals (Sweden)

    Alberto dos Santos Franco

    1985-01-01

    Full Text Available The tidal analysis of data from the Equatorial region, given by inverted echo-sounders, show considerable residuals in the frequency band of approximately 2 cycles per day. In the even harmonics of 4 and 6 cycles per day, tidal components statistically not negligible are also identified. Spectral analysis of temperature series from the same area show, on the other hand, variabilities in the same frequency bands, which suggests the occurrence of internal waves with energy distributed in these frequency bands, in the Atlantic Equatorial area.Análises de dados de maré, da zona equatorial, obtidos com ecobatímetros invertidos, mostram consideráveis resíduos na faixa de freqüências com aproximadamente dois ciclos por dia. Nos harmônicos pares com 4 e 6 ciclos por dia são também identificadas componentes de maré estatisticamente não desprezíveis. Análises espectrais de séries de temperatura obtidas na mesma área mostram, 218 por outro lado, variabilidades na mesma faixa de freqüências, o que sugere a ocorrência, na área equatorial Atlântica, de ondas internas com energia distribuída nessas faixas espectrais.

  9. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  10. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  11. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    Science.gov (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice. PMID:25991124

  12. Electronic Medical Record-Based Predictive Model for Acute Kidney Injury in an Acute Care Hospital.

    Science.gov (United States)

    Laszczyńska, Olga; Severo, Milton; Azevedo, Ana

    2016-01-01

    Patients with acute kidney injury (AKI) are at risk for increased morbidity and mortality. Lack of specific treatment has meant that efforts have focused on early diagnosis and timely treatment. Advanced algorithms for clinical assistance including AKI prediction models have potential to provide accurate risk estimates. In this project, we aim to provide a clinical decision supporting system (CDSS) based on a self-learning predictive model for AKI in patients of an acute care hospital. Data of all in-patient episodes in adults admitted will be analysed using "data mining" techniques to build a prediction model. The subsequent machine-learning process including two algorithms for data stream and concept drift will refine the predictive ability of the model. Simulation studies on the model will be used to quantify the expected impact of several scenarios of change in factors that influence AKI incidence. The proposed dynamic CDSS will apply to future in-hospital AKI surveillance in clinical practice. PMID:27577501

  13. Rational design of on-chip refractive index sensors based on lattice plasmon resonances (Presentation Recording)

    Science.gov (United States)

    Lin, Linhan; Zheng, Yuebing

    2015-08-01

    Lattice plasmon resonances (LPRs), which originate from the plasmonic-photonic coupling in gold or silver nanoparticle arrays, possess ultra-narrow linewidth by suppressing the radiative damping and provide the possibility to develop the plasmonic sensors with high figure of merit (FOM). However, the plasmonic-photonic coupling is greatly suppressed when the nanoparticles are immobilized on substrates because the diffraction orders are cut off at the nanoparticle-substrate interfaces. Here, we develop the rational design of LPR structures for the high-performance, on-chip plasmonic sensors based on both orthogonal and parallel coupling. Our finite-difference time-domain simulations in the core/shell SiO2/Au nanocylinder arrays (NCAs) reveal that new modes of localized surface plasmon resonances (LSPRs) show up when the aspect ratio of the NCAs is increased. The height-induced LSPRs couple with the superstrate diffraction orders to generate the robust LPRs in asymmetric environment. The high wavelength sensitivity and narrow linewidth in these LPRs lead to the plasmonic sensors with high FOM and high signal-to-noise ratio (SNR). Wide working wavelengths from visible to near-infrared are also achieved by tuning the parameters of the NCAs. Moreover, the wide detection range of refractive index is obtained in the parallel LPR structure. The electromagnetic field distributions in the NCAs demonstrate the height-enabled tunability of the plasmonic "hot spots" at the sub-nanoparticles resolution and the coupling between these "hot spots" with the superstrate diffraction waves, which are responsible for the high performance LPRs-based on-chip refractive index sensors.

  14. Linear and nonlinear analysis of airflow recordings to help in sleep apnoea–hypopnoea syndrome diagnosis

    International Nuclear Information System (INIS)

    This paper focuses on the analysis of single-channel airflow (AF) signal to help in sleep apnoea–hypopnoea syndrome (SAHS) diagnosis. The respiratory rate variability (RRV) series is derived from AF by measuring time between consecutive breathings. A set of statistical, spectral and nonlinear features are extracted from both signals. Then, the forward stepwise logistic regression (FSLR) procedure is used in order to perform feature selection and classification. Three logistic regression (LR) models are obtained by applying FSLR to features from AF, RRV and both signals simultaneously. The diagnostic performance of single features and LR models is assessed and compared in terms of sensitivity, specificity, accuracy and area under the receiver-operating characteristics curve (AROC). The highest accuracy (82.43%) and AROC (0.903) are reached by the LR model derived from the combination of AF and RRV features. This result suggests that AF and RRV provide useful information to detect SAHS. (paper)

  15. Analysis of counts with two latent classes, with application to risk assessment using physician visit records

    OpenAIRE

    Wang, Huijing

    2012-01-01

    Motivated by the CAYACS program at BC Cancer Research Center, this thesis project introduces a latent class model to formulate event counts. In particular, we consider a pop- ulation with two latent classes, such as an at-risk group and a not-at-risk group of cancer survivors in the CAYACS program. Likelihood-based inference procedures are proposed for estimating the model parameters with or without one class fully specified. The EM algo- rithm is adapted to compute the MLE; a pseudo MLE of t...

  16. AC quantum efficiency harmonic analysis of exciton annihilation in organic light emitting diodes (Presentation Recording)

    Science.gov (United States)

    Giebink, Noel C.

    2015-10-01

    Exciton annihilation processes impact both the lifetime and efficiency roll-off of organic light emitting diodes (OLEDs), however it is notoriously difficult to identify the dominant mode of annihilation in operating devices (exciton-exciton vs. exciton-charge carrier) and subsequently to disentangle its magnitude from competing roll-off processes such as charge imbalance. Here, we introduce a simple analytical method to directly identify and extract OLED annihilation rates from standard light-current-voltage (LIV) measurement data. The foundation of this approach lies in a frequency domain EQE analysis and is most easily understood in analogy to impedance spectroscopy, where in this case both the current (J) and electroluminescence intensity (L) are measured using a lock-in amplifier at different harmonics of the sinusoidal dither superimposed on the DC device bias. In the presence of annihilation, the relationship between recombination current and light output (proportional to exciton density) becomes nonlinear, thereby mixing the different EQE harmonics in a manner that depends uniquely on the type and magnitude of annihilation. We derive simple expressions to extract different annihilation rate coefficients and apply this technique to a variety of OLEDs. For example, in devices dominated by triplet-triplet annihilation, the annihilation rate coefficient, K_TT, is obtained directly from the linear slope that results from plotting EQE_DC-EQE_1ω versus L_DC (2EQE_1ω-EQE_DC). We go on to show that, in certain cases it is sufficient to calculate EQE_1ω directly from the slope of the DC light versus current curve [i.e. via (dL_DC)/(dJ_DC )], thus enabling this analysis to be conducted solely from common LIV measurement data.

  17. Comparative Analysis Between Tree-Ring Based Drought Records of Populuse up hratic a and Piceasc hre nkiana from the Mountains and Plains of Eastern Xinjiang%新疆东部雪岭云杉和胡杨树轮记录的干湿变化对比分析

    Institute of Scientific and Technical Information of China (English)

    陈峰; 尚华明; 袁玉江

    2016-01-01

    利用采自新疆东部的雪岭云杉和胡杨树轮样芯研制出区域树轮宽度年表。相关分析发现雪岭云杉区域树轮宽度年表与上年8月至当年7月标准化蒸发指数(SPEI)变化有较好相关性,相关系数为0.67(P<0.01,n=54),同时胡杨区域树轮宽度年表也表现出较强的干湿变化信号(r=0.48,P<0.01,n=54)。利用线性回归模型重建了新疆东部1725—2013年上年8月至当年7月SPEI变化,方差解释量为45.3%。利用雪岭云杉区域树轮宽度年表能够较好地重建新疆东部自1725年以来的上年8月至当年7月的SPEI变化,方差解释量达45.3%。重建结果揭示新疆东部1725—1728年,1737—1758年,1765—1804年,1829—1834年,1845—1852年,1888—1904年,1915—1923年,1932—1961年,1969—1973年,1986—2001年偏湿;1729—1736年,1759—1764年,1805—1828年,1835—1844年,1853—1887年,1905—1914年,1924—1931年,1962—1968年,1974—1985年,2002—2013年偏干。胡杨区域树轮宽度年表与雪岭云杉区域树轮宽度年表所指示的干湿变化在年代际上存在一致性,但是胡杨对干湿变化响应往往滞后于雪岭云杉。新疆东部干湿变化与天山中西部干湿变化在年际变化上存在很强一致性,但在年代际变化上存在显著差异,并与甘肃河西走廊干湿变化存在紧密联系。%The regional tree-ring width chronologies of Populus euphratica and Picea schrenkiana were developed for eastern Xinjiang, northwestern China. The climate response analysis shows the regional tree-ring width chronology of Picea schrenkiana had a good correlation (r=0.67)with August-July standardized precipitation-evapotranspiration index (SPEI). Based on the regional tree-ring chronology of Picea schrenkiana, we developed a August-July SPEI reconstruction of eastern Xinjiang for the period AD 1725-2013. The SPEI/tree-growth model accounts for 45.3% of the

  18. Efficient inverted organic light-emitting devices by amine-based solvent treatment (Presentation Recording)

    Science.gov (United States)

    Song, Myoung Hoon; Choi, Kyoung-Jin; Jung, Eui Dae

    2015-10-01

    The efficiency of inverted polymer light-emitting diodes (iPLEDs) were remarkably enhanced by introducing spontaneously formed ripple-shaped nanostructure of ZnO (ZnO-R) and amine-based polar solvent treatment using 2-methoxyethanol and ethanolamine (2-ME+EA) co-solvents on ZnO-R. The ripple-shape nanostructure of ZnO layer fabricated by solution process with optimal rate of annealing temperature improves the extraction of wave guide modes inside the device structure, and 2-ME+EA interlayer enhances the electron injection and hole blocking and reduces exciton quenching between polar solvent treated ZnO-R and emissive layer. As a result, our optimized iPLEDs show the luminous efficiency (LE) of 61.6 cd A-1, power efficiency (PE) of 19.4 lm W-1 and external quantum efficiency (EQE) of 17.8 %. This method provides a promising method, and opens new possibilities for not only organic light-emitting diodes (OLEDs) but also other organic optoelectronic devices such as organic photovoltaics, organic thin film transistors, and electrically driven organic diode laser.

  19. III-V GaAs based plasmonic lasers (Presentation Recording)

    Science.gov (United States)

    Lafone, Lucas; Nguyen, Ngoc; Clarke, Ed; Fry, Paul; Oulton, Rupert F.

    2015-09-01

    Plasmonics is a potential route to new and improved optical devices. Many predict that sub wavelength optical systems will be essential in the development of future integrated circuits, offering the only viable way of simultaneously increasing speed and reducing power consumption. Realising this potential will be contingent on the ability to exploit plasmonic effects within the framework of the established semiconductor industry and to this end we present III-V (GaAs) based surface plasmon laser platform capable of effective laser light generation in highly focussed regions of space. Our design utilises a suspended slab of GaAs with a metallic slot printed on top. Here, hybridisation between the plasmonic mode of the slot and the photonic mode of the slab leads to the formation of a mode with confinement and loss that can be adjusted through variation of the slot width alone. As in previous designs the use of a hybrid mode provides strong confinement with relatively low losses, however the ability to print the metal slot removes the randomness associated with device fabrication and the requirement for etching that can deteriorate the semiconductor's properties. The deterministic fabrication process and the use of bulk GaAs for gain make the device prime for practical implementation.

  20. Fabric-Based Wearable Dry Electrodes for Body Surface Biopotential Recording.

    Science.gov (United States)

    Yokus, Murat A; Jur, Jesse S

    2016-02-01

    A flexible and conformable dry electrode design on nonwoven fabrics is examined as a sensing platform for biopotential measurements. Due to limitations of commercial wet electrodes (e.g., shelf life, skin irritation), dry electrodes are investigated as the potential candidates for long-term monitoring of ECG signals. Multilayered dry electrodes are fabricated by screen printing of Ag/AgCl conductive inks on flexible nonwoven fabrics. This study focuses on the investigation of skin-electrode interface, form factor design, electrode body placement of printed dry electrodes for a wearable sensing platform. ECG signals obtained with dry and wet electrodes are comparatively studied as a function of body posture and movement. Experimental results show that skin-electrode impedance is influenced by printed electrode area, skin-electrode interface material, and applied pressure. The printed electrode yields comparable ECG signals to wet electrodes, and the QRS peak amplitude of ECG signal is dependent on printed electrode area and electrode on body spacing. Overall, fabric-based printed dry electrodes present an inexpensive health monitoring platform solution for mobile wearable electronics applications by fulfilling user comfort and wearability.