Sample records for based record analysis

  1. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach. (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro


    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool.

  2. Choosing the Optimal Trigger Point for Analysis of Movements after Stroke Based on Magnetoencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Guido Waldmann


    Full Text Available The aim of this study was to select the optimal procedure for analysing motor fields (MF and motor evoked fields (MEF measured from brain injured patients. Behavioural pretests with patients have shown that most of them cannot stand measurements longer than 30 minutes and they also prefer to move the hand with rather short breaks between movements. Therefore, we were unable to measure the motor field (MF optimally. Furthermore, we planned to use MEF to monitor cortical plasticity in a motor rehabilitation procedure. Classically, the MF analysis refers to rather long epochs around the movement onset (M-onset. We shortened the analysis epoch down to a range from 1000 milliseconds before until 500 milliseconds after M-onset to fulfil the needs of the patients. Additionally, we recorded the muscular activity (EMG by surface electrodes on the extensor carpi ulnaris and flexor carpi ulnaris muscles. Magnetoencephalographic (MEG data were recorded from 9 healthy subjects, who executed horizontally brisk extension and flexion in the right wrist. Significantly higher MF dipole strength was found in data based on EMG-onset than in M-onset based data. There was no difference in MEF I dipole strength between the two trigger latencies. In conclusion, we recommend averaging in respect to the EMG-onset for the analysis of both components MF as well as MEF.

  3. Object-oriented analysis and design: a methodology for modeling the computer-based patient record. (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L


    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  4. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe B.; Fauerskov, Inge; Osmanagic, Armin


    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals...... for heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  5. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann;


    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas, there ...

  6. Towards successful coordination of electronic health record based-referrals: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Paul Lindsey A


    Full Text Available Abstract Background Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP, the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs (i.e., e-referrals, lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. Methods We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Results Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Conclusions Despite facilitating information transfer between PCPs and

  7. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt


    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  8. A Computer Based Decision Support System for Tailoring Logistics Support Analysis Record (LSAR) Requirements (United States)


    L-7190, Preliminary Maintenance Allocation Chart: The Preliminary Miantenance Allocation Chart ( PMAC ) is a list of all items, down to the lowest level...operations, and remarks required to explain the maintenance operations. The PMAC includes additional data (over and above the required MAC data) and may be...used to develop the MAC for the organizational technical manual. Use LSAR Input data records C, Dl, H, H1 to arrange the data in PMAC format and

  9. Magnetoencephalography recording and analysis

    Directory of Open Access Journals (Sweden)

    Jayabal Velmurugan


    Full Text Available Magnetoencephalography (MEG non-invasively measures the magnetic field generated due to the excitatory postsynaptic electrical activity of the apical dendritic pyramidal cells. Such a tiny magnetic field is measured with the help of the biomagnetometer sensors coupled with the Super Conducting Quantum Interference Device (SQUID inside the magnetically shielded room (MSR. The subjects are usually screened for the presence of ferromagnetic materials, and then the head position indicator coils, electroencephalography (EEG electrodes (if measured simultaneously, and fiducials are digitized using a 3D digitizer, which aids in movement correction and also in transferring the MEG data from the head coordinates to the device and voxel coordinates, thereby enabling more accurate co-registration and localization. MEG data pre-processing involves filtering the data for environmental and subject interferences, artefact identification, and rejection. Magnetic resonance Imaging (MRI is processed for correction and identifying fiducials. After choosing and computing for the appropriate head models (spherical or realistic; boundary/finite element model, the interictal/ictal epileptiform discharges are selected and modeled by an appropriate source modeling technique (clinically and commonly used - single equivalent current dipole - ECD model. The equivalent current dipole (ECD source localization of the modeled interictal epileptiform discharge (IED is considered physiologically valid or acceptable based on waveform morphology, isofield pattern, and dipole parameters (localization, dipole moment, confidence volume, goodness of fit. Thus, MEG source localization can aid clinicians in sublobar localization, lateralization, and grid placement, by evoking the irritative/seizure onset zone. It also accurately localizes the eloquent cortex-like visual, language areas. MEG also aids in diagnosing and delineating multiple novel findings in other neuropsychiatric

  10. Validation of PC-based sound card with Biopac for digitalization of ECG recording in short-term HRV analysis

    Directory of Open Access Journals (Sweden)

    K Maheshkumar


    Full Text Available Background: Heart rate variability (HRV analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR in healthy as well as disease conditions. The aim of the present study was to compare (validate the HRV using a temporal series of electrocardiograms (ECG obtained by simple analog amplifier with PC-based sound card (audacity and Biopac MP36 module. Materials and Methods: Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. Results: The unpaired Student′s t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001 between the values in time and frequency domain obtained by the devices. Conclusion: On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  11. Template-based data entry for general description in medical records and data transfer to data warehouse for analysis. (United States)

    Matsumura, Yasushi; Kuwata, Shigeki; Yamamoto, Yuichiro; Izumi, Kazunori; Okada, Yasushi; Hazumi, Michihiro; Yoshimoto, Sachiko; Mineno, Takahiro; Nagahama, Munetoshi; Fujii, Ayumi; Takeda, Hiroshi


    General descriptions in medical records are so diverse that they are usually entered as free text into an electronic medical record, and the resulting data analysis is often difficult. We developed and implemented a template-based data entry module and data analyzing system for general descriptions. We developed a template with tree structure, whose content master and entered patient's data are simultaneously expressed by XML. The entered structured data is converted to narrative form for easy reading. This module was implemented in the EMR system, and is used in 35 hospitals as of October, 2006. So far, 3725 templates (3242 concepts) have been produced. The data in XML and narrative text data are stored in the EMR database. The XML data are retrieved, and then patient's data are extracted, to be stored in the data ware-house (DWH). We developed a search assisting system that enables users to find objective data from the DWH without requiring complicated SQL. By using this method, general descriptions in medical records can be structured and made available for clinical research.

  12. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis (United States)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan


    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  13. Hypnotic assessment based on the recurrence quantification analysis of EEG recorded in the ordinary state of consciousness. (United States)

    Madeo, Dario; Castellani, Eleonora; Santarcangelo, Enrica L; Mocenni, Chiara


    The cerebral cortical correlates of the susceptibility to hypnosis in the ordinary states of consciousness have not been clarified. Aim of the study was to characterize the EEG dynamics of subjects with high (highs) and low hypnotisability (lows) through the non-linear method of Recurrence Quantification Analysis (RQA). The EEG of 16 males--8 highs and 8 lows--was monitored for 1min without instructions other than keeping the eyes closed, being silent and avoiding movements (short resting), and during 15 min of simple relaxation, that is with the instruction to relax at their best. Highs and lows were compared on the RQA measures of Determinism (DET) and Entropy (ENT), which are related to the signal determinism and complexity. In the short resting condition discriminant analysis could classify highs and lows on the basis of DET and ENT values at temporo-parietal sites. Many differences in DET and all differences in ENT disappeared during simple relaxation, although DET still separated the two groups in the earliest 6min of relaxation at temporo-parietal sites. Our RQA based approach allows to develop computer-based methods of hypnotic assessment using short-lasting, single channel EEG recordings analyzed through standard mathematical methods.

  14. Recording-based identification of site liquefaction

    Institute of Scientific and Technical Information of China (English)

    Hu Yuxian; Zhang Yushan; Liang Jianwen; Ray Ruichong Zhang


    Reconnaissance reports and pertinent research on seismic hazards show that liquefaction is one of the key sources of damage to geotechnical and structural engineering systems. Therefore, identifying site liquefaction conditions plays an important role in seismic hazard mitigation. One of the widely used approaches for detecting liquefaction is based on the time-frequency analysis of ground motion recordings, in which short-time Fourier transform is typically used. It is known that recordings at a site with liquefaction are the result of nonlinear responses of seismic waves propagating in the liquefied layers underneath the site. Moreover, Fourier transform is not effective in characterizing such dynamic features as time-dependent frequency of the recordings rooted in nonlinear responses. Therefore, the aforementioned approach may not be intrinsically effective in detecting liquefaction. An alternative to the Fourier-based approach is presented in this study,which proposes time-frequency analysis of earthquake ground motion recordings with the aid of the Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the liquefaction features shown in the recordings. The paper then defines the predominant instantaneous frequency (PIF) and introduces the PIF-related motion features to identify liquefaction conditions at a given site. Analysis of 29 recorded data sets at different site conditions shows that the proposed approach is effective in detecting site liquefaction in comparison with other methods.

  15. Recording-based identification of site liquefaction (United States)

    Hu, Yuxian; Zhang, Yushan; Liang, Jianwen; Zhang, Ray Ruichong


    Reconnaissance reports and pertinent research on seismic hazards show that liquefaction is one of the key sources of damage to geotechnical and structural engineering systems. Therefore, identifying site liquefaction conditions plays an important role in seismic hazard mitigation. One of the widely used approaches for detecting liquefaction is based on the time-frequency analysis of ground motion recordings, in which short-time Fourier transform is typically used. It is known that recordings at a site with liquefaction are the result of nonlinear responses of seismic waves propagating in the liquefied layers underneath the site. Moreover, Fourier transform is not effective in characterizing such dynamic features as time-dependent frequency of the recordings rooted in nonlinear responses. Therefore, the aforementioned approach may not be intrinsically effective in detecting liquefaction. An alternative to the Fourier-based approach is presented in this study, which proposes time-frequency analysis of earthquake ground motion recordings with the aid of the Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the liquefaction features shown in the recordings. The paper then defines the predominant instantaneous frequency (PIF) and introduces the PIF-related motion features to identify liquefaction conditions at a given site. Analysis of 29 recorded data sets at different site conditions shows that the proposed approach is effective in detecting site liquefaction in comparison with other methods.

  16. Electrical source imaging and connectivity analysis to localize the seizure-onset zone based on high-density ictal scalp EEG recordings


    Staljanssens, Willeke; Strobbe, Gregor; Van Holen, Roel; Birot, Gwenael; Michel, Christophe; Seeck, Margitta; Vulliémoz, Serge; van Mierlo, Pieter


    Functional connectivity analysis of ictal intracranial EEG (icEEG) recordings can help with seizure-onset zone (SOZ) localization in patients with focal epilepsy1. However, it would be of high clinical value to be able to localize the SOZ based on non-invasive ictal EEG recordings to better target or avoid icEEG and improve surgical outcome. In this work, we propose an approach to localize the SOZ based on non-invasive ictal high- density EEG (hd-EEG) recordings. We considered retrospectiv...

  17. The Use of Continuous Wavelet Transform Based on the Fast Fourier Transform in the Analysis of Multi-channel Electrogastrography Recordings. (United States)

    Komorowski, Dariusz; Pietraszek, Stanislaw


    This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.

  18. 基于SAR的装备维修手册生成技术%Research on Generation Technology of Equipment Maintenance Handbook Based on Supportability Analysis Record

    Institute of Scientific and Technical Information of China (English)

    黄文江; 胡起伟; 朱宁; 王业


    保障性分析记录(Supportability Analysis Record,SAR)是维修手册编写的基础数据来源,针对传统纸质维修手册存在的缺陷以及维修手册编写中SAR的利用率不高等问题,提出了以保障性分析记录建立底层数据环境直接生成维修手册的设计思想,构建了基于SAR的装备维修手册生成技术框架,突破了基于XML的SAR表示技术。该框架充分利用SAR数据生成维修手册,提高了分析数据的可重用性,避免了重复分析,降低了工作量,提高了维修手册生成效率。%Supportability analysis record is the basal data source of the maintenance handbook. For the limitation of traditional papery maintenance handbook and the low utilization of supportability analysis record when writing the maintenance handbook,the underlying data environment is established based on supportability analysis record,then the maintenance handbook created directly. In this paper, a technical framework of equipment maintenance handbook is built based on supportability analysis record,in which the supportability analysis record is fully utilized,and the technology of SAR expressing based on XML is broken through. So the reusability of the data is improved,repeated analysis is avoided,the workload is decreased,the efficiency of creating maintenance handbook is improved.

  19. Portable EGG recording system based on a digital voice recorder. (United States)

    Jang, J-K; Shieh, M-J; Kuo, T-S; Jaw, F-S


    Cutaneous electrogastrogram (EGG) recording offers the benefit of non-invasive gastrointestinal diagnosis. With long-term ambulatory recording of signals, researchers and clinicians could have more opportunities to investigate and analyse paroxysmal or acute symptoms. A portable EGG system based on a digital voice recorder (DVR) is designed for long-term recording of cutaneous EGG signals. The system consists of electrodes, an EGG amplifier, a modulator, and a DVR. Online monitoring and off-line acquisition of EGG are handled by software. A special design employing an integrated timer circuit is used to modulate the EGG frequency to meet the input requirements of the DVR. This approach involves low supply voltage and low power consumption. Software demodulation is used to simplify the complexity of the system, and is helpful in reducing the size of the portable device. By using surface-mount devices (SMD) and a low-power design, the system is robust, compact, and suitable for long-term portable recording. As a result, researchers can record an ambulatory EGG signal by means of the proposed circuits in conjunction with an up-to-date voice-recording device.

  20. ICOHR: intelligent computer based oral health record. (United States)

    Peterson, L C; Cobb, D S; Reynolds, D C


    The majority of work on computer use in the dental field has focused on non-clinical practice management information needs. Very few computer-based dental information systems provide management support of the clinical care process, particularly with respect to quality management. Traditional quality assurance methods rely on the paper record and provide only retrospective analysis. Today, proactive quality management initiatives are on the rise. Computer-based dental information systems are being integrated into the care environment, actively providing decision support as patient care is being delivered. These new systems emphasize assessment and improvement of patient care at the time of treatment, thus building internal quality management into the caregiving process. The integration of real time quality management and patient care will be expedited by the introduction of an information system architecture that emulates the gathering and storage of clinical care data currently provided by the paper record. As a proposed solution to the problems associated with existing dental record systems, the computer-based patient record has emerged as a possible alternative to the paper dental record. The Institute of Medicine (IOM) recently conducted a study on improving the efficiency and accuracy of patient record keeping. As a result of this study, the IOM advocates the development and implementation of computer-based patient records as the standard for all patient care records. This project represents the ongoing efforts of The University of Iowa College of Dentistry's collaboration with the University of Uppsala Data Center, Uppsala, Sweden, on a computer-based patient dental record model. ICOHR (Intelligent Computer Based Oral Health Record) is an information system which brings together five important parts of the patient's dental record: medical and dental history; oral status; treatment planning; progress notes; and a Patient Care Database, generated from their

  1. Visualization Recorded Wave Analysis System Based on IEC61850%基于IEC61850的可视化故障录波分析系统

    Institute of Scientific and Technical Information of China (English)

    李楠; 尹军; 董贝; 韩春江; 葛雅川


    为满足智能变电站分析Comtrade格式录波的需求,设计并开发基于Comtrade档式和IEC61850文件传输模型的可视化故障录波分析系统.该软件能够根据IEC61850文传输模型,从智能电子设备中将Comtrade格式的录波文件下载到软件,解析后进行采样数据波形化显示,故障简报、事件报告、开入量列表化呈现,同时通过分层可视化逻辑显示提供系统了解逐层深入探究的录波分析方法,使录波更加直观便捷,科学精准.本文阐述可视化录波分析的总体架构和程序流程,着重介绍基于IEC61580文件传输模型的录波下载功能、采样数据分析算法,并对基于SVG可视化逻辑显示的具体实现进行详细介绍.%In order to meet the needs of recorded wave analysis in intelligence substation, the paper designs and develops the recorded wave analysis software based on QT. The software based on Comtrade format and IEC61850 file transfer model can download the Comtrade format wave file from intelligence electronic device, analyze the file, then display sample data, event report, and fault presentation graphically. Through the layered visual logic disply, the software provides the recorded analysis method which going from understand systematically to in-depth analysis, and makes the recorded wave analysis more intuitive, precision and scientific. This paper describes the overall software architecture and program flow of the recorded wave analysis, focuses on the function of downloading recorded wave file based on IEC61850 file transfer model, the sample data analysis algorithm, and describes software functions in detail.

  2. Sociometry Based Multiparty Audio Recordings Summarization


    Vinciarelli, Alessandro


    This paper shows how Social Network Analysis, the study of relational data in specific social environments, can be used to summarize multiparty radio news recordings. A social network is extracted from each recording and it is analyzed in order to detect the role of each speaker (e.g. anchorman, guest, etc.). The role is then used as a criterion to select the segments that are more representative of the recording content. The results show that the length of the recordings can be reduced by mo...

  3. Design of the SGML-based electronic patient record system with the use of object-oriented analysis methods. (United States)

    Kuikka, E; Eerola, A; Porrasmaa, J; Miettinen, A; Komulainen, J


    Since a patient record is typically a document updated by many users, required to be represented in many different layouts, and transferred from place to place, it is a good candidate to be represented structured and coded using the SGML document standard. The use of the SGML requires that the structure of the document is defined in advance by a Document Type Definition (DTD) and the document follows it. This paper represents a method which derives an SGML DTD by starting from the description of the usage of the patient record in medical care and nursing.

  4. Social Differences in Infant Mortality in 19th Century RostockA Demographic Analysis Based on Church Records

    Directory of Open Access Journals (Sweden)

    Michael Mühlichen


    Full Text Available The article examines the historical development of infant mortality in the Hanseatic city of Rostock, with a special focus on the question of how socio-economic factors influenced infant mortality in the early 19th century. Compared with the rest of Germany, the city exhibited an exceedingly low infant mortality level, in particular in the first third of the century. Our analyses show that the occupation of the father had a significant influence on the survival probability of a child in the first year of life in the early 19th century. Newborn children of fathers in lower ranked occupations exhibited a greater mortality risk in the first year of life than the offspring of fathers with occupations of higher status. The analyses are based on the registries of burials and baptisms of St. James’s Church (Jakobikirche in Rostock, which are largely preserved and much of which has been digitalised. Based on these individual data, this is the first event history analysis model conducted in the context of infant mortality in a German city in the 19th century. This article is also the first to reveal Rostock infant mortality rates for the entire 19th century according to sex, thus closing two research gaps.

  5. Sociometry Based Multiparty Audio Recordings Segmentation


    Vinciarelli, Alessandro


    This paper shows how Social Network Analysis, the sociological domain studying the interaction between people in specific social environments, can be used to assign roles to different speakers in multiparty recordings. The experiments presented in this work focus on radio news recordings involving around 11 speakers on average. Each of them is assigned automatically a role (e.g. anchorman or guest) without using any information related to their identity or the amount of time they talk. The re...

  6. An Analysis of the Accuracy of Electromechanical Eigenvalue Calculations Based on Instantaneous Power Waveforms Recorded in a Power Plant

    Directory of Open Access Journals (Sweden)

    Piotr Pruski


    Full Text Available The paper presents the results of calculating the eigenvalues (associated with electromechanical phenomena of the state matrix of the Polish Power System model on the basis of analysis of simulated and measured instantaneous power disturbance waveforms of generating units in Łaziska Power Plant. The method for electromechanical eigenvalue calculations used in investigations consists in approximation of the instantaneous power swing waveforms in particular generating units with the use of the waveforms being a superposition of the modal components associated with the searched eigenvalues and their participation factors. The hybrid optimisation algorithm consisting of the genetic and gradient algorithms was used for computations.

  7. The use of historical records in flood frequency analysis (United States)

    Sutcliffe, J. V.


    The inclusion of historical flood information in analysis can greatly increase the span of years sampled and should improve the reliability of estimation. Early flood level data may be obtained from physical marks, which are often found near historic cities. Early written records tend to be qualitative, but newspaper records can provide detailed level information. Verbal information that a recent flood was the highest in living memory, or that an earlier flood reached a certain level, can be used to extend a recent gauged record, particularly in countries with short written histories. Chinese historical flood records are especially long and could contribute valuable scientific information. The conversion of flood levels to discharges must involve extrapolation and should take full account of topography, not only in deriving realistic rating curves but in terms of the geological stability of the site. Two examples from the Nile show that some ratings can be assumed stable while others change significantly. Apparent trends must therefore be treated with caution and considered in relation to physical changes in the catchment, which could include deforestation, urban development or channel improvements as well as possible rainfall changes. Trends would be difficult to deduce from historic information, as the completeness of the records must change. The inclusion of historic records in single station analysis can be based on graphical or statistical methods. They can be given an appropriate plotting position in the graphical approach; examples are given for the Trent and the Yangtze rivers. Alternatively, maximum likelihood techniques can be adapted to censored samples, and examples are given of the inclusion of historic records in analysis. Flood records from a homogeneous region are often used in a combined analysis, and historical records can be included. Examples are given of envelope curves relating maximum flood values to basin area, where historical floods can be

  8. A microcontroller-based portable electrocardiograph recorder. (United States)

    Segura-Juárez, José J; Cuesta-Frau, David; Samblas-Pena, Luis; Aboy, Mateo


    We describe a low cost portable Holter design that can be implemented with off-the-shelf components. The recorder is battery powered and includes a graphical display and keyboard. The recorder is capable of acquiring up to 48 hours of continuous electrocardiogram data at a sample rate of up to 250 Hz.

  9. Segment clustering methodology for unsupervised Holter recordings analysis (United States)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German


    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  10. Intelligent technique for knowledge reuse of dental medical records based on case-based reasoning. (United States)

    Gu, Dong-Xiao; Liang, Chang-Yong; Li, Xing-Guo; Yang, Shan-Lin; Zhang, Pei


    With the rapid development of both information technology and the management of modern medical regulation, the generation of medical records tends to be increasingly intelligent. In this paper, Case-Based Reasoning is applied to the process of generating records of dental cases. Based on the analysis of the features of dental records, a case base is constructed. A mixed case retrieval method (FAIES) is proposed for the knowledge reuse of dental records by adopting Fuzzy Mathematics, which improves similarity algorithm based on Euclidian-Lagrangian Distance, and PULL & PUSH weight adjustment strategy. Finally, an intelligent system of dental cases generation (CBR-DENT) is constructed. The effectiveness of the system, the efficiency of the retrieval method, the extent of adaptation and the adaptation efficiency are tested using the constructed case base. It is demonstrated that FAIES is very effective in terms of reducing the time of writing medical records and improving the efficiency and quality. FAIES is also proven to be an effective aid for diagnoses and provides a new idea for the management of medical records and its applications.

  11. Statistical analysis of molecular signal recording.

    Directory of Open Access Journals (Sweden)

    Joshua I Glaser

    Full Text Available A molecular device that records time-varying signals would enable new approaches in neuroscience. We have recently proposed such a device, termed a "molecular ticker tape", in which an engineered DNA polymerase (DNAP writes time-varying signals into DNA in the form of nucleotide misincorporation patterns. Here, we define a theoretical framework quantifying the expected capabilities of molecular ticker tapes as a function of experimental parameters. We present a decoding algorithm for estimating time-dependent input signals, and DNAP kinetic parameters, directly from misincorporation rates as determined by sequencing. We explore the requirements for accurate signal decoding, particularly the constraints on (1 the polymerase biochemical parameters, and (2 the amplitude, temporal resolution, and duration of the time-varying input signals. Our results suggest that molecular recording devices with kinetic properties similar to natural polymerases could be used to perform experiments in which neural activity is compared across several experimental conditions, and that devices engineered by combining favorable biochemical properties from multiple known polymerases could potentially measure faster phenomena such as slow synchronization of neuronal oscillations. Sophisticated engineering of DNAPs is likely required to achieve molecular recording of neuronal activity with single-spike temporal resolution over experimentally relevant timescales.

  12. A web-based electronic patient record (ePR) system for data integration in movement analysis research on wheel-chair users to minimize shoulder pain (United States)

    Deshpande, Ruchi R.; Requejo, Philip; Sutisna, Erry; Wang, Ximing; Liu, Margaret; McNitt-Gray, Sarah; Ruparel, Puja; Liu, Brent J.


    Patients confined to manual wheel-chairs are at an added risk of shoulder injury. There is a need for developing optimal bio-mechanical techniques for wheel-chair propulsion through movement analysis. Data collected is diverse and in need of normalization and integration. Current databases are ad-hoc and do not provide flexibility, extensibility and ease of access. The need for an efficient means to retrieve specific trial data, display it and compare data from multiple trials is unmet through lack of data association and synchronicity. We propose the development of a robust web-based ePR system that will enhance workflow and facilitate efficient data management.

  13. Implementing security in computer based patient records clinical experiences. (United States)

    Iversen, K R; Heimly, V; Lundgren, T I


    In Norway, organizational changes in hospitals and a stronger focus on patient safety have changed the way of organizing and managing paper based patient records. Hospital-wide patient records tend to replace department based records. Since not only clinicians, but also other non-medical staff have access to the paper records, they also have easy access to all the information which is available on a specific patient; such a system has obvious 'side effects' on privacy and security. Computer based patient records (CPRs) can provide the solution to this apparent paradox if the complex aspects of security, privacy, effectiveness, and user friendliness are focused on jointly from the outset in designing such systems. Clinical experiences in Norway show that it is possible to design patient record systems that provide a very useful tool for clinicians and other health care personnel (HCP) while fully complying with comprehensive security and privacy requirements.

  14. Incidence and costs of hip fractures vs strokes and acute myocardial infarction in Italy: comparative analysis based on national hospitalization records

    Directory of Open Access Journals (Sweden)

    Piscitelli P


    Full Text Available Prisco Piscitelli,1,2 Giovanni Iolascon,3 Alberto Argentiero,2 Giovanna Chitano,2 Cosimo Neglia,2 Gemma Marcucci,1 Manuela Pulimeno,2 Marco Benvenuto,2 Santa Mundi,2 Valentina Marzo,2 Daniela Donato,4 Angelo Baggiani,4 Alberto Migliore,5 Mauro Granata,6 Francesca Gimigliano,3 Raffaele Di Blasio,7 Alessandra Gimigliano,3 Lorenzo Renzulli,7 Maria Luisa Brandi,1 Alessandro Distante,2,4 Raffaele Gimigliano3,71University of Florence, Florence Italy; 2ISBEM Research Centre, Brindisi, Italy; 3Second University of Naples, Naples, Italy; 4University of Pisa, Pisa, Italy; 5Fatebenefratelli St Peter’s Hospital, Rome, Italy; 6St Filippo Neri Hospital, Rome, Italy; 7Casa di Cura Santa Maria del Pozzo, Somma Vesuviana, ItalyObjectives: As osteoporotic fractures are becoming a major health care problem in countries characterized by an increasing number of older adults, in this study we aimed to compare the incidence and costs of hip fragility fractures in Italian elderly people versus those of major cardiovascular diseases (strokes and acute myocardial infarctions [AMI] occurring in the whole adult population.Methods: We analyzed hospitalization records maintained at the national level by the Italian Ministry of Health for the diagnosis of hip fractures (ICD-9-CM codes 820–821, AMI (code 410, hemorrhagic (codes 430, 431, 432 and ischemic strokes (codes 433–434, and TIA (code 435 between 2001–2005. Cost analyses were based on diagnosis-related groups.Results: The incidence of hip fractures in elderly people has increased (+12.9% between 2001 and 2005, as well as that of AMI (+20.2% and strokes (hemorrhagic: +9.6%; ischemic: +14.7 occurring in the whole adult population; conversely, hospitalization due to TIA decreased by a rate of 13.6% between 2001 and 2005. In 2005, the hospital costs across the national health care system that were associated with hip fragility fractures in the elderly were comparable to those of strokes (both hemorrhagic and

  15. Agriculture, population growth, and statistical analysis of the radiocarbon record. (United States)

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L


    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  16. Analysis of the Factors Influencing Breeding Record Establishment of Sheep-raising Households or Farms Based on Logit-ISM:Based on 849 Questionnaires from 17 Cities in Shandong Province

    Institute of Scientific and Technical Information of China (English)

    Shiping; ZHU; Shimin; SUN; Limin; HAN


    Breeding record is an important way to implement standardized sheep raising,trace major sheep raising epidemic information and ensure the quality and safety of products for sheep-raising households or farms.Based on 849 questionnaires from 17 cities of Shandong Province,the paper firstly used the binary discrete model of Logit to analyze the factors influencing the establishing behavior of breeding records of sheep-raising households or farms and then used the ISM model to explain the relationship and hierarchy of each influencing factor.The result showed that seven factors including the education level of the deciders,farming scale,fixed number of farming years,degree of specialization,support of the government,whether to join the industrialization organization and the recognition of the breeding records have a significant impact on the establishing behavior of breeding records of the sheep-raising households or farms.Among them,the support of the government and the recognition of breeding records are the surface direct factors,degree of specialization and whether to join the industrialization organization are the middle indirect factors,the education level of the deciders,the farming scale and the fixed number of farming years are deep source factors.

  17. Towards a web-based system for family health record. (United States)

    Marceglia, Sara; Bonacina, Stefano; Braidotti, Andrea; Nardelli, Marco; Pinciroli, Francesco


    Electronic health records are a fundamental support needed not only by healthcare providers, but also for individual patients. We considered the health management in the familiar environment and we developed a web-based system for family health record. The system permits an easy compilation and provides an effective visualization of the clinical data concerning family members also for friendly printing tasks.

  18. Evidence-based development of a mobile telephone food record. (United States)

    Six, Bethany L; Schap, Tusarebecca E; Zhu, Fengqing M; Mariappan, Anand; Bosch, Marc; Delp, Edward J; Ebert, David S; Kerr, Deborah A; Boushey, Carol J


    Mobile telephones with an integrated camera can provide a unique mechanism for collecting dietary information that reduces burden on record-keepers. Objectives for this study were to test whether participant's proficiency with the mobile telephone food record improved after training and repeated use and to measure changes in perceptions regarding use of the mobile telephone food record after training and repeated use. Seventy-eight adolescents (26 males, 52 females) ages 11 to 18 years were recruited to use the mobile telephone food record for one or two meals. Proficiency with the mobile telephone food record was defined as capturing a useful image for image analysis and self-reported ease of use. Positive changes in perceptions regarding use of the mobile telephone food record were assumed to equate to potentially improved proficiency with the mobile telephone food record. Participants received instruction for using the mobile telephone food record prior to their first meal, and captured an image of their meals before and after eating. Following the first meal, participants took part in an interactive session where they received additional training on capturing images in various snacking situations and responded to questions about user preferences. Changes in the participants' abilities to capture useful images and perceptions about the usability of the mobile telephone food record were examined using McNemar, Wilcoxon rank-sum test, and paired t test. After using the mobile telephone food record, the majority of participants (79%) agreed that the software was easy to use. Eleven percent of participants agreed taking images before snacking would be easy. After additional training, the percent increased significantly to 32% (Ptechnologies; however, the mobile telephone food record design needs to accommodate the lifestyles of its users to ensure useful images and continuous use. Further, these results suggest that additional training in using a new technology may

  19. Digital image analysis of palaeoenvironmental records and applications

    Institute of Scientific and Technical Information of China (English)


    Environmental change signals in geological or biological records are commonly reflected on their reflecting or transmitting images. These environmental signals can be extracted through digital image analysis. The analysis principle involves section line selection, color value reading and calculating environmental proxy index along the section lines, layer identification, auto-chronology and investigation of structure evolution of growth bands. On detailed illustrations of the image technique, this note provides image analyzing procedures of coral, tree-ring and stalagmite records. The environmental implications of the proxy index from image analysis are accordingly given through application demonstration of the image technique.

  20. System of Web-Based Electronic Medical Record

    Directory of Open Access Journals (Sweden)

    Maurício A. Machado


    Full Text Available Nowadays, the information systems are considered a tool to make-decision support in several areas. One of the applications of this system could be in the development of a web-based Electronic Medical Record. The attention to standards, naming, accurate measuring and the system security in the sense of information privacy are fundamental elements in the development of a web-based electronic medical record. Therefore, based on the solidarity and maturity of web applications, this work presents a solution that could supply the construction of electronic medical records by the internet. Recently, in the Brazilian market there have been few successful initiatives. Taking this into account, this work proposes the use of proven software development methodologies. How a study case was used the tengiology and vascular surgery. Currently the medical consultation processes of the angiology and vascular surgery specialties are operated manually. The final product provides automatization of these procedures.

  1. A Factor-based Analysis on Comprehensive Evaluation on Academic Records of College Students in Shaanxi%基于因子分析的陕西高校学生成绩综合评价

    Institute of Scientific and Technical Information of China (English)



    As to the necessity of the academic records assessment of college students, based on the factor analysis of the academic records of the college students, the factor model of the student grades is concluded and major elements affecting the learning abilities of the students are found. The empirical research is done to test this model. The results show that it is a scientific, valuable and impatrtial model to disclose the relationships between different courses and the caltivation of learning abilities of different students, and providing a basis for further studies in future.%押针对高校学生成绩评价的必要性,对高校学生学习成绩进行因子分析,得出影响学生知识和能力方面的主要因素,建立学生成绩因子分析模型。并通过实例研究对该模型进行检验,结果表明,学生成绩因子分析模型更为科学、合理、公平,旨在揭示不同课程对于学生不同学习能力培养的联系,进而为以后进一步深入研究提供依据。

  2. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin


    -and-click analysis, due to a novel and efficient indexing structure. With the web-site will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  3. Electronic Health Record A Systems Analysis of the Medications Domain

    CERN Document Server

    Scarlat, Alexander


    An accessible primer, Electronic Health Record: A Systems Analysis of the Medications Domain introduces the tools and methodology of Structured Systems Analysis as well as the nuances of the Medications domain. The first part of the book provides a top-down decomposition along two main paths: data in motion--workflows, processes, activities, and tasks in parallel to the analysis of data at rest--database structures, conceptual, logical models, and entities relationship diagrams. Structured systems analysis methodology and tools are applied to: electronic prescription, computerized physician or

  4. Quantum-dot based nanothermometry in optical plasmonic recording media

    Energy Technology Data Exchange (ETDEWEB)

    Maestro, Laura Martinez [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain); Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Zhang, Qiming; Li, Xiangping; Gu, Min [Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Jaque, Daniel [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain)


    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media.

  5. CRFs based de-identification of medical records (United States)

    He, Bin; Guan, Yi; Cheng, Jianyi; Cen, Keting; Hua, Wenlan


    De-identification is a shared task of the 2014 i2b2/UTHealth challenge. The purpose of this task is to remove protected health information (PHI) from medical records. In this paper, we propose a novel de-identifier, WI-deId, based on conditional random fields (CRFs). A preprocessing module, which tokenizes the medical records using regular expressions and an off-the-shelf tokenizer, is introduced, and three groups of features are extracted to train the de-identifier model. The experiment shows that our system is effective in the de-identification of medical records, achieving a micro-F1 of 0.9232 at the i2b2 strict entity evaluation level. PMID:26315662

  6. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.


    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  7. An Autonomous Underwater Recorder Based on a Single Board Computer. (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson


    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  8. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Directory of Open Access Journals (Sweden)

    Manuel Caldas-Morgan

    Full Text Available As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  9. Seeking a fingerprint: analysis of point processes in actigraphy recording (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek


    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  10. Rule-based deduplication of article records from bibliographic databases. (United States)

    Jiang, Yu; Lin, Can; Meng, Weiyi; Yu, Clement; Cohen, Aaron M; Smalheiser, Neil R


    We recently designed and deployed a metasearch engine, Metta, that sends queries and retrieves search results from five leading biomedical databases: PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Central Register of Controlled Trials. Because many articles are indexed in more than one of these databases, it is desirable to deduplicate the retrieved article records. This is not a trivial problem because data fields contain a lot of missing and erroneous entries, and because certain types of information are recorded differently (and inconsistently) in the different databases. The present report describes our rule-based method for deduplicating article records across databases and includes an open-source script module that can be deployed freely. Metta was designed to satisfy the particular needs of people who are writing systematic reviews in evidence-based medicine. These users want the highest possible recall in retrieval, so it is important to err on the side of not deduplicating any records that refer to distinct articles, and it is important to perform deduplication online in real time. Our deduplication module is designed with these constraints in mind. Articles that share the same publication year are compared sequentially on parameters including PubMed ID number, digital object identifier, journal name, article title and author list, using text approximation techniques. In a review of Metta searches carried out by public users, we found that the deduplication module was more effective at identifying duplicates than EndNote without making any erroneous assignments.

  11. 'Citizen science' recording of fossils by adapting existing computer-based biodiversity recording tools (United States)

    McGowan, Alistair


    Biodiversity recording activities have been greatly enhanced by the emergence of online schemes and smartphone applications for recording and sharing data about a wide variety of flora and fauna. As a palaeobiologist, one of the areas of research I have been heavily involved in is the question of whether the amount of rock available to sample acts as a bias on our estimates of biodiversity through time. Although great progress has been made on this question over the past ten years by a number of researchers, I still think palaeontology has not followed the lead offered by the 'citizen science' revolution in studies of extant biodiversity. By constructing clearly structured surveys with online data collection support, it should be possible to collect field data on the occurrence of fossils at the scale of individual exposures, which are needed to test competing hypotheses about these effects at relatively small spatial scales. Such data collection would be hard to justify for universities and museums with limited personnel but a co-ordinated citizen science programme would be capable of delivering such a programme. Data collection could be based on the MacKinnon's Lists method, used in rapid conservation assessment work. It relies on observers collecting lists of a fixed length (e.g. 10 species long) but what is important is that it focuses on getting observers to ignore sightings of the same species until that list is complete. This overcomes the problem of 'common taxa being commonly recorded' and encourages observers to seek out and identify the rarer taxa. This gives a targeted but finite task. Rather than removing fossils, participants would be encouraged to take photographs to share via a recording website. The success of iSpot, which allows users to upload photos of plants and animals for other users to help with identifications, offers a model for overcoming the problems of identifying fossils, which can often look nothing like the examples illustrated in

  12. A New Spectral Shape-Based Record Selection Approach Using Np and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Edén Bojórquez


    Full Text Available With the aim to improve code-based real records selection criteria, an approach inspired in a parameter proxy of spectral shape, named Np, is analyzed. The procedure is based on several objectives aimed to minimize the record-to-record variability of the ground motions selected for seismic structural assessment. In order to select the best ground motion set of records to be used as an input for nonlinear dynamic analysis, an optimization approach is applied using genetic algorithms focuse on finding the set of records more compatible with a target spectrum and target Np values. The results of the new Np-based approach suggest that the real accelerograms obtained with this procedure, reduce the scatter of the response spectra as compared with the traditional approach; furthermore, the mean spectrum of the set of records is very similar to the target seismic design spectrum in the range of interest periods, and at the same time, similar Np values are obtained for the selected records and the target spectrum.

  13. Recommending Related Papers Based on Digital Library Access Records

    CERN Document Server

    Pohl, Stefan; Joachims, Thorsten


    An important goal for digital libraries is to enable researchers to more easily explore related work. While citation data is often used as an indicator of relatedness, in this paper we demonstrate that digital access records (e.g. http-server logs) can be used as indicators as well. In particular, we show that measures based on co-access provide better coverage than co-citation, that they are available much sooner, and that they are more accurate for recent papers.

  14. Analysis & Research on the Mechanism of Daily Record System Attack Based on the Network Environment%基于网络环境下对日志系统攻击的机理分析与研究

    Institute of Scientific and Technical Information of China (English)



    Systematic security of daily record is to computer security all-important,and the location having analyses daily record system going through attack being unlike a type. And,from systematic confidentiality of daily record ,com-pleteness and applicability,three aspect has carried out detailed analysis on the systematic daily record mechanism to go through attack. How system is prevented from having carried out a summary finally,to daily record by various attack. System is had stronger guiding significance to go to the computer lead plane in network environment avoid it's daily record going through attack.%日志系统的安全对计算机安全来说至关重要,分析了日志系统受到攻击的位置和不同类型。并从日志系统的机密性、完整性和可用性三个方面对日志系统受到攻击的机理进行了详细的分析。最后,对日志系统如何防止受到各种攻击进行了总结。对在网络环境下计算机主机避免其日志系统受到攻击具有较强的指导意义。

  15. Analysis of astronomical records of King Wu's Conquest

    Institute of Scientific and Technical Information of China (English)


    All related astronomical records of King Wu's Conquest have been searched and analysed comprehensively. Constrained by the newest conclusions of archeology, philology and history in the Xia-Shang-Zhou Chronology Project and based mainly on dates in Wucheng, Jupiter's position in Guoyu and information on the season, our first choice of the date of King Wu's Conquest is Jun. 20, BC1046. This conclusion explains properly most relevant literature.

  16. A full lipid biomarker based record from Lake Challa, Tanzania (United States)

    Blaga, C. I.; de Leeuw, J. W.; Verschuren, D.; Sinninghe Damsté1, J. S.


    The climate of the regions surrounding the Indian Ocean - East Africa, Arabian and Indian peninsulas - is strongly dominated by the dynamics of the seasonal monsoon. To understand the long and short term driving forces behind the natural climatic variability in this region it is highly important to reconstruct climatic changes in the past and, thereby, predict future changes taking into account also anthropogenic activities. Most low latitude locations lack continuous, highly resolved continental records with good age control. From the few existing records acquired from tropical glacier ice, cave stalagmites and fossil diatoms a thorough understanding of the climatic variations reflected (rainfall and drought or temperature and its effect on precipitation) is scanty. Chemically stratified crater lakes accumulate high-quality climate-proxy records as shown in very recent studies done on the continuous and finely laminated sediment record of Lake Challa situated on the lower East slope of Mt. Kilimanjaro (Verschuren et al. 2009; Wolff et al. 2011). The unique location of this lake in equatorial East Africa implies that the climate variability is influenced by the Indian Ocean and not by the Atlantic due to the Congo Air Boundary (Thierney et al. 2011). The objective of this study is to fully explore the biomarker content of the Lake Challa sedimentary record already characterized by an excellent time resolution and chronology. Various normal chain lipids (n-alkanes, n-fatty acids, n-alcohols), sterols, long-chain diols, triterpenoids and glycolipids in sedimentary organic matter, were determined in their solvent-extractable (free) and saponification-released forms (bound). The changing composition of organic matter content from the investigated lake is used as a framework to trace palaeo-humidity, terrestrial input, algal input, temperature in sediment traps and underlying sediments of Lake Challa to further our palaeo-environmental knowledge based on GDGT's and

  17. Break and trend analysis of EUMETSAT Climate Data Records (United States)

    Doutriaux-Boucher, Marie; Zeder, Joel; Lattanzio, Alessio; Khlystova, Iryna; Graw, Kathrin


    EUMETSAT reprocessed imagery acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board Meteosat 8-9. The data covers the period from 2004 to 2012. Climate Data Records (CDRs) of atmospheric parameters such as Atmospheric Motion Vectors (AMV) as well as Clear and All Sky Radiances (CSR and ASR) have been generated. Such CDRs are mainly ingested by ECMWF to produce a reanalysis data. In addition, EUMETSAT produced a long CDR (1982-2004) of land surface albedo exploiting imagery acquired by the Meteosat Visible and Infrared Imager (MVIRI) on board Meteosat 2-7. Such CDR is key information in climate analysis and climate models. Extensive validation has been performed for the surface albedo record and a first validation of the winds and clear sky radiances have been done. All validation results demonstrated that the time series of all parameter appear homogeneous at first sight. Statistical science offers a variety of analyses methods that have been applied to further analyse the homogeneity of the CDRs. Many breakpoint analysis techniques depend on the comparison of two time series which incorporates the issue that both may have breakpoints. This paper will present a quantitative and statistical analysis of eventual breakpoints found in the MVIRI and SEVIRI CDRs that includes attribution of breakpoints to changes of instruments and other events in the data series compared. The value of different methods applied will be discussed with suggestions how to further develop this type of analysis for quality evaluation of CDRs.

  18. Eddy current analysis of thin film recording heads (United States)

    Shenton, D.; Cendes, Z. J.


    Due to inherently thin pole tips which enhance the sharpness of read/write pulses, thin-film magnetic recording heads provide a unique potential for increasing disk file capacity. However, the very feature of these heads which makes them attractive in the recording process, namely, their small size, also makes thin-film heads difficult to study experimentally. For this reason, a finite element simulation of the thin-film head has been developed to provide the magnetic field distribution and the resistance/inductance characteristics of these heads under a variety of conditions. A study based on a one-step multipath eddy current procedure is reported. This procedure may be used in thin film heads to compute the variation of magnetic field with respect to frequency. Computations with the IBM 3370 head show that a large phase shift occurs due to eddy currents in the frequency range 1-10 MHz.

  19. Semantic models in medical record data-bases. (United States)

    Cerutti, S


    A great effort has been recently made in the area of data-base design in a number of application fields (banking, insurance, travel, etc.). Yet, it is the current experience of computer scientists in the medical field that medical record information-processing requires less rigid and more complete definition of data-base specifications for a much more heterogeneous set of data, for different users who have different aims. Hence, it is important to state that the data-base in the medical field ought to be a model of the environment for which it was created, rather than just a collection of data. New more powerful and more flexible data-base models are being now designed, particularly in the USA, where the current trend in medicine is to implement, in the same structure, the connection among more different and specific users and the data-base (for administrative aims, medical care control, treatments, statistical and epidemiological results, etc.). In such a way the single users are able to talk with the data-base without interfering with one another. The present paper outlines that this multi-purpose flexibility can be achieved by improving mainly the capabilities of the data-base model. This concept allows the creation of procedures of semantic integrity control which will certainly have in the future a dramatic impact on important management features, starting from data-quality checking and non-physiological state detections, as far as more medical-oriented procedures like drug interactions, record surveillance and medical care review. That is especially true when a large amount of data are to be processed and the classical hierarchical and network data models are no longer sufficient for developing satisfactory and reliable automatic procedures. In this regard, particular emphasis will be dedicated to the relational model and, at the highest level, to the same semantic data model.

  20. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems. (United States)


    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record...

  1. Network Analysis of Time-Lapse Microscopy Recordings

    Directory of Open Access Journals (Sweden)

    Erik eSmedler


    Full Text Available Multicellular organisms rely on intercellular communication to regulate important cellular processes critical to life. To further our understanding of those processes there is a need to scrutinize dynamical signaling events and their functions in both cells and organisms. Here, we report a method and provide MATLAB code that analyzes time-lapse microscopy recordings to identify and characterize network structures within large cell populations, such as interconnected neurons. The approach is demonstrated using intracellular calcium (Ca2+ recordings in neural progenitors and cardiac myocytes, but could be applied to a wide variety of biosensors employed in diverse cell types and organisms. In this method, network structures are analyzed by applying cross-correlation signal processing and graph theory to single-cell recordings. The goal of the analysis is to determine if the single cell activity constitutes a network of interconnected cells and to decipher the properties of this network. The method can be applied in many fields of biology in which biosensors are used to monitor signaling events in living cells. Analyzing intercellular communication in cell ensembles can reveal essential network structures that provide important biological insights.

  2. Microcontroller-based wireless recorder for biomedical signals. (United States)

    Chien, C-N; Hsu, H-W; Jang, J-K; Rau, C-L; Jaw, F-S


    A portable multichannel system is described for the recording of biomedical signals wirelessly. Instead of using the conversional time-division analog-modulation method, the technique of digital multiplexing was applied to increase the number of signal channels to 4. Detailed design considerations and functional allocation of the system is discussed. The frontend unit was modularly designed to condition the input signal in an optimal manner. Then, the microcontroller handled the tasks of data conversion, wireless transmission, as well as providing the ability of simple preprocessing such as waveform averaging or rectification. The low-power nature of this microcontroller affords the benefit of battery operation and hence, patient isolation of the system. Finally, a single-chip receiver, which compatible with the RF transmitter of the microcontroller, was used to implement a compact interface with the host computer. An application of this portable recorder for low-back pain studies is shown. This device can simultaneously record one ECG and two surface EMG wirelessly, thus, is helpful in relieving patients' anxiety devising clinical measurement. Such an approach, microcontroller-based wireless measurement, could be an important trend for biomedical instrumentation and we help that this paper could be useful for other colleagues.


    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev


    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  4. Multifractal detrended moving average analysis of global temperature records

    CERN Document Server

    Mali, Provash


    Long-range correlation and multifractal nature of the global monthly mean temperature anomaly time series over the period 1850-2012 are studied in terms of the multifractal detrended moving average (MFDMA) method. We try to address the source(s) of multifractality in the time series by comparing the results derived from the actual series with those from a set of shuffled and surrogate series. It is seen that the newly developed MFDMA method predicts a multifractal structure of the temperature anomaly time series that is more or less similar to that observed by other multifractal methods. In our analysis the major contribution of multifractality in the temperature records is found to be stemmed from long-range temporal correlation among the measurements, however the contribution of fat-tail distribution function of the records is not negligible. The results of the MFDMA analysis, which are found to depend upon the location of the detrending window, tend towards the observations of the multifractal detrended fl...

  5. Opto-mechatronics issues in solid immersion lens based near-field recording (United States)

    Park, No-Cheol; Yoon, Yong-Joong; Lee, Yong-Hyun; Kim, Joong-Gon; Kim, Wan-Chin; Choi, Hyun; Lim, Seungho; Yang, Tae-Man; Choi, Moon-Ho; Yang, Hyunseok; Rhim, Yoon-Chul; Park, Young-Pil


    We analyzed the effects of an external shock on a collision problem in a solid immersion lens (SIL) based near-field recording (NFR) through a shock response analysis and proposed a possible solution to this problem with adopting a protector and safety mode. With this proposed method the collision between SIL and media can be avoided. We showed possible solution for contamination problem in SIL based NFR through a numerical air flow analysis. We also introduced possible solid immersion lens designs to increase the fabrication and assembly tolerances of an optical head with replicated lens. Potentially, these research results could advance NFR technology for commercial product.

  6. Anonymization of Electronic Medical Records to Support Clinical Analysis

    CERN Document Server

    Gkoulalas-Divanis, Aris


    Anonymization of Electronic Medical Records to Support Clinical Analysis closely examines the privacy threats that may arise from medical data sharing, and surveys the state-of-the-art methods developed to safeguard data against these threats. To motivate the need for computational methods, the book first explores the main challenges facing the privacy-protection of medical data using the existing policies, practices and regulations. Then, it takes an in-depth look at the popular computational privacy-preserving methods that have been developed for demographic, clinical and genomic data sharing, and closely analyzes the privacy principles behind these methods, as well as the optimization and algorithmic strategies that they employ. Finally, through a series of in-depth case studies that highlight data from the US Census as well as the Vanderbilt University Medical Center, the book outlines a new, innovative class of privacy-preserving methods designed to ensure the integrity of transferred medical data for su...

  7. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records (United States)


    ... investigation records, investigatory material for law enforcement purposes, and confidential source information... Unclassified computer network. Vetting requests, analyses, and results will be stored separately on a classified computer network. Both computer networks and the RAM database require a user identification...

  8. 基于住院病案首页信息的医疗并发症案例特征分析%Characteristics Analysis on the Medical Complications Cases Based on Front Pages of Hospitalized Medical Records

    Institute of Scientific and Technical Information of China (English)

    徐锡武; 宋景晨; 吴锁薇


    Objective To investigate the characteristics and influencing factors of medical complications cases through analyzing the occurrence of medical complications.Methods To collect complication data based on the front pages of medical records from January to December in 2015, and conduct analysis on the characteristics and influencing factors of medical complications cases with the application of logistic regression analysis of binary variable.Results The incidence rates of medical complications in plastic surgery,cardiovascular surgery, neurosurgery departments were higher. The cases with medical complications in general surgery, neurosurgery and thoracic surgery were more than others. The types of medical complications mainly included pulmonary infection, postoperative incision infection, postoperative fat liquefaction of incision, postoperative anastomotic fistula and postoperative bleeding. The risk factors of the medical complications included gender, age, length of stay and whether or not operating departments. The regression equation was LogitP=-7.994+-0.558 X1+0.024 X2+0.034 X3+1.58 X4. Conclusions It has the characteristics of accurate, timely and convenient to monitor the occurrence of medical complications based on hospitalization summary reports.Focusing on elderly patients, implementing infection control measures and strengthening the perioperative management were the effective methods to prevent and reduce the medical complications during the hospitalization.%目的:通过分析医疗并发症的发生,探讨医疗并发症案例的特征和影响因素。方法基于2015年1月-2015年12月住院病案首页收集的并发症信息,运用二分类变量的logistic回归,对发生医疗并发症案例的特征和影响因素进行分析。结果医疗并发症发生率较高的科室主要有整形外科、心血管外科、神经外科等,发生医疗并发症的出院人次较多的科室主要有普通外科、神经外科、胸外科等;

  9. Hilbert-Huang transform analysis of dynamic and earthquake motion recordings (United States)

    Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.


    This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.

  10. Changing negative core beliefs with trial-based thought record

    Directory of Open Access Journals (Sweden)

    Thaís R. Delavechia


    Full Text Available Abstract Background Trial-based thought record (TBTR is a technique used in trial-based cognitive therapy (TBCT, and simulates a court trial. It was designed to restructure unhelpful core beliefs (CBs during psychotherapy. Objective To confirm previous findings on the efficacy of TBTR in decreasing patients’ adherence to self-critical and unhelpful CBs and corresponding emotions, as well as assessing the differential efficacy of the empty-chair approach relative to the static format of TBTR. Methods Thirty-nine outpatients were submitted to a 50-minute, one-session, application of the TBTR technique in the empty-chair (n = 18 or conventional (n = 21 formats. Patients’ adherence to unhelpful CBs and the intensity of corresponding emotions were assessed after each step of TBTR, and the results obtained in each format were compared. Results Significant reductions in percent values both in the credit given to CBs and in the intensity of corresponding emotions were observed at the end of the session (p < .001, relative to baseline values. ANCOVA also showed a significant difference in favor of the empty-chair format for both belief credit and emotion intensity (p = .04. Discussion TBTR may help patients reduce adherence to unhelpful CBs and corresponding emotions and the empty-chair format seems to be more efficacious than the conventional format.

  11. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko


    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  12. Coral-based climate records from tropical South Atlantic

    DEFF Research Database (Denmark)

    Pereira, Natan S.; Sial, Alcides N.; Kikuchi, Ruy K.P.;


    Coral skeletons contain records of past environmental conditions due to their long life span and well calibrated geochemical signatures. C and O isotope records of corals are especially interesting, because they can highlight multidecadal variability of local climate conditions beyond the instrum...


    Directory of Open Access Journals (Sweden)

    Kamil Szydło


    Full Text Available The analysis of private tests is presented in the article. The applicable tests refer to accelerations, the level of the sound pressure as well as to the sound power emitted by the passenger lift cabin at different technical conditions of the lift. For a group of lifting devices the accelerations were tested at three axes with the use of an accelerometer. The accelerometer was placed in the central part of the cabin with simultaneous measurement of the acoustic parameters with the sound analyzer equipped with the sound volume double microphone probe. The attempt was made to determine the impact of the frame - cabin system construction as well as the lift technical condition on the recorded parameters. It can allow to establish the limit values of the lift structure parameters under which a rapid drop of comfort takes place while travelling in the lift as well as to indicate those construction elements the modification of which would affect the improvement of the operation noiselessness.

  14. Classifying Normal and Abnormal Status Based on Video Recordings of Epileptic Patients

    Directory of Open Access Journals (Sweden)

    Jing Li


    Full Text Available Based on video recordings of the movement of the patients with epilepsy, this paper proposed a human action recognition scheme to detect distinct motion patterns and to distinguish the normal status from the abnormal status of epileptic patients. The scheme first extracts local features and holistic features, which are complementary to each other. Afterwards, a support vector machine is applied to classification. Based on the experimental results, this scheme obtains a satisfactory classification result and provides a fundamental analysis towards the human-robot interaction with socially assistive robots in caring the patients with epilepsy (or other patients with brain disorders in order to protect them from injury.

  15. Julius – a template based supplementary electronic health record system

    Directory of Open Access Journals (Sweden)

    Klein Gunnar O


    Full Text Available Abstract Background EHR systems are widely used in hospitals and primary care centres but it is usually difficult to share information and to collect patient data for clinical research. This is partly due to the different proprietary information models and inconsistent data quality. Our objective was to provide a more flexible solution enabling the clinicians to define which data to be recorded and shared for both routine documentation and clinical studies. The data should be possible to reuse through a common set of variable definitions providing a consistent nomenclature and validation of data. Another objective was that the templates used for the data entry and presentation should be possible to use in combination with the existing EHR systems. Methods We have designed and developed a template based system (called Julius that was integrated with existing EHR systems. The system is driven by the medical domain knowledge defined by clinicians in the form of templates and variable definitions stored in a common data repository. The system architecture consists of three layers. The presentation layer is purely web-based, which facilitates integration with existing EHR products. The domain layer consists of the template design system, a variable/clinical concept definition system, the transformation and validation logic all implemented in Java. The data source layer utilizes an object relational mapping tool and a relational database. Results The Julius system has been implemented, tested and deployed to three health care units in Stockholm, Sweden. The initial responses from the pilot users were positive. The template system facilitates patient data collection in many ways. The experience of using the template system suggests that enabling the clinicians to be in control of the system, is a good way to add supplementary functionality to the present EHR systems. Conclusion The approach of the template system in combination with various local EHR

  16. VID-R and SCAN: Tools and Methods for the Automated Analysis of Visual Records. (United States)

    Ekman, Paul; And Others

    The VID-R (Visual Information Display and Retrieval) system that enables computer-aided analysis of visual records is composed of a film-to-television chain, two videotape recorders with complete remote control of functions, a video-disc recorder, three high-resolution television monitors, a teletype, a PDP-8, a video and audio interface, three…

  17. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment

    Directory of Open Access Journals (Sweden)

    Hongyi Mao


    Full Text Available In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients’ medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed.

  18. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment (United States)


    In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR) system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients' medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients) are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed. PMID:28280506

  19. Analysis of Direct Recordings from the Surface of the Human Brain (United States)

    Towle, Vernon L.


    , suggestive of a transiently active language network. Our findings suggest that analysis of coherence patterns can supplement visual inspection of conventional records to help identify pathological regions of cortex. With further study, it is hoped that analysis of single channel dynamics, along with analysis of multichannel lateral coherence patterns, and the functional holographic technique may allow determination of the boundaries of epileptic foci based on brief interictal recordings, possibly obviating the current need for extended monitoring of seizures.

  20. Performance analysis of a medical record exchanges model. (United States)

    Huang, Ean-Wen; Liou, Der-Ming


    Electronic medical record exchange among hospitals can provide more information for physician diagnosis and reduce costs from duplicate examinations. In this paper, we proposed and implemented a medical record exchange model. According to our study, exchange interface servers (EISs) are designed for hospitals to manage the information communication through the intra and interhospital networks linked with a medical records database. An index service center can be given responsibility for managing the EIS and publishing the addresses and public keys. The prototype system has been implemented to generate, parse, and transfer the health level seven query messages. Moreover, the system can encrypt and decrypt a message using the public-key encryption algorithm. The queuing theory is applied to evaluate the performance of our proposed model. We estimated the service time for each queue of the CPU, database, and network, and measured the response time and possible bottlenecks of the model. The capacity of the model is estimated to process the medical records of about 4000 patients/h in the 1-MB network backbone environments, which comprises about the 4% of the total outpatients in Taiwan.

  1. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas


    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  2. Analysis of the Lunar Eclipse Records from the Goryeosa (United States)

    Lee, Ki-Won; Mihn, Byeong-Hee; Ahn, Young Sook; Ahn, Sang-Hyeon


    In this paper, we study the lunar eclipse records in the Goryeosa (History of the Goryeo Dynasty), an official history book of the Goryeo dynasty (A.D. 918 -- 1392). In the history book, a total of 228 lunar eclipse accounts are recorded, covering the period from 1009 to 1392. However, we find that two accounts are duplications and four accounts correspond to no known lunar eclipses around the dates. For the remaining lunar eclipses, we calculate the magnitude and the time of the eclipse at different phases using the DE406 ephemeris. Of the 222 lunar eclipse accounts, we find that the minimum penumbral magnitude was 0.5583. For eclipses which occurred after midnight, we find that some accounts were recorded on the day before the eclipse, like the astronomical records of the Joseonwangjosillok (Annals of the Joseon Dynasty), while others were on the day of the lunar eclipse. We also find that four accounts show a difference in the Julian dates between this study and that of Ahn et al., even though it is assumed that the Goryeo court did not change the dates in the accounts for lunar eclipses that occurred after midnight. With regard to the contents of the lunar eclipse accounts, we confirm that the accounts recorded as total eclipses are accurate, except for two accounts. However, both eclipses were very close to the total eclipse. We also confirm that all predicted lunar eclipses did occur, although one eclipse happened two days after the predicted date. In conclusion, we believe that this study is very helpful for investigating the lunar eclipse accounts of other periods in Korea, and furthermore, useful for verifying the calendar dates of the Goryeo dynasty.

  3. Query log analysis of an electronic health record search engine. (United States)

    Yang, Lei; Mei, Qiaozhu; Zheng, Kai; Hanauer, David A


    We analyzed a longitudinal collection of query logs of a full-text search engine designed to facilitate information retrieval in electronic health records (EHR). The collection, 202,905 queries and 35,928 user sessions recorded over a course of 4 years, represents the information-seeking behavior of 533 medical professionals, including frontline practitioners, coding personnel, patient safety officers, and biomedical researchers for patient data stored in EHR systems. In this paper, we present descriptive statistics of the queries, a categorization of information needs manifested through the queries, as well as temporal patterns of the users' information-seeking behavior. The results suggest that information needs in medical domain are substantially more sophisticated than those that general-purpose web search engines need to accommodate. Therefore, we envision there exists a significant challenge, along with significant opportunities, to provide intelligent query recommendations to facilitate information retrieval in EHR.

  4. Recording, analysis, and interpretation of spreading depolarizations in neurointensive care

    DEFF Research Database (Denmark)

    Dreier, Jens P; Fabricius, Martin; Ayata, Cenk


    recorded during multimodal neuromonitoring in neurocritical care as a causal biomarker providing a diagnostic summary measure of metabolic failure and excitotoxic injury. Focal ischemia causes spreading depolarization within minutes. Further spreading depolarizations arise for hours to days due to energy...... electrocorticographic monitoring affords even remote detection of injury because spreading depolarizations propagate widely from ischemic or metabolically stressed zones; characteristic patterns, including temporal clusters of spreading depolarizations and persistent depression of spontaneous cortical activity, can...

  5. Design of Ground Analysis Program System Based on PAD for LY05 Voice Recording of Locomotive%基于PDA的LY05型机车语音录音装置的地面分析软件的设计

    Institute of Scientific and Technical Information of China (English)



    The design of the ground analysis program system based on PDA for locomotive voice recording was described, and design & realization of FTP downloading, analysis, playback, inquiring for recorded file were discussed. The similarities and differences of locomotive voice recording process between PDA and PC were analyzed and compared in detail. The application of the software improved die timeliness of locomotive fault analysis.%介绍了基于PDA平台的LY05型机车语音录音装置的地面分析软件的设计,以及机车语音录音文件的FTP下载、分析、回放、查询的设计与实现.重点分析比较了PDA与PC机上语音录音文件分析处理过程的异同.该软件的使用促进了机车故障分析的时校性.

  6. 75 FR 79312 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking... (United States)


    ... test reactor licensees to obtain a fingerprint- based criminal history records check before granting...; ] NUCLEAR REGULATORY COMMISSION 10 CFR Part 73 RIN 3150-AI25 Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking Unescorted Access to Research or Test Reactors...

  7. Obesity research based on the Copenhagen School Health Records Register

    DEFF Research Database (Denmark)

    Baker, Jennifer L; Sørensen, Thorkild I A


    INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from the capi......INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from...... the capital city of Denmark. Additional information on the cohort members has been obtained via linkages with population studies and national registers. RESEARCH TOPICS: Studies using data from the register have made important contributions in the areas of the aetiology of obesity, the development...... of the obesity epidemic, and the long-term health consequences of birth weight as well as body size and growth in childhood. CONCLUSION: Research using this unique register is ongoing, and its contributions to the study of obesity as well as other topics will continue for years to come....

  8. Projectile Base Flow Analysis (United States)


    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  9. Low frequency signals analysis from broadband seismometers records (United States)

    Hsu, Po-Chin


    Broadband seismometers record signals over a wide frequency band, in which the high-frequency background noise is usually associated with human activities, such as cars, trains and factory-related activities. Meanwhile, the low-frequency signals are generally linked to the microseisms, atmospheric phenomena and oceanic wave movement. In this study, we selected the broadband seismometer data recorded during the pass of the typhoons with different moving paths, such as Doksuri in 2012, Trami and Kong-Rey in 2013, Hagibis and Matmo in 2014. By comparing the broadband seismic data, the meteorological information, and the marine conditions, we attempt to understand the effect of the meteorological conditions on the low-frequency noise. The result shows that the broadband station located along the southwestern coast of Taiwan usually have relatively higher background noise value, while the inland stations were characterized by lower noise energy. This rapid decay of the noise energy with distance from the coastline suggest that the low frequency noise could be correlated with the oceanic waves. In addition, the noise energy level increases when the distance from the typhoon and the station decreases. The enhanced frequency range is between 0.1~0.3 Hz, which is consistent with the effect caused by the interference of oceanic waves as suggested by the previous studies. This observation indicates that when the pass of typhoon may reinforce the interaction of oceanic waves and caused some influence on the seismic records. The positive correlation between the significant wave height and the noise energy could also give evidence to this observation. However, we found that the noise energy is not necessarily the strongest when the distance from typhoon and the station is the shortest. This phenomenon seems to be related to the typhoon path. When the typhoon track is perpendicular to the coastline, the change of noise energy is generally more significantly; whereas less energy

  10. Comorbidities in rheumatoid arthritis: analysis of hospital discharge records

    Directory of Open Access Journals (Sweden)

    G.S. Mela


    Full Text Available Objective: Arthritis is often associated with comorbidities. For many of them, such as hypertension, cardiovascular disease, chronic pulmonary disease, and upper gastrointestinal disease, arthritis and its treatment may also represent a risk factor. This study is concerned with an evaluation of the frequency of comorbidities in a cohort of patients with rheumatoid arthritis (RA. Methods: The discharge diagnoses of patients with RA during the period 1 January 1997 to 31 December 2000 were retrieved from the database of the Department of Internal Medicine of the University of Genova, Italy. The diagnosis of RA was made if the patient’s discharge record contained the code 714 of the International Classification of Diseases, IX revision, as first 3 numbers. The other diagnoses were also recorded along with demographic data, type and duration of hospital stay, and performed procedures. Results: During the study period, 427 patients with RA were admitted to the hospital for a total number of 761 admissions, which represented 2.2% of total admissions. Ninety-one (21.3% patients did not have comorbidities, whereas 336 (78.6% had one or more comorbidities. The most frequently observed comorbidities were cardiovascular diseases (34.6%, including hypertension (14.5% and angina (3.5%, followed by gastrointestinal (24.5%, genito-urinary (18.7% and respiratory (17% diseases. There was a male predominance (p=0.004 within patients with comorbidities, who were significantly older (64.2±3.2 years vs. 57.2±4.2 years; p<0.001 and required longer periods of hospital stay (22.7 days vs. 12.5 days; p<0.001. Conclusions: Comorbidities are present in nearly 80% of RA inpatients. Comorbidity is a good predictor of health outcome, health services utilization, and medical costs. Because RA comorbidity can act as confounder, it should be considered in epidemiologic studies and clinical trials.

  11. Speech watermarking: an approach for the forensic analysis of digital telephonic recordings. (United States)

    Faundez-Zanuy, Marcos; Lucena-Molina, Jose J; Hagmüller, Martin


    In this article, the authors discuss the problem of forensic authentication of digital audio recordings. Although forensic audio has been addressed in several articles, the existing approaches are focused on analog magnetic recordings, which are less prevalent because of the large amount of digital recorders available on the market (optical, solid state, hard disks, etc.). An approach based on digital signal processing that consists of spread spectrum techniques for speech watermarking is presented. This approach presents the advantage that the authentication is based on the signal itself rather than the recording format. Thus, it is valid for usual recording devices in police-controlled telephone intercepts. In addition, our proposal allows for the introduction of relevant information such as the recording date and time and all the relevant data (this is not always possible with classical systems). Our experimental results reveal that the speech watermarking procedure does not interfere in a significant way with the posterior forensic speaker identification.

  12. Electronic Health Record Systems and Intent to Apply for Meaningful Use Incentives among Office-based Physician ... (United States)

    ... Order from the National Technical Information Service NCHS Electronic Health Record Systems and Intent to Apply for ... In 2011, 57% of office-based physicians used electronic medical record/electronic health record (EMR/EHR) systems, ...

  13. A CORBA-based integration of distributed electronic healthcare records using the synapses approach. (United States)

    Grimson, J; Grimson, W; Berry, D; Stephens, G; Felton, E; Kalra, D; Toussaint, P; Weier, O W


    The ability to exchange in a meaningful, secure, and simple fashion relevant healthcare data about patients is seen as vital in the context of efficient and cost-effective shared or team-based care. The electronic healthcare record (EHCR) lies at the heart of this information exchange, and it follows that there is an urgent need to address the ability to share EHCR's or parts of records between carers and across distributed health information systems. This paper presents the Synapses approach to sharing based on a standardized shared record, the Federated Healthcare Record, which is implemented in an open and flexible manner using the Common Object Request Broker Architecture (CORBA). The architecture of the Federated Healthcare Record is based on the architecture proposed by the Technical Committee 251 of the European Committee for Standardization.

  14. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings (United States)

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill


    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  15. Foreign technology alert-bibliography: Photography and recording devices. Citations from the NTIS data base (United States)

    Wilkinson, G.


    A systematically organized collection of abstracts from a bibliographic data base is provided on reports relating to photographic, imaging and recording systems originating from countries outside the USA. A tailored search of the data base was performed and the output carefully categorized, edited and indexed. Subjects covered include: photographic devices and imaging systems (cameras, image carriers, holography and applications); audiovisual recording (digital, magnetic and video); date encoding, recording and storage; and satellite equipment. Each of the sections in the book is cross-referenced and there is also an author index and useful subject index based on major descriptors.

  16. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin (United States)

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.


    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  17. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format (United States)

    Moore, Vivianne E.


    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  18. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film? (United States)


    ... records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests... and cellulose-acetate base film? (a) The nitrocellulose base, a substance akin to gun cotton, is... picture film and X-ray film—nitrocellulose base). (b) Agencies must inspect cellulose-acetate...

  19. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1 (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  20. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. (United States)

    Ratwani, Raj M; Fairbanks, Rollin J; Hettinger, A Zachary; Benda, Natalie C


    The usability of electronic health records (EHRs) continues to be a point of dissatisfaction for providers, despite certification requirements from the Office of the National Coordinator that require EHR vendors to employ a user-centered design (UCD) process. To better understand factors that contribute to poor usability, a research team visited 11 different EHR vendors in order to analyze their UCD processes and discover the specific challenges that vendors faced as they sought to integrate UCD with their EHR development. Our analysis demonstrates a diverse range of vendors' UCD practices that fall into 3 categories: well-developed UCD, basic UCD, and misconceptions of UCD. Specific challenges to practicing UCD include conducting contextually rich studies of clinical workflow, recruiting participants for usability studies, and having support from leadership within the vendor organization. The results of the study provide novel insights for how to improve usability practices of EHR vendors.

  1. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code (United States)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi


    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  2. Ex post power economic analysis of record of decision operational restrictions at Glen Canyon Dam.

    Energy Technology Data Exchange (ETDEWEB)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.; Loftin, S.; Osiek, B; Decision and Information Sciences; Western Area Power Administration


    On October 9, 1996, Bruce Babbitt, then-Secretary of the U.S. Department of the Interior signed the Record of Decision (ROD) on operating criteria for the Glen Canyon Dam (GCD). Criteria selected were based on the Modified Low Fluctuating Flow (MLFF) Alternative as described in the Operation of Glen Canyon Dam, Colorado River Storage Project, Arizona, Final Environmental Impact Statement (EIS) (Reclamation 1995). These restrictions reduced the operating flexibility of the hydroelectric power plant and therefore its economic value. The EIS provided impact information to support the ROD, including an analysis of operating criteria alternatives on power system economics. This ex post study reevaluates ROD power economic impacts and compares these results to the economic analysis performed prior (ex ante) to the ROD for the MLFF Alternative. On the basis of the methodology used in the ex ante analysis, anticipated annual economic impacts of the ROD were estimated to range from approximately $15.1 million to $44.2 million in terms of 1991 dollars ($1991). This ex post analysis incorporates historical events that took place between 1997 and 2005, including the evolution of power markets in the Western Electricity Coordinating Council as reflected in market prices for capacity and energy. Prompted by ROD operational restrictions, this analysis also incorporates a decision made by the Western Area Power Administration to modify commitments that it made to its customers. Simulated operations of GCD were based on the premise that hourly production patterns would maximize the economic value of the hydropower resource. On the basis of this assumption, it was estimated that economic impacts were on average $26.3 million in $1991, or $39 million in $2009.

  3. Multi-scale dynamical analysis (MSDA) of sea level records versus PDO, AMO, and NAO indexes

    CERN Document Server

    Scafetta, Nicola


    Herein I propose a multi-scale dynamical analysis to facilitate the physical interpretation of tide gauge records. The technique uses graphical diagrams. It is applied to six secular-long tide gauge records representative of the world oceans: Sydney, Pacific coast of Australia; Fremantle, Indian Ocean coast of Australia; New York City, Atlantic coast of USA; Honolulu, U.S. state of Hawaii; San Diego, U.S. state of California; and Venice, Mediterranean Sea, Italy. For comparison, an equivalent analysis is applied to the Pacific Decadal Oscillation (PDO) index and to the Atlantic Multidecadal Oscillation (AMO) index. Finally, a global reconstruction of sea level and a reconstruction of the North Atlantic Oscillation (NAO) index are analyzed and compared: both sequences cover about three centuries from 1700 to 2000. The proposed methodology quickly highlights oscillations and teleconnections among the records at the decadal and multidecadal scales. At the secular time scales tide gauge records present relatively...

  4. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis


    Shenkin, S. D.; Zhang, M.G.; De, G.; Mathur, S.; Mina, T.H.; Reynolds, R. M.


    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EM...

  5. Design of a system based on DSP and FPGA for video recording and replaying (United States)

    Kang, Yan; Wang, Heng


    This paper brings forward a video recording and replaying system with the architecture of Digital Signal Processor (DSP) and Field Programmable Gate Array (FPGA). The system achieved encoding, recording, decoding and replaying of Video Graphics Array (VGA) signals which are displayed on a monitor during airplanes and ships' navigating. In the architecture, the DSP is a main processor which is used for a large amount of complicated calculation during digital signal processing. The FPGA is a coprocessor for preprocessing video signals and implementing logic control in the system. In the hardware design of the system, Peripheral Device Transfer (PDT) function of the External Memory Interface (EMIF) is utilized to implement seamless interface among the DSP, the synchronous dynamic RAM (SDRAM) and the First-In-First-Out (FIFO) in the system. This transfer mode can avoid the bottle-neck of the data transfer and simplify the circuit between the DSP and its peripheral chips. The DSP's EMIF and two level matching chips are used to implement Advanced Technology Attachment (ATA) protocol on physical layer of the interface of an Integrated Drive Electronics (IDE) Hard Disk (HD), which has a high speed in data access and does not rely on a computer. Main functions of the logic on the FPGA are described and the screenshots of the behavioral simulation are provided in this paper. In the design of program on the DSP, Enhanced Direct Memory Access (EDMA) channels are used to transfer data between the FIFO and the SDRAM to exert the CPU's high performance on computing without intervention by the CPU and save its time spending. JPEG2000 is implemented to obtain high fidelity in video recording and replaying. Ways and means of acquiring high performance for code are briefly present. The ability of data processing of the system is desirable. And smoothness of the replayed video is acceptable. By right of its design flexibility and reliable operation, the system based on DSP and FPGA

  6. Validity of a hospital-based obstetric register using medical records as reference

    DEFF Research Database (Denmark)

    Brixval, Carina Sjöberg; Thygesen, Lau Caspar; Johansen, Nanna Roed;


    and validity of a hospital-based clinical register - the Obstetric Database - using a national register and medical records as references. METHODS: We assessed completeness of a hospital-based clinical register - the Obstetric Database - by linking data from all women registered in the Obstetric Database...... as having given birth in 2013 to the National Patient Register with coverage of all births in 2013. Validity of eleven selected indicators from the Obstetric Database was assessed using medical records as a golden standard. Using a random sample of 250 medical records, we calculated proportion of agreement......, sensitivity, specificity, and positive and negative predictive values for each indicator. Two assessors independently reviewed medical records and inter-rater reliability was calculated as proportion of agreement and Cohen's κ coefficient. RESULTS: We found 100% completeness of the Obstetric Database when...

  7. Book Recommendation Using Machine Learning Methods Based on Library Loan Records and Bibliographic Information


    Tsuji, Keita; Yoshikane, Fuyuki; Sato, Sho; Itsumura, Hiroshi


    In this paper, we propose a method to recommend Japanese books to university students throughmachine learning modules based on several features, including library loan records. We determine themost effective method among the ones that used (a) a support vector machine (SVM), (b) a randomforest, and (c) Adaboost. Furthermore, we assess the most effective combination of relevant featuresamong (1) the association rules derived from library loan records, (2) book titles, (3) Nippon DecimalClassif...

  8. DrCell – A Software Tool for the Analysis of Cell Signals Recorded with Extracellular Microelectrodes

    Directory of Open Access Journals (Sweden)

    Christoph Nick


    Full Text Available Microelectrode arrays (MEAs have been applied for in vivo and in vitro recording and stimulation of electrogenic cells, namely neurons and cardiac myocytes, for almost four decades. Extracellular recordings using the MEA technique inflict minimum adverse effects on cells and enable long term applications such as implants in brain or heart tissue. Hence, MEAs pose a powerful tool for studying the processes of learning and memory, investigating the pharmacological impacts of drugs and the fundamentals of the basic electrical interface between novel electrode materials and biological tissue. Yet in order to study the areas mentioned above, powerful signal processing and data analysis tools are necessary. In this paper a novel toolbox for the offline analysis of cell signals is presented that allows a variety of parameters to be detected and analyzed. We developed an intuitive graphical user interface (GUI that enables users to perform high quality data analysis. The presented MATLAB® based toolbox gives the opportunity to examine a multitude of parameters, such as spike and neural burst timestamps, network bursts, as well as heart beat frequency and signal propagation for cardiomyocytes, signal-to-noise ratio and many more. Additionally a spike-sorting tool is included, offering a powerful tool for cases of multiple cell recordings on a single microelectrode. For stimulation purposes, artifacts caused by the stimulation signal can be removed from the recording, allowing the detection of field potentials as early as 5 ms after the stimulation.

  9. Modelling and Analysis of Electrical Potentials Recorded in Microelectrode Arrays (MEAs). (United States)

    Ness, Torbjørn V; Chintaluri, Chaitanya; Potworowski, Jan; Łęski, Szymon; Głąbska, Helena; Wójcik, Daniel K; Einevoll, Gaute T


    Microelectrode arrays (MEAs), substrate-integrated planar arrays of up to thousands of closely spaced metal electrode contacts, have long been used to record neuronal activity in in vitro brain slices with high spatial and temporal resolution. However, the analysis of the MEA potentials has generally been mainly qualitative. Here we use a biophysical forward-modelling formalism based on the finite element method (FEM) to establish quantitatively accurate links between neural activity in the slice and potentials recorded in the MEA set-up. Then we develop a simpler approach based on the method of images (MoI) from electrostatics, which allows for computation of MEA potentials by simple formulas similar to what is used for homogeneous volume conductors. As we find MoI to give accurate results in most situations of practical interest, including anisotropic slices covered with highly conductive saline and MEA-electrode contacts of sizable physical extensions, a Python software package (ViMEAPy) has been developed to facilitate forward-modelling of MEA potentials generated by biophysically detailed multicompartmental neurons. We apply our scheme to investigate the influence of the MEA set-up on single-neuron spikes as well as on potentials generated by a cortical network comprising more than 3000 model neurons. The generated MEA potentials are substantially affected by both the saline bath covering the brain slice and a (putative) inadvertent saline layer at the interface between the MEA chip and the brain slice. We further explore methods for estimation of current-source density (CSD) from MEA potentials, and find the results to be much less sensitive to the experimental set-up.

  10. [Infrequent arrhythmia episodes diagnosed by a smartphone-based event recorder]. (United States)

    Pontoppidan, Jacob; Sandgaard, Niels Christian; Brandes, Axel; Johansen, Jens Brock


    Smartphone-based ECG monitor devices are a new promising tool for rhythm detection in patients with palpitations. We present a case where a young patient with infrequent arrhythmia episodes was diagnosed with atrial fibrillation using this novel smartphone-based event recorder.

  11. Secure Management of Personal Health Records by Applying Attribute-Based Encryption

    NARCIS (Netherlands)

    Ibraimi, Luan; Asim, Muhammad; Petkovic, Milan


    The confidentiality of personal health records is a major problem when patients use commercial Web-based systems to store their health data. Traditional access control mechanisms, such as Role-Based Access Control, have several limitations with respect to enforcing access control policies and ensuri

  12. Deciphering the record of short-term base-level changes in Gilbert-type deltas (United States)

    Gobo, Katarina; Ghinassi, Massimiliano; Nemec, Wojciech


    -front accommodation driven by short-term base-level changes, with some accompanying inevitable 'noise' in the facies record due to the system autogenic variability and regional climatic fluctuations. Comparison of delta coeval foreset and toeset/bottomset deposits in a delta shows further a reverse pattern of reciprocal changes in facies assemblages, with the TFA assemblage of foreset deposits passing downdip into a DFA assemblage of delta-foot deposits, and the DFA assemblage of foreset deposits passing downdip into a TFA assemblage. This reverse reciprocal alternation of TFA and DFA facies assemblages is attributed to the delta-slope own morphodynamics. When the delta slope is dominated by deposition of debrisflows, only the most diluted turbulent flows and chute bypassing turbidity currents are reaching the delta-foot zone. When the delta slope is dominated by turbiditic sedimentation, larger chutes and gullies form - triggering and conveying debrisflows to the foot zone. These case studies as a whole shed a new light on the varying pattern of subaqueous sediment dispersal processes in an evolving Gilbert-type deltaic system and point to an the attractive possibility of the recognition of a 'hidden' record of base-level changes on the basis of detailed facies analysis.

  13. Analysis of patent value evaluation and recorded value based on real option theory%基于实物期权理论的专利权价值评估及入账价值计算分析

    Institute of Scientific and Technical Information of China (English)



      本文通过对实物期权理论的介绍以及其在进行专利权评估上的运用,简要分析了利弊,并给出了合适的专利权价值评估公式。%In this paper, through the introduction of the real options theory and use during patent assessment, a brief analysis of the pros and cons, and given the appropriate patent valuation formula.

  14. Patients covertly recording clinical encounters: threat or opportunity? A qualitative analysis of online texts.

    Directory of Open Access Journals (Sweden)

    Maka Tsulukidze

    Full Text Available The phenomenon of patients covertly recording clinical encounters has generated controversial media reports. This study aims to examine the phenomenon and analyze the underlying issues.We conducted a qualitative analysis of online posts, articles, blogs, and forums (texts discussing patients covertly recording clinical encounters. Using Google and Google Blog search engines, we identified and analyzed 62 eligible texts published in multiple countries between 2006 and 2013. Thematic analysis revealed four key themes: 1 a new behavior that elicits strong reactions, both positive and negative, 2 an erosion of trust, 3 shifting patient-clinician roles and relationships, and 4 the existence of confused and conflicting responses. When patients covertly record clinical encounters - a behavior made possible by various digital recording technologies - strong reactions are evoked among a range of stakeholders. The behavior represents one consequence of an erosion of trust between patients and clinicians, and when discovered, leads to further deterioration of trust. Confused and conflicting responses to the phenomenon by patients and clinicians highlight the need for policy guidance.This study describes strong reactions, both positive and negative, to the phenomenon of patients covertly recording clinical encounters. The availability of smartphones capable of digital recording, and shifting attitudes to patient-clinician relationships, seems to have led to this behavior, mostly viewed as a threat by clinicians but as a welcome and helpful innovation by some patients, possibly indicating a perception of subordination and a lack of empowerment. Further examination of this tension and its implications is needed.

  15. Social science and linguistic text analysis of nurses' records: a systematic review and critique. (United States)

    Buus, Niels; Hamilton, Bridget Elizabeth


    The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations.

  16. A Lower Rhine flood chronology based on the sedimentary record of an abandoned channel fill (United States)

    Toonen, W. H. J.; Winkels, T. G.; Prins, M. A.; de Groot, L. V.; Bunnik, F. P. M.; Cohen, K. M.


    The Bienener Altrhein is an abandoned channel of the Lower Rhine (Germany). Following a late 16th century abandonment event, the channel was disconnected from the main stream and the oxbow lake gradually filled with 8 meters of flood deposits. This process still continues today. During annual floods, a limited proportion of overbank discharge is routed across the oxbow lake. Large floods produce individual flood layers, which are visually recognized in the sedimentary sequence. Based on the sedimentary characteristics of these event layers, we created a ~450-year flood chronology for the Lower Rhine. Laser-diffraction grain size measurements were used to assess relative flood magnitudes for individual flood event layers. Continuous sampling at a ~2 cm interval provided a high-resolution record, resolving the record at an annual scale. Standard descriptive techniques (e.g., mean grain size, 95th percentile, % sand) and the more advanced 'end member modelling' were applied to zoom in on the coarse particle bins in the grain size distributions, which are indicative of higher flow velocities. The most recent part of the record was equated to modern discharge measurements. This allows to establish relations between deposited grain size characteristics in the abandoned channel and flood magnitudes in the main river. This relation can also be applied on flood event layers from previous centuries, for which only water level measurements and historical descriptions exist. This makes this method relevant to expand data series used in flood frequency analysis from 100 years to more than 400 years. To date event-layers in the rapidly accumulated sequence, we created an age-depth model that uses organic content variations to tune sedimentation rates between the known basal and top ages. No suitable identifiable organic material for radiocarbon dating was found in the cores. Instead, palynological results (introduction of agricultural species) and palaeomagnetic secular

  17. Sea-level probability for the last deglaciation: A statistical analysis of far-field records (United States)

    Stanford, J. D.; Hemingway, R.; Rohling, E. J.; Challenor, P. G.; Medina-Elizalde, M.; Lester, A. J.


    Pulses of ice-sheet meltwater into the world ocean during the last deglaciation are of great current interest, because these large-scale events offer important test-beds for numerical models of the responses of ocean circulation and climate to meltwater addition. The largest such event has become known as meltwater pulse (mwp) 1a, with estimates of about 20 m of sea-level rise in about 500 years. A second meltwater pulse (mwp-1b) has been inferred from some sea-level records, but its existence has become debated following the presentation of additional records. Even the use of the more ubiquitous mwp-1a in modelling studies has been compromised by debate about its exact age, based upon perceived discrepancies between far-field sea-level records. It is clear that an objective investigation is needed to determine to what level inferred similarities and/or discrepancies between the various deglacial sea-level records are statistically rigorous (or not). For that purpose, we present a Monte Carlo style statistical analysis to determine the highest-probability sea-level history from six key far-field deglacial sea-level records, which fully accounts for realistic methodological and chronological uncertainties in all these records, and which is robust with respect to removal of individual component datasets. We find that sea-level rise started to accelerate into the deglaciation from around 17 ka BP. Within the deglacial rise, there were two distinct increases; one at around the timing of the Bølling warming (14.6 ka BP), and another, much broader, event that just post-dates the end of the Younger Dryas (11.3 ka BP). We interpret these as mwp-1a and mwp-1b, respectively. We find that mwp-1a occurred between 14.3 ka BP and 12.8 ka BP. Highest rates of sea-level rise occurred at ~ 13.8 ka, probably (67% confidence) within the range of 100-130 cm/century, although values may have been as high as 260 cm/century (99% confidence limit). Mwp-1b is robustly expressed as a broad

  18. Remote heartbeat signal detection from visible spectrum recordings based on blind deconvolution (United States)

    Kaur, Balvinder; Moses, Sophia; Luthra, Megha; Ikonomidou, Vasiliki N.


    While recent advances have shown that it is possible to acquire a signal equivalent to the heartbeat from visual spectrum video recordings of the human skin, extracting the heartbeat's exact timing information from it, for the purpose of heart rate variability analysis, remains a challenge. In this paper, we explore two novel methods to estimate the remote cardiac signal peak positions, aiming at a close representation of the R-peaks of the ECG signal. The first method is based on curve fitting (CF) using a modified filtered least mean square (LMS) optimization and the second method is based on system estimation using blind deconvolution (BDC). To prove the efficacy of the developed algorithms, we compared results obtained with the ground truth (ECG) signal. Both methods achieved a low relative error between the peaks of the two signals. This work, performed under an IRB approved protocol, provides initial proof that blind deconvolution techniques can be used to estimate timing information of the cardiac signal closely correlated to the one obtained by traditional ECG. The results show promise for further development of a remote sensing of cardiac signals for the purpose of remote vital sign and stress detection for medical, security, military and civilian applications.

  19. Analysis of Switchable Spin Torque Oscillator for Microwave Assisted Magnetic Recording

    Directory of Open Access Journals (Sweden)

    Mingsheng Zhang


    Full Text Available A switchable spin torque oscillator (STO with a negative magnetic anisotropy oscillation layer for microwave assisted magnetic recording is analyzed theoretically and numerically. The equations for finding the STO frequency and oscillation angle are derived from Landau-Lifshitz-Gilbert (LLG equation with the spin torque term in spherical coordinates. The theoretical analysis shows that the STO oscillating frequency remains the same and oscillation direction reverses after the switching of the magnetization of the spin polarization layer under applied alternative magnetic field. Numerical analysis based on the derived equations shows that the oscillation angle increases with the increase of the negative anisotropy energy density (absolute value but decreases with the increase of spin current, the polarization of conduction electrons, the saturation magnetization, and the total applied magnetic field in the z direction. The STO frequency increases with the increase of spin current, the polarization of conduction electrons, and the negative anisotropy energy density (absolute value but decreases with the increase of the saturation magnetization and the total applied magnetic field in the z direction.

  20. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features (United States)

    García, A.; Romano, H.; Laciar, E.; Correa, R.


    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases "arrhythmias MIT BIH database" and "MIT BIH supraventricular arrhythmias database". A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  1. A Kinect-based system for automatic recording of some pigeon behaviors. (United States)

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M


    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  2. Time and spectral analysis methods with machine learning for the authentication of digital audio recordings. (United States)

    Korycki, Rafal


    This paper addresses the problem of tampering detection and discusses new methods that can be used for authenticity analysis of digital audio recordings. Nowadays, the only method referred to digital audio files commonly approved by forensic experts is the ENF criterion. It consists in fluctuation analysis of the mains frequency induced in electronic circuits of recording devices. Therefore, its effectiveness is strictly dependent on the presence of mains signal in the recording, which is a rare occurrence. This article presents the existing methods of time and spectral analysis along with their modifications as proposed by the author involving spectral analysis of residual signal enhanced by machine learning algorithms. The effectiveness of tampering detection methods described in this paper is tested on a predefined music database. The results are compared graphically using ROC-like curves. Furthermore, time-frequency plots are presented and enhanced by reassignment method in purpose of visual inspection of modified recordings. Using this solution, enables analysis of minimal changes of background sounds, which may indicate tampering.

  3. Geometric data perturbation-based personal health record transactions in cloud computing. (United States)

    Balasubramaniam, S; Kavitha, V


    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  4. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis. (United States)

    Shenkin, S D; Zhang, M G; Der, G; Mathur, S; Mina, T H; Reynolds, R M


    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EMBASE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) to May 2015. We included studies that reported recalled birth weight and recorded birth weight. We excluded studies investigating a clinical population. Two reviewers independently reviewed citations, extracted data, assessed risk of bias. Data were pooled in a random effects meta-analysis for correlation and mean difference. In total, 40 studies were eligible for qualitative synthesis (n=78,997 births from 78,196 parents). Agreement between recalled and recorded birth weight was high: pooled estimate of correlation in 23 samples from 19 studies (n=7406) was 0.90 [95% confidence interval (CI) 0.87-0.93]. The difference between recalled and recorded birth weight in 29 samples from 26 studies (n=29,293) was small [range -86-129 g; random effects estimate 1.4 g (95% CI -4.0-6.9 g)]. Studies were heterogeneous, with no evidence for an effect of time since birth, person reporting, recall bias, or birth order. In post-hoc subgroup analysis, recall was higher than recorded birth weight by 80 g (95% CI 57-103 g) in low and middle income countries. In conclusion, there is high agreement between recalled and recorded birth weight. If birth weight is recalled, it is suitable for use in epidemiological studies, at least in high income countries.

  5. Patients covertly recording clinical encounters: threat or opportunity? A qualitative analysis of online texts

    NARCIS (Netherlands)

    Tsulukidze, M.; Grande, S.W.; Thompson, R.; Rudd, K.; Elwyn, G.


    BACKGROUND: The phenomenon of patients covertly recording clinical encounters has generated controversial media reports. This study aims to examine the phenomenon and analyze the underlying issues. METHODS AND FINDINGS: We conducted a qualitative analysis of online posts, articles, blogs, and forums

  6. Noninvasive method for electrocardiogram recording in conscious rats: feasibility for heart rate variability analysis

    Directory of Open Access Journals (Sweden)

    Pedro P. Pereira-Junior


    Full Text Available Heart rate variability (HRV analysis consists in a well-established tool for the assessment of cardiac autonomic control, both in humans and in animal models. Conventional methods for HRV analysis in rats rely on conscious state electrocardiogram (ECG recording based on prior invasive surgical procedures for electrodes/transmitters implants. The aim of the present study was to test a noninvasive and inexpensive method for ECG recording in conscious rats, assessing its feasibility for HRV analysis. A custom-made elastic cotton jacket was developed to fit the rat's mean thoracic circumference, with two pieces of platinum electrodes attached on its inner surface, allowing ECG to be recorded noninvasively in conscious, restrained rats (n=6. Time- and frequency-domain HRV analyses were conducted, under basal and autonomic blockade conditions. High-quality ECG signals were obtained, being feasible for HRV analysis. As expected, mean RR interval was significantly decreased in the presence of atropine (p A análise da variabilidade da freqüência cardíaca (VFC consiste em uma metodologia bem estabelecida para o estudo do controle autonômico cardíaco, tanto em humanos como em modelos animais. As metodologias convencionais para o estudo da VFC em ratos utilizam-se de procedimentos cirúrgicos para o implante de eletródios ou transmissores, o que possibilita a posterior aquisição do eletrocardiograma (ECG no estado consciente. O objetivo do presente trabalho foi o de desenvolver e aplicar um método não-invasivo para o registro do ECG em ratos conscientes, verificando sua validade para a análise da VFC. Uma vestimenta de tecido elástico em algodão foi desenvolvida de acordo com as dimensões médias da circunferência torácica dos animais, e dois pequenos eletródios retangulares de platina foram aderidos à superfície interna da vestimenta, permitindo o registro do ECG de forma não-invasiva em ratos conscientes (n=6, sob contenção. Foram

  7. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging. (United States)

    Haghpanahi, Masoumeh; Borkholder, David A


    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother's abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice.

  8. 49 CFR 1544.230 - Fingerprint-based criminal history records checks (CHRC): Flightcrew members. (United States)


    ... 49 Transportation 9 2010-10-01 2010-10-01 false Fingerprint-based criminal history records checks (CHRC): Flightcrew members. 1544.230 Section 1544.230 Transportation Other Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY AIRCRAFT OPERATOR...

  9. IMASIS computer-based medical record project: dealing with the human factor. (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F


    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re

  10. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    NARCIS (Netherlands)

    Haan, de S.; Lodder, J.C.


    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having perpend

  11. In Pursuit of Reciprocity: Researchers, Teachers, and School Reformers Engaged in Collaborative Analysis of Video Records (United States)

    Curry, Marnie W.


    In the ideal, reciprocity in qualitative inquiry occurs when there is give-and-take between researchers and the researched; however, the demands of the academy and resource constraints often make the pursuit of reciprocity difficult. Drawing on two video-based, qualitative studies in which researchers utilized video records as resources to enhance…

  12. An evaluation of a teaching package constructed using a Web-based lecture recorder


    Segal, Judith


    This paper reports on an evaluation of a teaching package constructed using Audiograph, a Web-based lecture recorder developed at the University of Surrey. Audiograph is described in detail in Jesshope and Shafarenko (1997). Its developer aims to provide a medium by which multimedia teaching packages, based on traditional university lectures, may be developed rapidly by the lecturer(s) concerned (as opposed to professional CAL developers) at low cost. Audiograph is designed so that developmen...

  13. An integrable, web-based solution for easy assessment of video-recorded performances

    DEFF Research Database (Denmark)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars


    Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems......, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video......-recorded performances (ISEA)....

  14. A new photopolymerizable holographic recording material based on acrylamide and N-hydroxymethyl acrylamide

    Institute of Scientific and Technical Information of China (English)

    Gong Qiao-Xia; Wang Su-Lian; Huang Ming-Ju; Gan Fu-Xi


    A new polyvinylalcohol-based photopolymeric holographic recording material has been developed. The recording is obtained by the copolymerization of acrylamide and N-hydroxymethyl acrylamide. Diffraction efficiencies near 50% are obtained with energetic exposure of 80m J/cm2. N-hydroxymethyl acrylamide can improve the optical quality of the film. With the increase of the concentration of N-hydroxymethyl acrylamide, the flatness of the film increases, scattering reduces and the straight image is clearer with a small distortion. The postexposure effect on the grating is also studied.The diffraction efficiency of grating increases further during postexposure, gradient of monomer exists after exposure.

  15. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns (United States)

    Frost, J. D., Jr.; Salamy, J. G.


    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  16. Development of a peer review system using patient records for outcome evaluation of medical education: reliability analysis. (United States)

    Kameoka, Junichi; Okubo, Tomoya; Koguma, Emi; Takahashi, Fumie; Ishii, Seiichi; Kanatsuka, Hiroshi


    In addition to input evaluation (education delivered at school) and output evaluation (students' capability at graduation), the methods for outcome evaluation (performance after graduation) of medical education need to be established. One approach is a review of medical records, which, however, has been met with difficulties because of poor inter-rater reliability. Here, we attempted to develop a peer review system of medical records with high inter-rater reliability. We randomly selected 112 patients (and finally selected 110 after removing two ineligible patients) who visited (and were hospitalized in) one of the four general hospitals in the Tohoku region of Japan between 2008 and 2012. Four reviewers, who were well-trained general internists from outside the Tohoku region, visited the hospitals independently and evaluated outpatient medical records based on an evaluation sheet that consisted of 14 items (3-point scale) for record keeping and 15 items (5-point scale) for quality of care. The mean total score was 84.1 ± 7.7. Cronbach's alpha for these items was 0.798. Single measure and average measure intraclass correlations for the reviewers were 0.733 (95% confidence interval: 0.720-0.745) and 0.917 (95% confidence interval: 0.912-0.921), respectively. An exploratory factor analysis revealed six factors: history taking, physical examination, clinical reasoning, management and outcome, rhetoric, and patient relationship. In conclusion, we have developed a peer review system of medical records with high inter-rater reliability, which may enable us, with further validity analysis, to measure quality of patient care as an outcome evaluation of medical education in the future.

  17. Characteristics of solar diurnal variations: a case study based on records from the ground magnetic observatory at Vassouras, Brazil

    CERN Document Server

    Klausner, Virginia; Mendes, Odim; Domingues, Margarete O; Frick, Peter


    The horizontal component amplitudes observed by ground-based observatories of the INTERMAGNET network have been used to analyze the global pattern variance of the solar diurnal variations. Data from magnetic stations present gaps in records and consequently we explored them via a time-frequency gapped wavelet algorithm. After computing the gapped wavelet transform, we performed wavelet cross-correlation analysis which was useful to isolate the period of the spectral components of the geomagnetic field in each of the selected magnetic stations and to correlate them as function of scale (period) with the low latitude Vassouras Observatory, Rio de Janeiro, Brazil, which is under the South Atlantic Magnetic Anomaly (SAMA) influence and should be used as a reference for an under-construction Brazilian network of magnetic observatories. The results show that the records in magnetic stations have a latitudinal dependence affected by the season of year and by the level of solar activity. We have found a disparity on ...

  18. Functional recordings from awake, behaving rodents through a microchannel based regenerative neural interface (United States)

    Gore, Russell K.; Choi, Yoonsu; Bellamkonda, Ravi; English, Arthur


    group of awake and behaving animals. These unique findings provide preliminary evidence that efferent, volitional motor potentials can be recorded from the microchannel-based peripheral neural interface; a critical requirement for any neural interface intended to facilitate direct neural control of external technologies.

  19. Polarization holographic recording in thin films of pure azopolymer and azopolymer based hybrid materials (United States)

    Berberova, N.; Daskalova, D.; Strijkova, V.; Kostadinova, D.; Nazarova, D.; Nedelchev, L.; Stoykova, E.; Marinova, V.; Chi, C. H.; Lin, S. H.


    Recently, a birefringence enhancement effect was observed in azopolymers doped with various nanoparticles. The paper presents comparison between the parameters of polarization holographic gratings recorded in a pure azopolymer PAZO (Poly[1-[4-(3-carboxy-4-hydroxyphenylazo) benzenesulfonamido]-1,2-ethanediyl, sodium salt]) and in a hybrid PAZO-based organic/inorganic material with incorporated ZnO nanoparticles of size less than 50 nm. Laser emitting at 491 nm is used for the holographic recording. Along with the anisotropic grating in the volume of the media, surface relief is also formed. Gratings with different spatial frequencies are obtained by varying the recording angle. The time dependence of the diffraction efficiency is probed at 635 nm and the height of the relief gratings is determined by AFM. Our results indicate that both the diffraction efficiency and the height of the surface relief for the hybrid samples are enhanced with respect to the pure azopolymer films.

  20. Smart Card Based Integrated Electronic Health Record System For Clinical Practice

    Directory of Open Access Journals (Sweden)

    N. Anju Latha


    Full Text Available Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Smart cards are used as a Electronic Health Record (EHR Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in hospitals. In this paper, a smart card based Integrated Electronic health Record System is developed. The system uses smart card for personal identification and transfer of health data and provides data communication. In addition to personal information, general health information about the patient is also loaded to patient smart card. Health care providers use smart cards to access data on patient cards. Electronic health records have number of advantages over the paper record, which improve the accuracy, quality of patient care, reduce the cost, efficiency, productivity. In present work we measure the biomedical parameters like Blood Pressure, Diabetes Mellitus and Pulse oxygen measurement.,etc clinical parameters of patient and store health details in Electronic Health record. The system has been successfully tested and implemented (Abstract

  1. Template-based automatic recognition of birdsong syllables from continuous recordings. (United States)

    Anderson, S E; Dave, A S; Margoliash, D


    The application of dynamic time warping (DTW) to the automated analysis of continuous recordings of animal vocalizations is evaluated. The DTW algorithm compares an input signal with a set of predefined templates representative of categories chosen by the investigator. It directly compares signal spectrograms, and identifies constituents and constituent boundaries, thus permitting the identification of a broad range of signals and signal components. When applied to vocalizations of an indigo bunting (Passerina cyanea) and a zebra finch (Taeniopygia guttata) collected from a low-clutter, low-noise environment, the recognizer identifies syllables in stereotyped songs and calls with greater than 97% accuracy. Syllables of the more variable and lower amplitude indigo bunting plastic song are identified with approximately 84% accuracy. Under restricted recordings conditions, this technique apparently has general applicability to analysis of a variety of animal vocalizations and can dramatically decrease the amount of time spent on manual identification of vocalizations.

  2. Temporomandibular joint sounds: a critique of techniques for recording and analysis. (United States)

    Widmer, C G


    Sonography, or the graphic recording of sounds, has been proposed as an objective measure of various pathological conditions in the temporomandibular joint. Various electronic devices have been developed to enhance our ability to auscultate the joint, monitor the timing of the sounds with jaw movement, and analyze the characteristics of the sound; the intent of these devices is to diagnose the intracapsular condition "objectively." This review paper critically evaluates the advantages and limitations of this technique. Based on the existing literature, these instruments can record sounds; however, the origin of these sounds is uncertain, since room noise, skin and hair sounds, respiration, arterial blood flow, and cross-over noises from the opposite TMJ have not been excluded as possible artifacts of the recording. More important, the diagnostic specificity, as an indicator of each type of TMJ disease, has not been clearly and consistently demonstrated with the sonographic technique.

  3. The Private Communications of Magnetic Recording under Socialism (Retrospective Disco Analysis

    Directory of Open Access Journals (Sweden)

    Oleg Vladimir Sineokij


    Full Text Available The article analyzes the formation and development of a general model of rare sound records in the structure of institutions of a social communication. The author considers psychocomminicative features of the filophone communication as a special type of interaction in the field of entertainment. The author studied the causes and conditions of a tape subculture in the USSR. It is observed the dynamics of the disco-communication in limited information conditions from socialism till modern high-tech conditions.At the end of the article the author argues based achievements in the field of advanced technology systems, innovation revival in the industry of music-record. Hence, using innovative approaches in the study, the author sets out the basic concept of recording popular music as a special information and legal institution, in retrospect, the theory and practice of the future needs in the information society.

  4. Factors affecting the quality of sound recording for speech and voice analysis. (United States)

    Vogel, Adam P; Morgan, Angela T


    The importance and utility of objective evidence-based measurement of the voice is well documented. Therefore, greater consideration needs to be given to the factors that influence the quality of voice and speech recordings. This manuscript aims to bring together the many features that affect acoustically acquired voice and speech. Specifically, the paper considers the practical requirements of individual speech acquisition configurations through examining issues relating to hardware, software and microphone selection, the impact of environmental noise, analogue to digital conversion and file format as well as the acoustic measures resulting from varying levels of signal integrity. The type of recording environment required by a user is often dictated by a variety of clinical and experimental needs, including: the acoustic measures being investigated; portability of equipment; an individual's budget; and the expertise of the user. As the quality of recorded signals is influenced by many factors, awareness of these issues is essential. This paper aims to highlight the importance of these methodological considerations to those previously uninitiated with voice and speech acoustics. With current technology, the highest quality recording would be made using a stand-alone hard disc recorder, an independent mixer to attenuate the incoming signal, and insulated wiring combined with a high quality microphone in an anechoic chamber or sound treated room.

  5. [Hardware and software for EMG recording and analysis of respiratory muscles of human]. (United States)

    Solnushkin, S D; Chakhman, V N; Segizbaeva, M O; Pogodin, M A; Aleksandrov, V G


    This paper presents a new hardware and software system that allows to not only record the EMG of different groups of the respiratory muscles, but also hold its amplitude-frequency analysis, which allows to determine the change in the contribution to the work of breathing of a respiratory muscles and detect early signs of fatigue of the respiratory muscles. Presented complex can be used for functional diagnostics of breath in patients and healthy people and sportsmen.

  6. Signals embedded in the OBS records, in light of Gabor Spectral Analysis (United States)

    Chang, T.; Wang, Y.; Chang, C.; Lee, C.


    Since the last decades, seismological survey has been expanded to marine area, with goal of making up the deficiency of seismogenic study outside the land. Although teleseismic data can resolve plate boundaries location and certain seismic parameters for great earthquake, local seismogenic frame can be merely emerged by the seismic network in situ. The Ocean Bottom Seismometer (OBS), therefore, is developing for this kind of purpose and becoming an important facility for seismological study. This work introduces a synthesized spectral method to analyze the seismograms recorded by 15 OBS deployed at the Okinawa trough in 14 days (Nov. 19 ~Dec. 2, 2003). Geological background of Okinawa trough is well known to correspond with the back-arc spreading in the regime of the Philippine Sea plate subducting northward beneath the Eurasia plate. As the complex affections at sea bottom, for instance, strong current, slope slumping, turbidite flow, and even sea animal attack, the OBS seismogram show a rather noisy sequence in comparison with the record on land. However, hundreds of tectonic earthquake can be extracted from such noisy records (done by Drs. Lin and Sibuet). Our job is to sort out the signals with the distinguishable sources by means of a systematically spectral analysis. The continuous wavelet transform and short-term Fourier transform, both taking Gaussian function as kernel, are synthesized as the Gabor transform in data process. The use of a limited Gaussian window along time axis with negligible low frequency error can largely enhance the stability of discrete Fourier spectrum. With a proper window factor selection, the Gabor transform can improve the resolution of spectrogram in time domain. We have converted the OBS records into spectrograms to detect the variation of signal causes. Up-to-date, some tremors signals and strong current oscillations have been told apart from these continuous records with varied frequency composing. We anticipate the further

  7. Paleosecular variation during the PCRS based on a new database of sedimentary and volcanic records (United States)

    Haldan, M. M.; Langereis, C. G.; Evans, M. E.


    We present a paleosecular variation study using a generalised global paleomagnetic sedimentary and volcanic database. We made use of all available (and suitable) - published and some new- sedimentary and volcanic paleomagnetic records corresponding to the Permo-Carboniferous Reversed Superchron (PCRS) interval to reanalyse all data. We focused on records with a sufficient number of samples, and acquired - whenever possible - the original data, or - as a second choice - parametrised published site means. Analysis of these paleomagnetic data in terms of latitude variation of the scatter of the virtual geomagnetic poles (VGPs) suggests that careful data selection is required and that some of the older studies may need to be redone using more modern methods, both in terms of sampling and laboratory treatment. In addition, high (southern and especially northern hemisphere) latitudes are notably lacking in published records. The transitional data is removed using a variable VGP cut-off angle which varies with latitude. We use also our extended sedimentary records from Permian red beds from the Lodève and Dôme de Barrot basins (S. France), a new detailed paleomagnetic study of the Permian volcanics in the Oslo graben (Norway), as well as new data from Carboniferous-Permian sediments from the Donbas basin (Ukraine). We compare our results with those from published paleosecular variation models and with recent (re)analyses of VGP scatter during different periods of the geological archive.

  8. Role-based access control through on-demand classification of electronic health record. (United States)

    Tiwari, Basant; Kumar, Abhay


    Electronic health records (EHR) provides convenient method to exchange medical information of patients between different healthcare providers. Access control mechanism in healthcare services characterises authorising users to access EHR records. Role Based Access Control helps to restrict EHRs to users in a certain role. Significant works have been carried out for access control since last one decade but little emphasis has been given to on-demand role based access control. Presented work achieved access control through physical data isolation which is more robust and secure. We propose an algorithm in which selective combination of policies for each user of the EHR database has been defined. We extend well known data mining technique 'classification' to group EHRs with respect to the given role. Algorithm works by taking various roles as class and defined their features as a vector. Here, features are used as a Feature Vector for classification to describe user authority.

  9. Cost-efficient measurement strategies for posture observations based on video recordings. (United States)

    Mathiassen, Svend Erik; Liv, Per; Wahlström, Jens


    Assessment of working postures by observation is a common practice in ergonomics. The present study investigated whether monetary resources invested in a video-based posture observation study should preferably be spent in collecting many video recordings of the work and have them observed once by one observer, or in having multiple observers rate postures repeatedly from fewer videos. The study addressed this question from a practitioner's perspective by focusing two plausible scenarios: documenting the mean exposure of one individual, and of a specific occupational group. Using a data set of observed working postures among hairdressers, empirical values of posture variability, observer variability, and costs for recording and observing one video were entered into equations expressing the total cost of data collection and the information (defined as 1/SD) provided by the resulting estimates of two variables: percentage time with the arm elevated 90°. Sixteen measurement strategies involving 1-4 observers repeating their posture ratings 1-4 times were examined for budgets up to €2000. For both posture variables and in both the individual and group scenario, the most cost-efficient strategy at any specific budget was to engage 3-4 observers and/or having observer(s) rate postures multiple times each. Between 17% and 34% less information was produced when using the commonly practiced approach of having one observer rate a number of video recordings one time each. We therefore recommend observational posture assessment to be based on video recordings of work, since this allows for multiple observations; and to allocate monetary resources to repeated observations rather than many video recordings.

  10. Cloud-based Electronic Health Records for Real-time, Region-specific Influenza Surveillance. (United States)

    Santillana, M; Nguyen, A T; Louie, T; Zink, A; Gray, J; Sung, I; Brownstein, J S


    Accurate real-time monitoring systems of influenza outbreaks help public health officials make informed decisions that may help save lives. We show that information extracted from cloud-based electronic health records databases, in combination with machine learning techniques and historical epidemiological information, have the potential to accurately and reliably provide near real-time regional estimates of flu outbreaks in the United States.


    Directory of Open Access Journals (Sweden)

    O. Yu. Kydashev


    Full Text Available This paper presents the detailed description of agglomerative clustering system implementation for speech segments based on Bayesian information criterion. Numerical experiment results with different acoustic features, as well as the full and diagonal covariance matrices application are given. The error rate DER equal to 6.4% for audio records of radio «Svoboda» was achieved by means of designed system.

  12. Displacement spectra and displacement modification factors, based on records from Greece


    Athanassiadou, C. J.; Karakostas, C. Z.; Margaris, B. N.; Kappos, A. J.


    Elastic and inelastic displacement spectra (for periods up to 4.0 s) are derived, using a representative sample of acceleration records from Greece, carefully selected based on magnitude, distance and peak ground acceleration criteria, and grouped into three ground type categories according to the Eurocode 8 (EC8) provisions. The modification factor for the elastic design spectrum adopted in EC8 for accounting for damping is verified herein and is found to be satisfactory in the short to medi...

  13. Natural Language Processing Based Instrument for Classification of Free Text Medical Records

    Directory of Open Access Journals (Sweden)

    Manana Khachidze


    Full Text Available According to the Ministry of Labor, Health and Social Affairs of Georgia a new health management system has to be introduced in the nearest future. In this context arises the problem of structuring and classifying documents containing all the history of medical services provided. The present work introduces the instrument for classification of medical records based on the Georgian language. It is the first attempt of such classification of the Georgian language based medical records. On the whole 24.855 examination records have been studied. The documents were classified into three main groups (ultrasonography, endoscopy, and X-ray and 13 subgroups using two well-known methods: Support Vector Machine (SVM and K-Nearest Neighbor (KNN. The results obtained demonstrated that both machine learning methods performed successfully, with a little supremacy of SVM. In the process of classification a “shrink” method, based on features selection, was introduced and applied. At the first stage of classification the results of the “shrink” case were better; however, on the second stage of classification into subclasses 23% of all documents could not be linked to only one definite individual subclass (liver or binary system due to common features characterizing these subclasses. The overall results of the study were successful.

  14. Paper-Based Medical Records: the Challenges and Lessons Learned from Studying Obstetrics and Gynaecological Post-Operation Records in a Nigerian Hospital

    Directory of Open Access Journals (Sweden)

    Adekunle Yisau Abdulkadir


    Full Text Available AIM: With the background knowledge that auditing of Medical Records (MR for adequacy and completeness is necessary if it is to be useful and reliable in continuing patient care; protection of the legal interest of the patient, physicians, and the Hospital; and meeting requirements for researches, we scrutinized theatre records of our hospital to identify routine omissions or deficiencies, and correctable errors in our MR system. METHOD: Obstetrics and Gynaecological post operation theatre records between January 2006 and December 2008 were quantitatively and qualitatively analyzed for details that included: hospital number; Patients age; diagnosis; surgery performed; types and modes of anesthesia; date of surgery; patients’ ward; Anesthetists names; surgeons and attending nurses names, and abbreviations used with SPSS 15.0 for Windows. RESULTS: Hardly were any of the 1270 surgeries during the study period documented without an omission or an abbreviation. Hospital numbers and patients’ age were not documented in 21.8% (n=277 and 59.1% (n=750 respectively. Diagnoses and surgeries were recorded with varying abbreviations in about 96% of instances. Surgical team names were mostly abbreviated or initials only given. CONCLUSION: To improve the quality of Paper-based Medical Record, regular auditing, training and good orientation of medical personnel for good record practices, and discouraging large volume record book to reduce paper damages and sheet loss from handling are necessary else what we record toady may neither be useful nor available tomorrow. [TAF Prev Med Bull 2010; 9(5.000: 427-432

  15. A Brief Tool to Assess Image-Based Dietary Records and Guide Nutrition Counselling Among Pregnant Women: An Evaluation (United States)

    Ashman, Amy M; Collins, Clare E; Brown, Leanne J; Rae, Kym M


    Background Dietitians ideally should provide personally tailored nutrition advice to pregnant women. Provision is hampered by a lack of appropriate tools for nutrition assessment and counselling in practice settings. Smartphone technology, through the use of image-based dietary records, can address limitations of traditional methods of recording dietary intake. Feedback on these records can then be provided by the dietitian via smartphone. Efficacy and validity of these methods requires examination. Objective The aims of the Australian Diet Bytes and Baby Bumps study, which used image-based dietary records and a purpose-built brief Selected Nutrient and Diet Quality (SNaQ) tool to provide tailored nutrition advice to pregnant women, were to assess relative validity of the SNaQ tool for analyzing dietary intake compared with nutrient analysis software, to describe the nutritional intake adequacy of pregnant participants, and to assess acceptability of dietary feedback via smartphone. Methods Eligible women used a smartphone app to record everything they consumed over 3 nonconsecutive days. Records consisted of an image of the food or drink item placed next to a fiducial marker, with a voice or text description, or both, providing additional detail. We used the SNaQ tool to analyze participants’ intake of daily food group servings and selected key micronutrients for pregnancy relative to Australian guideline recommendations. A visual reference guide consisting of images of foods and drinks in standard serving sizes assisted the dietitian with quantification. Feedback on participants’ diets was provided via 2 methods: (1) a short video summary sent to participants’ smartphones, and (2) a follow-up telephone consultation with a dietitian. Agreement between dietary intake assessment using the SNaQ tool and nutrient analysis software was evaluated using Spearman rank correlation and Cohen kappa. Results We enrolled 27 women (median age 28.8 years, 8 Indigenous

  16. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Directory of Open Access Journals (Sweden)

    Marco Crescentini


    Full Text Available High-throughput screening (HTS using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i design of scalable microfluidic devices; (ii design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  17. Evaluation of listener-based anuran surveys with automated audio recording devices (United States)

    Shearin, A. F.; Calhoun, A.J.K.; Loftin, C.S.


    Volunteer-based audio surveys are used to document long-term trends in anuran community composition and abundance. Current sampling protocols, however, are not region- or species-specific and may not detect relatively rare or audibly cryptic species. We used automated audio recording devices to record calling anurans during 2006–2009 at wetlands in Maine, USA. We identified species calling, chorus intensity, time of day, and environmental variables when each species was calling and developed logistic and generalized mixed models to determine the time interval and environmental variables that optimize detection of each species during peak calling periods. We detected eight of nine anurans documented in Maine. Individual recordings selected from the sampling period (0.5 h past sunset to 0100 h) described in the North American Amphibian Monitoring Program (NAAMP) detected fewer species than were detected in recordings from 30 min past sunset until sunrise. Time of maximum detection of presence and full chorusing for three species (green frogs, mink frogs, pickerel frogs) occurred after the NAAMP sampling end time (0100 h). The NAAMP protocol’s sampling period may result in omissions and misclassifications of chorus sizes for certain species. These potential errors should be considered when interpreting trends generated from standardized anuran audio surveys.

  18. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records

    DEFF Research Database (Denmark)

    Walker, Mike; Johnsen, Sigfus Johann; Rasmussen, Sune Olander;


    The Greenland ice core from NorthGRIP (NGRIP) contains a proxy climate record across the Pleistocene-Holocene boundary of unprecedented clarity and resolution. Analysis of an array of physical and chemical parameters within the ice enables the base of the Holocene, as reflected in the first signs...

  19. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home. (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko


    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  20. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  1. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)


    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  2. A 6,700 years sea-level record based on French Polynesian coral reefs (United States)

    Hallmann, Nadine; Camoin, Gilbert; Eisenhauer, Anton; Vella, Claude; Samankassou, Elias; Botella, Albéric; Milne, Glenn; Fietzke, Jan; Dussouillez, Philippe


    Sea-level change during the Mid- to Late Holocene has a similar amplitude to the sea-level rise that is likely to occur before the end of the 21st century providing a unique opportunity to study the coastal response to sea-level change and to reveal an important baseline of natural climate variability prior to the industrial revolution. Mid- to Late Holocene relative sea-level change in French Polynesia was reconstructed using coral reef records from ten islands, which represent ideal settings for accurate sea-level studies because: 1) they can be regarded as tectonically stable during the relevant period (slow subsidence), 2) they are located far from former ice sheets (far-field), 3) they are characterized by a low tidal amplitude, and 4) they cover a wide range of latitudes which produces significantly improved constraints on GIA (Glacial Isostatic Adjustment) model parameters. Absolute U/Th dating of in situ coral colonies and their accurate positioning via GPS RTK (Real Time Kinematic) measurements is crucial for an accurate reconstruction of sea-level change. We focus mainly on the analysis of coral microatolls, which are sensitive low-tide recorders, as their vertical accretion is limited by the mean low water springs level. Growth pattern analysis allows the reconstruction of low-amplitude, high-frequency sea-level changes on centennial to sub-decadal time scales. A sea-level rise of less than 1 m is recorded between 6 and 3-3.5 ka, and is followed by a gradual fall in sea level that started around 2.5 ka and persisted until the past few centuries. The reconstructed sea-level curve therefore extends the Tahiti sea-level curve [Deschamps et al., 2012, Nature, 483, 559-564], and is in good agreement with a geophysical model tuned to fit far-field deglacial records [Bassett et al., 2005, Science, 309, 925-928].

  3. Improving reproducibility of VEP recording in rats: electrodes, stimulus source and peak analysis. (United States)

    You, Yuyi; Klistorner, Alexander; Thie, Johnson; Graham, Stuart L


    The aims of this study were to evaluate and improve the reproducibility of visual evoked potential (VEP) measurement in rats and to develop a mini-Ganzfeld stimulator for rat VEP recording. VEPs of Sprague-Dawley rats were recorded on one randomly selected eye on three separate days within a week, and the recordings were repeated three times on the first day to evaluate the intrasession repeatability and intersession reproducibility. The VEPs were recorded with subdermal needle and implanted skull screw electrodes, respectively, to evaluate the effect of electrode configuration on VEP reproducibility. We also designed a mini-Ganzfeld stimulator for rats, which provided better eye isolation than the conventional visual stimuli such as flash strobes and large Ganzfeld systems. The VEP responses from mini-Ganzfeld were compared with PS33-PLUS photic strobe and single light-emitting diode (LED). The latencies of P1, N1, P2, N2, and P3 and the amplitude of each component were measured and analysed. Intrasession and intersession within-subject standard deviations (Sw), coefficient of variation, repeatability (R95) and intraclass correlation coefficient (ICC) were calculated. The VEPs recorded using the implanted skull electrodes showed significantly larger amplitude and higher reproducibility compared to the needle electrodes (Pwaves. The mean intrasession and intersession ICCs were 0.96 and 0.86 for the early peaks. Using a combination of skull screw electrodes, mini-Ganzfeld stimulator and early peak analysis, we achieved a high reproducibility in the rat VEP measurement. The latencies of the early peaks of rat VEPs were more consistent, which may be due to their generation in the primary visual cortex via the retino-geniculate fibres.

  4. The process analysis and improvement of the slender and flat shaft of track recorder

    Institute of Scientific and Technical Information of China (English)

    DAI Tong-yan; WANG Wei; LIU Ying


    A slender and flat shaft is a key part of the track recorder in marine vessels. However, the axial straightness of the shaft often exceeds standard measurements after it is machined. It has also been found that its precision does not last a long time. After thorough analysis of these problems the main reasons that affect machining quality are identified-and a process modification plan is put forward that meets design requirements of the shaft. The production and practice indicate that the precision of the shaft is stable for a long period and the quality of products improved substantially after new measures were employed, securing the e accuracy of the track recording of the marine vessel.

  5. An empirical approach to predicting long term behavior of metal particle based recording media (United States)

    Hadad, Allan S.


    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. Since iron particles used for magnetic recording are small, additional oxidation has a direct impact on performance especially where archival storage of recorded information for long periods of time is important. Further stabilization chemistry/processes had to be developed to guarantee that iron particles could be considered as a viable long term recording medium. In an effort to retard the diffusion of iron ions through the oxide layer, other elements such as silicon, aluminum, and chromium have been added to the base iron to promote more dense scale formation or to alleviate some of the non-stoichiometric behavior of the oxide or both. The presence of water vapor has been shown to disrupt the passive layer, subsequently increasing the oxidation rate of the iron. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 deg C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  6. European temperature records of the past five centuries based on documentary information compared to climate simulations (United States)

    Zorita, E.


    Two European temperature records for the past half-millennium, January-to-April air temperature for Stockholm (Sweden) and seasonal temperature for a Central European region, both derived from the analysis of documentary sources combined with long instrumental records, are compared with the output of forced (solar, volcanic, greenhouse gases) climate simulations with the model ECHO-G. The analysis is complemented with the long (early)-instrumental record of Central England Temperature (CET). Both approaches to study past climates (simulations and reconstructions) are burdened with uncertainties. The main objective of this comparative analysis is to identify robust features and weaknesses that may help to improve models and reconstruction methods. The results indicate a general agreement between simulations and the reconstructed Stockholm and CET records regarding the long-term temperature trend over the recent centuries, suggesting a reasonable choice of the amplitude of the solar forcing in the simulations and sensitivity of the model to the external forcing. However, the Stockholm reconstruction and the CET record also show a long and clear multi-decadal warm episode peaking around 1730, which is absent in the simulations. The uncertainties associated with the reconstruction method or with the simulated internal climate variability cannot easily explain this difference. Regarding the interannual variability, the Stockholm series displays in some periods higher amplitudes than the simulations but these differences are within the statistical uncertainty and further decrease if output from a regional model driven by the global model is used. The long-term trends in the simulations and reconstructions of the Central European temperature agree less well. The reconstructed temperature displays, for all seasons, a smaller difference between the present climate and past centuries than the simulations. Possible reasons for these differences may be related to a limitation

  7. Comparison of experimental approaches to study selective properties of thick phase-amplitude holograms recorded in materials with diffusion-based formation mechanisms (United States)

    Borisov, Vladimir; Klepinina, Mariia; Veniaminov, Andrey; Angervaks, Aleksandr; Shcheulin, Aleksandr; Ryskin, Aleksandr


    Volume holographic gratings, both transmission and reflection-type, may be employed as one-dimensional pho- tonic crystals. More complex two- and three-dimensional holographic photonic-crystalline structures can be recorded using several properly organized beams. As compared to colloidal photonic crystals, their holographic counterparts let minimize distortions caused by multiple inner boundaries of the media. Unfortunately, it's still hard to analyze spectral response of holographic structures. This work presents the results of thick holographic gratings analysis based on spectral-angular selectivity contours approximation. The gratings were recorded in an additively colored fluorite crystal and a glassy polymer doped with phenanthrenequinone (PQ-PMMA). The two materials known as promising candidates for 3D diffraction optics including photonic crystals, employ diffusion-based mechanisms of grating formation. The surfaces of spectral-angular selectivity were obtained in a single scan using a white-light LED, rotable table and a matrix spectrometer. The data expressed as 3D plots make apparent visual estimation of the grating phase/amplitude nature, noninearity of recording, etc., and provide sufficient information for numerical analysis. The grating recorded in the crystal was found to be a mixed phase-amplitude one, with different contributions of refractive index and absorbance modulation at different wavelengths, and demonstrated three diffraction orders corresponding to its three spatial harmonics originating from intrinsically nonlinear diffusion-drift recording mechanism. Contrastingly, the grating in the polymeric medium appeared purely phase and linearly recorded.

  8. Study on key techniques for camera-based hydrological record image digitization (United States)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping


    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  9. Feasibility and performance evaluation of generating and recording visual evoked potentials using ambulatory Bluetooth based system. (United States)

    Ellingson, Roger M; Oken, Barry


    Report contains the design overview and key performance measurements demonstrating the feasibility of generating and recording ambulatory visual stimulus evoked potentials using the previously reported custom Complementary and Alternative Medicine physiologic data collection and monitoring system, CAMAS. The methods used to generate visual stimuli on a PDA device and the design of an optical coupling device to convert the display to an electrical waveform which is recorded by the CAMAS base unit are presented. The optical sensor signal, synchronized to the visual stimulus emulates the brain's synchronized EEG signal input to CAMAS normally reviewed for the evoked potential response. Most importantly, the PDA also sends a marker message over the wireless Bluetooth connection to the CAMAS base unit synchronized to the visual stimulus which is the critical averaging reference component to obtain VEP results. Results show the variance in the latency of the wireless marker messaging link is consistent enough to support the generation and recording of visual evoked potentials. The averaged sensor waveforms at multiple CPU speeds are presented and demonstrate suitability of the Bluetooth interface for portable ambulatory visual evoked potential implementation on our CAMAS platform.

  10. Optimal discrimination and classification of neuronal action potential waveforms from multiunit, multichannel recordings using software-based linear filters. (United States)

    Gozani, S N; Miller, J P


    We describe advanced protocols for the discrimination and classification of neuronal spike waveforms within multichannel electrophysiological recordings. The programs are capable of detecting and classifying the spikes from multiple, simultaneously active neurons, even in situations where there is a high degree of spike waveform superposition on the recording channels. The protocols are based on the derivation of an optimal linear filter for each individual neuron. Each filter is tuned to selectively respond to the spike waveform generated by the corresponding neuron, and to attenuate noise and the spike waveforms from all other neurons. The protocol is essentially an extension of earlier work [1], [13], [18]. However, the protocols extend the power and utility of the original implementations in two significant respects. First, a general single-pass automatic template estimation algorithm was derived and implemented. Second, the filters were implemented within a software environment providing a greatly enhanced functional organization and user interface. The utility of the analysis approach was demonstrated on samples of multiunit electrophysiological recordings from the cricket abdominal nerve cord.

  11. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi


    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  12. Improving the effectiveness of electronic health record-based referral processes. (United States)

    Esquivel, Adol; Sittig, Dean F; Murphy, Daniel R; Singh, Hardeep


    Electronic health records are increasingly being used to facilitate referral communication in the outpatient setting. However, despite support by technology, referral communication between primary care providers and specialists is often unsatisfactory and is unable to eliminate care delays. This may be in part due to lack of attention to how information and communication technology fits within the social environment of health care. Making electronic referral communication effective requires a multifaceted "socio-technical" approach. Using an 8-dimensional socio-technical model for health information technology as a framework, we describe ten recommendations that represent good clinical practices to design, develop, implement, improve, and monitor electronic referral communication in the outpatient setting. These recommendations were developed on the basis of our previous work, current literature, sound clinical practice, and a systems-based approach to understanding and implementing health information technology solutions. Recommendations are relevant to system designers, practicing clinicians, and other stakeholders considering use of electronic health records to support referral communication.

  13. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record]. (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen


    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital.

  14. Analysis of diagnoses extracted from electronic health records in a large mental health case register (United States)

    Kovalchuk, Yevgeniya; Stewart, Robert; Broadbent, Matthew; Hubbard, Tim J. P.; Dobson, Richard J. B.


    The UK government has recently recognised the need to improve mental health services in the country. Electronic health records provide a rich source of patient data which could help policymakers to better understand needs of the service users. The main objective of this study is to unveil statistics of diagnoses recorded in the Case Register of the South London and Maudsley NHS Foundation Trust, one of the largest mental health providers in the UK and Europe serving a source population of over 1.2 million people residing in south London. Based on over 500,000 diagnoses recorded in ICD10 codes for a cohort of approximately 200,000 mental health patients, we established frequency rate of each diagnosis (the ratio of the number of patients for whom a diagnosis has ever been recorded to the number of patients in the entire population who have made contact with mental disorders). We also investigated differences in diagnoses prevalence between subgroups of patients stratified by gender and ethnicity. The most common diagnoses in the considered population were (recurrent) depression (ICD10 codes F32-33; 16.4% of patients), reaction to severe stress and adjustment disorders (F43; 7.1%), mental/behavioural disorders due to use of alcohol (F10; 6.9%), and schizophrenia (F20; 5.6%). We also found many diagnoses which were more likely to be recorded in patients of a certain gender or ethnicity. For example, mood (affective) disorders (F31-F39); neurotic, stress-related and somatoform disorders (F40-F48, except F42); and eating disorders (F50) were more likely to be found in records of female patients, while males were more likely to be diagnosed with mental/behavioural disorders due to psychoactive substance use (F10-F19). Furthermore, mental/behavioural disorders due to use of alcohol and opioids were more likely to be recorded in patients of white ethnicity, and disorders due to use of cannabinoids in those of black ethnicity. PMID:28207753

  15. A new technique for fractal analysis applied to human, intracerebrally recorded, ictal electroencephalographic signals. (United States)

    Bullmore, E; Brammer, M; Alarcon, G; Binnie, C


    Application of a new method of fractal analysis to human, intracerebrally recorded, ictal electroencephalographic (EEG) signals is reported. 'Frameshift-Richardson' (FR) analysis involves estimation of fractal dimension (1 EEG data; it is suggested that this technique offers significant operational advantages over use of algorithms for FD estimation requiring preliminary reconstruction of EEG data in phase space. FR analysis was found to reduce substantially the volume of EEG data, without loss of diagnostically important information concerning onset, propagation and evolution of ictal EEG discharges. Arrhythmic EEG events were correlated with relatively increased FD; rhythmic EEG events with relatively decreased FD. It is proposed that development of this method may lead to: (i) enhanced definition and localisation of initial ictal changes in the EEG presumed due to multi-unit activity; and (ii) synoptic visualisation of long periods of EEG data.

  16. Space and Astrophysical Plasmas : Matched filtering-parameter estimation method and analysis of whistlers recorded at Varanasi

    Indian Academy of Sciences (India)

    R P Singh; R P Patel; Ashok K Singh; D Hamar; J Lichtenberger


    The matched filtering technique is based on the digital-construction of theoretical whistlers and their comparison with observed whistlers. The parameters estimated from the theoretical and experimental whistler curves are matched to have higher accuracy using digital filters. This yields a resolution ten times better in the time domain. We have tested the applicability of this technique for the analysis of whistlers recorded at Varanasi. It is found that the whistlers have propagated along > 2 and have wave normal angles after exiting from the ionosphere such that they propagate towards equator in the earth-ionosphere wave-guide. High-resolution analysis shows the presence of fine structures present in the dynamic spectrum. An effort is made to interpret the results.

  17. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens


    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  18. Performance evaluation of wavelet-based face verification on a PDA recorded database (United States)

    Sellahewa, Harin; Jassim, Sabah A.


    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  19. Similarities and differences of doctor-patient co-operated evidence-based medical record of treating digestive system diseases with integrative medicine compared with traditional medical records

    Institute of Scientific and Technical Information of China (English)

    Bo Li; Wen-Hong Shao; Yan-Da Li; Ying-Pan Zhao; Qing-Na Li; Zhao Yang; Hong-Cai Shang


    Objective: To establish the model of doctor-patient cooperated record, based on the concepts of narrative evidence-based medicine and related theories on Doctor-Patient Co-operated Evidence-Based Medical Record. Methods: We conducted a literature search from Pubmed, following the principles of narrative evidence-based medicine, and refer to the advice of experts of digestive system and EBM in both traditional Chinese medicine and Western medicine. Result: This research is a useful attempt to discuss the establishment of doctor-patient co-operated evidence-based medical record guided by narrative evidence-based medicine. Conclusion:Doctor-patient co-operated medical record can become a key factor of the curative effect evaluation methodology system of integrated therapy of tradition Chinese medicine and Western medicine on spleen and stomach diseases.%遵循叙事循证医学理念,咨询中西医消化内科及循证医学专家,凝练医患共建式病历的理论,建立医患共建式病历的范本,对比医患共建式病历与传统病历记录的不同,分析医患共建式病历的优缺点。思考与展望:医患共建式病历有可能成为中西医合作治疗脾胃病疗效评价方法学体系的一个要素。

  20. Task and error analysis balancing benefits over business of electronic medical records. (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B


    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR.

  1. Impact of the recorded variable on recurrence quantification analysis of flows (United States)

    Portes, Leonardo L.; Benda, Rodolfo N.; Ugrinowitsch, Herbert; Aguirre, Luis A.


    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps.

  2. A critical ear: Analysis of value judgements in reviews of Beethoven’s piano sonata recordings

    Directory of Open Access Journals (Sweden)

    Elena eAlessandri


    Full Text Available What sets a great music performance apart? In this study we addressed this question through an examination of value judgements in written criticism of recorded performance. One hundred reviews of recordings of Beethoven’s piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative expressed by critics’ statements and the evaluation criteria that underpinned their judgements. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  3. A Critical Ear: Analysis of Value Judgments in Reviews of Beethoven's Piano Sonata Recordings. (United States)

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron


    What sets a great music performance apart? In this study, we addressed this question through an examination of value judgments in written criticism of recorded performance. One hundred reviews of recordings of Beethoven's piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative) expressed by critics' statements and the evaluation criteria that underpinned their judgments. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  4. Robert Recorde

    CERN Document Server

    Williams, Jack


    The 16th-Century intellectual Robert Recorde is chiefly remembered for introducing the equals sign into algebra, yet the greater significance and broader scope of his work is often overlooked. This book presents an authoritative and in-depth analysis of the man, his achievements and his historical importance. This scholarly yet accessible work examines the latest evidence on all aspects of Recorde's life, throwing new light on a character deserving of greater recognition. Topics and features: presents a concise chronology of Recorde's life; examines his published works; describes Recorde's pro

  5. Hand-Based Biometric Analysis (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)


    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  6. Development and programming of Geophonino: A low cost Arduino-based seismic recorder for vertical geophones (United States)

    Soler-Llorens, J. L.; Galiana-Merino, J. J.; Giner-Caturla, J.; Jauregui-Eslava, P.; Rosa-Cintas, S.; Rosa-Herranz, J.


    The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.

  7. Spatiotemporal climatic, hydrological, and environmental variations based on records of annually laminated lake sediments from northern Poland (United States)

    Tylmann, W.; Blanke, L.; Kinder, M.; Loewe, T.; Mayr, C.; Ohlendorf, C.; Zolitschka, B.


    In northern Poland there is the unique opportunity to compare varved lake sediment records with distinct climatic trends along a 700 km long W-E transect. Annually laminated Holocene sediment sequences from Lake Lubinskie, Lake Suminko, Lake Lazduny, and Lake Szurpily were cored for high-resolution multiproxy climate and environmental reconstruction in the framework of the Polish-German project “Northern Polish Lake Research” (NORPOLAR). First results from a 139 cm long gravity core of Lake Lazduny (53°51.4’N, 21°57.3’E) document deposition of an organic (mean organic matter: 13.9%; mean biogenic opal: 9.8%) and highly carbonaceous gyttja (mean calcite content: 61.6%). The finely laminated sediment consists of biochemical varves. Pale spring/summer layers composed of autochthonous carbonates alternate with dark fall/winter layers made of organic and minerogenic detritus. The established chronology for the last 1500 calendar-years is based on thin section analysis supported by independent radiometric dating (C-14, Pb-210). Sedimentological, geochemical and stable isotope analyses were carried out with a decadal temporal resolution. Additionally, non-destructive and high-resolution XRF scanning data reveal a rhythmic variation in the Ca content that reflects seasonal calcite deposition. Redox-sensitive elements like Fe, Mn and S are interpreted to be the response to mean winter temperatures: colder winter temperatures → extended lake ice cover → intensification of meromixis → increased Fe/Mn ratio. In turn, these parameters can be linked to NAO (North Atlantic Oscillation) variability, because a negative NAO is related to colder and drier conditions in northeastern Europe. Climate variability is also mirrored by the δ13C record of the endogenic calcite fraction. In mid-latitude lakes calcite precipitation is dominated by productivity-controlled consumption of the dissolved inorganic carbon (DIC) pool. Thus the δ13C record potentially provides a

  8. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii (United States)

    Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.


    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.

  9. A technique to stabilize record bases for Gothic arch tracings in patients with implant-retained complete dentures. (United States)

    Raigrodski, A J; Sadan, A; Carruth, P L


    Clinicians have long expressed concern about the accuracy of the Gothic arch tracing for recording centric relation in edentulous patients. With the use of dental implants to assist in retaining complete dentures, the problem of inaccurate recordings, made for patients without natural teeth, can be significantly reduced. This article presents a technique that uses healing abutments to stabilize the record bases so that an accurate Gothic arch tracing can be made.

  10. A new record of a flathead fish (Teleostei: Platycephalidae) from China based on morphological characters and DNA barcoding (United States)

    Qin, Yan; Song, Na; Zou, Jianwei; Zhang, Zhaohui; Cheng, Guangping; Gao, Tianxiang; Zhang, Xiumei


    A new record of Platycephalus sp.1 (sensu Nakabo, 2002) was documented based on morphological characters and DNA barcoding. We collected 174 specimens of the genus Platycephalus from Chinese coastal waters of Dongying, Qingdao, Zhoushan, and Beihai. Samples were identified as Platycephalus sp.1 morphologically. The coloration, meristic traits, and morphometric measurements are consistent with previously published records. In brief, it is an orange-brown flathead fish with dark brown spots scattered on head and body, lateral line scales 83 to 99 with one or two spine-bearing anteriormost pored scale, no yellow blotch on the caudal fin. Cytochrome oxidase I subunit (COI) gene fragments were sequenced for phylogenetic analysis. The mean evolutionary distance within the species Platycephalus sp.1 was 0.1%. Net evolutionary distances between Platycephalus sp.1 and other species of Platycephalus ranged from 10.8% to 19.7%, which is much greater than the threshold for species delimitation. The COI sequence analysis strongly supports the validity of Platyceohalus sp.1 at genetic level.

  11. A new record of a flathead fish (Teleostei: Platycephalidae)from China based on morphological characters and DNA barcoding

    Institute of Scientific and Technical Information of China (English)

    QIN Yan; SONG Na; ZOU Jianwei; ZHANG Zhaohui; CHENG Guangping; GAO Tianxiang; ZHANG Xiumei


    A new record of Platycephalus sp.1 (sensu Nakabo,2002) was documented based on morphological characters and DNA barcoding.We collected 174 specimens of the genus Platycephalus from Chinese coastal waters of Dongying,Qingdao,Zhoushan,and Beihai.Samples were identified as Platycephalus sp.1 morphologically.The coloration,meristic traits,and morphometric measurements are consistent with previously published records.In brief,it is an orange-brown flathead fish with dark brown spots scattered on head and body,lateral line scales 83 to 99 with one or two spine-bearing anteriormost pored scale,no yellow blotch on the caudal fin.Cytochrome oxidase I subunit (COI) gene fragments were sequenced for phylogenetic analysis.The mean evolutionary distance within the species Platycephalus sp.1 was 0.1%.Net evolutionary distances between Platycephalus sp.1 and other species of Platycephalus ranged from 10.8% to 19.7%,which is much greater than the threshold for species delimitation.The COI sequence analysis strongly supports the validity ofPlatyceohalus sp.1 at genetic level.

  12. Comparison of the Hazard Mapping System (HMS) fire product to ground-based fire records in Georgia, USA (United States)

    Hu, Xuefei; Yu, Chao; Tian, Di; Ruminski, Mark; Robertson, Kevin; Waller, Lance A.; Liu, Yang


    Biomass burning has a significant and adverse impact on air quality, climate change, and various ecosystems. The Hazard Mapping System (HMS) detects fires using data from multiple satellite sensors in order to maximize its fire detection rate. However, to date, the detection rate of the HMS fire product for small fires has not been well studied, especially using ground-based fire records. This paper utilizes the 2011 fire information compiled from ground observations and burn authorizations in Georgia to assess the comprehensiveness of the HMS active fire product. The results show that detection rates of the hybrid HMS increase substantially by integrating multiple satellite instruments. The detection rate increases dramatically from 3% to 80% with an increase in fire size from less than 0.02 km2 to larger than 2 km2, resulting in detection of approximately 12% of all recorded fires which represent approximately 57% of the total area burned. The spatial pattern of detection rates reveals that grid cells with high detection rates are generally located in areas where large fires occur frequently. The seasonal analysis shows that overall detection rates in winter and spring (12% and 13%, respectively) are higher than those in summer and fall (3% and 6%, respectively), mainly because of higher percentages of large fires (>0.19 km2) that occurred in winter and spring. The land cover analysis shows that detection rates are 2-7 percentage points higher in land cover types that are prone to large fires such as forestland and shrub land.

  13. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System. (United States)

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul


    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  14. Extracting physician group intelligence from electronic health records to support evidence based medicine.

    Directory of Open Access Journals (Sweden)

    Griffin M Weber

    Full Text Available Evidence-based medicine employs expert opinion and clinical data to inform clinical decision making. The objective of this study is to determine whether it is possible to complement these sources of evidence with information about physician "group intelligence" that exists in electronic health records. Specifically, we measured laboratory test "repeat intervals", defined as the amount of time it takes for a physician to repeat a test that was previously ordered for the same patient. Our assumption is that while the result of a test is a direct measure of one marker of a patient's health, the physician's decision to order the test is based on multiple factors including past experience, available treatment options, and information about the patient that might not be coded in the electronic health record. By examining repeat intervals in aggregate over large numbers of patients, we show that it is possible to 1 determine what laboratory test results physicians consider "normal", 2 identify subpopulations of patients that deviate from the norm, and 3 identify situations where laboratory tests are over-ordered. We used laboratory tests as just one example of how physician group intelligence can be used to support evidence based medicine in a way that is automated and continually updated.

  15. Designing ETL Tools to Feed a Data Warehouse Based on Electronic Healthcare Record Infrastructure. (United States)

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L


    Aim of this paper is to propose a methodology to design Extract, Transform and Load (ETL) tools in a clinical data warehouse architecture based on the Electronic Healthcare Record (EHR). This approach takes advantages on the use of this infrastructure as one of the main source of information to feed the data warehouse, taking also into account that clinical documents produced by heterogeneous legacy systems are structured using the HL7 CDA standard. This paper describes the main activities to be performed to map the information collected in the different types of document with the dimensional model primitives.

  16. Integrated multimedia electronic patient record and graph-based image information for cerebral tumors. (United States)

    Puentes, John; Batrancourt, Bénédicte; Atif, Jamal; Angelini, Elsa; Lecornu, Laurent; Zemirline, Abdelhamid; Bloch, Isabelle; Coatrieux, Gouenou; Roux, Christian


    Current electronic patient record (EPR) implementations do not incorporate medical images, nor structural information extracted from them, despite images increasing role for diagnosis. This paper presents an integration framework into EPRs of anatomical and pathological knowledge extracted from segmented magnetic resonance imaging (MRI), applying a graph of representation for anatomical and functional information for individual patients. Focusing on cerebral tumors examination and patient follow-up, multimedia EPRs were created and evaluated through a 3D navigation application, developed with open-source libraries and standards. Results suggest that the enhanced clinical information scheme could lead to original changes in the way medical experts utilize image-based information.

  17. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis. (United States)

    Greensmith, David J


    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow.

  18. Analysis of the 23 June 2001 Southern Peru Earthquake Using Locally Recorded Seismic Data (United States)

    Tavera, H.; Comte, D.; Boroschek, R.; Dorbath, L.; Portugal, D.; Haessler, H.; Montes, H.; Bernal, I.; Antayhua, Y.; Salas, H.; Inza, A.; Rodriguez, S.; Glass, B.; Correa, E.; Balmaceda, I.; Meneses, C.


    The 23 June 2001, Mw=8.4 southern Peru earthquake ruptured the northern and central part of the previous large earthquake occurred on 13 August 1868, Mw ~9. A detailed analysis of the aftershock sequence was possible due to the deployment of a temporary seismic network along the coast in the Arequipa and Moquegua districts, complementing the Peruvian permanent stations. The deployed temporary network included 10 short period three component stations from the U. of Chile-IRD-France and 7 broad-band seismic stations from the Instituto Geofísico del Perú. This network operated during the first weeks after the mainshock and recorded the major aftershocks like the larger one occurred on 7 July 2001, Mw=7.5, this event defines the southern limit of the rupture area of the 2001 Peruvian earthquake. The majority of the aftershocks shows a thrusting fault focal mechanisms according with the average convergence direction of the subducting Nazca plate, however, normal faulting events are also present in the aftershock sequence like the 5 July 2001, Mw=6.6 one. The depth distribution of the events permitted a detailed definition of the Wadati-Benioff zone in the region. The segment between Ilo and Tacna did not participated in the rupture process of the 2001 southern Peru earthquake. Seismicity located near the political Peruvian-Chilean boundary was reliable determined using the data recorded by the northern Chile permanent network. Analysis of the mainshock and aftershock acelerograms recorded in Arica, northern Chile are also included. The occurrence of the 1995 Antofagasta (Mw=8.0) and the 2001 southern Peru earthquakes suggests that the probability of having a major earthquake in the northern Chile region increased, considering that the previous large earthquake in this region happened in 1877 (Mw ~9), and since that time no earthquake with magnitude Mw>8 had occurred inside of the 1877 estimated rupture area (between Arica and Antofagasta).

  19. A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation. (United States)

    Mori, Hiroki; Sumiya, Erika; Mashita, Tomohiro; Kiyokawa, Kiyoshi; Takemura, Haruo


    In this paper, we propose a wide-view parallax-free eye-mark recorder with a hyperboloidal half-silvered mirror and a gaze estimation method suitable for the device. Our eye-mark recorder provides a wide field-of-view video recording of the user's exact view by positioning the focal point of the mirror at the user's viewpoint. The vertical angle of view of the prototype is 122 degree (elevation and depression angles are 38 and 84 degree, respectively) and its horizontal view angle is 116 degree (nasal and temporal view angles are 38 and 78 degree, respectively). We implemented and evaluated a gaze estimation method for our eye-mark recorder. We use an appearance-based approach for our eye-mark recorder to support a wide field-of-view. We apply principal component analysis (PCA) and multiple regression analysis (MRA) to determine the relationship between the captured images and their corresponding gaze points. Experimental results verify that our eye-mark recorder successfully captures a wide field-of-view of a user and estimates gaze direction with an angular accuracy of around 2 to 4 degree.

  20. Development of Software for dose Records Data Base Access; Programacion para la consulta del Banco de Datos Dosimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M.


    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs.

  1. An iPad and Android-based Application for Digitally Recording Geologic Field Data (United States)

    Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.


    Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file

  2. An Internet-Based Real-Time Audiovisual Link for Dual MEG Recordings.

    Directory of Open Access Journals (Sweden)

    Andrey Zhdanov

    Full Text Available Most neuroimaging studies of human social cognition have focused on brain activity of single subjects. More recently, "two-person neuroimaging" has been introduced, with simultaneous recordings of brain signals from two subjects involved in social interaction. These simultaneous "hyperscanning" recordings have already been carried out with a spectrum of neuroimaging modalities, such as functional magnetic resonance imaging (fMRI, electroencephalography (EEG, and functional near-infrared spectroscopy (fNIRS.We have recently developed a setup for simultaneous magnetoencephalographic (MEG recordings of two subjects that communicate in real time over an audio link between two geographically separated MEG laboratories. Here we present an extended version of the setup, where we have added a video connection and replaced the telephone-landline-based link with an Internet connection. Our setup enabled transmission of video and audio streams between the sites with a one-way communication latency of about 130 ms. Our software that allows reproducing the setup is publicly available.We demonstrate that the audiovisual Internet-based link can mediate real-time interaction between two subjects who try to mirror each others' hand movements that they can see via the video link. All the nine pairs were able to synchronize their behavior. In addition to the video, we captured the subjects' movements with accelerometers attached to their index fingers; we determined from these signals that the average synchronization accuracy was 215 ms. In one subject pair we demonstrate inter-subject coherence patterns of the MEG signals that peak over the sensorimotor areas contralateral to the hand used in the task.

  3. Feasibility of ensuring confidentiality and security of computer-based patient records. Council on Scientific Affairs, American Medical Association. (United States)


    Legal and ethical precepts that apply to paper-based medical records, including requirements that patient records be kept confidential, accurate and legible, secure, and free from unauthorized access, should also apply to computer-based patient records. Sources of these precepts include federal regulations, state medical practice acts, licensing statutes and the regulations that implement them, accreditation standards, and professional codes of ethics. While the legal and ethical principles may not change, the risks to confidentiality and security of patient records appear to differ between paper- and computer-based records. Breaches of system security, the potential for faulty performance that may result in inaccessibility or loss of records, the increased technical ability to collect, store, and retrieve large quantities of data, and the ability to access records from multiple and (sometimes) remote locations are among the risk factors unique to computer-based record systems. Managing these risks will require a combination of reliable technological measures, appropriate institutional policies and governmental regulations, and adequate penalties to serve as a dependable deterrent against the infringement of these precepts.

  4. Estimating the frequency of extremely energetic solar events, based on solar, stellar, lunar, and terrestrial records

    CERN Document Server

    Schrijver, C J; Baltensperger, U; Cliver, E W; Guedel, M; Hudson, H S; McCracken, K G; Osten, R A; Peter, Th; Soderblom, D R; Usoskin, I G; Wolff, E W


    The most powerful explosions on the Sun [...] drive the most severe space-weather storms. Proxy records of flare energies based on SEPs in principle may offer the longest time base to study infrequent large events. We conclude that one suggested proxy, nitrate concentrations in polar ice cores, does not map reliably to SEP events. Concentrations of select radionuclides measured in natural archives may prove useful in extending the time interval of direct observations up to ten millennia, but as their calibration to solar flare fluences depends on multiple poorly known properties and processes, these proxies cannot presently be used to help determine the flare energy frequency distribution. Being thus limited to the use of direct flare observations, we evaluate the probabilities of large-energy solar explosions by combining solar flare observations with an ensemble of stellar flare observations. We conclude that solar flare energies form a relatively smooth distribution from small events to large flares, while...

  5. An evaluation of a teaching package constructed using a Web-based lecture recorder

    Directory of Open Access Journals (Sweden)

    Judith Segal


    Full Text Available This paper reports on an evaluation of a teaching package constructed using Audiograph, a Web-based lecture recorder developed at the University of Surrey. Audiograph is described in detail in Jesshope and Shafarenko (1997. Its developer aims to provide a medium by which multimedia teaching packages, based on traditional university lectures, may be developed rapidly by the lecturer(s concerned (as opposed to professional CAL developers at low cost. Audiograph is designed so that development time should only be in the order of two hours for every hour of presentation. Packages developed using Audiograph make much use of audio, which is somewhat unusual (apart from in video clips in a package not dedicated to Computer-Assisted Language Learning or to addressing learning difficulties associated with vision. They also use text and (some animation.

  6. Eielson Air Force Base operable unit 2 and other areas record of decision

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, R.E.; Smith, R.M.


    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater.

  7. Design of Secured Ground Vehicle Event Data Recorder for Data Analysis

    Directory of Open Access Journals (Sweden)

    Mr. Love Sharma


    Full Text Available The Event Data Recorder (EDR is now one of the important components installed in the vehicles by the automakers since it is helping in calculating an independent measurement of crash severity which is far better than the traditional systems used. There is limited research is done on the domain. In this paper we are going to propose an EDR which is based on ARM controller and will sense the alcohol, brake pressed, Speed, Location, Humidity, and Temperature. The data collected from the sensors is aggregated using a threshold-based technique, then the data is encrypted using RC6 and finally, the data is mined for knowledge using top k rules.

  8. Analysis of microseismic signals and temperature recordings for rock slope stability investigations in high mountain areas

    Directory of Open Access Journals (Sweden)

    C. Occhiena


    Full Text Available The permafrost degradation is a probable cause for the increase of rock instabilities and rock falls observed in recent years in high mountain areas, particularly in the Alpine region. The phenomenon causes the thaw of the ice filling rock discontinuities; the water deriving from it subsequently freezes again inducing stresses in the rock mass that may lead, in the long term, to rock falls. To investigate these processes, a monitoring system composed by geophones and thermometers was installed in 2007 at the Carrel hut (3829 m a.s.l., Matterhorn, NW Alps. In 2010, in the framework of the Interreg 2007–2013 Alcotra project no. 56 MASSA, the monitoring system has been empowered and renovated in order to meet project needs.

    In this paper, the data recorded by this renewed system between 6 October 2010 and 5 October 2011 are presented and 329 selected microseismic events are analysed. The data processing has concerned the classification of the recorded signals, the analysis of their distribution in time and the identification of the most important trace characteristics in time and frequency domain. The interpretation of the results has evidenced a possible correlation between the temperature trend and the event occurrence.

    The research is still in progress and the data recording and interpretation are planned for a longer period to better investigate the spatial-temporal distribution of microseismic activity in the rock mass, with specific attention to the relation of microseismic activity with temperatures. The overall goal is to verify the possibility to set up an effective monitoring system for investigating the stability of a rock mass under permafrost conditions, in order to supply the researchers with useful data to better understand the relationship between temperature and rock mass stability and, possibly, the technicians with a valid tool for decision-making.

  9. A compact self-recording pressure based sea level gauge suitable for deployments at harbour and offshore environments

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.; Peshwe, V.B.; Joseph, A.; Mehra, P.; Naik, G.P.; Kumar, V.; Desa, E.S.; Desai, R.G.P.; Nagvekar, S.; Desai, S.P.

    A compact and lightweight self-recording pressure based sea level gauge has been designed to suit deployments from harbour and offshore environments. A novel hydraulic coupling device designed in-house was used to transfer the seawater pressure...

  10. Astronomical calibration and global correlation of the Santonian (Cretaceous) based on the marine carbon isotope record (United States)

    Thibault, N.; Jarvis, I.; Voigt, S.; Gale, A. S.; Attree, K.; Jenkyns, H. C.


    High-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the sections at Seaford Head (southern England) and Bottaccione (central Italy). An unambiguous stratigraphic correlation is presented for the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of carbon and oxygen isotopes at Seaford Head points to the Boreal Santonian spanning five 405 kyr cycles (Sa1 to Sa5). Correlation of the Seaford Head time scale to that of the Niobrara Formation (Western Interior Basin) permits anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19 ± 0.38 Ma. Among the five tuning options examined, option 2 places the Sa/Ca at the 84.2 Ma 405 kyr insolation minimum and appears as the most likely. This solution indicates that minima of the 405 kyr filtered output of the resistivity in the Niobrara Formation correlate to 405 kyr insolation minima in the astronomical solution and to maxima in the filtered δ13C of Seaford Head. We suggest that variance in δ13C is driven by climate forcing of the proportions of CaCO3 versus organic carbon burial on land and in oceanic basins. The astronomical calibration generates a 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, due either to diachronism of the lowest occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions or to remaining uncertainties of radiometric dating and cyclostratigraphic records.

  11. Validation of connectivity-based thalamic segmentation with direct electrophysiologic recordings from human sensory thalamus. (United States)

    Elias, W Jeffrey; Zheng, Zhong A; Domer, Paul; Quigg, Mark; Pouratian, Nader


    Connectivity-based segmentation has been used to identify functional gray matter subregions that are not discernable on conventional magnetic resonance imaging. However, the accuracy and reliability of this technique has only been validated using indirect means. In order to provide direct electrophysiologic validation of connectivity-based thalamic segmentations within human subjects, we assess the correlation of atlas-based thalamic anatomy, connectivity-based thalamic maps, and somatosensory evoked thalamic potentials in two adults with medication-refractory epilepsy who were undergoing intracranial EEG monitoring with intrathalamic depth and subdural cortical strip electrodes. MRI with atlas-derived localization was used to delineate the anatomic boundaries of the ventral posterolateral (VPL) nucleus of the thalamus. Somatosensory evoked potentials with intrathalamic electrodes physiologically identified a discrete region of phase reversal in the ventrolateral thalamus. Finally, DTI was obtained so that probabilistic tractography and connectivity-based segmentation could be performed to correlate the region of thalamus linked to sensory areas of the cortex, namely the postcentral gyrus. We independently utilized these three different methods in a blinded fashion to localize the "sensory" thalamus, demonstrating a high-degree of reproducible correlation between electrophysiologic and connectivity-based maps of the thalamus. This study provides direct electrophysiologic validation of probabilistic tractography-based thalamic segmentation. Importantly, this study provides an electrophysiological basis for using connectivity-based segmentation to further study subcortical anatomy and physiology while also providing the clinical basis for targeting deep brain nuclei with therapeutic stimulation. Finally, these direct recordings from human thalamus confirm early inferences of a sensory thalamic component of the N18 waveform in somatosensory evoked potentials.

  12. Visibility Graph Based Time Series Analysis (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie


    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  13. Vocal registers of the countertenor voice: Based on signals recorded and analyzed in VoceVista (United States)

    Chenez, Raymond

    Today's countertenors possess vocal ranges similar to the mezzo-soprano, and are trained to sing with a vibrant, focused tone. Little research has been conducted on the registers of the countertenor voice. Advancement in vocal techniques in the countertenor voice from the late 20th century to the present has been rapid. This treatise attempts to define the registers of the countertenor voice, and is intended as a resource for singers and teachers. The voices of eleven North American countertenors were recorded and analyzed using VoceVista Pro software, which was developed and designed by Donald Miller. Through spectrographic and electroglottographic analysis, the registers of the countertenor voice were identified and outlined.

  14. Stochasticity of Road Traffic Dynamics: Comprehensive Linear and Nonlinear Time Series Analysis on High Resolution Freeway Traffic Records

    CERN Document Server

    Siegel, H; Siegel, Helge; Belomestnyi, Dennis


    The dynamical properties of road traffic time series from North-Rhine Westphalian motorways are investigated. The article shows that road traffic dynamics is well described as a persistent stochastic process with two fixed points representing the freeflow (non-congested) and the congested state regime. These traffic states have different statistical properties, with respect to waiting time distribution, velocity distribution and autocorrelation. Logdifferences of velocity records reveal non-normal, obviously leptocurtic distribution. Further, linear and nonlinear phase-plane based analysis methods yield no evidence for any determinism or deterministic chaos to be involved in traffic dynamics on shorter than diurnal time scales. Several Hurst-exponent estimators indicate long-range dependence for the free flow state. Finally, our results are not in accordance to the typical heuristic fingerprints of self-organized criticality. We suggest the more simplistic assumption of a non-critical phase transition between...

  15. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson


    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  16. Experimental analysis of decay biases in the fossil record of lobopodians (United States)

    Murdock, Duncan; Gabbott, Sarah; Purnell, Mark


    If fossils are to realize their full potential in reconstructing the tree of life we must understand how our view of ancient organisms is obscured by taphonomic filters of decay and preservation. In most cases, processes of decay will leave behind either nothing or only the most decay resistant body parts, and even in those rare instances where soft tissues are fossilized we cannot assume that the resulting fossil, however exquisite, represents a faithful anatomical representation of the animal as it was in life.Recent experiments have shown that the biases introduced by decay can be far from random; in chordates, for example, the most phylogenetically informative characters are also the most decay-prone, resulting in 'stemward slippage'. But how widespread is this phenomenon, and are there other non-random biases linked to decay? Intuitively, we make assumptions about the likelihood of different kinds of characters to survive and be preserved, with knock-on effects for anatomical and phylogenetic interpretations. To what extent are these assumptions valid? We combine our understanding of the fossil record of lobopodians with insights from decay experiments of modern onychophorans (velvet worms) to test these assumptions. Our analysis demonstrates that taphonomically informed tests of character interpretations have the potential to improve phylogenetic resolution. This approach is widely applicable to the fossil record - allowing us to ground-truth some of the assumptions involved in describing exceptionally preserved fossil material.

  17. Conflict Detection Performance Analysis for Function Allocation Using Time-Shifted Recorded Traffic Data (United States)

    Guerreiro, Nelson M.; Butler, Ricky W.; Maddalon, Jeffrey M.; Hagen, George E.; Lewis, Timothy A.


    The performance of the conflict detection function in a separation assurance system is dependent on the content and quality of the data available to perform that function. Specifically, data quality and data content available to the conflict detection function have a direct impact on the accuracy of the prediction of an aircraft's future state or trajectory, which, in turn, impacts the ability to successfully anticipate potential losses of separation (detect future conflicts). Consequently, other separation assurance functions that rely on the conflict detection function - namely, conflict resolution - are prone to negative performance impacts. The many possible allocations and implementations of the conflict detection function between centralized and distributed systems drive the need to understand the key relationships that impact conflict detection performance, with respect to differences in data available. This paper presents the preliminary results of an analysis technique developed to investigate the impacts of data quality and data content on conflict detection performance. Flight track data recorded from a day of the National Airspace System is time-shifted to create conflicts not present in the un-shifted data. A methodology is used to smooth and filter the recorded data to eliminate sensor fusion noise, data drop-outs and other anomalies in the data. The metrics used to characterize conflict detection performance are presented and a set of preliminary results is discussed.

  18. [Internal audit based on the recording critical incidents: the first results]. (United States)

    Terekhova, N N; Kazakova, E A; Sitnikov, A V


    The critical incident concept on which an internal medical audit is based has been proposed to comparatively assess different protocols of anesthesiological support. The purpose of this study was to develop a procedure and to implement it at an anesthesiological unit. The study included and analyzed 361 anesthesiological supports. The list of critical incidents (CIs) contained 53 items and was divided into 8 main groups. CIs were recorded in 42.1% of anesthesias: a total of 304 CIs were noted and the frequency of CIs (the number of recorded CIs per anesthesia was 0.84). The bulk of CIs was associated with the cardiovascular system and varying allergic reactions. The study also yielded data on the distribution of CIs in relation to the type of anesthesiological support, the type of a surgical intervention and the physical status of a patient (according to the ASA classification). This study has only opened a little way to internal audit and showed the importance of its routine use to assess different procedures for anesthesiological support.

  19. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited) (United States)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.


    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  20. Provincial prenatal record revision: a multiple case study of evidence-based decision-making at the population-policy level

    Directory of Open Access Journals (Sweden)

    Olson Joanne


    Full Text Available Abstract Background There is a significant gap in the knowledge translation literature related to how research evidence actually contributes to health care decision-making. Decisions around what care to provide at the population (rather than individual level are particularly complex, involving considerations such as feasibility, cost, and population needs in addition to scientific evidence. One example of decision-making at this "population-policy" level involves what screening questions and intervention guides to include on standardized provincial prenatal records. As mandatory medical reporting forms, prenatal records are potentially powerful vehicles for promoting population-wide evidence-based care. However, the extent to which Canadian prenatal records reflect best-practice recommendations for the assessment of well-known risk factors such as maternal smoking and alcohol consumption varies markedly across Canadian provinces and territories. The goal of this study is to better understand the interaction of contextual factors and research evidence on decision-making at the population-policy level, by examining the processes by which provincial prenatal records are reviewed and revised. Methods Guided by Dobrow et al.'s (2004 conceptual model for context-based evidence-based decision-making, this study will use a multiple case study design with embedded units of analysis to examine contextual factors influencing the prenatal record revision process in different Canadian provinces and territories. Data will be collected using multiple methods to construct detailed case descriptions for each province/territory. Using qualitative data analysis techniques, decision-making processes involving prenatal record content specifically related to maternal smoking and alcohol use will be compared both within and across each case, to identify key contextual factors influencing the uptake and application of research evidence by prenatal record review

  1. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli


    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  2. Detrended fluctuation analysis of daily temperature records: Geographic dependence over Australia

    CERN Document Server

    Kir'aly, A; Kir\\'aly, Andrea; J\\'anosi, Imre M.


    Daily temperature anomaly records are analyzed (61 for Australia, 18 for Hungary) by means of detrended fluctuation analysis. Positive long range asymptotic correlations extending up to 5-10 years are detected for each case. Contrary to earlier claims, the correlation exponent is not universal for continental stations. Interestingly, the dominant factor is geographic latitude over Australia: the general tendency is a decrease of correlation exponent with increasing distance from the equator. This tendency is in a complete agreement with the results found by Tsonis et al. (1999) for 500-hPa height anomalies in the northern hemisphere. The variance of fluctuations exhibits an opposite trend, the larger is the distance from the equator, the larger the amplitude of intrinsic fluctuations. The presence of Tropospheric Biennial Oscillation is clearly identified for three stations at the north-eastern edge of the Australian continent.

  3. Thermal Analysis of Heat-Assisted Magnetic Recording Optical Head with Laser Diode on Slider (United States)

    Xu, Baoxi; Chia, Cheow Wee; Zhang, Qide; Teck Toh, Yeow; An, Chengwu; Vienne, Guillaume


    For the optical head used in heat-assisted magnetic recording (HAMR), mounting a laser diode chip on the slider offers a more integrated, compact, and stable design. However, the heat generated by the laser diode will cause the head temperature to increase, which may decrease the laser output power and change the slider flying status. In this paper, the thermal analysis of the HAMR head including the laser diode and a transducer is conducted. The effects of the laser diode power, the power absorbed by the transducer, boundary thermal resistance between the laser diode chip and the slider substrate, and slider fly speed and fly height on the laser temperature increase, the transducer temperature increase, and the air-bearing surface temperature distribution are studied. The deformation of the air-bearing surface caused by its temperature change is also analyzed.

  4. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo


    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  5. Analysis of the Impact of Wildfire on Surface Ozone Record in the Colorado Front Range (United States)

    McClure-Begley, A.; Petropavlovskikh, I. V.; Oltmans, S. J.; Pierce, R. B.; Sullivan, J. T.; Reddy, P. J.


    Ozone plays an important role on the oxidation capacity of the atmosphere, and at ground-level has negative impacts on human health and ecosystem processes. In order to understand the dynamics and variability of surface ozone, it is imperative to analyze individual sources, interactions between sources, transport, and chemical processes of ozone production and accumulation. Biomass burning and wildfires have been known to emit a suite of particulate matter and gaseous compounds into the atmosphere. These compounds, such as, volatile organic compounds, carbon monoxide, and nitrogen oxides are precursor species which aid in the photochemical production and destruction of ozone. The Colorado Front Range (CFR) is a region of complex interactions between pollutant sources and meteorological conditions which result in the accumulation of ozone. High ozone events in the CFR associated with fires are analyzed for 2003-2014 to develop understanding of the large scale influence and variability of ozone and wildfire relationships. This study provides analysis of the frequency of enhanced ozone episodes that can be confirmed to be transported within and affected by the fires and smoke plumes. Long-term records of surface ozone data from the CFR provide information on the impact of wildfire pollutants on seasonal and diurnal ozone behavior. Years with increased local fire activity, as well as years with increased long-range transport of smoke plumes, are evaluated for the effect on the long-term record and high ozone frequency of each location. Meteorological data, MODIS Fire detection images, NOAA HYSPLIT Back Trajectory analysis, NOAA Smoke verification model, Fire Tracer Data (K+), RAQMS Model, Carbon Monoxide data, and Aerosol optical depth retrievals are used with NOAA Global Monitoring Division surface ozone data from three sites in Colorado. This allows for investigation of the interactions between pollutants and meteorology which result in high surface ozone levels.

  6. Preservation and analysis of footprint evidence within the archaeological record: examples from Valsequillo and Cuatrocienegas, Mexico. (United States)

    Bennett, M.; Huddart, D.; Gonzalez, S.


    Human footprints provide a direct record of human occupation and can be used to make a range of biometric inferences about the individuals which left them. In this paper we describe the application of three-dimensional optical laser scanning in the preservation and analysis both human and animal footprints. Optical laser scanning provides a digital elevation model of a print or surface with a vertical accuracy typically less than + 0.01 mm. Not only does this provide a procedure for recording fragile footprint evidence but allows digital measurements to be made. It is also possible to use the techniques developed for rapid proto-typing to recreate the print as solid models for visualisation. The role of optical laser scanning in the preservation of footprint evidence is explored with specific reference to the controversial footprints of the Valsequillo Basin in Central Mexico which may provide some of the earliest evidence of human colonization of the Americas. More importantly, digital footprint scans provide a basis for the numerical analysis of footprints allowing the tools of geometric morphometrics to be applied. These tools have been widely developed in the fields of biology and physical anthropology and used to explore the anatomical significance of shape. One key question that can be addressed using this approach is to develop a statistical approach to the objective recognition of a human footprint thereby helping to verify their interpretation and archaeological significance. Using footprint data from sites across the World a statistical model for the recognition of human footprints is presented and used to evaluate the controversial footprint site of Valsequillo, (Puebla State) preserved in volcanic ash and those in the Cuatrocienegas Basin, (Coahuila State) preserved in travertine.

  7. A new method for estimating morbidity rates based on routine electronic medical records in primary care

    NARCIS (Netherlands)

    Nielen, M.; Spronk, I.; Davids, R.; Korevaar, J.; Poos, R.; Hoeymans, N.; Opstelten, W.; Sande, M. van der; Biermans, M.; Schellevis, F.; Verheij, R.


    Background & Aim: Routinely recorded electronic health records (EHRs) from general practitioners (GPs) are increasingly available and provide valuable data for estimating incidence and prevalence rates of diseases in the general population. Valid morbidity rates are essential for patient management

  8. The implantable loop recorder and its mammographic appearance: A case based approach. (United States)

    Steinberger, Sharon; Margolies, Laurie R


    The normal radiographic appearance of implantable loop recorders has been illustrated in the radiology literature; however, their mammographic appearance has not been described. Breast imagers should become familiar with the appearance of loop recorders in order to create an accurate report. In this paper we report 3 cases of patients with implantable loop recorders who underwent mammography. We describe the types and components of implantable loop recorders, indications for their placement, and their classic appearance on mammography.

  9. Comparison of individual follow-up and computerized record linkage using the Canadian Mortality Data Base. (United States)

    Shannon, H S; Jamieson, E; Walsh, C; Julian, J A; Fair, M E; Buffet, A


    We compared two methods of ascertaining mortality in a historical prospective mortality study. Computerized Record Linkage (CRL) with the centralized historical Canadian Mortality Data Base (CMDB) was carried out on 2469 men and an attempt was also made to trace the subjects by individual follow-up (IFU). All but 88 were traced and 60 were reported to be dead. CRL was able to locate the deaths of three men who had been untraced by IFU. Contradictory information on vital status was obtained on 5 subjects--in 4 of them, the discrepancy was resolved in favour of CRL. Overall, CRL using the CMDB performed very well. We also consider factors that affect the relative costs of the two methods, which should be balanced against the accuracy of information obtained.

  10. A cloud-based approach for interoperable electronic health records (EHRs). (United States)

    Bahga, Arshdeep; Madisetti, Vijay K


    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  11. Video-Based Motion Analysis (United States)

    French, Paul; Peterson, Joel; Arrighi, Julie


    Video-based motion analysis has recently become very popular in introductory physics classes. This paper outlines general recommendations regarding equipment and software; videography issues such as scaling, shutter speed, lighting, background, and camera distance; as well as other methodological aspects. Also described are the measurement and modeling of the gravitational, drag, and Magnus forces on 1) a spherical projectile undergoing one-dimensional motion and 2) a spinning spherical projectile undergoing motion within a plane. Measurement and correction methods are devised for four common, major sources of error: parallax, lens distortion, discretization, and improper scaling.

  12. Tremor recording and analysis as a tool for target localisation in thalamotomy and DBS for tremor

    NARCIS (Netherlands)

    Journee, HL; Hamoen, DJ; Staal, MJ; Sclabassi, R; Haaxma, R; Elands, A; Hummel, JJJ; Boom, H; Robinson, C; Rutten, W; Neuman, M; Wijkstra, H


    The objective of this work was to design and use a tremor and analysis system for stereotactic thalamotomy and thalamus stimulation (DBS). A notebook PC based system was developed. The tremor was measured by accelero-transducers or EMG. The method was used to confirm the definitive localization of t

  13. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Fong-Lin Jang


    Full Text Available The health care sector has become increasingly interested in developing personal health record (PHR systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM that integrates the physician-patient relationship (PPR construct into TAM’s original perceived ease of use (PEOU and perceived usefulness (PU constructs to explore which factors will influence the behavioral intentions (BI of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR.

  14. Massively parallel recording of unit and local field potentials with silicon-based electrodes. (United States)

    Csicsvari, Jozsef; Henze, Darrell A; Jamieson, Brian; Harris, Kenneth D; Sirota, Anton; Barthó, Péter; Wise, Kensall D; Buzsáki, György


    Parallel recording of neuronal activity in the behaving animal is a prerequisite for our understanding of neuronal representation and storage of information. Here we describe the development of micro-machined silicon microelectrode arrays for unit and local field recordings. The two-dimensional probes with 96 or 64 recording sites provided high-density recording of unit and field activity with minimal tissue displacement or damage. The on-chip active circuit eliminated movement and other artifacts and greatly reduced the weight of the headgear. The precise geometry of the recording tips allowed for the estimation of the spatial location of the recorded neurons and for high-resolution estimation of extracellular current source density. Action potentials could be simultaneously recorded from the soma and dendrites of the same neurons. Silicon technology is a promising approach for high-density, high-resolution sampling of neuronal activity in both basic research and prosthetic devices.

  15. Reconstruction of Subdecadal Changes in Sunspot Numbers Based on the NGRIP 10Be Record (United States)

    Inceoglu, F.; Knudsen, M. F.; Karoff, C.; Olsen, J.


    Sunspot observations since 1610 A.D. show that the solar magnetic activity displays long-term changes, from Maunder Minimum-like low-activity states to Modern Maximum-like high-activity episodes, as well as short-term variations, such as the pronounced 11-year periodicity. Information on changes in solar activity levels before 1610 relies on proxy records of solar activity stored in natural archives, such as 10Be in ice cores and 14C in tree rings. These cosmogenic radionuclides are produced by the interaction between Galactic cosmic rays (GCRs) and atoms in the Earth's atmosphere; their production rates are anti-correlated with the solar magnetic activity. The GCR intensity displays a distinct 11-year periodicity due to solar modulation of the GCRs in the heliosphere, which is inversely proportional to, but out of phase with, the 11-year solar cycle. This implies a time lag between the actual solar cycles and the GCR intensity, which is known as the hysteresis effect. In this study, we use the North Greenland Ice Core Project (NGRIP) records of the 10Be flux to reconstruct the solar modulation strength (Φ), which describes the modulation of GCRs throughout the heliosphere, to reconstruct both long-term and subdecadal changes in sunspot numbers (SSNs). We compare three different approaches for reconstructing subdecadal-scale changes in SSNs, including a linear approach and two approaches based on the hysteresis effect, i.e. models with ellipse-linear and ellipse relationships between Φ and SSNs. We find that the ellipse approach provides an amplitude-sensitive reconstruction and the highest cross-correlation coefficients in comparison with the ellipse-linear and linear approaches. The long-term trend in the reconstructed SSNs is computed using a physics-based model and agrees well with the other group SSN reconstructions. The new empirical approach, combining a physics-based model with ellipse-modeling of the 11-year cycle, therefore provides a method for

  16. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis. (United States)

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J


    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.

  17. Certification-Based Process Analysis (United States)

    Knight, Russell L.


    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  18. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Frank Kurth


    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  19. [Application Status of Evaluation Methodology of Electronic Medical Record: Evaluation of Bibliometric Analysis]. (United States)

    Lin, Dan; Liu, Jialin; Zhang, Rui; Li, Yong; Huang, Tingting


    In order to provide a reference and theoretical guidance of the evaluation of electronic medical record (EMR) and establishment of evaluation system in China, we applied a bibliometric analysis to assess the application of methodologies used at home and abroad, as well as to summarize the advantages and disadvantages of them. We systematically searched international medical databases of Ovid-MEDLINE, EBSCOhost, EI, EMBASE, PubMed, IEEE, and China's medical databases of CBM and CNKI between Jan. 1997 and Dec. 2012. We also reviewed the reference lists of articles for relevant articles. We selected some qualified papers according to the pre-established inclusion and exclusion criteria, and did information extraction and analysis to the papers. Eventually, 1 736 papers were obtained from online database and other 16 articles from manual retrieval. Thirty-five articles met the inclusion and exclusion criteria and were retrieved and assessed. In the evaluation of EMR, US counted for 54.28% in the leading place, and Canada and Japan stood side by side and ranked second with 8.58%, respectively. For the application of evaluation methodology, Information System Success Model, Technology Acceptance Model (TAM), Innovation Diffusion Model and Cost-Benefit Access Model were widely applied with 25%, 20%, 12.5% and 10%, respectively. In this paper, we summarize our study on the application of methodologies of EMR evaluation, which can provide a reference to EMR evaluation in China.

  20. Fragmented implementation of maternal and child health home-based records in Vietnam: need for integration

    Directory of Open Access Journals (Sweden)

    Hirotsugu Aiga


    Full Text Available Background: Home-based records (HBRs are globally implemented as the effective tools that encourage pregnant women and mothers to timely and adequately utilise maternal and child health (MCH services. While availability and utilisation of nationally representative HBRs have been assessed in several earlier studies, the reality of a number of HBRs subnationally implemented in a less coordinated manner has been neither reported nor analysed. Objectives: This study is aimed at estimating the prevalence of HBRs for MCH and the level of fragmentation of and overlapping between different HBRs for MCH in Vietnam. The study further attempts to identify health workers’ and mothers’ perceptions towards HBR operations and utilisations. Design: A self-administered questionnaire was sent to the provincial health departments of 28 selected provinces. A copy of each HBR available was collected from them. A total of 20 semi-structured interviews with health workers and mothers were conducted at rural communities in four of 28 selected provinces. Results: Whereas HBRs developed exclusively for maternal health and exclusively for child health were available in four provinces (14% and in 28 provinces (100%, respectively, those for both maternal health and child health were available in nine provinces (32%. The mean number of HBRs in 28 provinces (=5.75 indicates over-availability of HBRs. All 119 minimum required items for recording found in three different HBRs under nationwide scale-up were also included in the Maternal and Child Health Handbook being piloted for nationwide scaling-up. Implementation of multiple HBRs is likely to confuse not only health workers by requiring them to record the same data on several HBRs but also mothers about which HBR they should refer to and rely on at home. Conclusions: To enable both health workers and pregnant women to focus on only one type of HBR, province-specific HBRs for maternal and/or child health need to be

  1. Digital Audio Legal Recorder (United States)

    Department of Transportation — The Digital Audio Legal Recorder (DALR) provides the legal recording capability between air traffic controllers, pilots and ground-based air traffic control TRACONs...

  2. Least-squares self-coherency analysis of superconducting gravimeter records in search for the Slichter triplet (United States)

    Pagiatakis, Spiros D.; Yin, Hui; El-Gelil, Mahmoud Abd


    We develop a new approach for the spectral analysis of the superconducting gravimeter data to search for the spheroidal oscillation 1S1 of the Earth solid inner core. The new method, which we call least- squares ( LS) self- coherency analysis, is based on the product of the least-squares spectra of segments of the time series under consideration. The statistical foundation of this method is presented in the new least- squares product spectrum theorem that establishes rigorously confidence levels for detecting significant peaks. We apply this approach along with a number of other innovative ideas to a 6-year long gravity series collected at the Canadian Superconducting Gravimeter Installation (CSGI) in Cantley, Canada, by splitting it into 72 statistically independent monthly records. Each monthly record is analysed spectrally and all monthly LS spectra are multiplied to construct the self- coherency spectrum of the 6-year gravity series. The self-coherency spectrum is then used to detect significant peaks in the band 3-7 h at various significant levels with the aim to identify a triplet of periods associated with the rotational/ellipsoidal splitting of 1S1 (Slichter triplet). From all the Slichter periods predicted by various researchers so far, Smylie's triplet appears to be the most supported one, albeit very weakly, both, before and after the atmospheric pressure effect is removed from the series. Using the viscous splitting law [Smylie, D.E., 1992. The inner core translational triplet and the density near Earth's center. Science 255, 1678-1682] as guide, we can also see one interesting and statistically significant triplet with periods A = {4.261 h, 4.516 h, 4.872 h}, which changes slightly to A' = {4.269 h, 4.516 h, 4.889 h} after the atmospheric pressure correction is applied to the gravity series.

  3. The Earth's palaeorotation, postglacial rebound and lower mantle viscosity from analysis of ancient Chinese eclipse records (United States)

    Pang, Kevin D.; Yau, Kevin; Chou, Hung-Hsiang


    Of the forces changing the Earth's rotation, tidal braking and postglacial rebound predominate at a timescale≥102 yr ( Hide and Dickey, 1991; Dickey, 1992). Analysis of ancient eclipse records has given values for the clock error Δ T and the time derivative of the Earth's dynamic oblatenessdot J_2 for the past 3,300 yr. Since Δ T=AT-UT=ct 2 , where AT is Atomic (cesium clock) Time, UT is Universal (Earth rotation) Time, and t is the number of centuries before 1800, the oldest data have the most weight. Sunrise and sunset eclipses are especially valuable, as they can be retrospectively timed. The Bamboo Annals, entombed in 299 B.C. and unearthed in A.D. 281, states that “in the first year of King Yi of the Western Zhou dynasty the day dawned twice at Zheng (34.5°N, 109.8°E)”. Kaiyuan zhanjing (Siddhartha, A.D. 724) cites this passage and adds that “in the 2nd (actually 12th) year of Sheng Ping reign period of King Shang (actually King Xi) the day began twice at Zheng”. Matching these records with the April 21, 899 B.C. and April 4, A.D. 368 sunrise eclipses (Oppolzer eclipse Nos. 732 and 3747) gave Δ T values of 5.8±0.15 and 1.7±0.1 hr, respectively. The recurrence of a central solar eclipse at the same site under almost identical circumstances accurately links up an ancient Δ T value with a more precise medieval one, and makes the statistics of such early data more robust. The brightness changes for the magnitudes 0.95 0.97 and 0.991 0.998 eclipses were greater than those for the January 4, 1992 “double sunset” over Southern California, U.S.A. (magnitude 0.91 0.92). David H. Levy noted that “... as annularity ended. Sunset had come and gone, but the sky began to brighten not darken. For almost 15 minutes it continued to brighten until the onrushing shadow of Earth took over and darkness fell again ...” (Sky Telesc. 83, 694). We have analyzed even earlier records from the Shang dynasty. Six solar eclipse records have been identified among

  4. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K


    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  5. A 600-ka Arctic sea-ice record from Mendeleev Ridge based on ostracodes (United States)

    Cronin, Thomas M.; Polyak, L.V.; Reed, D.; Kandiano, E. S.; Marzen, R. E.; Council, E. A.


    Arctic paleoceanography and sea-ice history were reconstructed from epipelagic and benthic ostracodes from a sediment core (HLY0503-06JPC, 800 m water depth) located on the Mendeleev Ridge, Western Arctic Ocean. The calcareous microfaunal record (ostracodes and foraminifers) covers several glacial/interglacial cycles back to estimated Marine Isotope Stage 13 (MIS 13, ∼500 ka) with an average sedimentation rate of ∼0.5 cm/ka for most of the stratigraphy (MIS 5–13). Results based on ostracode assemblages and an unusual planktic foraminiferal assemblage in MIS 11 dominated by a temperate-water species Turborotalita egelida show that extreme interglacial warmth, high surface ocean productivity, and possibly open ocean convection characterized MIS 11 and MIS 13 (∼400 and 500 ka, respectively). A major shift in western Arctic Ocean environments toward perennial sea ice occurred after MIS 11 based on the distribution of an ice-dwelling ostracode Acetabulastoma arcticum. Spectral analyses of the ostracode assemblages indicate sea ice and mid-depth ocean circulation in western Arctic Ocean varied primarily at precessional (∼22 ka) and obliquity (∼40 ka) frequencies.

  6. EHR query language (EQL)--a query language for archetype-based health records. (United States)

    Ma, Chunlan; Frankel, Heath; Beale, Thomas; Heard, Sam


    OpenEHR specifications have been developed to standardise the representation of an international electronic health record (EHR). The language used for querying EHR data is not as yet part of the specification. To fill in this gap, Ocean Informatics has developed a query language currently known as EHR Query Language (EQL), a declarative language supporting queries on EHR data. EQL is neutral to EHR systems, programming languages and system environments and depends only on the openEHR archetype model and semantics. Thus, in principle, EQL can be used in any archetype-based computational context. In the EHR context described here, particular queries mention concepts from the openEHR EHR Reference Model (RM). EQL can be used as a common query language for disparate archetype-based applications. The use of a common RM, archetypes, and a companion query language, such as EQL, semantic interoperability of EHR information is much closer. This paper introduces the EQL syntax and provides example clinical queries to illustrate the syntax. Finally, current implementations and future directions are outlined.

  7. Image Processing Based Girth Monitoring and Recording System for Rubber Plantations

    Directory of Open Access Journals (Sweden)

    Chathura Thilakarathne


    Full Text Available Measuring the girth and continuous monitoring of the increase in girth is one of the most important processes in rubber plantations since identification of girth deficiencies would enable planters to take corrective actions to ensure a good yield from the plantation. This research paper presents an image processing based girth measurement & recording system that can replace existing manual process in an efficient and economical manner. The system uses a digital image of the tree which uses the current number drawn on the tree to identify the tree number & its width. The image is threshold first & then filtered out using several filtering criterion to identify possible candidates for numbers. Identified blobs are then fed to the Tesseract OCR for number recognition. Threshold image is then filtered again with different criterion to segment out the black strip drawn on the tree which is then used to calculate the width of the tree using calibration parameters. Once the tree number is identified & width is calculated the girth the measured girth of the tree is stored in the data base under the identified tree number. The results obtained from the system indicated significant improvement in efficiency & economy for main plantations. As future developments we are proposing a standard commercial system for girth measurement using standardized 2D Bar Codes as tree identifiers

  8. TECHNICAL NOTE: The development of a PZT-based microdrive for neural signal recording (United States)

    Park, Sangkyu; Yoon, Euisung; Lee, Sukchan; Shin, Hee-sup; Park, Hyunjun; Kim, Byungkyu; Kim, Daesoo; Park, Jongoh; Park, Sukho


    A hand-controlled microdrive has been used to obtain neural signals from rodents such as rats and mice. However, it places severe physical stress on the rodents during its manipulation, and this stress leads to alertness in the mice and low efficiency in obtaining neural signals from the mice. To overcome this issue, we developed a novel microdrive, which allows one to adjust the electrodes by a piezoelectric device (PZT) with high precision. Its mass is light enough to install on the mouse's head. The proposed microdrive has three H-type PZT actuators and their guiding structure. The operation principle of the microdrive is based on the well known inchworm mechanism. When the three PZT actuators are synchronized, linear motion of the electrode is produced along the guiding structure. The electrodes used for the recording of the neural signals from neuron cells were fixed at one of the PZT actuators. Our proposed microdrive has an accuracy of about 400 nm and a long stroke of about 5 mm. In response to formalin-induced pain, single unit activities are robustly measured at the thalamus with electrodes whose vertical depth is adjusted by the microdrive under urethane anesthesia. In addition, the microdrive was efficient in detecting neural signals from mice that were moving freely. Thus, the present study suggests that the PZT-based microdrive could be an alternative for the efficient detection of neural signals from mice during behavioral states without any stress to the mice.

  9. Flood frequency analysis and discussion of non-stationarity of the Lower Rhine flooding regime (AD 1350-2011): Using discharge data, water level measurements, and historical records (United States)

    Toonen, W. H. J.


    Accurate estimates of the recurrence time of extreme floods are essential to assess flood safety in flood-prone regions, such as the Lower Rhine in The Netherlands. Measured discharge records have a limited length and are, in general, poorly representing extremes, which results in considerable uncertainties when used for flood frequency analysis. In this paper, it is shown how alternative discharge monitoring stations along the Rhine, measurements of water levels, and historical records can be used to increase data availability. Although pre-processing and the conversion of data types into discharge estimates introduces extra uncertainty, the added value of this data in flood frequency analysis is considerable, because extending record length by including slightly less-precise data results in much better constrained estimates for the discharges and recurrence intervals of extreme events. Based on results obtained with the Generalised Extreme Value (GEV) distribution, it was concluded that large floods of the last century are presumably rarer than previously considered using shorter data series. Moreover, the combined effect of climatic and anthropogenic-induced non-stationarities of the flooding regime is more easily recognised in extended records. It is shown that non-stationarities have a significant effect on the outcomes of flood frequency analysis using both short and long input data series. Effects on outcomes of dominant multi-decadal variability are, however, largely subdued in the longer 240-year series.

  10. Analysis of multiple recording methods for full resolution multi-view autostereoscopic 3D display system incorporating VHOE (United States)

    Hwang, Yong Seok; Cho, Kyu Ha; Kim, Eun Soo


    In this paper, we propose multiple recording process of photopolymer for a full-color multi-view including multiple-view auto-stereoscopic 3D display system based on VHOE (Volume Holographic Optical Element). To overcome the problems such as low resolution, and limited viewing zone of conventional 3D-display without glasses, we designed multiple recording condition of VHOE for multi-view display. It is verified that VHOE may be optically made by angle-multiplexed recording of pre-designed multiple-viewing zone that uniformly is recorded through optimized exposuretime scheduling scheme. Here, VHOE-based backlight system for 4-view stereoscopic display is implemented, in which the output beams that playing a role reference beam from LGP(Light guide plate)t may be sequentially synchronized with the respective stereo images displayed on the LCD panel.

  11. Maturity Matrices for Quality of Model- and Observation-Based Climate Data Records (United States)

    Höck, Heinke; Kaiser-Weiss, Andrea; Kaspar, Frank; Stockhause, Martina; Toussaint, Frank; Lautenschlager, Michael


    In the field of Software Engineering the Capability Maturity Model is used to evaluate and improve software development processes. The application of a Maturity Matrix is a method to assess the degree of software maturity. This method was adapted to the maturity of Earth System data in scientific archives. The application of such an approach to Climate Data Records was first proposed in the context of satellite-based climate products and applied by NOAA and NASA. The European FP7 project CORE-CLIMAX suggested and tested extensions of the approach in order to allow the applicability to additional climate datasets, e.g. based on in-situ observations as well as model-based reanalysis. Within that project the concept was applied to products of satellite- and in-situ based datasets. Examples are national ground-based data from Germany as an example for typical products of a national meteorological service, the EUMETSAT Satellite Application Facility Network, the ESA Climate Change Initiative, European Reanalysis activities (ERA-CLIM) and international in situ-based climatologies such as GPCC, ECA&D, BSRN, HadSST. Climate models and their related output have some additional characteristics that need specific consideration in such an approach. Here we use examples from the World Data Centre for Climate (WDCC) to discuss the applicability. The WDCC focuses on climate data products, specifically those resulting from climate simulations. Based on these already existing Maturity Matrix models, WDCC developed a generic Quality Assessment System for Earth System data. A self-assessment is performed using a maturity matrix evaluating the data quality for five maturity levels with respect to the criteria data and metadata consistency, completeness, accessibility and accuracy. The classical goals of a quality assessment system in a data processing workflow are: (1) to encourage data creators to improve quality to reach the next quality level, (2) enable data consumers to decide

  12. Transparency in Transcribing: Making Visible Theoretical Bases Impacting Knowledge Construction from Open-Ended Interview Records


    Skukauskaite, Audra


    This article presents a reflexive analysis of two transcripts of an open-ended interview and argues for transparency in transcribing processes and outcomes. By analyzing ways in which a researcher's theories become consequential in producing and using transcripts of an open-ended interview, this paper makes visible the importance of examining and presenting theoretical bases of transcribing decisions. While scholars across disciplines have argued that transcribing is a theoretically laden pro...

  13. Record of Decision for the First Active Duty F-35A Operational Base (United States)


    7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S Army Corps of Engineers Geotechnical and Environmental Engineering Branch (CESPK-ED-GI...Management Actions and Mitigations A voiding or reducing potential environmental impacts was a consideration guiding the analysis of the F-35A basing...with engines that meet U.S. Environmental Protection Agency Tier 3 and 4 non- road standards. Using alternatively-fueled construction equipment, such as

  14. Record of Decision for the First Air National Guard F-35A Operational Base (United States)


    7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S Army Corps of Engineers Geotechnical and Environmental Engineering Branch (CESPK-ED-GI...Mitigations A voiding or reducing potential environmental impacts was a consideration guiding the analysis of the F-35A basing alternatives. Some equipment with engines that meet U.S. Environmental Protection Agency Tier 3 and 4 non- road standards. - Using alternatively-fueled

  15. LabTrove: a lightweight, web based, laboratory "blog" as a route towards a marked up record of work in a bioscience research laboratory.

    Directory of Open Access Journals (Sweden)

    Andrew J Milsted

    Full Text Available BACKGROUND: The electronic laboratory notebook (ELN has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research. METHODOLOGY/PRINCIPAL FINDINGS: We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system. CONCLUSIONS/SIGNIFICANCE: LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for

  16. 77 FR 5781 - Record of Decision for the Air Space Training Initiative Shaw Air Force Base, South Carolina... (United States)


    ... Department of the Air Force Record of Decision for the Air Space Training Initiative Shaw Air Force Base... Decision (ROD). SUMMARY: On December 9, 2011, the United States Air Force signed the ROD for the Airspace Training Initiative Shaw Air Force Base, South Carolina Final Environmental Impact Statement (EIS). The...

  17. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka


    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  18. How to limit the burden of data collection for Quality Indicators based on medical records? The COMPAQH experience

    Directory of Open Access Journals (Sweden)

    Grenier Catherine


    Full Text Available Abstract Background Our objective was to limit the burden of data collection for Quality Indicators (QIs based on medical records. Methods The study was supervised by the COMPAQH project. Four QIs based on medical records were tested: medical record conformity; traceability of pain assessment; screening for nutritional disorders; time elapsed before sending copy of discharge letter to the general practitioner. Data were collected by 6 Clinical Research Assistants (CRAs in a panel of 36 volunteer hospitals and analyzed by COMPAQH. To limit the burden of data collection, we used the same sample of medical records for all 4 QIs, limited sample size to 80 medical records, and built a composite score of only 10 items to assess medical record completeness. We assessed QI feasibility by completing a grid of 19 potential problems and evaluating time spent. We assessed reliability (κ coefficient as well as internal consistency (Cronbach α coefficient in an inter-observer study, and discriminatory power by analysing QI variability among hospitals. Results Overall, 23 115 data items were collected for the 4 QIs and analyzed. The average time spent on data collection was 8.5 days per hospital. The most common feasibility problem was misunderstanding of the item by hospital staff. QI reliability was good (κ: 0.59–0.97 according to QI. The hospitals differed widely in their ability to meet the quality criteria (mean value: 19–85%. Conclusion These 4 QIs based on medical records can be used to compare the quality of record keeping among hospitals while limiting the burden of data collection, and can therefore be used for benchmarking purposes. The French National Health Directorate has included them in the new 2009 version of the accreditation procedure for healthcare organizations.

  19. [The influence of Donguibogam during the middle Joseon era based on clinical records on low back pain in Seungjeongwon ilgi]. (United States)

    Jung, Jae Young; Lee, Jun Hwan; Chung, Seok Hee


    The recently increasing interest in historical records has led to more research on historical records in various fields of study. This trend has also affected medical research, with the medical climate and popular treatment modalities of the past now being revealed based on historical records. However, most research on medical history during the Joseon era has been based on the most well-known record, Joseon wangjo sillok or Annals of the Joseon Dynasty. Joseon wangjo sillok is a comprehensive and organized record of society during the Joseon era and contains key knowledge about medical history during the period, but it lacks details on the treatment of common disorders at the time. Seungjeongwon ilgi or Diary of the Royal Secretariat has detailed records of daily events and is a valuable resource for the daily activities of the era. And in the middle Josoen era, a variety of medical books - especially Donguibogam - was published. Therefore, the authors focused on the under-researched Seungjeongwon ilgi, Donguibogam and attempted to assess and evaluate low back pain treatment performed on Joseon royalty. The most notable characteristic of low back treatment records within the Seungjeongwon ilgi is that diagnosis and treatment was made based on an independent Korean medicine, rather than conventional Chinese medicine. This paradigm shift is represented in Dongeuibogam, and can be seen in the close relationship between Dongeuibogam and national medical exams of the day. Along with the pragmatism of the middle Joseon era, medical treatment also put more focus on pragmatic treatment methods, and records show emphasis on acupuncture and moxibustion and other points in accord with this. The authors also observed meaning and limitations of low back pain treatment during that era through comparison with current diagnosis and treatment.

  20. An ecometric analysis of the fossil mammal record of the Turkana Basin (United States)

    Žliobaitė, Indrė; Kaya, Ferhat; Bibi, Faysal; Bobe, René; Leakey, Louise; Leakey, Meave; Patterson, David; Rannikko, Janina; Werdelin, Lars


    Although ecometric methods have been used to analyse fossil mammal faunas and environments of Eurasia and North America, such methods have not yet been applied to the rich fossil mammal record of eastern Africa. Here we report results from analysis of a combined dataset spanning east and west Turkana from Kenya between 7 and 1 million years ago (Ma). We provide temporally and spatially resolved estimates of temperature and precipitation and discuss their relationship to patterns of faunal change, and propose a new hypothesis to explain the lack of a temperature trend. We suggest that the regionally arid Turkana Basin may between 4 and 2 Ma have acted as a ‘species factory’, generating ecological adaptations in advance of the global trend. We show a persistent difference between the eastern and western sides of the Turkana Basin and suggest that the wetlands of the shallow eastern side could have provided additional humidity to the terrestrial ecosystems. Pending further research, a transient episode of faunal change centred at the time of the KBS Member (1.87–1.53 Ma), may be equally plausibly attributed to climate change or to a top-down ecological cascade initiated by the entry of technologically sophisticated humans. This article is part of the themed issue ‘Major transitions in human evolution’. PMID:27298463

  1. Measuring the amplitude characteristic of an image recorder based on a CCD matrix

    NARCIS (Netherlands)

    Zhurovich, KA; Kirillov, VP; Mikhailov, YA; Sklizkov, GV; Starodub, AN; Sudakov, OA


    A method for studying the amplitude characteristic of an image recorder designed on the basis of a charge-coupled device (CCD) matrix is described. The recorder input signal is an intensity of distribution a monochromatic light formed upon Fraunhofer diffraction of the light by two identical slits.

  2. A tutorial on activity-based costing of electronic health records. (United States)

    Federowicz, Marie H; Grossman, Mila N; Hayes, Bryant J; Riggs, Joseph


    As the American Recovery and Restoration Act of 2009 allocates $19 billion to health information technology, it will be useful for health care managers to project the true cost of implementing an electronic health record (EHR). This study presents a step-by-step guide for using activity-based costing (ABC) to estimate the cost of an EHR. ABC is a cost accounting method with a "top-down" approach for estimating the cost of a project or service within an organization. The total cost to implement an EHR includes obvious costs, such as licensing fees, and hidden costs, such as impact on productivity. Unlike other methods, ABC includes all of the organization's expenditures and is less likely to miss hidden costs. Although ABC is used considerably in manufacturing and other industries, it is a relatively new phenomenon in health care. ABC is a comprehensive approach that the health care field can use to analyze the cost-effectiveness of implementing EHRs. In this article, ABC is applied to a health clinic that recently implemented an EHR, and the clinic is found to be more productive after EHR implementation. This methodology can help health care administrators assess the impact of a stimulus investment on organizational performance.

  3. Archetype-based knowledge management for semantic interoperability of electronic health records. (United States)

    Garde, Sebastian; Chen, Rong; Leslie, Heather; Beale, Thomas; McNicoll, Ian; Heard, Sam


    Formal modeling of clinical content that can be made available internationally is one of the most promising pathways to semantic interoperability of health information. Drawing on the extensive experience from openEHR archetype research and implementation work, we present the latest research and development in this area to improve semantic interoperability of Electronic Health Records (EHRs) using openEHR (ISO 13606) archetypes. Archetypes as the formal definition of clinical content need to be of high technical and clinical quality. We will start with a brief introduction of the openEHR architecture followed by presentations on specific topics related to the management of a wide range of clinical knowledge artefacts. We will describe a web-based review process for archetypes that enables international involvement and ensures that released archetypes are technically and clinically correct. Tools for validation of archetypes will be presented, along with templates and compliance templates. All this in combination enables the openEHR computing platform to be the foundation for safely sharing the information clinicians need, using this information within computerized clinical guidelines, for decision support as well as migrating legacy data.

  4. Laser Based Color Film Recorder System With GaAs Microlaser (United States)

    Difrancesco, David J.


    In 1984 Pixar's research and development group built and applied to the motion-picture arts at Lucasfilm's ILM facility a three color laser based film scanner/recorder system. The digital film printer is capable of reading and writing 35mm film formats on a variety of film stocks. The system has been used in award-winning special-effects work, and has been operated in a normal production environment since that time. The primary objective was to develop a full color high resolution system, free from scan artifacts, enabling traditionally photographed motion-picture film to be inter-cut with digital raster image photography. Its use is applied to the art of blue-screen traveling-matte cinematography for motion pic-ture special effects. The system was designed using the Pixar Image Computer and conventional gas laser technology as the illumination source. This paper will discuss recent experimental work in the application of GaAs microlaser technology to a digital film printing system of the future.

  5. A Novel Error Correcting System Based on Product Codes for Future Magnetic Recording Channels

    CERN Document Server

    Van, Vo Tam


    We propose a novel construction of product codes for high-density magnetic recording based on binary low-density parity check (LDPC) codes and binary image of Reed Solomon (RS) codes. Moreover, two novel algorithms are proposed to decode the codes in the presence of both AWGN errors and scattered hard errors (SHEs). Simulation results show that at a bit error rate (bER) of approximately 10^-8, our method allows improving the error performance by approximately 1.9dB compared with that of a hard decision decoder of RS codes of the same length and code rate. For the mixed error channel including random noises and SHEs, the signal-to-noise ratio (SNR) is set at 5dB and 150 to 400 SHEs are randomly generated. The bit error performance of the proposed product code shows a significant improvement over that of equivalent random LDPC codes or serial concatenation of LDPC and RS codes.

  6. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    Directory of Open Access Journals (Sweden)

    Ivo Martiník


    Full Text Available Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  7. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory (United States)

    Martiník, Ivo


    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  8. Point-process analysis of neural spiking activity of muscle spindles recorded from thin-film longitudinal intrafascicular electrodes. (United States)

    Citi, Luca; Djilas, Milan; Azevedo-Coste, Christine; Yoshida, Ken; Brown, Emery N; Barbieri, Riccardo


    Recordings from thin-film Longitudinal Intra-Fascicular Electrodes (tfLIFE) together with a wavelet-based de-noising and a correlation-based spike sorting algorithm, give access to firing patterns of muscle spindle afferents. In this study we use a point process probability structure to assess mechanical stimulus-response characteristics of muscle spindle spike trains. We assume that the stimulus intensity is primarily a linear combination of the spontaneous firing rate, the muscle extension, and the stretch velocity. By using the ability of the point process framework to provide an objective goodness of fit analysis, we were able to distinguish two classes of spike clusters with different statistical structure. We found that spike clusters with higher SNR have a temporal structure that can be fitted by an inverse Gaussian distribution while lower SNR clusters follow a Poisson-like distribution. The point process algorithm is further able to provide the instantaneous intensity function associated with the stimulus-response model with the best goodness of fit. This important result is a first step towards a point process decoding algorithm to estimate the muscle length and possibly provide closed loop Functional Electrical Stimulation (FES) systems with natural sensory feedback information.


    Directory of Open Access Journals (Sweden)



    Full Text Available Toarcian sections studied mainly in Europe have revealed the incidence of Milankovitch forcing with a well-developed, highly stable, 405 ky component of eccentricity, a short-term eccentricity of ~100 kyr, the cycle of obliquity ~36 kyr, and the precession signal at ~21 kyr. Cyclostratigraphic analysis of the Toarcian succession at the Valdorbia section (Umbria-Marche Apennines was conducted based on time-series of foraminiferal assemblages. Well-developed cyclic patterns were obtained, with several significant cycles corresponding to thicknesses of 3.8-4.1 m / 5.8-6.3 m / 8.2 m / 10.4 m. Comparison with previous studies at the Valdorbia section led us to interpret the cycle of ~4 m as directly related with the short-term eccentricity (95-105 kyr. The rest of the cycles could be assigned to a periodicity of ~140-160 kyr, ~200 kyr and ~250 kyr, and interpreted as indirect signals of the long-term eccentricity, obliquity and precession, whose record would be impeded by the incompleteness of the studied succession and the sampling interval. Studied components in the foraminiferal assemblage show variable cyclostratigraphic patterns, allowing for a differentiation of groups based on similar registered cycles. These groups reveal different responses by the foraminiferal assemblage, associated with particular requirements, to the palaeoenvironmental changes of Milankovitch origin.

  10. Reconstruction of Oceanographic Changes Based on the Diatom Records of the Central Okhotsk Sea over the last 500000 Years

    Directory of Open Access Journals (Sweden)

    Wei-Lung Wang and Liang-Chi Wang


    Full Text Available This study provides insight into changes in sea ice conditions and the oceanographic environment over the past 500 kyr through analysis of the diatom record. Based on the relative abundance of 13 diatoms species in piston core MD012414, four types of environmental conditions in the central Okhotsk Sea over the last 330 ka BP have been distinguished: (1 open-ocean alternating with seasonal sea-ice cover in Stages 9, 5, and 1; (2 almost open-ocean free of sea-ice cover in Stages 7 and 3; (3 perennial sea-ice cover in Stages 6, 4, and 2; and (4 a warm ice-age dominated by open ocean assemblages in Stage 8. The littoral diatom species, Paralia sulcata, showed a sudden increase from the glacial period to the nterglacial period over the last 330 ka BP, except during Stage 8. Such a result implies that melting sea-ice transported terrigenous materials from the north Okhotsk Sea continental shelves to the central ocean during eglaciation. From Stage 13 to Stage 10, however, cold and warm marine conditions unexpectedly occurred in the late interglacial periods and the glacial periods, respectively. One possible reason for this is a lack of age control points from Stage 13 to Stage 10, and the different sediment accumulation rates between glacial and interglacial periods. This study suggests not only the process by which oceanographic variation of sea ice occurred, but also new significance for Paralia sulcata as an indicator in the diatom record of the Okhotsk Sea.

  11. Analysis of Readex's Serial Set MARC Records: Improving the Data for the Library Catalog (United States)

    Draper, Daniel; Lederer, Naomi


    Colorado State University Libraries (CSUL) purchased the digitized "United States Congressional Serial Set," 1817-1994 and "American State Papers" (1789-1838) from the Readex Division of NewsBank, Inc. and, once funds and records were available, the accompanying MARC records. The breadth of information found in the "Serial…

  12. [Pressure ulcer care quality indicator: analysis of medical records and incident report]. (United States)

    dos Santos, Cássia Teixeira; Oliveira, Magáli Costa; Pereira, Ana Gabriela da Silva; Suzuki, Lyliam Midori; Lucena, Amália de Fátima


    Cross-sectional study that aimed to compare the data reported in a system for the indication of pressure ulcer (PU) care quality, with the nursing evolution data available in the patients' medical records, and to describe the clinical profile and nursing diagnosis of those who developed PU grade 2 or higher Sample consisted of 188 patients at risk for PU in clinical and surgical units. Data were collected retrospectively from medical records and a computerized system of care indicators and statistically analyzed. Of the 188 patients, 6 (3%) were reported for pressure ulcers grade 2 or higher; however, only 19 (10%) were recorded in the nursing evolution records, thus revealing the underreporting of data. Most patients were women, older adults and patients with cerebrovascular diseases. The most frequent nursing diagnosis was risk of infection. The use of two or more research methodologies such as incident reporting data and retrospective review of patients' records makes the results trustworthy.

  13. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Directory of Open Access Journals (Sweden)

    Catalina Pimiento

    Full Text Available Carcharocles megalodon ("Megalodon" is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma. Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  14. When did Carcharocles megalodon become extinct? A new analysis of the fossil record. (United States)

    Pimiento, Catalina; Clements, Christopher F


    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE) model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  15. Automatic Indexing for Content Analysis of Whale Recordings and XML Representation

    Directory of Open Access Journals (Sweden)

    Hervé Glotin


    Full Text Available This paper focuses on the robust indexing of sperm whale hydrophone recordings based on a set of features extracted from a real-time passive underwater acoustic tracking algorithm for multiple whales using four hydrophones. Acoustic localization permits the study of whale behavior in deep water without interfering with the environment. Given the position coordinates, we are able to generate different features such as the speed, energy of the clicks, Inter-Click-Interval (ICI, and so on. These features allow to construct different markers which allow us to index and structure the audio files. Thus, the behavior study is facilitated by choosing and accessing the corresponding index in the audio file. The complete indexing algorithm is processed on real data from the NUWC (Naval Undersea Warfare Center of the US Navy and the AUTEC (Atlantic Undersea Test & Evaluation Center-Bahamas. Our model is validated by similar results from the US Navy (NUWC and SOEST (School of Ocean and Earth Science and Technology Hawaii university labs in a single whale case. Finally, as an illustration, we index a single whale sound file using the extracted whale's features provided by the tracking, and we present an example of an XML script structuring it.

  16. Automatic Indexing for Content Analysis of Whale Recordings and XML Representation

    Directory of Open Access Journals (Sweden)

    Bénard Frédéric


    Full Text Available Abstract This paper focuses on the robust indexing of sperm whale hydrophone recordings based on a set of features extracted from a real-time passive underwater acoustic tracking algorithm for multiple whales using four hydrophones. Acoustic localization permits the study of whale behavior in deep water without interfering with the environment. Given the position coordinates, we are able to generate different features such as the speed, energy of the clicks, Inter-Click-Interval (ICI, and so on. These features allow to construct different markers which allow us to index and structure the audio files. Thus, the behavior study is facilitated by choosing and accessing the corresponding index in the audio file. The complete indexing algorithm is processed on real data from the NUWC (Naval Undersea Warfare Center of the US Navy and the AUTEC (Atlantic Undersea Test & Evaluation Center-Bahamas. Our model is validated by similar results from the US Navy (NUWC and SOEST (School of Ocean and Earth Science and Technology Hawaii university labs in a single whale case. Finally, as an illustration, we index a single whale sound file using the extracted whale's features provided by the tracking, and we present an example of an XML script structuring it.


    Directory of Open Access Journals (Sweden)

    Ruchita Gautam,


    Full Text Available The electrocardiogram (ECG is quite important tool to find out more information about the heart. The main tasks in ECG signal analysis are the detection of QRS complex (i.e. R wave, and the estimation ofinstantaneous heart rate by measuring the time interval between two consecutive R-waves. After recognizing R wave, other components like P, Q, S and T can be detected by using window method. In this paper, we describe a QRS complex detector based on the Dyadic wavelet transform (DyWT which is robust in comparison with time- varying QRS complex morphology and to noise. We illustrate the performance of the DyWT-based QRS detector by considering problematic ECG signals from Common Standard for Electrocardiography (CSE database. We also compare and analyze its performance to some of the QRS detectors developed in the past.

  18. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)



    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information:

  19. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005) (United States)

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.


    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  20. In vitro and in vivo noise analysis for optical neural recording. (United States)

    Foust, Amanda J; Schei, Jennifer L; Rojas, Manuel J; Rector, David M


    Laser diodes (LD) are commonly used for optical neural recordings in chronically recorded animals and humans, primarily due to their brightness and small size. However, noise introduced by LDs may counteract the benefits of brightness when compared to low-noise light-emitting diodes (LEDs). To understand noise sources in optical recordings, we systematically compared instrument and physiological noise profiles in two recording paradigms. A better understanding of noise sources can help improve optical recordings and make them more practical with fewer averages. We stimulated lobster nerves and a rat cortex, then compared the root mean square (RMS) noise and signal-to-noise ratios (SNRs) of data obtained with LED, superluminescent diode (SLD), and LD illumination for different numbers of averages. The LED data exhibited significantly higher SNRs in fewer averages than LD data in all recordings. In the absence of tissue, LED noise increased linearly with intensity, while LD noise increased sharply in the transition to lasing and settled to noise levels significantly higher than the LED's, suggesting that speckle noise contributed to the LD's higher noise and lower SNRs. Our data recommend low coherence and portable light sources for in vivo chronic neural recording applications.

  1. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator Project (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  2. Version based spatial record management techniques for spatial database management system

    Institute of Scientific and Technical Information of China (English)

    KIM Ho-seok; KIM Hee-taek; KIM Myung-keun; BAE Hae-young


    The search operation of spatial data was a principal operation in existent spatial database management system, but the update operation of spatial data such as tracking are occurring frequently in the spatial database management system recently. So, necessity of concurrency improvement among transactions is increasing. In general database management system, many techniques have been studied to solve concurrency problem of transaction. Among them, multi-version algorithm does to minimize interference among transactions. However, to apply existent multi-version algorithm to improve concurrency of transaction to spatial database management system, the waste of storage happens because it must store entire version for spatial record even if only aspatial data of spatial record is changed. This paper has proposed the record management techniques to manage separating aspatial data version and spatial data version to decrease waste of storage for record version and improve concurrency among transactions.

  3. A novel assessment of odor sources using instrumental analysis combined with resident monitoring records for an industrial area in Korea (United States)

    Lee, Hyung-Don; Jeon, Soo-Bin; Choi, Won-Joon; Lee, Sang-Sup; Lee, Min-Ho; Oh, Kwang-Joong


    The residents living nearby the Sa-sang industrial area (SSIA) continuously were damaged by odorous pollution since 1990s. We determined the concentrations of reduced sulfur compounds (RSCs) [hydrogen sulfide (H2S), methyl mercaptan (CH3SH), dimethyl sulfide (DMS), and dimethyl disulfide (DMDS)], nitrogenous compounds (NCs) [ammonia (NH3) and trimethylamine (TMA)], and carbonyl compounds (CCs) [acetaldehyde and butyraldehyde] by instrumental analysis in the SSIA in Busan, Korea from Jun to Nov, 2011. We determined odor intensity (OI) based on the concentrations of the odorants and resident monitoring records (RMR). The mean concentration of H2S was 10-times higher than NCs, CCs and the other RSC. The contribution from RSCs to the OI was over 50% at all sites excluding the A-5 (chemical production) site. In particular, A-4 (food production) site showed more than 8-times higher the sum of odor activity value (SOAV) than the other sites. This suggested that the A-4 site was the most malodorous area in the SSIA. From the RMR analysis, the annoyance degree (OI ≥ 2) was 51.9% in the industrial area. The 'Rotten' smell arising from the RSCs showed the highest frequency (25.3%) while 'Burned' and 'Other' were more frequent than 'Rotten' in the residential area. The correlation between odor index calculated by instrumental analysis and OI from the RMR was analyzed. The Pearson correlation coefficient (r) of the SOAV was the highest at 0.720 (P < 0.05), and overall results of coefficient showed a moderately high correlation distribution range (from 0.465 to 0.720). Therefore, the overall results of this research confirm that H2S emitted from A-4 site including food production causes significant annoyance in the SSIA. We also confirm RMR data can be used effectively to evaluate the characteristic of odorants emitted from the SSIA.

  4. Training State and Community Instructors in Use of NHTSA Curriculum Packages: Driver Improvement Analysis, Driver License Examiner-Supervisor and Traffic Record Analysis. (United States)

    Burgener, V. E.; Tiryakioglu, Dona

    A series of five national instructor training institutes were planned for each of three emerging highway safety technician areas for which curriculum packages have been prepared (Driver Improvement Analysis, Driver License Examiner-Supervisor, and Traffic Record Analysis). Technical Education Research Centers and Dunlap & Associates…

  5. Holographic theory and recording techniques. Citations from the NTIS data base (United States)

    Carrigan, B.


    The topics cited include holographic recording techniques, theory, equipment, and materials. Among the techniques cited are color holography, X-ray holography, high speed holography, and motion picture holography. Photographic materials, films, emulsions, and equipment for recording and information storage are covered. Techniques for image motion compensation, image deblurring, wave-front reconstruction, and resolution are also cited. This updated bibliography contains 251 abstracts, 17 of which are new entries to the previous edition.

  6. Independent component analysis and decision trees for ECG holter recording de-noising.

    Directory of Open Access Journals (Sweden)

    Jakub Kuzilek

    Full Text Available We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA. This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE between original ECG and filtered data contaminated with artificial noise. Proposed algorithm achieved comparable result in terms of standard noises (power line interference, base line wander, EMG, but noticeably significantly better results were achieved when uncommon noise (electrode cable movement artefact were compared.

  7. Beyond the Eyes of the Monster: An Analysis of Recent Trends in Assessment and Recording. (United States)

    Ainscow, Mel


    The article analyzes existing practice in assessment and recording in the special needs field and makes such recommendations as assessment with a wider perspective more continuously to aid in making the curriculum responsive to individual needs. (DB)

  8. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents (United States)


    three service branches as of January 2008, our data capture point . Both the records management systems and the in- dividual record types may have...facility; the U.S. Army Military Academy at West Point , NY, and the Na- tional Aeronautics and Space Administration (NASA) maintain their own archive...Criminal Investigative Service HQ, (Code 27D) 716 Sicard Street SE, Suite 2000 Washington Navy Yard Washington DC 20388-5380 202-433-9505

  9. Analysis of debris-flow recordings in an instrumented basin: confirmations and new findings

    Directory of Open Access Journals (Sweden)

    M. Arattano


    Full Text Available On 24 August 2006, a debris flow took place in the Moscardo Torrent, a basin of the Eastern Italian Alps instrumented for debris-flow monitoring. The debris flow was recorded by two seismic networks located in the lower part of the basin and on the alluvial fan, respectively. The event was also recorded by a pair of ultrasonic sensors installed on the fan, close to the lower seismic network. The comparison between the different recordings outlines particular features of the August 2006 debris flow, different from that of events recorded in previous years. A typical debris-flow wave was observed at the upper seismic network, with a main front abruptly appearing in the torrent, followed by a gradual decrease of flow height. On the contrary, on the alluvial fan the wave displayed an irregular pattern, with low flow depth and the main peak occurring in the central part of the surge both in the seismic recording and in the hydrographs. Recorded data and field evidences indicate that the surge observed on the alluvial fan was not a debris flow, and probably consisted in a water surge laden with fine to medium-sized sediment. The change in shape and characteristics of the wave can be ascribed to the attenuation of the surge caused by the torrent control works implemented in the lower basin during the last years.

  10. Analysis of historical meteor and meteor shower records: Korea, China, and Japan

    CERN Document Server

    Yang, H J; Park, M G; Yang, Hong-Jin; Park, Changbom; Park, Myeong-Gu


    We have compiled and analyzed historical Korean meteor and meteor shower records in three Korean official history books, Samguksagi which covers the three Kingdoms period (57 B.C -- A.D. 935), Goryeosa of Goryeo dynasty (A.D. 918 -- 1392), and Joseonwangjosillok of Joseon dynasty (A.D. 1392 -- 1910). We have found 3861 meteor and 31 meteor shower records. We have confirmed the peaks of Perseids and an excess due to the mixture of Orionids, north-Taurids, or Leonids through the Monte-Carlo test. The peaks persist from the period of Goryeo dynasty to that of Joseon dynasty, for almost one thousand years. Korean records show a decrease of Perseids activity and an increase of Orionids/north-Taurids/Leonids activity. We have also analyzed seasonal variation of sporadic meteors from Korean records. We confirm the seasonal variation of sporadic meteors from the records of Joseon dynasty with the maximum number of events being roughly 1.7 times the minimum. The Korean records are compared with Chinese and Japanese re...

  11. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8


    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  12. Multi-level analysis of electronic health record adoption by health care professionals: A study protocol

    Directory of Open Access Journals (Sweden)

    Labrecque Michel


    Full Text Available Abstract Background The electronic health record (EHR is an important application of information and communication technologies to the healthcare sector. EHR implementation is expected to produce benefits for patients, professionals, organisations, and the population as a whole. These benefits cannot be achieved without the adoption of EHR by healthcare professionals. Nevertheless, the influence of individual and organisational factors in determining EHR adoption is still unclear. This study aims to assess the unique contribution of individual and organisational factors on EHR adoption in healthcare settings, as well as possible interrelations between these factors. Methods A prospective study will be conducted. A stratified random sampling method will be used to select 50 healthcare organisations in the Quebec City Health Region (Canada. At the individual level, a sample of 15 to 30 health professionals will be chosen within each organisation depending on its size. A semi-structured questionnaire will be administered to two key informants in each organisation to collect organisational data. A composite adoption score of EHR adoption will be developed based on a Delphi process and will be used as the outcome variable. Twelve to eighteen months after the first contact, depending on the pace of EHR implementation, key informants and clinicians will be contacted once again to monitor the evolution of EHR adoption. A multilevel regression model will be applied to identify the organisational and individual determinants of EHR adoption in clinical settings. Alternative analytical models would be applied if necessary. Results The study will assess the contribution of organisational and individual factors, as well as their interactions, to the implementation of EHR in clinical settings. Conclusions These results will be very relevant for decision makers and managers who are facing the challenge of implementing EHR in the healthcare system. In addition

  13. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L


    HRs recorded and mean HR calculated from all daytime HRs. Follow-up was recorded from public registers. Outcome measure was hazard rate for the combined endpoint of cardiovascular mortality, non-fatal heart failure and non-fatal acute myocardial infarction. Comparison of casual RHR, Holter RHR...... rates of 1.02 (p = 0.079) for casual RHR, 1.04 (p = 0.036*) for Holter RHR, and 1.03 (p = 0.093) for mean HR for each 10 beat increment in HR. CONCLUSIONS: In a comparative analysis on the correlation and significance of differing RHR measurement modalities RHR measured by 24-hour Holter recording...... was found to be marginally superior as a predictor of cardiovascular morbidity and mortality. The results presented here do not however warrant the abandonment of a tested epidemiological variable....

  14. Phylogenetic analysis shows that Neolithic slate plaques from the southwestern Iberian Peninsula are not genealogical recording systems.

    Directory of Open Access Journals (Sweden)

    Daniel García Rivero

    Full Text Available Prehistoric material culture proposed to be symbolic in nature has been the object of considerable archaeological work from diverse theoretical perspectives, yet rarely are methodological tools used to test the interpretations. The lack of testing is often justified by invoking the opinion that the slippery nature of past human symbolism cannot easily be tackled by the scientific method. One such case, from the southwestern Iberian Peninsula, involves engraved stone plaques from megalithic funerary monuments dating ca. 3,500-2,750 B.C. (calibrated age. One widely accepted proposal is that the plaques are ancient mnemonic devices that record genealogies. The analysis reported here demonstrates that this is not the case, even when the most supportive data and techniques are used. Rather, we suspect there was a common ideological background to the use of plaques that overlay the southwestern Iberian Peninsula, with little or no geographic patterning. This would entail a cultural system in which plaque design was based on a fundamental core idea, with a number of mutable and variable elements surrounding it.

  15. Phylogenetic analysis shows that Neolithic slate plaques from the southwestern Iberian Peninsula are not genealogical recording systems. (United States)

    García Rivero, Daniel; O'Brien, Michael J


    Prehistoric material culture proposed to be symbolic in nature has been the object of considerable archaeological work from diverse theoretical perspectives, yet rarely are methodological tools used to test the interpretations. The lack of testing is often justified by invoking the opinion that the slippery nature of past human symbolism cannot easily be tackled by the scientific method. One such case, from the southwestern Iberian Peninsula, involves engraved stone plaques from megalithic funerary monuments dating ca. 3,500-2,750 B.C. (calibrated age). One widely accepted proposal is that the plaques are ancient mnemonic devices that record genealogies. The analysis reported here demonstrates that this is not the case, even when the most supportive data and techniques are used. Rather, we suspect there was a common ideological background to the use of plaques that overlay the southwestern Iberian Peninsula, with little or no geographic patterning. This would entail a cultural system in which plaque design was based on a fundamental core idea, with a number of mutable and variable elements surrounding it.

  16. iSpectra: An Open Source Toolbox For The Analysis of Spectral Images Recorded on Scanning Electron Microscopes. (United States)

    Liebske, Christian


    iSpectra is an open source and system-independent toolbox for the analysis of spectral images (SIs) recorded on energy-dispersive spectroscopy (EDS) systems attached to scanning electron microscopes (SEMs). The aim of iSpectra is to assign pixels with similar spectral content to phases, accompanied by cumulative phase spectra with superior counting statistics for quantification. Pixel-to-phase assignment starts with a threshold-based pre-sorting of spectra to create groups of pixels with identical elemental budgets, similar to a method described by van Hoek (2014). Subsequent merging of groups and re-assignments of pixels using elemental or principle component histogram plots enables the user to generate chemically and texturally plausible phase maps. A variety of standard image processing algorithms can be applied to groups of pixels to optimize pixel-to-phase assignments, such as morphology operations to account for overlapping excitation volumes over pixels located at phase boundaries. iSpectra supports batch processing and allows pixel-to-phase assignments to be applied to an unlimited amount of SIs, thus enabling phase mapping of large area samples like petrographic thin sections.

  17. VBORNET gap analysis: Mosquito vector distribution models utilised to identify areas of potential species distribution in areas lacking records.

    Directory of Open Access Journals (Sweden)

    Francis Schaffner


    Full Text Available This is the second of a number of planned data papers presenting modelled vector distributions produced originally during the ECDC funded VBORNET project. This work continues under the VectorNet project now jointly funded by ECDC and EFSA. Further data papers will be published after sampling seasons when more field data will become available allowing further species to be modelled or validation and updates to existing models.  The data package described here includes those mosquito species first modelled in 2013 & 2014 as part of the VBORNET gap analysis work which aimed to identify areas of potential species distribution in areas lacking records. It comprises three species models together with suitability masks based on land class and environmental limits. The species included as part of this phase are the mosquitoes 'Aedes vexans', 'Anopheles plumbeus' and 'Culex modestus'. The known distributions of these species within the area covered by the project (Europe, the ­Mediterranean Basin, North Africa, and Eurasia are currently incomplete to a greater or lesser degree. The models are designed to fill the gaps with predicted distributions, to provide a assistance in ­targeting surveys to collect distribution data for those areas with no field validated information, and b a first indication of the species distributions within the project areas.

  18. Extensions in model-based system analysis


    Graham, Matthew R.


    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  19. A new global geomagnetic model based on archeomagnetic, volcanic and historical records (United States)

    Arneitz, Patrick; Leonhardt, Roman; Fabian, Karl


    The major challenge of geomagnetic field reconstruction lies in the inhomogeneous spatio-temporal distribution of the available data and their highly variable quality. Paleo- and archeomagnetic records provide information about the ancient geomagnetic field beyond the historical period. Typically these data types have larger errors than their historical counterparts, and investigated materials and applied experimental methods potentially bias field readings. Input data for the modelling approach were extracted from available collections of archeomagnetic, volcanic and historical records, which were integrated into a single database along with associated meta-data. The used iterative Bayesian inversion scheme targets the implementation of reliable error treatments, which allows to combine the different data types. The proposed model is scrutinized by carrying out tests with artificial records. Records are synthesized using a known field evolution generated by a geodynamo model showing realistic energy characteristics. Using the artificial field, a synthetic data set is generated that exactly mirrors the existing measured records in all meta-data, but provides data that would have been observed if the artificial field would have been real. After inversion of the synthetic data, the comparison of known artificial Gauss coefficients and modelled ones allows for the verification of the applied modelling strategy as well as for the examination of the potential and limits of the current data compilation.

  20. Analysis of atrial fibrillatory rate during spontaneous episodes of atrial fibrillation in humans using implantable loop recorder electrocardiogram. (United States)

    Platonov, Pyotr G; Stridh, Martin; de Melis, Mirko; Urban, Lubos; Carlson, Jonas; Corbucci, Giorgio; Holmqvist, Fredrik


    Atrial fibrillatory rate (AFR) can predict outcome of interventions for atrial fibrillation (AF); however, AFR behavior at AF onset in humans is poorly described. We studied AFR during spontaneous AF episodes in patients with lone paroxysmal AF who received implantable loop recorders and had AF episodes of 1 hour or more recorded (n = 4). Mean AFR per minute was assessed from continuous implantable loop recorder electrocardiogram using spatiotemporal QRST cancellation and time-frequency analysis. Atrial fibrillatory rate increased from 290 ± 20 to 326 ± 39 fibrillations per minute during the first 3 hours (P<.05) and reached plateau then. Atrial fibrillatory rate beyond the initial 3 hours can, therefore, be considered stable and may be evaluated for prediction of intervention effect.

  1. Transparency in Transcribing: Making Visible Theoretical Bases Impacting Knowledge Construction from Open-Ended Interview Records

    Directory of Open Access Journals (Sweden)

    Audra Skukauskaite


    Full Text Available This article presents a reflexive analysis of two transcripts of an open-ended interview and argues for transparency in transcribing processes and outcomes. By analyzing ways in which a researcher's theories become consequential in producing and using transcripts of an open-ended interview, this paper makes visible the importance of examining and presenting theoretical bases of transcribing decisions. While scholars across disciplines have argued that transcribing is a theoretically laden process (GREEN, FRANQUIZ & DIXON, 1997; KVALE & BRINKMAN, 2009, few have engaged in reflexive analyses of the data history to demonstrate the consequences particular theoretical and methodological approaches pose in producing knowledge claims and inciting dialogues across traditions. The article demonstrates how theory-method-claim relationships in transcribing influence research transparency and warrantability. URN:

  2. Mobile Application of Water Meter Recorder Based on Short Message Service Transmissions Using Windows Mobile Platform

    Directory of Open Access Journals (Sweden)

    I Dewa Nyoman Anom Manuaba


    Full Text Available The rapid development of technology nowadays has major impact to the development of cellular technology. This development led to a new wide range of smartphone. The growth of life nowadays requires people to work more quickly, so that they can use the time more effective and increase the performance. The process which done manually takes a lot more time than the process which done automatically, because the process which done manually have a higher risk of error than the process which done automatically. The process that are still done manually is recording the amount of customer water consumption in PDAM (Regional Water Company. This problem can be solved by creating mobile application that can record the water meter and then automatically send the data of the customer and the amount of water use directly to the computer server and calculated automatically. This application can solve the problem in recording the water meter.

  3. The computer-based patient record challenges towards timeless and spaceless medical practice. (United States)

    Sicotte, C; Denis, J L; Lehoux, P; Champagne, F


    Although computerization is increasingly advocated as a means for hospitals to enhance quality of care and control costs, few studies have evaluated its impact on the day-to-day organization of medical work. This study investigated a large Computerized Patient Record (CPR) project ($50 million U.S.) aimed at allowing physicians to work in a completely electronic record environment. The present multiple-case study analyzed the implementation of this project conducted in four hospitals. Our results show the intricate complexity of introducing the CPR in medical work. Profound obstructions to the achievement of a tighter synchronization between the care and information processes were the main problems. The presence of multiple information systems in one (Communication, Decision Support, and Archival record keeping) was overlooked. It introduced several misconceptions in the meaning and codification of clinical information that were then torn apart between information richness to sustain clinical decisions and concision to sustain care coordination.

  4. A record of Quaternary humidity fluctuations on the NE Tibetan Plateau based on magnetic susceptibility variations in lacustrine sediments of the Qaidam Basin (United States)

    Herb, Christian; Koutsodendris, Andreas; Zhang, Weilin; Appel, Erwin; Pross, Jörg; Fang, Xiaomin


    Magnetic susceptibility (?) and other magnetic proxies play an important role in paleoclimatic studies as they hold the potential for high-resolution records of past environmental change. Nevertheless, it is necessary to understand the cause of the variation in magnetic proxies by comparing them to more direct climate proxies such as pollen or stable isotopes. In this study we have compiled a high-resolution magnetic proxy dataset of the ca. 940-m-long core SG-1, which was drilled in the lacustrine sediments of the western Qaidam Basin on the northeastern Tibetan Plateau. Our record spans the entire Quaternary (~2.8 to 0.1 Ma). The magnetic susceptibility record is compared to the Artemisia/Chenopodiaceae (A/C) ratio, which is used to discriminate between dry and more humid phases in the Qaidam Basin, based on (i) 41 samples spanning the Middle Pleistocene Transition (MPT; ~1 Ma BP) and (ii) additional 40 samples selected from intervals of minimum and maximum ? values throughout the core. For the drill core SG-1, we observe a high correlation of the A/C ratio with ? results: minima of ? correspond to maxima of the A/C ratio (representing more humid phases) and vice versa. Additionally, spectral analysis of the ? record shows the emergence of the 100-ka Milankovitch cycle after the MPT. This testifies to the fact that cyclic variation of ? represents a response to global climate change.

  5. Revised estimates of Greenland ice sheet thinning histories based on ice-core records

    DEFF Research Database (Denmark)

    Lecavalier, B.S.; Milne, G.A.; Fisher, D.A.;


    -based reconstructions and, to some extent, the estimated elevation histories. A key component of the ice core analysis involved removing the influence of vertical surface motion on the dO signal measured from the Agassiz and Renland ice caps. We re-visit the original analysis with the intent to determine if the use...... height changes on the dO signal from the two ice cores. This procedure is complicated by the fact that dO contained in Agassiz ice is influenced by land height changes distant from the ice cap and so selecting a single location at which to compute the land height signal is not possible. Uncertainty...... in this selection is further complicated by the possible influence of Innuitian ice during the early Holocene (12-8 ka BP). Our results indicate that a more accurate treatment of the uplift correction leads to elevation histories that are, in general, shifted down relative to the original curves at GRIP, NGRIP, DYE...

  6. Time capsule: an autonomous sensor and recorder based on diffusion-reaction. (United States)

    Gerber, Lukas C; Rosenfeld, Liat; Chen, Yunhan; Tang, Sindy K Y


    We describe the use of chemical diffusion and reaction to record temporally varying chemical information as spatial patterns without the need for external power. Diffusion of chemicals acts as a clock, while reactions forming immobile products possessing defined optical properties perform sensing and recording functions simultaneously. The spatial location of the products reflects the history of exposure to the detected substances of interest. We refer to our device as a time capsule and show an initial proof of principle in the autonomous detection of lead ions in water.

  7. Literature based drug interaction prediction with clinical assessment using electronic medical records: novel myopathy associated drug interactions.

    Directory of Open Access Journals (Sweden)

    Jon D Duke

    Full Text Available Drug-drug interactions (DDIs are a common cause of adverse drug events. In this paper, we combined a literature discovery approach with analysis of a large electronic medical record database method to predict and evaluate novel DDIs. We predicted an initial set of 13197 potential DDIs based on substrates and inhibitors of cytochrome P450 (CYP metabolism enzymes identified from published in vitro pharmacology experiments. Using a clinical repository of over 800,000 patients, we narrowed this theoretical set of DDIs to 3670 drug pairs actually taken by patients. Finally, we sought to identify novel combinations that synergistically increased the risk of myopathy. Five pairs were identified with their p-values less than 1E-06: loratadine and simvastatin (relative risk or RR = 1.69; loratadine and alprazolam (RR = 1.86; loratadine and duloxetine (RR = 1.94; loratadine and ropinirole (RR = 3.21; and promethazine and tegaserod (RR = 3.00. When taken together, each drug pair showed a significantly increased risk of myopathy when compared to the expected additive myopathy risk from taking either of the drugs alone. Based on additional literature data on in vitro drug metabolism and inhibition potency, loratadine and simvastatin and tegaserod and promethazine were predicted to have a strong DDI through the CYP3A4 and CYP2D6 enzymes, respectively. This new translational biomedical informatics approach supports not only detection of new clinically significant DDI signals, but also evaluation of their potential molecular mechanisms.

  8. Daily Life Event Segmentation for Lifestyle Evaluation Based on Multi-Sensor Data Recorded by a Wearable Device* (United States)

    Li, Zhen; Wei, Zhiqiang; Jia, Wenyan; Sun, Mingui


    In order to evaluate people’s lifestyle for health maintenance, this paper presents a segmentation method based on multi-sensor data recorded by a wearable computer called eButton. This device is capable of recording more than ten hours of data continuously each day in multimedia forms. Automatic processing of the recorded data is a significant task. We have developed a two-step summarization method to segment large datasets automatically. At the first step, motion sensor signals are utilized to obtain candidate boundaries between different daily activities in the data. Then, visual features are extracted from images to determine final activity boundaries. It was found that some simple signal measures such as the combination of a standard deviation measure of the gyroscope sensor data at the first step and an image HSV histogram feature at the second step produces satisfactory results in automatic daily life event segmentation. This finding was verified by our experimental results. PMID:24110323

  9. Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data (United States)

    Butterfield, Ansel J.


    Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.

  10. Strong-motion earthquake accelerograms digitization and analysis records from Lima, Peru, 1951 to 1974 (United States)

    Brady, A. Gerald; Perez, Virgilio


    This is the second of a series of reports planned to include the results of digitization and routine analyses of strong-motion earthquake accelerograms published by the U.S. Geological Survey. Serving as a model for this effort is the collection of data reports published by the Earthquake Engineering Research Laboratory of the California Institute of Technology during the years 1969 - 1975 and covering the significant records of the period from 1933 up to the San Fernando earthquake of February 9, 1971. The first of the present series of reports, Open File Report No. 76-609, covered the significant records of 1971 subsequent to the San Fernando earthquake. The present report includes the results of some ongoing work on Peru records.

  11. Local Structure Analysis and Interface Layer Effect of Phase-Change Recording Material Using Actual Media (United States)

    Nakai, Tsukasa; Yoshiki, Masahiko; Satoh, Yasuhiro; Ashida, Sumio


    The influences of the interface layer on crystal structure, the local atomic arrangement, and the electronic and chemical structure of a GeBiTe (GBT) phase-change recording material have been investigated using X-ray diffraction (XRD), X-ray absorption fine structure (XAFS), and hard X-ray photoelectron spectroscopy (HX-PES) methods using actual rewritable high-speed HD DVD media without special sample processing. XRD results showed that the crystal structure of laser-crystallized GBT alloy in the actual HD DVD media is the same as that of GeSbTe (GST) alloy, which has a NaCl-type structure. No differences between samples with and without interface layers were found. The lattice constant of GBT is larger than that of GST. Bi increases the lattice constant of GST with respect to the Bi substitution ratio of Sb. According to HX-PES, the DOS of in the recording film amorphous state with an interface layer is closer to that of the crystalline state than the recording film without an interface layer. From XAFS results, clear differences between amorphous (Amo.) and crystalline states (Cry.) were observed. The interatomic distance of amorphous recording material is independent of the existence of an interface layer. On the other hand, the coordination number varied slightly due to the presence of the interface layer. Therefore, the electronic state of the recording layer changes because of the interface layer, although the local structure changes only slightly except for the coordination number. Combining these results, we conclude that the interface layer changes the electronic state of the recording layer and promotes crystallization, but only affects the local structure of the atomic arrangement slightly.

  12. Managing Everyday Life: A Qualitative Study of Patients’ Experiences of a Web-Based Ulcer Record for Home-Based Treatment

    Directory of Open Access Journals (Sweden)

    Marianne V. Trondsen


    Full Text Available Chronic skin ulcers are a significant challenge for patients and health service resources, and ulcer treatment often requires the competence of a specialist. Although e-health interventions are increasingly valued for ulcer care by giving access to specialists at a distance, there is limited research on patients’ use of e-health services for home-based ulcer treatment. This article reports an exploratory qualitative study of the first Norwegian web-based counselling service for home-based ulcer treatment, established in 2011 by the University Hospital of North Norway (UNN. Community nurses, general practitioners (GPs and patients are offered access to a web-based record system to optimize ulcer care. The web-based ulcer record enables the exchange and storage of digital photos and clinical information, by the use of which, an ulcer team at UNN, consisting of specialized nurses and dermatologists, is accessible within 24 h. This article explores patients’ experiences of using the web-based record for their home-based ulcer treatment without assistance from community nurses. Semi-structured interviews were conducted with a total of four patients who had used the record. The main outcomes identified were: autonomy and flexibility; safety and trust; involvement and control; and motivation and hope. These aspects improved the patients’ everyday life during long-term ulcer care and can be understood as stimulating patient empowerment.

  13. Design techniques and analysis of high-resolution neural recording systems targeting epilepsy focus localization. (United States)

    Shoaran, Mahsa; Pollo, Claudio; Leblebici, Yusuf; Schmid, Alexandre


    The design of a high-density neural recording system targeting epilepsy monitoring is presented. Circuit challenges and techniques are discussed to optimize the amplifier topology and the included OTA. A new platform supporting active recording devices targeting wireless and high-resolution focus localization in epilepsy diagnosis is also proposed. The post-layout simulation results of an amplifier dedicated to this application are presented. The amplifier is designed in a UMC 0.18µm CMOS technology, has an NEF of 2.19 and occupies a silicon area of 0.038 mm(2), while consuming 5.8 µW from a 1.8-V supply.

  14. Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy (United States)

    Kopitzki, K.; Warnke, P. C.; Timmer, J.


    Invasive electroencephalograph (EEG) recordings of ten patients suffering from focal epilepsy were analyzed using the method of renormalized entropy. Introduced as a complexity measure for the different regimes of a dynamical system, the feature was tested here for its spatiotemporal behavior in epileptic seizures. In all patients a decrease of renormalized entropy within the ictal phase of seizure was found. Furthermore, the strength of this decrease is monotonically related to the distance of the recording location to the focus. The results suggest that the method of renormalized entropy is a useful procedure for clinical applications like seizure detection and localization of epileptic foci.

  15. Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy

    CERN Document Server

    Kopitzki, K; Timmer, J


    Invasive electroencephalograph (EEG) recordings of ten patients suffering from focal epilepsy were analyzed using the method of renormalized entropy. Introduced as a complexity measure for the different regimes of a dynamical system, the feature was tested here for its spatio-temporal behavior in epileptic seizures. In all patients a decrease of renormalized entropy within the ictal phase of seizure was found. Furthermore, the strength of this decrease is monotonically related to the distance of the recording location to the focus. The results suggest that the method of renormalized entropy is a useful procedure for clinical applications like seizure detection and localization of epileptic foci.

  16. Evolution of a high-performance storage system based on magnetic tape instrumentation recorders (United States)

    Peters, Bruce


    In order to provide transparent access to data in network computing environments, high performance storage systems are getting smarter as well as faster. Magnetic tape instrumentation recorders contain an increasing amount of intelligence in the form of software and firmware that manages the processes of capturing input signals and data, putting them on media and then reproducing or playing them back. Such intelligence makes them better recorders, ideally suited for applications requiring the high-speed capture and playback of large streams of signals or data. In order to make recorders better storage systems, intelligence is also being added to provide appropriate computer and network interfaces along with services that enable them to interoperate with host computers or network client and server entities. Thus, recorders are evolving into high-performance storage systems that become an integral part of a shared information system. Data tape has embarked on a program with the Caltech sponsored Concurrent Supercomputer Consortium to develop a smart mass storage system. Working within the framework of the emerging IEEE Mass Storage System Reference Model, a high-performance storage system that works with the STX File Server to provide storage services for the Intel Touchstone Delta Supercomputer is being built. Our objective is to provide the required high storage capacity and transfer rate to support grand challenge applications, such as global climate modeling.

  17. Evolution of a high-performance storage system based on magnetic tape instrumentation recorders (United States)

    Peters, Bruce

    In order to provide transparent access to data in network computing environments, high performance storage systems are getting smarter as well as faster. Magnetic tape instrumentation recorders contain an increasing amount of intelligence in the form of software and firmware that manages the processes of capturing input signals and data, putting them on media and then reproducing or playing them back. Such intelligence makes them better recorders, ideally suited for applications requiring the high-speed capture and playback of large streams of signals or data. In order to make recorders better storage systems, intelligence is also being added to provide appropriate computer and network interfaces along with services that enable them to interoperate with host computers or network client and server entities. Thus, recorders are evolving into high-performance storage systems that become an integral part of a shared information system. Data tape has embarked on a program with the Caltech sponsored Concurrent Supercomputer Consortium to develop a smart mass storage system. Working within the framework of the emerging IEEE Mass Storage System Reference Model, a high-performance storage system that works with the STX File Server to provide storage services for the Intel Touchstone Delta Supercomputer is being built. Our objective is to provide the required high storage capacity and transfer rate to support grand challenge applications, such as global climate modeling.

  18. Incorporating Semantics into Data Driven Workflows for Content Based Analysis (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  19. Physician assessment of disease activity in JIA subtypes. Analysis of data extracted from electronic medical records

    Directory of Open Access Journals (Sweden)

    Wang Deli


    Full Text Available Abstract Objective Although electronic medical records (EMRs have facilitated care for children with juvenile idiopathic arthritis (JIA, analyses of treatment outcomes have required paper based or manually re-entered data. We have started EMR discrete data entry for JIA patient visits, including joint examination and global assessment, by physician and patient. In this preliminary study, we extracted data from the EMR to Xenobase™ (TransMed Systems, Inc., Cupertino, CA, an application permitting cohort analyses of the relationship between global assessment to joint examination and subtype. Methods During clinic visits, data were entered into discrete fields in ambulatory visit forms in the EMR (EpicCare™, Epic Systems, Verona, WI. Data were extracted using Clarity Reports, then de-identified and uploaded for analyses to Xenobase™. Parameters included joint examination, ILAR diagnostic classification, physician global assessment, patient global assessment, and patient pain score. Data for a single visit for each of 160 patients over a 2 month period, beginning March, 2010, were analyzed. Results In systemic JIA patients, strong correlations for physician global assessment were found with pain score, joint count and patient assessment. In contrast, physician assessment for patients with persistent oligoarticular and rheumatoid factor negative patients showed strong correlation with joint counts, but only moderate correlation with pain scores and patient global assessment. Conversely, for enthesitis patients, physician assessment correlated strongly with pain scores, and moderately with joint count and patient global assessment. Rheumatoid factor positive patients, the smallest group studied, showed moderate correlation for all three measures. Patient global assessment for systemic patients showed strong correlations with pain scores and joint count, similar to data for physician assessment. For polyarticular and enthesitis patients

  20. Scoring tail damage in pigs: an evaluation based on recordings at Swedish slaughterhouses

    Directory of Open Access Journals (Sweden)

    Keeling Linda J


    Full Text Available Abstract Background There is increasing interest in recording tail damage in pigs at slaughter to identify problem farms for advisory purposes, but also for benchmarking within and between countries as part of systematic monitoring of animal welfare. However, it is difficult to draw conclusions when comparing prevalence’s between studies and countries partly due to differences in management (e.g. differences in tail docking and enrichment routines and partly due to differences in the definition of tail damage. Methods Tail damage and tail length was recorded for 15,068 pigs slaughtered during three and four consecutive days at two slaughterhouses in Sweden. Tail damage was visually scored according to a 6-point scale and tail length was both visually scored according to a 5-point scale and recorded as tail length in centimetres for pigs with injured or shortened tails. Results The total prevalence of injury or shortening of the tail was 7.0% and 7.2% in slaughterhouse A and B, respectively. When only considering pigs with half or less of the tail left, these percentages were 1.5% and 1.9%, which is in line with the prevalence estimated from the routine recordings at slaughter in Sweden. A higher percentage of males had injured and/or shortened tails, and males had more severely bitten tails than females. Conclusions While the current method to record tail damage in Sweden was found to be reliable as a method to identify problem farms, it clearly underestimates the actual prevalence of tail damage. For monitoring and benchmarking purposes, both in Sweden and internationally, we propose that a three graded scale including both old and new tail damage would be more appropriate. The scale consists of one class for no tail damage, one for mild tail damage (injured or shortened tail with more than half of the tail remaining and one for severe tail damage (half or less of the tail remaining.

  1. A combined cICA-EEMD analysis of EEG recordings from depressed or schizophrenic patients during olfactory stimulation (United States)

    Götz, Th; Stadler, L.; Fraunhofer, G.; Tomé, A. M.; Hausner, H.; Lang, E. W.


    Objective. We propose a combination of a constrained independent component analysis (cICA) with an ensemble empirical mode decomposition (EEMD) to analyze electroencephalographic recordings from depressed or schizophrenic subjects during olfactory stimulation. Approach. EEMD serves to extract intrinsic modes (IMFs) underlying the recorded EEG time. The latter then serve as reference signals to extract the most similar underlying independent component within a constrained ICA. The extracted modes are further analyzed considering their power spectra. Main results. The analysis of the extracted modes reveals clear differences in the related power spectra between the disease characteristics of depressed and schizophrenic patients. Such differences appear in the high frequency γ-band in the intrinsic modes, but also in much more detail in the low frequency range in the α-, θ- and δ-bands. Significance. The proposed method provides various means to discriminate both disease pictures in a clinical environment.

  2. Recording and analysis of locomotion in dairy cows with 3D accelerometers

    NARCIS (Netherlands)

    Mol, de R.M.; Lammers, R.J.H.; Pompe, J.C.A.M.; Ipema, A.H.; Hogewerf, P.H.


    An automated method for lameness detection can be an alternative for detection by regular observations. Accelerometers attached to a leg of the dairy cow can be used to record the locomotion of a dairy cow. In an experiment the 3D acceleration of the right hind leg during walking of three dairy cows

  3. On the Analysis of Wind-Induced Noise in Seismological Recordings (United States)

    Lott, Friederike F.; Ritter, Joachim R. R.; Al-Qaryouti, Mahmoud; Corsmeier, Ulrich


    Atmospheric processes, ranging from microscale turbulence to severe storms on the synoptic scale, impact the continuous ground motion of the earth and have the potential to induce strong broad-band noise in seismological recordings. We designed a target-oriented experiment to quantify the influence of wind on ground motion velocity in the Dead Sea valley. For the period from March 2014 to February 2015, a seismological array, consisting of 15 three-component short-period and broad-band stations, was operated near Madaba, Jordan, complemented by one meteorological tower providing synchronized, continuous three-component measurements of wind speed. Results reveal a pronounced, predominantly linear increase of the logarithmic power of ground motion velocity with rising mean horizontal wind speed at all recording stations. Measurements in rough, mountainous terrain further identify a strong dependency of wind-induced noise on surface characteristics, such as topography and, therefore, demonstrate the necessity to consider wind direction as well. To assess the noise level of seismological recordings with respect to a dynamically changing wind field, we develop a methodology to account for the dependency of power spectral density of ground motion velocity on wind speed and wind direction for long, statistically significant periods. We further introduce the quantitative measure of the ground motion susceptibility to estimate the vulnerability of seismological recordings to the presence of wind.

  4. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas


    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately ex...

  5. Analysis of Self-Recording in Self-Management Interventions for Stereotypy (United States)

    Fritz, Jennifer N.; Iwata, Brian A.; Rolider, Natalie U.; Camp, Erin M.; Neidert, Pamela L.


    Most treatments for stereotypy involve arrangements of antecedent or consequent events that are imposed entirely by a therapist. By contrast, results of some studies suggest that self-recording, a common component of self-management interventions, might be an effective and efficient way to reduce stereotypy. Because the procedure typically has…

  6. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study

    Directory of Open Access Journals (Sweden)

    Ann R R Robertson


    Full Text Available Background   Globally, diabetes mellitus presents a substantial burden to individuals and healthcare systems. Structuring and/or coding of medical records underpin attempts to improve information sharing and searching, potentially bringing clinical and secondary uses benefits.Aims and objectives   We investigated if, how and why records for adults with diabetes were structured and/or coded, and explored stakeholders’ perceptions of current practice.Methods   We carried out a qualitative, theoretically-informed case study of documenting healthcare information for diabetes patients in family practice and hospital settings, using semi-structured interviews, observations, systems demonstrations and documentary data.Results   We conducted 22 interviews and four on-site observations, and reviewed 25 documents. For secondary uses – research, audit, public health and service planning – the benefits of highly structured and coded diabetes data were clearly articulated. Reported clinical benefits in terms of managing and monitoring diabetes, and perhaps encouraging patient self-management, were modest. We observed marked differences in levels of record structuring and/or coding between settings, and found little evidence that these data were being exploited to improve information sharing between them.Conclusions   Using high levels of data structuring and coding in medical records for diabetes patients has potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK.

  7. Estimating the Preferences of Central Bankers : An Analysis of Four Voting Records

    NARCIS (Netherlands)

    Eijffinger, S.C.W.; Mahieu, R.J.; Raes, L.B.D.


    Abstract: This paper analyzes the voting records of four central banks (Sweden, Hungary, Poland and the Czech Republic) with spatial models of voting. We infer the policy preferences of the monetary policy committee members and use these to analyze the evolution in preferences over time and the diff

  8. Model-based estimation of the global carbon budget and its uncertainty from carbon dioxide and carbon isotope records

    Energy Technology Data Exchange (ETDEWEB)

    Kheshgi, Haroon S. [Corporate Research Laboratories, Exxon Research and Engineering Company, Annandale, New Jersey (United States); Jain, Atul K. [Department of Atmospheric Sciences, University of Illinois, Urbana (United States); Wuebbles, Donald J. [Department of Atmospheric Sciences, University of Illinois, Urbana (United States)


    A global carbon cycle model is used to reconstruct the carbon budget, balancing emissions from fossil fuel and land use with carbon uptake by the oceans, and the terrestrial biosphere. We apply Bayesian statistics to estimate uncertainty of carbon uptake by the oceans and the terrestrial biosphere based on carbon dioxide and carbon isotope records, and prior information on model parameter probability distributions. This results in a quantitative reconstruction of past carbon budget and its uncertainty derived from an explicit choice of model, data-based constraints, and prior distribution of parameters. Our estimated ocean sink for the 1980s is 17{+-}7 Gt C (90% confidence interval) and is comparable to the estimate of 20{+-}8 Gt C given in the recent Intergovernmental Panel on Climate Change assessment [Schimel et al., 1996]. Constraint choice is tested to determine which records have the most influence over estimates of the past carbon budget; records individually (e.g., bomb-radiocarbon inventory) have little effect since there are other records which form similar constraints. (c) 1999 American Geophysical Union.

  9. Transect based analysis versus area based analysis to quantify shoreline displacement: spatial resolution issues. (United States)

    Anfuso, Giorgio; Bowman, Dan; Danese, Chiara; Pranzini, Enzo


    Field surveys, aerial photographs, and satellite images are the most commonly employed sources of data to analyze shoreline position, which are further compared by area based analysis (ABA) or transect based analysis (TBA) methods. The former is performed by computing the mean shoreline displacement for the identified coastal segments, i.e., dividing the beach area variation by the segment length; the latter is based on the measurement of the distance between each shoreline at set points along transects. The present study compares, by means of GIS tools, the ABA and TBA methods by computing shoreline displacements recorded on two stretches of the Tuscany coast (Italy): the beaches of Punta Ala, a linear coast without shore protection structures, and the one at Follonica, which is irregular due to the presence of groins and detached breakwaters. Surveys were carried out using a differential global positioning system (DGPS) in RTK mode. For each site, a 4800-m-long coastal segment was analyzed and divided into ninety-six 50-m-long sectors for which changes were computed using both the ABA and TBA methods. Sectors were progressively joined to have a length of 100, 200, 400, and 800 m to examine how this influenced results. ABA and TBA results are highly correlated for transect distance and sector length up to 100 m at both investigated locations. If longer transects are considered, the two methods still produce good correlated data on the smooth shoreline (i.e. at Punta Ala), but correlation became significantly lower on the irregular shoreline (i.e., at Follonica).

  10. A stratigraphic framework for naming and robust correlation of abrupt climatic changes during the last glacial period based on three synchronized Greenland ice core records (United States)

    Rasmussen, Sune O.


    Due to their outstanding resolution and well-constrained chronologies, Greenland ice core records have long been used as a master record of past climatic changes during the last interglacial-glacial cycle in the North Atlantic region. As part of the INTIMATE (INtegration of Ice-core, MArine and TErrestrial records) project, protocols have been proposed to ensure consistent and robust correlation between different records of past climate. A key element of these protocols has been the formal definition of numbered Greenland Stadials (GS) and Greenland Interstadials (GI) within the past glacial period as the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. Using a recent synchronization of the NGRIP, GRIP, and GISP2 ice cores that allows the parallel analysis of all three records on a common time scale, we here present an extension of the GS/GI stratigraphic template to the entire glacial period. This is based on a combination of isotope ratios (δ18O, reflecting mainly local temperature) and calcium concentrations (reflecting mainly atmospheric dust loading). In addition to the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice core records more than two decades ago, a number of short-lived climatic oscillations have been identified in the three synchronized records. Some of these events have been observed in other studies, but we here propose a consistent scheme for discriminating and naming all the significant climatic events of the last glacial period that are represented in the Greenland ice cores. This is a key step aimed at promoting unambiguous comparison and correlation between different proxy records, as well as a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations. The work presented is under review for publication in Quaternary Science Reviews. Author team: S

  11. Recording the dynamic endocytosis of single gold nanoparticles by AFM-based force tracing (United States)

    Ding, Bohua; Tian, Yongmei; Pan, Yangang; Shan, Yuping; Cai, Mingjun; Xu, Haijiao; Sun, Yingchun; Wang, Hongda


    We utilized force tracing to directly record the endocytosis of single gold nanoparticles (Au NPs) with different sizes, revealing the size-dependent endocytosis dynamics and the crucial role of membrane cholesterol. The force, duration and velocity of Au NP invagination are accurately determined at the single-particle and microsecond level unprecedentedly.We utilized force tracing to directly record the endocytosis of single gold nanoparticles (Au NPs) with different sizes, revealing the size-dependent endocytosis dynamics and the crucial role of membrane cholesterol. The force, duration and velocity of Au NP invagination are accurately determined at the single-particle and microsecond level unprecedentedly. Electronic supplementary information (ESI) available: Details of the experimental procedures and the results of the control experiments. See DOI: 10.1039/c5nr01020a

  12. Improved Security of Attribute Based Encryption for Securing Sharing of Personal Health Records

    Directory of Open Access Journals (Sweden)

    Able E Alias


    Full Text Available Cloud computing servers provides platform for users to remotely store data and share the data items to everyone. Personal health record (PHR has emerged as a patient –centric model of health information exchange. Confidentiality of the shared data is the major problem when patients uses the commercial cloud servers because it can be view by everyone., to assure the patient’s control over access to their own medical records; it is a promising method to encrypt the files before outsourcing and give access control to that data. Privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control In this paper a high degree of patient privacy is guaranteed by exploiting multi-authority ABE. Divide the users in the PHR system into multiple security domains that greatly reduces the key management complexity for owners and users

  13. Design of an Electronic Healthcare Record Server Based on Part 1 of ISO EN 13606

    Directory of Open Access Journals (Sweden)

    Tony Austin


    Full Text Available ISO EN 13606 is a newly approved standard at European and ISO levels for the meaningful exchange of clinical information between systems. Although conceived as an inter-operability standard to which existing electronic health record (EHR systems will transform legacy data, the requirements met and architectural approach reflected in this standard also make it a good candidate for the internal architecture of an EHR server. The authors have built such a server for the storage of healthcare records and demonstrated that it is possible to use ISO EN 13606 part 1 as the basis of an internal system architecture. The development of the system and some of the applications of the server are described in this paper. It is the first known operational implementation of the standard as an EHR system.

  14. Fundamental electric circuit elements based on the linear and nonlinear magnetoelectric effects (Presentation Recording) (United States)

    Sun, Young; Shang, Dashan; Chai, Yisheng; Cao, Zexian; Lu, Jun


    From the viewpoint of electric circuit theory, the three fundamental two-terminal passive circuit elements, resistor R , capacitor C, and inductor L, are defined in terms of a relationship between two of the four basic circuit variables, charge q, current i, voltage v, and magnetic flux φ. From a symmetry concern, there should be a fourth fundamental element defined from the relationship between charge q and magnetic flux φ. Here we present both theoretical analysis and experimental evidences to demonstrate that a two-terminal passive device employing the magnetoelectric (ME) effects can exhibit a direct relationship between charge q and magnetic flux φ, and thus is able to act as the fourth fundamental circuit element. The ME effects refer to the induction of electric polarization by a magnetic field or magnetization by an electric field, and have attracted enormous interests due to their promise in many applications. However, no one has linked the ME effects with fundamental circuit theory. Both the linear and nonlinear-memory devices, termed transtor and memtranstor, respectively, have been experimentally realized using multiferroic materials showing strong ME effects. Based on our work, a full map of fundamental two-terminal circuit elements is constructed, which consists of four linear and four nonlinear-memory elements. This full map provides an invaluable guide to developing novel circuit functionalities in the future.

  15. 故障录波器后台分析软件关键问题研究%Research on key technique of fault recorder background analysis software

    Institute of Scientific and Technical Information of China (English)

    郭振华; 江亚群; 杨帅雄; 梁勇超; 黄纯


    This paper studies the difficulties and key technique of designing power fault record analysis software of fault record device. Firstly, the algorithm of calculating waveform vertical coordinate with consideration of unified waveform scaling is proposed according to the characteristics and requirements of the power fault record waveform analysis and displaying. Then the computer double buffer drawing method is adopted to avoid graphics flicker which happened in traditional Windows graphics plot. In the IEC COMTRADE formatted fault recording file, the recording device samples the signal with different sampling rates in different periods; in order to estimate cross-period signal parameters, a dual sampling rate parameter algorithm based on discrete Fourier transform is given. The power fault record analysis system developed by Visual C++6.0 has friendly man-machine interface and excellent performance, which has been used in engineering practice.%对故障录波装置中录波分析软件设计中的难点和关键技术进行了研究.根据电力故障录波波形分析、显示的特点和要求,提出统一考虑波形缩放的波形显示曲线纵坐标的计算方法.采用计算机双缓存绘图方法,解决了传统Windows绘图闪烁问题,改善了显示效果.针对IEC C0MTRADE格式录波文件分时段多频率采样的特点,提出基于离散傅里叶变换的双采样速率波形参数估计算法,实现了波形参数的跨时段计算.采用Visual C++6.0设计的电力故障录波分析系统人机界面友好,性能优良,已应用于工程实践.

  16. Time based measurement of the impedance of the skin-electrode interface for dry electrode ECG recording. (United States)

    Dozio, Roberta; Baba, Adeshina; Assambo, Cedric; Burke, Martin J


    This paper reports the measurement of the properties of dry or pasteless conductive electrodes to be used for long-term recording of the human electrocardiogram (ECG). Knowledge of these properties is essential for the correct design of the input stage of associated recording amplifiers. Measurements were made on three commercially available conductive carbon based electrodes at pressures of 5 mmHg and 20 mmHg, located on the lower abdomen of the body on three subjects having different skin types. Parameter values were fitted to a two-time-constant based model of the electrode using data measured over a period of 10s. Values of resistance, ranging from 40kOmega to 1590kOmega and of capacitance ranging from 0.05 microF to 38 microF were obtained for the components, while the values of the time-constants varied from 0.07 s to 3.9s.

  17. Ground-based assessment of the bias and long-term stability of 14 limb and occultation ozone profile data records (United States)

    Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, José; Keppens, Arno; Baray, Jean-Luc; Bourassa, Adam E.; Cortesi, Ugo; Degenstein, Doug A.; Froidevaux, Lucien; Godin-Beekmann, Sophie; Hoppel, Karl W.; Johnson, Bryan J.; Kyrölä, Erkki; Leblanc, Thierry; Lichtenberg, Günter; Marchand, Marion; McElroy, C. Thomas; Murtagh, Donal; Nakane, Hideaki; Portafaix, Thierry; Querel, Richard; Russell, James M., III; Salvador, Jacobo; Smit, Herman G. J.; Stebel, Kerstin; Steinbrecht, Wolfgang; Strawbridge, Kevin B.; Stübi, René; Swart, Daan P. J.; Taha, Ghassan; Tarasick, David W.; Thompson, Anne M.; Urban, Joachim; van Gijsel, Joanna A. E.; Van Malderen, Roeland; von der Gathen, Peter; Walker, Kaley A.; Wolfram, Elian; Zawodny, Joseph M.


    profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of 14 limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20 and 40 km the satellite ozone measurement biases are smaller than ±5 %, the short-term variabilities are less than 5-12 % and the drifts are at most ±5 % decade-1 (or even ±3 % decade-1 for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10 % and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY) and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions

  18. Paleoclimate record and paleohydrogeological analysis of travertine from the Niangziguan Karst Springs, northern China

    Institute of Scientific and Technical Information of China (English)

    李义连; 王焰新; 邓安力


    Travertine deposited around the Niangziguan karst springs was used as a new type of paleoclimate record in this study. Five stages of climate change in northern China from 200± ka to 36± ka before the present (B. P.) were reconstructed using the 18O and 13C isotope record of the travertine. Tendency of the change was towards a more arid climate. Coupling the temporal-spatial evolution of the springs with climate change, the hydrogeological evolution could be divided into four major periods since middle Pleistocene: (1)No spring period; (2)The initial period of spring outcropping as the predominant way of discharge; (3)The culmination period of spring development; and (4)The spring discharge attenuation period. The attenuation is partly related to the decrease of recharge as a result of the dry climate after 90±kaBP.

  19. Appropriate threshold levels of cardiac beat-to-beat variation in semi-automatic analysis of equine ECG recordings

    DEFF Research Database (Denmark)

    Madsen, Mette Flethøj; Kanters, Jørgen K.; Pedersen, Philip Juul


    Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study was to de......Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study...... was to determine the appropriate threshold levels of maximum acceptable deviation of RR intervals in equine ECG analysis, and to evaluate a novel two-step timing algorithm by quantifying the frequency of arrhythmias in a cohort of healthy adult endurance horses. Results: Beat-to-beat variation differed......, range 1–24). Conclusions: Beat-to-beat variation of equine cardiac rhythm varies according to HR, and threshold levels in equine ECG analysis should be adjusted accordingly. Standardization of the analysis criteria will enable comparisons of studies and follow-up examinations of patients. A small number...

  20. Hair Analysis Provides a Historical Record of Cortisol Levels in Cushing’s Syndrome (United States)

    Thomson, S.; Koren, G.; Fraser, L.-A.; Rieder, M.; Friedman, T. C.; Van Uum, S. H. M.


    The severity of Cushing’s Syndrome (CS) depends on the duration and extent of the exposure to excess glucocorticoids. Current measurements of cortisol in serum, saliva and urine reflect systemic cortisol levels at the time of sample collection, but cannot assess past cortisol levels. Hair cortisol levels may be increased in patients with CS, and, as hair grows about 1 cm/month, measurement of hair cortisol may provide historical information on the development of hypercortisolism. We attempted to measure cortisol in hair in relation to clinical course in six female patients with CS and in 32 healthy volunteers in 1 cm hair sections. Hair cortisol content was measured using a commercially available salivary cortisol immune assay with a protocol modified for use with hair. Hair cortisol levels were higher in patients with CS than in controls, the medians (ranges) were 679 (279–2500) and 116 (26–204) ng/g respectively (P <0.001). Segmental hair analysis provided information for up to 18 months before time of sampling. Hair cortisol concentrations appeared to vary in accordance with the clinical course. Based on these data, we suggest that hair cortisol measurement is a novel method for assessing dynamic systemic cortisol exposure and provides unique historical information on variation in cortisol, and that more research is required to fully understand the utility and limits of this technique. PMID:19609841

  1. Conventional heart rate variability analysis of ambulatory electrocardiographic recordings fails to predict imminent ventricular fibrillation (United States)

    Vybiral, T.; Glaeser, D. H.; Goldberger, A. L.; Rigney, D. R.; Hess, K. R.; Mietus, J.; Skinner, J. E.; Francis, M.; Pratt, C. M.


    OBJECTIVES. The purpose of this report was to study heart rate variability in Holter recordings of patients who experienced ventricular fibrillation during the recording. BACKGROUND. Decreased heart rate variability is recognized as a long-term predictor of overall and arrhythmic death after myocardial infarction. It was therefore postulated that heart rate variability would be lowest when measured immediately before ventricular fibrillation. METHODS. Conventional indexes of heart rate variability were calculated from Holter recordings of 24 patients with structural heart disease who had ventricular fibrillation during monitoring. The control group consisted of 19 patients with coronary artery disease, of comparable age and left ventricular ejection fraction, who had nonsustained ventricular tachycardia but no ventricular fibrillation. RESULTS. Heart rate variability did not differ between the two groups, and no consistent trends in heart rate variability were observed before ventricular fibrillation occurred. CONCLUSIONS. Although conventional heart rate variability is an independent long-term predictor of adverse outcome after myocardial infarction, its clinical utility as a short-term predictor of life-threatening arrhythmias remains to be elucidated.

  2. Automatic Identification of Motion Artifacts in EHG Recording for Robust Analysis of Uterine Contractions

    Directory of Open Access Journals (Sweden)

    Yiyao Ye-Lin


    Full Text Available Electrohysterography (EHG is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.

  3. An analysis of the recording of tobacco use among inpatients in Irish hospitals.

    LENUS (Irish Health Repository)

    Sheridan, A


    Smoking is the largest avoidable cause of premature mortality in the world. Hospital admission is an opportunity to identify and help smokers quit. This study aimed to determine the level of recording of tobacco use (current and past) in Irish hospitals. Information on inpatient discharges with a tobacco use diagnosis was extracted from HIPE. In 2011, a quarter (n=84, 679) of discharges had a recording of tobacco use, which were more common among males (29% (n=50,161) male v. 20% (n=30,162) female), among medical patients (29% (n=54,375) medical v. 20% (n=30,162) other) and was highest among those aged 55-59 years (30.6%; n=7,885). SLAN 2007 reported that 48% of adults had smoked at some point in their lives. This study would suggest an under- reporting of tobacco use among hospital inpatients. Efforts should be made to record smoking status at hospital admission, and to improve the quality of the HIPE coding of tobacco use.

  4. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech


    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  5. ATLAS Recordings

    CERN Multimedia

    Steven Goldfarb; Mitch McLachlan; Homer A. Neal

    Web Archives of ATLAS Plenary Sessions, Workshops, Meetings, and Tutorials from 2005 until this past month are available via the University of Michigan portal here. Most recent additions include the Trigger-Aware Analysis Tutorial by Monika Wielers on March 23 and the ROOT Workshop held at CERN on March 26-27.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal.Feedback WelcomeOur group is making arrangements now to record plenary sessions, tutorials, and other important ATLAS events for 2007. Your suggestions for potential recording, as well as your feedback on existing archives is always welcome. Please contact us at Thank you.Enjoy the Lectures!

  6. Astronomical calibration of the Boreal Santonian (Cretaceous) based on the marine carbon isotope record and correlation to the tropical realm (United States)

    Thibault, Nicolas; Jarvis, Ian; Voigt, Silke; Gale, Andy; Attree, Kevin; Jenkyns, Hugh


    New high-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the reference sections at Seaford Head (southern England) and Bottaccione (Gubbio, central Italy). These records allow for a new and unambiguous stratigraphic correlation of the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of stable carbon and oxygen isotopes can be highlighted in the Seaford Head dataset, and a floating astronomical time scale is presented for the Santonian of the section, which spans five 405 kyr cycles (Sa1 to Sa5). Macro-, micro- and nannofossil biostratigraphy of the Seaford section is integrated along with magnetostratigraphy, carbon-isotope chemostratigraphy and cyclostratigraphy. Correlation of the Seaford Head astronomical time scale to that of the Niobrara Formation (U.S. Western Interior Basin) allows for anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19±0.38 Ma. Five different astronomical tuning options are examined. The astronomical calibration generates a c. 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, likely due either to slight diachronism of the first occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions, or to remaining uncertainties of radiometric dating and the cyclostratigraphic records.

  7. Genetic Counselors’ Current Use of Personal Health Records-Based Family Histories in Genetic Clinics and Considerations for Their Future Adoption


    Widmer, Chaney; DeShazo, Jonathan P.; Bodurtha, Joann; Quillin, John; Creswick, Heather


    Given the widespread adoption of electronic medical records and recent emergence of electronic family history tools, we examined genetic counselors’ perspectives on the emerging technology of the personal health record (PHR)-based family history tool that links to an electronic medical record (EMR). Two-hundred thirty-three genetic counselors responded to an on-line survey eliciting current use of electronic family history (EFH) tools and familiarity with PHR-based family history tools. Addit...

  8. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick


    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  9. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data. (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick


    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  10. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert


    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  11. Vertical Microbial Community Variability of Carbonate-based Cones may Provide Insight into Formation in the Rock Record (United States)

    Trivedi, C.; Bojanowski, C.; Daille, L. K.; Bradley, J.; Johnson, H.; Stamps, B. W.; Stevenson, B. S.; Berelson, W.; Corsetti, F. A.; Spear, J. R.


    Stromatolite morphogenesis is poorly understood, and the process by which microbial mats become mineralized is a primary question in microbialite formation. Ancient conical stromatolites are primarily carbonate-based whereas the few modern analogues in hot springs are either non-mineralized or mineralized by silica. A team from the 2015 International GeoBiology Course investigated carbonate-rich microbial cones from near Little Hot Creek (LHC), Long Valley Caldera, California, to investigate how conical stromatolites might form in a hot spring carbonate system. The cones are up to 3 cm tall and are found in a calm, ~45° C pool near LHC that is 4 times super-saturated with respect to CaCO3. The cones rise from a flat, layered microbial mat at the edge of the pool. Scanning electron microscopy revealed filamentous bacteria associated with calcite crystals within the cone tips. Preliminary 16S rRNA gene analysis indicated variability of community composition between different vertical levels of the cone. The cone tip had comparatively greater abundance of filamentous cyanobacteria (Leptolyngbya and Phormidium) and fewer heterotrophs (e.g. Chloroflexi) compared to the cone bottom. This supports the hypothesis that cone formation may depend on the differential abundance of the microbial community and their potential functional roles. Metagenomic analyses of the cones revealed potential genes related to chemotaxis and motility. Specifically, a genomic bin identified as a member of the genus Isosphaera contained an hmp chemotaxis operon implicated in gliding motility in the cyanobacterium Nostoc punctiforme [1]. Isosphaera is a Planctomycete shown to have phototactic capabilities [2], and may play a role in conjunction with cyanobacteria in the vertical formation of the cones. This analysis of actively growing cones indicates a complex interplay of geochemistry and microbiology that form structures which can serve as models for processes that occurred in the past and are

  12. An Agent Based System Framework for Mining Data Record Extraction from Search Engine Result Pages

    Directory of Open Access Journals (Sweden)

    Dr.K.L Shunmuganathan


    Full Text Available Nowadays, the huge amount of information distributed through the Web motivates studying techniques to be adopted in order to extract relevant data in an efficient and reliable way. Information extraction (IE from semistructured Web documents plays an important role for a variety of information agents. In this paper, a framework of WebIE system with the help of the JADE platform is proposed to solve problems by non-visual automatic wrapper to extract data records from search engine results pages which contain important information for Meta search engine and computer users. It gives the idea about different agents used in WebIE and how the communication occurred between them and how to manage different agents. Multi Agent System (MAS provides an efficient way for communicating agents and it is decentralized. Prototype model is developed for the study purpose and how it is used to solve the complex problems arise into the WebIE. Our wrapper consists of a series of agent filter to detect and remove irrelevant data region from the web page. In this paper, we propose a highly effective and efficient algorithm for automatically mining result records from search engine responsepages.

  13. A biophysically-based finite state machine model for analysing gastric experimental entrainment and pacing recordings (United States)

    Sathar, Shameer; Trew, Mark L.; Du, Peng; O’ Grady, Greg; Cheng, Leo K.


    Gastrointestinal motility is coordinated by slow waves (SWs) generated by the interstitial cells of Cajal (ICC). Experimental studies have shown that SWs spontaneously activate at different intrinsic frequencies in isolated tissue, whereas in intact tissues they are entrained to a single frequency. Gastric pacing has been used in an attempt to improve motility in disorders such as gastroparesis by modulating entrainment, but the optimal methods of pacing are currently unknown. Computational models can aid in the interpretation of complex in-vivo recordings and help to determine optical pacing strategies. However, previous computational models of SW entrainment are limited to the intrinsic pacing frequency as the primary determinant of the conduction velocity, and are not able to accurately represent the effects of external stimuli and electrical anisotropies. In this paper, we present a novel computationally efficient method for modelling SW propagation through the ICC network while accounting for conductivity parameters and fiber orientations. The method successfully reproduced experimental recordings of entrainment following gastric transection and the effects of gastric pacing on SW activity. It provides a reliable new tool for investigating gastric electrophysiology in normal and diseased states, and to guide and focus future experimental studies. PMID:24276722

  14. High-frequency paleoclimatic variability: a spectral analysis of the Vostok ice-core isotopic record

    Energy Technology Data Exchange (ETDEWEB)

    Yiou, P.; Genthon, C.; Jouzel, J.; Le Treut, H.; Lorius, C.; Ghil, M.; Korotkevich, Y.S.


    This paper uses a recently analysed isotopic record from an ice core drilled at the Soviet Antartic Station VOSTOK, representing a total time span of about 160,000 years. Results obtained show the existence of a significative climatic variability at the time scale of 10,000 years and below. The many spectral peaks appear to be approximate linear combination of a little number among them, a clear indication of the non linear nature of climate fluctuations at these ''short'' time scales.

  15. Detection of the short-term preseizure changes in EEG recordings using complexity and synchrony analysis

    Institute of Scientific and Technical Information of China (English)

    JIA Wenyan; KONG Na; MA Jun; LIU Hesheng; GAO Xiaorong; GAO Shangkai; YANG Fusheng


    An important consideration in epileptic seizure prediction is proving the existence of a pre-seizure state that can be detected using various signal processing algorithms. In the analyses of intracranial electroencephalographic (EEG)recordings of four epilepsy patients, the short-term changes in the measures of complexity and synchrony were detected before the majority of seizure events across the sample patient population. A decrease in complexity and increase in phase synchrony appeared several minutes before seizure onset and the changes were more pronounced in the focal region than in the remote region. This result was also validated statistically using a surrogate data method.

  16. Electronic Medical Record System Based on XML%基于XML的电子病历系统

    Institute of Scientific and Technical Information of China (English)



    According to the problems of medical information transmission delaying and time consuming of browsing anamnesis in the hand-writing medical records, the design solution of the electronic medical record based on XML was raised and complemented. By the development platform of Web Services, the clinical information centered by patients was integrated by the support of EMR, including the physical orders, medical technical inspections, nursing care and infectious diseases' reports. And the information query and integration was implemented in the system. By applying the electronic medical record system, supervision on medical records from multi-direction is being substituted for emphasis on terminal quality control only in hand-writing medical records.%文章针对传统病历书写中存在的医疗信息传递慢,历史病历调阅繁琐等问题,提出并实现了基于XML技术的电子病历系统设计方案.该系统作为临床信息数据的载体,以患者诊疗信息为主线,借助Web Services开发平台,集成了医嘱、医技、护理及传染病报病等信息,实现了数据查询与集成.本系统克服了传统手写病历管理只重病历的终末监控的问题,强化了对病历的多点、多方位监控.

  17. A 350 ka record of climate change from Lake El'gygytgyn, Far East Russian Arctic: refining the pattern of climate modes by means of cluster analysis

    Directory of Open Access Journals (Sweden)

    U. Frank


    Full Text Available Rock magnetic, biochemical and inorganic records of the sediment cores PG1351 and Lz1024 from Lake El'gygytgyn, Chukotka peninsula, Far East Russian Arctic, were subject to a hierarchical agglomerative cluster analysis in order to refine and extend the pattern of climate modes as defined by Melles et al. (2007. Cluster analysis of the data obtained from both cores yielded similar results, differentiating clearly between the four climate modes warm, peak warm, cold and dry, and cold and moist. In addition, two transitional phases were identified, representing the early stages of a cold phase and slightly colder conditions during a warm phase. The statistical approach can thus be used to resolve gradual changes in the sedimentary units as an indicator of available oxygen in the hypolimnion in greater detail. Based upon cluster analyses on core Lz1024, the published succession of climate modes in core PG1351, covering the last 250 ka, was modified and extended back to 350 ka. Comparison to the marine oxygen isotope (δ18O stack LR04 (Lisiecki and Raymo, 2005 and the summer insolation at 67.5° N, with the extended Lake El'gygytgyn parameter records of magnetic susceptibility (κLF, total organic carbon content (TOC and the chemical index of alteration (CIA; Minyuk et al., 2007, revealed that all stages back to marine isotope stage (MIS 10 and most of the substages are clearly reflected in the pattern derived from the cluster analysis.

  18. Combining terrestrial stereophotogrammetry, DGPS and GIS-based 3D voxel modelling in the volumetric recording of archaeological features (United States)

    Orengo, Hector A.


    Archaeological recording of structures and excavations in high mountain areas is greatly hindered by the scarce availability of both space, to transport material, and time. The Madriu-Perafita-Claror, InterAmbAr and PCR Mont Lozère high mountain projects have documented hundreds of archaeological structures and carried out many archaeological excavations. These projects required the development of a technique which could record both structures and the process of an archaeological excavation in a fast and reliable manner. The combination of DGPS, close-range terrestrial stereophotogrammetry and voxel based GIS modelling offered a perfect solution since it helped in developing a strategy which would obtain all the required data on-site fast and with a high degree of precision. These data are treated off-site to obtain georeferenced orthoimages covering both the structures and the excavation process from which site and excavation plans can be created. The proposed workflow outputs also include digital surface models and volumetric models of the excavated areas from which topography and archaeological profiles were obtained by voxel-based GIS procedures. In this way, all the graphic recording required by standard archaeological practices was met.

  19. Health scorecard of spacecraft platforms: Track record of on-orbit anomalies and failures and preliminary comparative analysis (United States)

    Wise, Marcie A.; Saleh, Joseph H.; Haga, Rachel A.


    Choosing the "right" satellite platform for a given market and mission requirements is a major investment decision for a satellite operator. With a variety of platforms available on the market from different manufacturers, and multiple offerings from the same manufacturer, the down-selection process can be quite involved. In addition, because data for on-obit failures and anomalies per platform is unavailable, incomplete, or fragmented, it is difficult to compare options and make an informed choice with respect to the critical attribute of field reliability of different platforms. In this work, we first survey a large number of geosynchronous satellite platforms by the major satellite manufacturers, and we provide a brief overview of their technical characteristics, timeline of introduction, and number of units launched. We then analyze an extensive database of satellite failures and anomalies, and develop for each platform a "health scorecard" that includes all the minor and major anomalies, and complete failures—that is failure events of different severities—observed on-orbit for each platform. We identify the subsystems that drive these failure events and how much each subsystem contributes to these events for each platform. In addition, we provide the percentage of units in each platform which have experienced failure events, and, after calculating the total number of years logged on-orbit by each platform, we compute its corresponding average failure and anomaly rate. We conclude this work with a preliminary comparative analysis of the health scorecards of different platforms. The concept of a "health scorecard" here introduced provides a useful snapshot of the failure and anomaly track record of a spacecraft platform on orbit. As such, it constitutes a useful and transparent benchmark that can be used by satellite operators to inform their acquisition choices ("inform" not "base" as other considerations are factored in when comparing different spacecraft

  20. The validation of a computer-based food record for older adults: the Novel Assessment of Nutrition and Ageing (NANA) method. (United States)

    Timon, Claire M; Astell, Arlene J; Hwang, Faustina; Adlam, Tim D; Smith, Tom; Maclean, Lin; Spurr, Daynor; Forster, Sarah E; Williams, Elizabeth A


    Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65-89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland-Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual's dietary intake.

  1. A new on-line electrocardiographic records database and computer routines for data analysis. (United States)

    Ledezma, Carlos A; Severeyn, Erika; Perpiñán, Gilberto; Altuve, Miguel; Wong, Sara


    Gathering experimental data to test computer methods developed during a research is a hard work. Nowadays, some databases have been stored online that can be freely downloaded, however there is not a wide range of databases yet and not all pathologies are covered. Researchers with low resources are in need of more data they can consult for free. To cope with this we present an on-line portal containing a compilation of ECG databases recorded over the last two decades for research purposes. The first version of this portal contains four databases of ECG records: ischemic cardiopathy (72 patients, 3-lead ECG each), ischemic preconditioning (20 patients, 3-lead ECG each), diabetes (51 patients, 8-lead ECG each) and metabolic syndrome (25 subjects, 12-lead ECG each). In addition, one computer program and three routines are provided in order to correctly read the signals, and two digital filters along with two ECG waves detectors are provided for further processing. This portal will be constantly growing, other ECG databases and signal processing software will be uploaded. With this project, we give the scientific community a resource to avoid hours of data collection and to develop free software.

  2. Matlab software for the analysis of seismic waves recorded by three-element arrays (United States)

    Pignatelli, A.; Giuntini, A.; Console, R.


    We develop and implement an algorithm for inverting three-element array data on a Matlab platform. The algorithm allows reliable estimation of back azimuth and apparent velocity from seismic records under low signal-to-noise conditions. We start with a cubic spline interpolation of the waveforms and determine the differences between arrival times at pairs of array elements. The time differences are directly computed from cross-correlation functions. The advantages of this technique are (a) manual picking of the onset of each arrival is not necessary at each array element; (b) interpolation makes it possible to estimate time differences at a higher resolution than the sampling rate of the digital waveforms; (c) consistency among three independent determinations provides a reliability check; and (d) the value of apparent velocity indicates the nature of the recorded wavelet and physically checks the results. The algorithm was tested on data collected by a tri-partite array (with an aperture of ˜250 m) deployed in 1998 by the National Data Center of Israel, during a field experiment in southern Israel, 20 km southwest of the Dead Sea. The data include shallow explosions and natural earthquakes under both high and low signal-to-noise conditions. The procedure developed in this study is considered suitable for searching of small aftershocks subsequent to an underground explosion, in the context of on-site inspections according to the Comprehensive Nuclear-Test-Ban Treaty (CTBT).

  3. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos


    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  4. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton


    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  5. Late Holocene stable-isotope based winter temperature records from ice wedges in the Northeast Siberian Arctic (United States)

    Opel, Thomas; Meyer, Hanno; Laepple, Thomas; Dereviagin, Alexander Yu.


    The Arctic is currently undergoing an unprecedented warming. This highly dynamic response on changes in climate forcing and the global impact of the Arctic water, carbon and energy balances make the Arctic a key region to study past, recent and future climate changes. Recent proxy-based temperature reconstructions indicate a long-term cooling over the past about 8 millennia that is mainly related to a decrease in solar summer insolation and has been reversed only by the ongoing warming. Climate model results on the other hand show no significant change or even a slight warming over this period. This model-proxy data mismatch might be caused by a summer bias of the used climate proxies. Ice wedges may provide essential information on past winter temperatures for a comprehensive seasonal picture of Holocene Arctic climate variability. Polygonal ice wedges are a widespread permafrost feature in the Arctic tundra lowlands. Ice wedges form by the repeated filling of thermal contraction cracks with snow melt water, which quickly refreezes at subzero ground temperatures and forms ice veins. As the seasonality of frost cracking and infill is generally related to winter and spring, respectively, the isotopic composition of wedge ice is indicative of past climate conditions during the annual cold season (DJFMAM, hereafter referred to as winter). δ18O of ice is interpreted as proxy for regional surface air temperature. AMS radiocarbon dating of organic remains in ice-wedge samples provides age information to generate chronologies for single ice wedges as well as regionally stacked records with an up to centennial resolution. In this contribution we seek to summarize Holocene ice-wedge δ18O based temperature information from the Northeast Siberian Arctic. We strongly focus on own work in the Laptev Sea region but consider as well literature data from other regional study sites. We consider the stable-isotope composition of wedge ice, ice-wedge dating and chronological

  6. Towards a Judgement-Based Statistical Analysis (United States)

    Gorard, Stephen


    There is a misconception among social scientists that statistical analysis is somehow a technical, essentially objective, process of decision-making, whereas other forms of data analysis are judgement-based, subjective and far from technical. This paper focuses on the former part of the misconception, showing, rather, that statistical analysis…

  7. SigMate: a Matlab-based automated tool for extracellular neuronal signal processing and analysis. (United States)

    Mahmud, Mufti; Bertoldo, Alessandra; Girardi, Stefano; Maschietto, Marta; Vassanelli, Stefano


    Rapid advances in neuronal probe technology for multisite recording of brain activity have posed a significant challenge to neuroscientists for processing and analyzing the recorded signals. To be able to infer meaningful conclusions quickly and accurately from large datasets, automated and sophisticated signal processing and analysis tools are required. This paper presents a Matlab-based novel tool, "SigMate", incorporating standard methods to analyze spikes and EEG signals, and in-house solutions for local field potentials (LFPs) analysis. Available modules at present are - 1. In-house developed algorithms for: data display (2D and 3D), file operations (file splitting, file concatenation, and file column rearranging), baseline correction, slow stimulus artifact removal, noise characterization and signal quality assessment, current source density (CSD) analysis, latency estimation from LFPs and CSDs, determination of cortical layer activation order using LFPs and CSDs, and single LFP clustering; 2. Existing modules: spike detection, sorting and spike train analysis, and EEG signal analysis. SigMate has the flexibility of analyzing multichannel signals as well as signals from multiple recording sources. The in-house developed tools for LFP analysis have been extensively tested with signals recorded using standard extracellular recording electrode, and planar and implantable multi transistor array (MTA) based neural probes. SigMate will be disseminated shortly to the neuroscience community under the open-source GNU-General Public License.

  8. Global solar radiation: comparison of satellite-based climatology with station records (United States)

    Skalak, Petr; Zahradnicek, Pavel; Stepanek, Petr; Farda, Ales


    We analyze surface incoming shortwave radiation (SIS) from the SARAH dataset prepared by the EUMETSAT Climate Monitoring Satellite Applications Facility from satellite observations of the visible channels of the MVIRI and SEVIRI instruments onboard the geostationary Meteosat satellites. The satellite SIS data are evaluated within the period 1984-2014 on various time scales: from individual months and years to long-term climate means. The validation is performed using the ground measurements of global solar radiation (GLBR) carried out on 11 meteorological stations of the Czech Hydrometeorological Institute in the Czech Republic with at least 30 years long data series. Our aim is to explore whether the SIS data could potentially serve as an alternative source of information on GLBR outside of a relatively sparse network of meteorological stations recording GLBR. Acknowledgement: Supported by the Ministry of Education, Youth and Sports of the Czech Republic within the National Sustainability Program I (NPU I), grant number LO1415.

  9. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations (United States)

    Talandier, Jacques; Okal, Emile A.


    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called `hotspot earthquakes' occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time-series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wave train, than in the case of dislocation sources embedded in the solid Earth.

  10. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning (United States)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.


    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get

  11. Current source density analysis: methods and application to simultaneously recorded field potentials of the rabbit's visual cortex. (United States)

    Rappelsberger, P; Pockberger, H; Petsche, H


    This paper deals with the application of current source density (CSD) analysis to simultaneously recorded intracortical field potentials of the rabbit's visual cortex. Recordings were made with multielectrodes with either 8 contacts at distances of 300 microns, or 16 contacts at distances of 150 microns on one carrier needle. For synchronized activities, a spatial resolution of 150 microns turned out to be sufficient to record all depth-varying details of the field potentials; for seizure potentials even a spacing of 300 microns was adequate in most cases. For practical application, an appropriate spacing of the measuring points has to be chosen for a satisfactory estimation of the first and second derivatives of the field potentials. For this reason an interpolation procedure is applied to reduce the spacing from 300 microns or 150 microns electrode contact distances, respectively, and to obtain intermediate values at 75 microns distances. With this spacing satisfactory estimations of the second derivative are obtained. Theoretically, CSD analysis has to be made three-dimensionally, but under certain conditions which are discussed, a one-dimensional analysis can be applied. An unknown quantity is sigma z, the vertical conductivity. It turned out that average values obtained from different experiments are not representative and that the vertical conductivity has to be measured in every experiment. This is caused by the great individual differences of the cortices even if the same stereotactic coordinates are chosen. Therefore, in every experiment relative conductivity measurements are performed. The influence of different conductivity values within the various layers and the influence of a conductivity gradient is discussed and demonstrated by examples.

  12. Psychopaths at Nuremberg? A Rorschach analysis of the records of the Nazi war criminals. (United States)

    Greiner, N; Nunno, V J


    Sixteen Nuremberg war criminals' (NWC) Rorschach records were compared to those of Antisocial Personality Disordered (APD) incarcerated males procured by Gacono and Meloy (1988). The Meloy (1988) set of hypotheses for psychopathy was applied to the NWCs' Rorschachs. The NWCs did not match Meloy's hypotheses, and neither did the antisocial personality disordered inmates. However, individually and as a group, the NWC Rorschach variables indicated less psychopathy, according to the hypotheses, than those of the APD inmates. Unlike most previous studies, variance in type and degree of psychopathology precluded the application of a mental disorder, character structure, or trait to all, or to the majority, of NWCs. Nevertheless, common features, such as avoidance of responsibility, low self-esteem, and capacity for affection, were revealed.

  13. Recorded fatal and permanently disabling injuries in South African manufacturing industry - Overview, analysis and reflection

    DEFF Research Database (Denmark)

    Hedlund, Frank Huess


    Studies on occupational accident statistics in South Africa are few and far between, the most recent paper on the manufacturing sector was published in 1990. Accidents in South Africa are recorded in two systems: Exhaustive information is available from the insurance system under the Workmen...... less so for permanently disabling accidents/incidents. The paper examines if effects of the popular practice of replacing permanent workers with contract workers is visible in the WCC statistics – firm conclusions cannot be drawn however, due to data shortcomings. Data inaccuracies are reviewed...... and it is argued that WCC registrations may comprise industries outside the Standard Industrial Classification (SIC) scheme for manufacturing. The quality of accident reporting in official publications began to deteriorate by mid-1990s. The largest problem, however, is that reporting has come to a standstill...

  14. Reachability Analysis of Sampling Based Planners

    NARCIS (Netherlands)

    Geraerts, R.J.; Overmars, M.H.


    The last decade, sampling based planners like the Probabilistic Roadmap Method have proved to be successful in solving complex motion planning problems. We give a reachability based analysis for these planners which leads to a better understanding of the success of the approach and enhancements of t

  15. Installation Restoration Program. Phase I. Records Search, Peterson Air Force Base, Colorado. (United States)


    GUNN so" a. 105VI A [ads SALIDA to(, A- I Ponce, Tous 50 2 Haswell Creek AP ’all or & Ca , Lm enross Arlington Sargent, 1 10Park IF renc Sugar 6...headquartered at Ent Air Force Base near downtown Colorado Springs, thus reopening the base. When the 15th Air Force moved to California in December 1949

  16. Spatial-temporal analysis on climate variation in early Qing dynasty (17th -18th century) using China's chronological records (United States)

    Lin, Kuan-Hui Elaine; Wang, Pao-Kuan; Fan, I.-Chun; Liao, Yi-Chun; Liao, Hsiung-Ming; Pai, Pi-Ling


    Global climate change in the form of extreme, variation, and short- or mid-term fluctuation is now widely conceived to challenge the survival of the human beings and the societies. Meanwhile, improving present and future climate modeling needs a comprehensive understanding of the past climate patterns. Although historical climate modeling has gained substantive progress in recent years based on the new findings from dynamical meteorology, phenology, or paleobiology, less known are the mid- to short-term variations or lower-frequency variabilities at different temporal scale and their regional expressions. Enabling accurate historical climate modeling would heavily rely on the robustness of the dataset that could carry specific time, location, and meteorological information in the continuous temporal and spatial chains. This study thus presents an important methodological innovation to reconstruct historical climate modeling at multiple temporal and spatial scales through building a historical climate dataset, based on the Chinese chronicles compiled in a Zhang (2004) edited Compendium of Chinese Meteorological Records of the Last 3,000 Years since Zhou Dynasty (1100BC). The dataset reserves the most delicate meteorological data with accurate time, location, meteorological event, duration, and other phonological, social and economic impact information, and is carefully digitalized, coded, and geo-referenced on the Geographical Information System based maps according to Tan's (1982) historical atlas in China. The research project, beginning in January 2015, is a collaborative work among scholars across meteorology, geography, and historical linguistics disciplines. The present research findings derived from the early 100+ years of the Qing dynasty include the following. First, the analysis is based on the sampling size, denoted as cities/counties, n=1398 across the Mainland China in the observation period. Second, the frequencies of precipitation, cold

  17. Analysis of geomagnetic storm variations and count-rate of cosmic ray muons recorded at the Brazilian southern space observatory

    Energy Technology Data Exchange (ETDEWEB)

    Frigo, Everton [University of Sao Paulo, USP, Institute of Astronomy, Geophysics and Atmospheric Sciences, IAG/USP, Department of Geophysics, Sao Paulo, SP (Brazil); Savian, Jairo Francisco [Space Science Laboratory of Santa Maria, LACESM/CT, Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil); Silva, Marlos Rockenbach da; Lago, Alisson dal; Trivedi, Nalin Babulal [National Institute for Space Research, INPE/MCT, Division of Space Geophysics, DGE, Sao Jose dos Campos, SP (Brazil); Schuch, Nelson Jorge, E-mail:, E-mail:, E-mail:, E-mail:, E-mail:, E-mail: [Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil)


    An analysis of geomagnetic storm variations and the count rate of cosmic ray muons recorded at the Brazilian Southern Space Observatory -OES/CRS/INPE-MCT, in Sao Martinho da Serra, RS during the month of November 2004, is presented in this paper. The geomagnetic measurements are done by a three component low noise fluxgate magnetometer and the count rates of cosmic ray muons are recorded by a muon scintillator telescope - MST, both instruments installed at the Observatory. The fluxgate magnetometer measures variations in the three orthogonal components of Earth magnetic field, H (North-South), D (East-West) and Z (Vertical), with data sampling rate of 0.5 Hz. The muon scintillator telescope records hourly count rates. The arrival of a solar disturbance can be identified by observing the decrease in the muon count rate. The goal of this work is to describe the physical morphology and phenomenology observed during the geomagnetic storm of November 2004, using the H component of the geomagnetic field and vertical channel V of the multi-directional muon detector in South of Brazil. (author)

  18. Analysis of Enhanced Associativity Based Routing Protocol

    Directory of Open Access Journals (Sweden)

    Said A. Shaar


    Full Text Available This study introduces an analysis to the performance of the Enhanced Associativity Based Routing protocol (EABR based on two factors; Operation complexity (OC and Communication Complexity (CC. OC can be defined as the number of steps required in performing a protocol operation, while CC can be defined as the number of messages exchanged in performing a protocol operation[1]. The values represent the worst-case analysis. The EABR has been analyzed based on CC and OC and the results have been compared with another routing technique called ABR. The results have shown that EABR can perform better than ABR in many circumstances during the route reconstruction.

  19. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access. (United States)

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny


    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  20. An Efficient Searchable Encryption Against Keyword Guessing Attacks for Sharable Electronic Medical Records in Cloud-based System. (United States)

    Wu, Yilun; Lu, Xicheng; Su, Jinshu; Chen, Peixin


    Preserving the privacy of electronic medical records (EMRs) is extremely important especially when medical systems adopt cloud services to store patients' electronic medical records. Considering both the privacy and the utilization of EMRs, some medical systems apply searchable encryption to encrypt EMRs and enable authorized users to search over these encrypted records. Since individuals would like to share their EMRs with multiple persons, how to design an efficient searchable encryption for sharable EMRs is still a very challenge work. In this paper, we propose a cost-efficient secure channel free searchable encryption (SCF-PEKS) scheme for sharable EMRs. Comparing with existing SCF-PEKS solutions, our scheme reduces the storage overhead and achieves better computation performance. Moreover, our scheme can guard against keyword guessing attack, which is neglected by most of the existing schemes. Finally, we implement both our scheme and a latest medical-based scheme to evaluate the performance. The evaluation results show that our scheme performs much better performance than the latest one for sharable EMRs.

  1. A Web-based vital sign telemonitor and recorder for telemedicine applications. (United States)

    Mendoza, Patricia; Gonzalez, Perla; Villanueva, Brenda; Haltiwanger, Emily; Nazeran, Homer


    We describe a vital sign telemonitor (VST) that acquires, records, displays, and provides readings such as: electrocardiograms (ECGs), temperature (T), and oxygen saturation (SaO2) over the Internet to any site. The design of this system consisted of three parts: sensors, analog signal processing circuits, and a user-friendly graphical user interface (GUI). The first part involved selection of appropriate sensors. For ECG, disposable Ag/AgCl electrodes; for temperature, LM35 precision temperature sensor; and for SaO2 the Nonin Oximetry Development Kit equipped with a finger clip were selected. The second part consisted of processing the analog signals obtained from these sensors. This was achieved by implementing suitable amplifiers and filters for the vital signs. The final part focused on development of a GUI to display the vital signs in the LabVIEW environment. From these measurements, important values such as heart rate (HR), beat-to-beat (RR) intervals, SaO2 percentages, and T in both degrees Celsius and Fahrenheit were calculated The GUI could be accessed through the Internet in a Web-page facilitating the possibility of real-time patient telemonitoring. The final system was completed and tested on volunteers with satisfactory results.

  2. Estimation of pathological tremor from recorded signals based on adaptive sliding fast Fourier transform

    Directory of Open Access Journals (Sweden)

    Shengxin Wang


    Full Text Available Pathological tremor is an approximately rhythmic movement and considerably affects patients’ daily living activities. Biomechanical loading and functional electrical stimulation are proposed as potential alternatives for canceling the pathological tremor. However, the performance of suppression methods is associated with the separation of tremor from the recorded signals. In this literature, an algorithm incorporating a fast Fourier transform augmented with a sliding convolution window, an interpolation procedure, and a damping module of the frequency is presented to isolate tremulous components from the measured signals and estimate the instantaneous tremor frequency. Meanwhile, a mechanism platform is designed to provide the simulation tremor signals with different degrees of voluntary movements. The performance of the proposed algorithm and existing procedures is compared with simulated signals and experimental signals collected from patients. The results demonstrate that the proposed solution could detect the unknown dominant frequency and distinguish the tremor components with higher accuracy. Therefore, this algorithm is useful for actively compensating tremor by functional electrical stimulation without affecting the voluntary movement.

  3. Installation Restoration Program Records Search for March Air Force Base, California (United States)


    which is bounded by the Jacinto Fault on the east and the Elsinore Fault on the west. Ground surface elevations within the March AFB bound- Z aries...approx- imately one mile south of the base. Lake Mathews, located approximately 10 miles west of the baze, is the terminal reservoir of this aqueduct...State Project water is brought into the Perris Valley via the California Aqueduct, which runs north and east of March Air Force Base. Lake Perris

  4. Annotation methods to develop and evaluate an expert system based on natural language processing in electronic medical records. (United States)

    Gicquel, Quentin; Tvardik, Nastassia; Bouvry, Côme; Kergourlay, Ivan; Bittar, André; Segond, Frédérique; Darmoni, Stefan; Metzger, Marie-Hélène


    The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data.

  5. 基于SVM的录音设备分类研究%Recording Equipment Classification Study Based on SVM

    Institute of Scientific and Technical Information of China (English)

    丛韫; 杜状状; 高冲红; 童茜雯; 郑义; 仲倩


    为解决音频取证中私录音频由何种录音设备所录的问题,针对不同设备所采用的压缩算法不同,就会导致录音信号中蕴含着区别于其他录音设备的个性特征,本文从压缩算法出发,提出了一种基于 SVM 对录音设备的分类方法。首先获取不同录音格式的音频,然后针对音频分别用MATLAB对其求改进 MFCC 倒谱参数,接着选定测试集和训练集,使用交叉验证方法得到倒谱数据的最佳参数,之后用训练集对 SVM 进行训练,再用得到的模型来预测测试集的分类标签。通过仿真与实验,结果表明,该方法能够较好的区分不同压缩算法下的音频特性,平均识别率达97%。%To solve the problem of which kind of recording equipment is used for private audio recorded in audio forensic, the article presents a classification method for recording equipment based on SVM embarking from the compression algorithm, which is based on the fact that the recorded signals from different devices with different compression algorithms contain personality characteristics different from other recording devices.Audios in different format are collected at first.Then its improved MFCCs are extracted respective⁃ly by MATLAB and testing and training sets are selected. Then Cross Validation method is used to get the optimal parameters of cepstrum data. The SVM is trained with the training set and the classification label of the testing set is predicted with the model obtained.The simulation and test results show that the method can distinguish audio features among different compression algorithms better,and the average recognition rate is 97%.

  6. Implementation of virtual medical record object model for a standards-based clinical decision support rule engine. (United States)

    Huang, Christine; Noirot, Laura A; Heard, Kevin M; Reichley, Richard M; Dunagan, Wm Claiborne; Bailey, Thomas C


    The Virtual Medical Record (vMR) is a structured data model for representing individual patient informations. Our implementation of vMR is based on HL7 Reference Information Model (RIM) v2.13 from which a minimum set of objects and attributes are selected to meet the requirement of a clinical decision support (CDS) rule engine. Our success of mapping local patient data to the vMR model and building a vMR adaptor middle layer demonstrate the feasibility and advantages of implementing a vMR in a portable CDS solution.

  7. Prepolymer-based waveguiding thin films for the holographic recording of dry-developing refractive-index gratings (United States)

    Driemeier, W.


    A new concept is presented for the easy preparation of polymer systems which are characterized by a persistent photoinduced refractive-index change. These organic materials are based upon highly viscous prepolymers, reactive multifunctional thinners and uv-photoinitiators used in very high concentrations of max. 25%. Waveguiding thin films are applied for the optical recording of refractive-index gratings. The index modulation is enhanced by a dry development at 20-50°C up to 1.0×10 -2. A holographically produced grating coupler reaches efficiencies of 33% for an incident HeNe laser beam.

  8. Analysis of Real Estate Records Management%浅析房地产档案管理

    Institute of Scientific and Technical Information of China (English)



    随着我国社会主义现代化建设的发展,城市建设日新月异,这就给房地产的发展提供了更大的契机,同时也极大的推动了房地产档案管理事业的发展,众所周知,房地产档案管理在维护房产市场运营中发挥着不可替代的作用,但随着社会的发展,档案资料量的增加等诸多问题为我们的管理人员及经办人员增加了很多繁重的工作负担,在客观上影响了市场运转的效率,这就要求尽快的推广与之相适应的技术措施,以提高运转效率。那么如何采取相应的措施是摆在我们面前的重要课题,而最大力度的加大档案管理的力度是解决这一问题的关键。%With the development of China's socialist modernization,rapid changing of urban construction provides the real estate development greater opportunity and also greatly promoted the development of real estate records management business.As we all know that real estate files management operation has played an irreplaceable role in the maintenance of the housing market,but with the development of society,it is heavy work burden for management staff and managers because of the increasing of the amount of data files and many other issues,which influences the efficiency of market operation,and requires the corresponding technical measures as soon as possible to improve operating efficiency,and efforts to increase the records management is the key to solve this problem.


    Harvey, E N; Snell, P A


    1. The rapid decay of luminescence in extracts of the ostracod crustacean Cypridina hilgendorfii, has been studied by means of a photoelectric-amplifier-string galvanometer recording system. 2. For rapid flashes of luminescence, the decay is logarithmic if ratio of luciferin to luciferase is small; logarithmic plus an initial flash, if ratio of luciferin to luciferase is greater than five. The logarithmic plot of luminescence intensity against time is concave to time axis if ratio of luciferin to luciferase is very large. 3. The velocity constant of rapid flashes of luminescence is approximately proportional to enzyme concentration, is independent of luciferin concentration, and varies approximately inversely as the square root of the total luciferin (luciferin + oxyluciferin) concentration. For large total luciferin concentrations, the velocity constant is almost independent of the total luciferin. 4. The variation of velocity constant with total luciferin concentration (luciferin + oxyluciferin) and its independence of luciferin concentration is explained by assuming that light intensity is a measure of the luciferin molecules which become activated to oxidize (accompanied with luminescence) by adsorption on luciferase. The adsorption equilibrium is the same for luciferin and oxyluciferin and determines the velocity constant.

  10. Security of the distributed electronic patient record: a case-based approach to identifying policy issues. (United States)

    Anderson, J G


    The growth of managed care and integrated delivery systems has created a new commodity, health information and the technology that it requires. Surveys by Deloitte and Touche indicate that over half of the hospitals in the US are in the process of implementing electronic patient record (EPR) systems. The National Research Council has established that industry spends as much as $15 billion on information technology (IT), an amount that is expanding by 20% per year. The importance of collecting, electronically storing, and using the information is undisputed. This information is needed by consumers to make informed choices; by physicians to provide appropriate quality clinical care: and by health plans to assess outcomes, control costs and monitor quality. The collection, storage and communication of a large variety of personal patient data, however, present a major dilemma. How can we provide the data required by the new forms of health care delivery and at the same time protect the personal privacy of patients? Recent debates concerning medical privacy legislation, software regulation, and telemedicine suggest that this dilemma will not be easily resolved. The problem is systemic and arises out of the routine use and flow of information throughout the health industry. Health care information is primarily transferred among authorized users. Not only is the information used for patient care and financial reimbursement, secondary users of the information include medical, nursing, and allied health education, research, social services, public health, regulation, litigation, and commercial purposes such as the development of new medical technology and marketing. The main threats to privacy and confidentiality arise from within the institutions that provide patient care as well as institutions that have access to patient data for secondary purposes.

  11. Cystic Echinococcosis Epidemiology in Spain Based on Hospitalization Records, 1997-2012 (United States)

    Siles-Lucas, Mar; Aparicio, Pilar; Lopez-Velez, Rogelio; Gherasim, Alin; Garate, Teresa; Benito, Agustín


    Background Cystic echinococcosis (CE) is a parasitic disease caused by the tapeworm Echinococcus granulosus. Although present throughout Europe, deficiencies in the official reporting of CE result in under-reporting and misreporting of this disease, which in turn is reflected in the wrong opinion that CE is not an important health problem. By using an alternative data source, this study aimed at describing the clinical and temporal-spatial characteristics of CE hospitalizations in Spain between 1997 and 2012. Methodology/Principal Findings We performed a retrospective descriptive study using the Hospitalization Minimum Data Set (CMBD in Spanish). All CMBD’s hospital discharges with echinococcosis diagnosis placed in first diagnostic position were reviewed. Hospitalization rates were computed and clinical characteristics were described. Spatial and temporal distribution of hospital discharges was also assessed. Between 1997 and 2012, 14,010 hospitalizations with diagnosis of CE were recorded, 55% were men and 67% were aged over 45 years. Pediatric hospitalizations occurred during the whole study period. The 95.2% were discharged at home, and only 1.7% were exitus. The average cost was 8,439.11 €. The hospitalization rate per 100,000 per year showed a decreasing trend during the study period. All the autonomous communities registered discharges, even those considered as non-endemic. Maximum rates were reached by Extremadura, Castilla-Leon and Aragon. Comparison of the CMBD data and the official Compulsory Notifiable Diseases (CND) reports from 2005 to 2012 showed that official data were lower than registered hospitalization discharges. Conclusions Hospitalizations distribution was uneven by year and autonomous region. Although CE hospitalization rates have decreased considerably due to the success of control programs, it remains a public health problem due to its severity and economic impact. Therefore, it would be desirable to improve its oversight and

  12. Data base of array characteristics instrument response and data, recorded at NNC

    Energy Technology Data Exchange (ETDEWEB)

    Bushueva, E.A.; Ermolenko, E.A.; Efremova, N.A. [and others


    A northern and east-northern parts of Kazakstan Republic are utterly favorable for a placing of seismic stations. There is a very low level of natural and industrial seismic noise. Rocks of Kazakh epi-Hercynian platform have a very good transmissive properties. Geophysical observatories (GOs), now belonging to the Institute of Geophysical Researches of National Nuclear Center of Kazakstan Republic (IGR NNC RK), were established in especially selected low-noise places of Northern Kazakstan, in accordance with Soviet program for nuclear weapons test monitoring. In 1994, these GOs were transferred by Russian Federation into the possession of Kazakstan. A location of GOs is shown on the Fig. 1. According to the studying of seismic noises, jointly implemented by scientists from IGR and IRIS, places, where a `Borovoye` and `Kurchatov` seismic stations are located, are among the best places for seismic observations in the world. A seismic arrays exist in `Borovoye` and `Kurchatov` observatories - in two observatories out four (`Aktiubinsk`, `Borovoye`, `Kurchatov` and `Makanchi`). These two observatories are described in this report. A history of geophysical observatories, conditions of equipment operations (climatic, geological and so on) are presented in this report, as well as it is described the equipment of GOs and seismic arrays, and samples of digital seismograms, recorded on the equipment of various types, are presented in this report. GO `Borovoye` is described in the 2nd chart, GO `Kurchatov` is described in the 3rd chart of the report. The main results of work are presented in the conclusion. A list of used papers, a list of tables and figures is given in the end of the report. 14 refs., 95 figs., 12 tabs.

  13. Analysis of primary school children's abilities and strategies for reading and recording time from analogue and digital clocks (United States)

    Boulton-Lewis, Gillian; Wilss, Lynn; Mutch, Sue


    Sixty-seven children in Grades 1-3 and 66 children in Grades 4-6 were tested for their ability to read and record analogue and digital times. The children in Grades 4-6 were asked to describe their strategies. A sequence of time acquisition was proposed, based on a recent theory of cognitive development and the literature. This was: hour, half hour, quarter hour, five minute, and minute times. Times after the hour would be more difficult and digital times would be learned sooner. The sequence was confirmed for Grades 1-3; irregularities occurred in Grades 4-6. Some implications are drawn for the teaching of time.

  14. Number of Black Children in Extreme Poverty Hits Record High. Analysis Background. (United States)

    Children's Defense Fund, Washington, DC.

    To examine the experiences of black children and poverty, researchers conducted a computer analysis of data from the U.S. Census Bureau's Current Population Survey, the source of official government poverty statistics. The data are through 2001. Results indicated that nearly 1 million black children were living in extreme poverty, with after-tax…

  15. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty


    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  16. A Comparative Analysis of Coastal and Open-Ocean Records of the Great Chilean Tsunamis of 2010, 2014 and 2015 off the Coast of Mexico (United States)

    Zaytsev, Oleg; Rabinovich, Alexander B.; Thomson, Richard E.


    The three great earthquakes off the coast of Chile on 27 February 2010 (Maule, M w 8.8), 1 April 2014 (Iquique, M w 8.2) and 16 September 2015 (Illapel, M w 8.3) generated major transoceanic tsunamis that spread throughout the Pacific Ocean and were measured by numerous coastal tide gauges and open-ocean DART stations. Statistical and spectral analyses of the tsunami waves from the events recorded on the Pacific coast of Mexico enabled us to estimate parameters of the waves along the coast and to compare statistical features of the events. We also identified three coastal "hot spots" (sites having maximum tsunami risk): Puerto Angel, Puerto Madero and Manzanillo. Based on the joint spectral analyses of the tsunamis and background noise, we have developed a method for using coastal observations to determine the underlying spectrum of tsunami waves in the deep ocean. The "reconstructed" open-ocean tsunami spectra are in close agreement with the actual tsunami spectra evaluated from direct analysis of the DART records offshore of Mexico. We have further used the spectral estimates to parameterize the energy of the three Chilean tsunamis based on the total open-ocean tsunami energy and frequency content of the individual events.

  17. Installation Restoration Program Records Search for Bergstrom Air Force Base, Texas. (United States)


    Johnson grass, buffalograss, Bermudagrass, fescue, and Texas wintergrass. Spring wildflowers , including Texas bluebonnet and indian paintbrush, are...kingfisher. The California jack rabbit is very common on the grassy areas of the base. Other mammals known to be present in the wooded ravines include

  18. Camera-Vision Based Oil Content Prediction for Oil Palm (Elaeis Guineensis Jacq Fresh Fruits Bunch at Various Recording Distances

    Directory of Open Access Journals (Sweden)

    Dinah Cherie


    Full Text Available In this study, the correlation between oil palm fresh fruits bunch (FFB appearance and its oil content (OC was explored. FFB samples were recorded from various distance (2, 7, and 10 m with different lighting spectrums and configurations (Ultraviolet: 280-380nm, Visible: 400-700nm, and Infrared: 720-1100nm and intensities (600watt and 1000watt lamps to explore the correlations. The recorded FFB images were segmented and its color features were subsequently extracted to be used as input variables for modeling the OC of the FFB. In this study, four developed models were selected to perform oil content prediction (OCP for intact FFBs. These models were selected based on their validity and accuracy upon performing the OCP. Models were developed using Multi-Linear-Perceptron-Artificial-Neural-Network (MLP-ANN methods, employing 10 hidden layers and 15 images features as input variables. Statistical engineering software was used to create the models. Although the number of FFB samples in this study was limited, four models were successfully developed to predict intact FFB’s OC, based on its images’ color features. Three OCP models developed for image recording from 10 m under UV, Vis2, and IR2 lighting configurations. Another model was successfully developed for short range imaging (2m under IR2 light. The coefficient of correlation for each model when validated was 0.816, 0.902, 0.919, and 0.886, respectively. For bias and error, these selected models obtained root-mean-square error (RMSE of 1.803, 0.753, 0.607, and 1.104, respectively.

  19. Assessment of providers' referral decisions in Rural Burkina Faso: a retrospective analysis of medical records

    Directory of Open Access Journals (Sweden)

    Ilboudo Tegawende


    Full Text Available Abstract Background A well-functioning referral system is fundamental to primary health care delivery. Understanding the providers' referral decision-making process becomes critical. This study's aim was to assess the correctness of diagnoses and appropriateness of the providers' referral decisions from health centers (HCs to district hospitals (DHs among patients with severe malaria and pneumonia. Methods A record review of twelve months of consultations was conducted covering eight randomly selected HCs to identify severe malaria (SM cases among children under five and pneumonia cases among adults. The correctness of the diagnosis and appropriateness of providers' referral decisions were determined using the National Clinical Guidebook as a 'gold standard'. Results Among the 457 SM cases affecting children under five, only 66 cases (14.4% were correctly diagnosed and of those 66 correctly diagnosed cases, 40 cases (60.6% received an appropriate referral decision from their providers. Within these 66 correctly diagnosed SM cases, only 60.6% were appropriately referred. Among the adult pneumonia cases, 5.9% (79/1331 of the diagnoses were correctly diagnosed; however, the appropriateness rate of the provider's referral decision was 98.7% (78/79. There was only one case that should not have been referred but was referred. Conclusions The adherence to the National Guidelines among the health center providers when making a diagnosis was low for both severe malaria cases and pneumonia cases. The appropriateness of the referral decisions was particularly poor for children with severe malaria. Health center providers need to be better trained in the diagnostic process and in disease management in order to improve the performance of the referral system in rural Burkina Faso.

  20. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Directory of Open Access Journals (Sweden)

    Sergio Miranda Freire

    Full Text Available This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when

  1. Identifying Proper Names Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)


    The issue of proper names recognition in Chinese text was discussed. An automatic approach based on association analysis to extract rules from corpus was presented. The method tries to discover rules relevant to external evidence by association analysis, without additional manual effort. These rules can be used to recognize the proper nouns in Chinese texts. The experimental result shows that our method is practical in some applications.Moreover, the method is language independent.

  2. Pattern recognition on X-ray fluorescence records from Copenhagen lake sediments using principal component analysis

    DEFF Research Database (Denmark)

    Schreiber, Norman; Garcia, Emanuel; Kroon, Aart


    , Fe, Rb) and characterized the content of minerogenic material in the sediment. In case of both cores, PC2 was a good descriptor emphasized as the contamination component. It showed strong linkages with heavy metals (Cu, Zn, Pb), disclosing changing heavy-metal contamination trends across different...... Component Analysis helped to trace geochemical patterns and temporal trends in lake sedimentation. The PCA models explained more than 80 % of the original variation in the datasets using only 2 or 3 principle components. The first principle component (PC1) was mostly associated with geogenic elements (Si, K...... depths. The sediments featured a temporal association with contaminant dominance. Lead contamination was superseded by zinc within the compound pattern which was linked to changing contamination sources over time. Principle Component Analysis was useful to visualize and interpret geochemical XRF data...

  3. A stable, unbiased, long-term satellite based data record of sea surface temperature from ESA's Climate Change Initiative (United States)

    Rayner, Nick; Good, Simon; Merchant, Chris


    The study of climate change demands long-term, stable observational records of climate variables such as sea surface temperature (SST). ESA's Climate Change Initiative was set up to unlock the potential of satellite data records for this purpose. As part of this initiative, 13 projects were established to develop the data records for different essential climate variables - aerosol, cloud, fire, greenhouse gases, glaciers, ice sheets, land cover, ocean colour, ozone, sea ice, sea level, soil moisture and SST. In this presentation we describe the development work that has taken place in the SST project and present new prototype data products that are available now for users to trial. The SST project began in 2010 and has now produced two prototype products. The first is a long-term product (covering mid-1991 - 2010 currently, but with a view to update this in the future), which prioritises length of data record and stability over other considerations. It is based on data from the Along-Track Scanning Radiometer (ATSR) and Advanced Very-High Resolution Radiometer (AVHRR) series of satellite instruments. The product aims to combine the favourable stability and bias characteristics of ATSR data with the geographical coverage achieved with the AVHRR series. Following an algorithm selection process, an optimal estimation approach to retrieving SST from the satellite measurements from both sensors was adopted. The retrievals do not depend on in situ data and so this data record represents an independent assessment of SST change. In situ data are, however, being used to validate the resulting data. The second data product demonstrates the coverage that can be achieved using the modern satellite observing system including, for example, geostationary satellite data. Six months worth of data have been processed for this demonstration product. The prototype SST products will be released in April to users to trial in their work. The long term product will be available as

  4. Interpreting land records

    CERN Document Server

    Wilson, Donald A


    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  5. Installation Restoration Program Records Search for Holloman Air Force Base, New Mexico (United States)


    track, the salt-tolerant arroyo and springs communities, and the extensive horticultural plantings on the base proper. The primary aquatic habitats on...since 1980. Primate research activities conducted in the past include cancer risk, drug therapy , disease studies and the evaluation of automobile...should provide ample pro- tection for personnel working at the site. To prevent ingestion, do not eat , drink, or smoke during visits to site. To prevent

  6. Conceptual model of health information ethics as a basis for computer-based instructions for electronic patient record systems. (United States)

    Okada, Mihoko; Yamamoto, Kazuko; Watanabe, Kayo


    A computer-based learning system called Electronic Patient Record (EPR) Laboratory has been developed for students to acquire knowledge and practical skills of EPR systems. The Laboratory is basically for self-learning. Among the subjects dealt with in the system is health information ethics. We consider this to be of the utmost importance for personnel involved in patient information handling. The variety of material on the subject has led to a problem in dealing with it in a methodical manner. In this paper, we present a conceptual model of health information ethics developed using UML to represent the semantics and the knowledge of the domain. Based on the model, we could represent the scope of health information ethics, give structure to the learning materials, and build a control mechanism for a test, fail and review cycle. We consider that the approach is applicable to other domains.

  7. Performance evaluation of a web-based system to exchange Electronic Health Records using Queueing model (M/M/1). (United States)

    de la Torre, Isabel; Díaz, Francisco Javier; Antón, Míriam; Martínez, Mario; Díez, José Fernando; Boto, Daniel; López, Miguel; Hornero, Roberto; López, María Isabel


    Response time measurement of a web-based system is essential to evaluate its performance. This paper shows a comparison of the response times of a Web-based system for Ophthalmologic Electronic Health Records (EHRs), TeleOftalWeb. It makes use of different database models like Oracle 10 g, dbXML 2.0, Xindice 1.2, and eXist 1.1.1. The system's modelling, which uses Tandem Queue networks, will allow us to estimate the service times of the different components of the system (CPU, network and databases). In order to calculate those times, associated to the different databases, benchmarking techniques are used. The final objective of the comparison is to choose the database system resulting in the lowest response time to TeleOftalWeb and to compare the obtained results using a new benchmarking.

  8. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh


    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  9. A Retrospective Analysis of the Burn Injury Patients Records in the Emergency Department, an Epidemiologic Study

    Directory of Open Access Journals (Sweden)

    Nilgün Aksoy


    Full Text Available Introduction: Burns can be very destructive, and severely endanger the health and lives of humans. It maybe cause disability and even psychological trauma in individuals. . Such an event can also lead to economic burden on victim’s families and society. The aim of our study is to evaluate epidemiology and outcome of burn patients referring to emergency department. Methods: This is a cross-sectional study was conducted by evaluation of patients’ files and forensic reports of burned patients’ referred to the emergency department (ED of Akdeniz hospital, Turkey, 2008. Demographic data, the season, place, reason, anatomical sites, total body surface area, degrees, proceeding treatment, and admission time were recorded. Multinomial logistic regression was used to compare frequencies’ differences among single categorized variables. Stepwise logistic regression was applied to develop a predictive model for hospitalization. P<0.05 was defined as a significant level. Results: Two hundred thirty patients were enrolled (53.9% female. The mean of patients' ages was 25.3 ± 22.3 years. The most prevalence of burn were in the 0-6 age group and most of which was hot liquid scalding (71.3%. The most affected parts of the body were the left and right upper extremities. With increasing the severity of triage level (OR=2.2; 95% CI: 1.02-4.66; p=0.046, intentional burn (OR=4.7; 95% CI: 1.03-21.8; p=0.047, referring from other hospitals or clinics (OR=3.4; 95% CI: 1.7-6.6; p=0.001, and percentage of burn (OR=18.1; 95% CI: 5.42-62.6; p<0.001 were independent predictive factor for hospitalization. In addition, odds of hospitalization was lower in patients older than 15 years (OR=0.7; 95% CI: 0.5-0.91; p=0.035. Conclusion: This study revealed the most frequent burns are encountered in the age group of 0-6 years, percentage of <10%, second degree, upper extremities, indoor, and scalding from hot liquids. Increasing ESI severity, intentional burn, referring from

  10. Biomarkers identification in Alzheimer’s disease using effective connectivity analysis from electroencephalography recordings

    Directory of Open Access Journals (Sweden)

    Jazmín X. Suárez-Revelo


    Full Text Available Alzheimer’s disease (AD is the most common cause of dementia, which generally affects people over 65 years old. Some genetic mutations induce early onset of AD and help to track the evolution of the symptoms and the physiological changes at different stages of the disease. In Colombia there is a large family group with the PSEN1 E280A mutation with a median age of 46,8 years old for onset of symptoms. AD has been defined as a disconnection syndrome; consequently, network approaches could help to capture different features of the disease. The aim of the current work is to identify a biomarker in AD that helps in the tracking of the neurodegenerative process. Electroencephalography (EEG was recorded during the encoding of visual information for four groups of individuals: asymptomatic and mild cognitive impairment carriers of the PSEN1 E280A mutation, and two non-carrier control groups. For each individual, the effective connectivity was estimated using the direct Directed Transfer Function and three measurements from graph theory were extracted: input strength, output strength and total strength. A relation between the cognitive status and age of the participants with the connectivity features was calculated. For those connectivity measures in which there is a relation with the age or the clinical scale, the performance as a diagnostic feature was evaluated. We found that output strength connectivity in the right occipito-parietal region is related to age of the carrier groups (r=−0,54, p=0,0036 and has a high sensitivity and high specificity to distinguish between carriers and non-carriers (67% sensitivity and 80% specificity in asymptomatic cases, and 83% sensitivity and 67% specificity in symptomatic cases. This relationship indicates that output strength connectivity could be related to the neurodegenerative process of the disease and could help to track the conversion from the asymptomatic stage to dementia.

  11. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng


    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  12. Social Network Analysis Based on Network Motifs



    Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...

  13. Integrated interpretation of helicopter and ground-based geophysical data recorded within the Okavango Delta, Botswana

    DEFF Research Database (Denmark)

    Podgorski, Joel E.; Green, Alan G.; Kalscheuer, Thomas


    deposited in the huge Paleo Lake Makgadikgadi (PLM), which once covered a 90,000km2 area that encompassed the delta, Lake Ngami, the Mababe Depression, and the Makgadikgadi Basin. Examples of PLM sediments are intersected in many boreholes. Low permeability clay within the PLM unit seems to be a barrier...... to the downward flow of the saline water. Below the PLM unit, freshwater-saturated sand of the Paleo Okavango Megafan (POM) unit is distinguished by moderate to high resistivities, low P-wave velocity, and numerous subhorizontal reflectors. The POM unit is interpreted to be the remnants of a megafan based...

  14. Long-term invariant parameters obtained from 24-h Holter recordings: A comparison between different analysis techniques (United States)

    Cerutti, Sergio; Esposti, Federico; Ferrario, Manuela; Sassi, Roberto; Signorini, Maria Gabriella


    Over the last two decades, a large number of different methods had been used to study the fractal-like behavior of the heart rate variability (HRV). In this paper some of the most used techniques were reviewed. In particular, the focus is set on those methods which characterize the long memory behavior of time series (in particular, periodogram, detrended fluctuation analysis, rescale range analysis, scaled window variance, Higuchi dimension, wavelet-transform modulus maxima, and generalized structure functions). The performances of the different techniques were tested on simulated self-similar noises (fBm and fGn) for values of α, the slope of the spectral density for very small frequency, ranging from -1 to 3 with a 0.05 step. The check was performed using the scaling relationships between the various indices. DFA and periodogram showed the smallest mean square error from the expected values in the range of interest for HRV. Building on the results obtained from these tests, the effective ability of the different methods in discriminating different populations of patients from RR series derived from Holter recordings, was assessed. To this extent, the Noltisalis database was used. It consists of a set of 30, 24-h Holter recordings collected from healthy subjects, patients suffering from congestive heart failure, and heart transplanted patients. All the methods, with the exception at most of rescale range analysis, were almost equivalent in distinguish between the three groups of patients. Finally, the scaling relationships, valid for fBm and fGn, when empirically used on HRV series, also approximately held.

  15. Analysis of the applicability of Ni, Cu, Au, Pt, and Pd nanoclusters for data recording (United States)

    Redel', L. V.; Gafner, S. L.; Gafner, Yu. Ya.; Zamulin, I. S.; Goloven'ko, Zh. V.


    The applicability of individual Ni, Cu, Au, Pt, and Pd nanoclusters as data bits in next generation memory devices constructed on the phase-change carrier principle is studied. To this end, based on the modified tight-binding potential (TB-SMA), structure formation from the melt of nanoparticles of these metals to 10 nm in diameter was simulated by the molecular dynamics method. The effect of various crystallization conditions on the formation of the internal structures of Ni, Cu, Au, Pt, and Pd nanoclusters is studied. The stability boundaries of various crystalline isomers are analyzed. The obtained systematic features are compared for nanoparticles of copper, nickel, gold, platinum, and palladium of identical sizes. It is concluded that platinum nanoclusters of diameter D > 8 nm are the best materials among studied metals for producing memory elements based on phase transitions.

  16. Some results of analysis of inverted echo-sounder records from the Atlantic Equatorial region

    Directory of Open Access Journals (Sweden)

    Alberto dos Santos Franco


    Full Text Available The tidal analysis of data from the Equatorial region, given by inverted echo-sounders, show considerable residuals in the frequency band of approximately 2 cycles per day. In the even harmonics of 4 and 6 cycles per day, tidal components statistically not negligible are also identified. Spectral analysis of temperature series from the same area show, on the other hand, variabilities in the same frequency bands, which suggests the occurrence of internal waves with energy distributed in these frequency bands, in the Atlantic Equatorial area.Análises de dados de maré, da zona equatorial, obtidos com ecobatímetros invertidos, mostram consideráveis resíduos na faixa de freqüências com aproximadamente dois ciclos por dia. Nos harmônicos pares com 4 e 6 ciclos por dia são também identificadas componentes de maré estatisticamente não desprezíveis. Análises espectrais de séries de temperatura obtidas na mesma área mostram, 218 por outro lado, variabilidades na mesma faixa de freqüências, o que sugere a ocorrência, na área equatorial Atlântica, de ondas internas com energia distribuída nessas faixas espectrais.

  17. Seismic Design Value Evaluation Based on Checking Records and Site Geological Conditions Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Tienfuan Kerh


    Full Text Available This study proposes an improved computational neural network model that uses three seismic parameters (i.e., local magnitude, epicentral distance, and epicenter depth and two geological conditions (i.e., shear wave velocity and standard penetration test value as the inputs for predicting peak ground acceleration—the key element for evaluating earthquake response. Initial comparison results show that a neural network model with three neurons in the hidden layer can achieve relatively better performance based on the evaluation index of correlation coefficient or mean square error. This study further develops a new weight-based neural network model for estimating peak ground acceleration at unchecked sites. Four locations identified to have higher estimated peak ground accelerations than that of the seismic design value in the 24 subdivision zones are investigated in Taiwan. Finally, this study develops a new equation for the relationship of horizontal peak ground acceleration and focal distance by the curve fitting method. This equation represents seismic characteristics in Taiwan region more reliably and reasonably. The results of this study provide an insight into this type of nonlinear problem, and the proposed method may be applicable to other areas of interest around the world.

  18. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor (United States)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  19. Development of an apnea detection algorithm based on temporal analysis of thoracic respiratory effort signal (United States)

    Dell'Aquila, C. R.; Cañadas, G. E.; Correa, L. S.; Laciar, E.


    This work describes the design of an algorithm for detecting apnea episodes, based on analysis of thorax respiratory effort signal. Inspiration and expiration time, and range amplitude of respiratory cycle were evaluated. For range analysis the standard deviation statistical tool was used over respiratory signal temporal windows. The validity of its performance was carried out in 8 records of Apnea-ECG database that has annotations of apnea episodes. The results are: sensitivity (Se) 73%, specificity (Sp) 83%. These values can be improving eliminating artifact of signal records.

  20. Evidence and analysis of 2012 Greenland records from spaceborne observations, a regional climate model and reanalysis data

    Directory of Open Access Journals (Sweden)

    M. Tedesco


    Full Text Available A combined analysis of remote sensing observations, regional climate model (RCM outputs and reanalysis data over the Greenland ice sheet provides evidence that multiple records were set during summer 2012. Melt extent was the largest in the satellite era (extending up to ~ 97% of the ice sheet and melting lasted up to ~ two months longer than the 1979–2011 mean. Model results indicate that near surface temperature was ~ 3 standard deviations (σ above the 1958–2011 mean, while surface mass balance was ~ 3σ below the mean and runoff was 3.9σ above the mean over the same period. Albedo, exposure of bare ice and surface mass balance also set new records, as did the total mass balance with summer and annual mass changes of, respectively, −627 Gt and −574 Gt, 2σ below the 2003–2012 mean.

    We identify persistent anticyclonic conditions over Greenland associated with anomalies in the North Atlantic Oscillation (NAO, changes in surface conditions (e.g. albedo and pre-conditioning of surface properties from recent extreme melting as major driving mechanisms for the 2012 records. Because of self-amplifying positive feedbacks, less positive if not increasingly negative SMB will likely occur should large-scale atmospheric circulation and induced surface characteristics observed over the past decade persist. Since the general circulation models of the Coupled Model Intercomparison Project Phase 5 (CMIP5 do not simulate the abnormal anticyclonic circulation resulting from extremely negative NAO conditions as observed over recent years, contribution to sea level rise projected under different warming scenarios will be underestimated should the trend in NAO summer values continue.


    Energy Technology Data Exchange (ETDEWEB)



    This document is a Phase I deliverable for the Single-Shell Tank Analysis of Record effort. This document is not the Analysis of Record. The intent of this document is to guide the Phase II detailed modeling effort. Preliminary finite element models for each of the tank types were developed and different case studies were performed on one or more of these tank types. Case studies evaluated include thermal loading, waste level variation, the sensitivity of boundary effects (soil radial extent), excavation slope or run to rise ratio, soil stratigraphic (property and layer thickness) variation at different farm locations, and concrete material property variation and their degradation under thermal loads. The preliminary analysis document reviews and preliminary modeling analysis results are reported herein. In addition, this report provides recommendations for the next phase of the SST AOR project, SST detailed modeling. Efforts and results discussed in this report do not include seismic modeling as seismic modeling is covered by a separate report. The combined results of both static and seismic models are required to complete this effort. The SST AOR project supports the US Department of Energy's (DOE) Office of River Protection (ORP) mission for obtaining a better understanding of the structural integrity of Hanford's SSTs. The 149 SSTs, with six different geometries, have experienced a range of operating histories which would require a large number of unique analyses to fully characterize their individual structural integrity. Preliminary modeling evaluations were conducted to determine the number of analyses required for adequate bounding of each of the SST tank types in the Detailed Modeling Phase of the SST AOR Project. The preliminary modeling was conducted in conjunction with the Evaluation Criteria report, Johnson et al. (2010). Reviews of existing documents were conducted at the initial stage of preliminary modeling. These reviews guided the topics

  2. Climate variability in SE Europe since 1450 AD based on a varved sediment record from Etoliko Lagoon (Western Greece) (United States)

    Koutsodendris, Andreas; Brauer, Achim; Reed, Jane M.; Plessen, Birgit; Friedrich, Oliver; Hennrich, Barbara; Zacharias, Ierotheos; Pross, Jörg


    To achieve deeper understanding of climate variability during the last millennium in SE Europe, we report new sedimentological and paleoecological data from Etoliko Lagoon, Western Greece. The record represents the southernmost annually laminated (i.e., varved) archive from the Balkan Peninsula spanning the Little Ice Age, allowing insights into critical time intervals of climate instability such as during the Maunder and Dalton solar minima. After developing a continuous, ca. 500-year-long varve chronology, high-resolution μ-XRF counts, stable-isotope data measured on ostracod shells, palynological (including pollen and dinoflagellate cysts), and diatom data are used to decipher the season-specific climate and ecosystem evolution at Etoliko Lagoon since 1450 AD. Our results show that the Etoliko varve record became more sensitive to climate change from 1740 AD onwards. We attribute this shift to the enhancement of primary productivity within the lagoon, which is documented by an up to threefold increase in varve thickness. This marked change in the lagoon's ecosystem was caused by: (i) increased terrestrial input of nutrients, (ii) a closer connection to the sea and human eutrophication particularly from 1850 AD onwards, and (iii) increasing summer temperatures. Integration of our data with those of previously published paleolake sediment records, tree-ring-based precipitation reconstructions, simulations of atmospheric circulation and instrumental precipitation data suggests that wet conditions in winter prevailed during 1740-1790 AD, whereas dry winters marked the periods 1790-1830 AD (Dalton Minimum) and 1830-1930 AD, the latter being sporadically interrupted by wet winters. This variability in precipitation can be explained by shifts in the large-scale atmospheric circulation patterns over the European continent that affected the Balkan Peninsula (e.g., North Atlantic Oscillation). The transition between dry and wet phases at Etoliko points to longitudinal

  3. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes. (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar


    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice.

  4. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  5. A record flexible piezoelectric KNN ultrafine-grained nanopowder-based nanogenerator

    Directory of Open Access Journals (Sweden)

    Qing-tang Xue


    Full Text Available We explore a type piezoelectric material 0.9525(K0.5Na0.5NbO3-0.0475LiTaO3 (KNN-LTS which can be used to fabricate nanogenerator with high output voltage and current due to its high piezoelectric constant (d33. Because of its unique structure mixed with multi-wall carbon nanotube and polydimethylsiloxane, the output voltage is up to 53 V and the output current is up to 15 uA (current density of 12.5 uA/cm2 respectively. The value of the output voltage and output current represent the highest level in the piezoelectric field reported to date. The KNN-LTS nanopowder-based nanogenerator can also be used as a sensitive motion detection sensor.

  6. Pyroclastic flow hazard assessment at Somma-Vesuvius based on the geological record (United States)

    Gurioli, L.; Sulpizio, R.; Cioni, R.; Sbrana, A.; Santacroce, R.; Luperini, W.; Andronico, D.


    During the past 22 ka of activity at Somma-Vesuvius, catastrophic pyroclastic density currents (PDCs) have been generated repeatedly. Examples are those that destroyed the towns of Pompeii and Ercolano in AD 79, as well as Torre del Greco and several circum-Vesuvian villages in AD 1631. Using new field data and data available from the literature, we delineate the area impacted by PDCs at Somma-Vesuvius to improve the related hazard assessment. We mainly focus on the dispersal, thickness, and extent of the PDC deposits generated during seven plinian and sub-plinian eruptions, namely, the Pomici di Base, Greenish Pumice, Pomici di Mercato, Pomici di Avellino, Pompeii Pumice, AD 472 Pollena, and AD 1631 eruptions. We present maps of the total thickness of the PDC deposits for each eruption. Five out of seven eruptions dispersed PDCs radially, sometimes showing a preferred direction controlled by the position of the vent and the paleotopography. Only the PDCs from AD 1631 eruption were influenced by the presence of the Mt Somma caldera wall which stopped their advance in a northerly direction. Most PDC deposits are located downslope of the pronounced break-in slope that marks the base of the Somma-Vesuvius cone. PDCs from the Pomici di Avellino and Pompeii Pumice eruptions have the most dispersed deposits (extending more than 20 km from the inferred vent). These deposits are relatively thin, normally graded, and stratified. In contrast, thick, massive, lithic-rich deposits are only dispersed within 7 to 8 km of the vent. Isopach maps and the deposit features reveal that PDC dispersal was strongly controlled by the intensity of the eruption (in terms of magma discharge rate), the position of the vent area with respect to the Mt Somma caldera wall, and the pre-existing topography. Facies characteristics of the PDC deposits appear to correlate with dispersal; the stratified facies are consistently dispersed more widely than the massive facies.

  7. Rational design of on-chip refractive index sensors based on lattice plasmon resonances (Presentation Recording) (United States)

    Lin, Linhan; Zheng, Yuebing


    Lattice plasmon resonances (LPRs), which originate from the plasmonic-photonic coupling in gold or silver nanoparticle arrays, possess ultra-narrow linewidth by suppressing the radiative damping and provide the possibility to develop the plasmonic sensors with high figure of merit (FOM). However, the plasmonic-photonic coupling is greatly suppressed when the nanoparticles are immobilized on substrates because the diffraction orders are cut off at the nanoparticle-substrate interfaces. Here, we develop the rational design of LPR structures for the high-performance, on-chip plasmonic sensors based on both orthogonal and parallel coupling. Our finite-difference time-domain simulations in the core/shell SiO2/Au nanocylinder arrays (NCAs) reveal that new modes of localized surface plasmon resonances (LSPRs) show up when the aspect ratio of the NCAs is increased. The height-induced LSPRs couple with the superstrate diffraction orders to generate the robust LPRs in asymmetric environment. The high wavelength sensitivity and narrow linewidth in these LPRs lead to the plasmonic sensors with high FOM and high signal-to-noise ratio (SNR). Wide working wavelengths from visible to near-infrared are also achieved by tuning the parameters of the NCAs. Moreover, the wide detection range of refractive index is obtained in the parallel LPR structure. The electromagnetic field distributions in the NCAs demonstrate the height-enabled tunability of the plasmonic "hot spots" at the sub-nanoparticles resolution and the coupling between these "hot spots" with the superstrate diffraction waves, which are responsible for the high performance LPRs-based on-chip refractive index sensors.

  8. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang


    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  9. Discourses of aggression in forensic mental health: a critical discourse analysis of mental health nursing staff records. (United States)

    Berring, Lene L; Pedersen, Liselotte; Buus, Niels


    Managing aggression in mental health hospitals is an important and challenging task for clinical nursing staff. A majority of studies focus on the perspective of clinicians, and research mainly depicts aggression by referring to patient-related factors. This qualitative study investigates how aggression is communicated in forensic mental health nursing records. The aim of the study was to gain insight into the discursive practices used by forensic mental health nursing staff when they record observed aggressive incidents. Textual accounts were extracted from the Staff Observation Aggression Scale-Revised (SOAS-R), and Fairclough's critical discourse analysis was used to identify short narrative entries depicting patients and staffs in typical ways. The narratives contained descriptions of complex interactions between patient and staff that were linked to specific circumstances surrounding the patient. These antecedents, combined with the aggression incident itself, created stereotyping representations of forensic psychiatric patients as deviant, unpredictable and dangerous. Patient and staff identities were continually (re)produced by an automatic response from the staff that was solely focused on the patient's behavior. Such response might impede implementation of new strategies for managing aggression.

  10. Identification and analysis of shear waves recorded by three-component OBSs in northeastern South China Sea

    Institute of Scientific and Technical Information of China (English)

    Minghui Zhao; Xuelin Qiu; Shaohong Xia; Ping Wang; Kanyuan Xia; Huilong Xu


    Structure models associated with P- and S-wave velocities contain considerable amount of information on lithology and geophysical properties, which can be used to better understand the complexity of the deep crustal structure. However, records of converted shear waves are less due to the speciality of seismic survey at sea and the rigorous generated conditions. The study on shear waves has always been a weakness for studying the deep crustal structures of South China Sea (SCS). In this paper, eleven three-component OBSs were deployed along the Profile OBS-2001 in northeastern SCS. After the data processing of polarization and band-pass filter, converted Swave phases were identified in the radical component records of nine OBSs. Taking the OBS7 as an example, identification and analysis of converted shear waves were presented and discussed in detail. A few phase groups, such as PwSc, PgSs, PnSc, PmS, and PwSn, were found coming from the deep crust or Moho interface by simple theoretical model calculation and ray-tracing simulation. The results not only provide the underlying basis for studies of S-wave velocity structure and Poisson's ratio structure, but also reveal the relationship between crustal petrology and seismology, which will be of importance for making full use of S-wave information in the future.

  11. Efficient inverted organic light-emitting devices by amine-based solvent treatment (Presentation Recording) (United States)

    Song, Myoung Hoon; Choi, Kyoung-Jin; Jung, Eui Dae


    The efficiency of inverted polymer light-emitting diodes (iPLEDs) were remarkably enhanced by introducing spontaneously formed ripple-shaped nanostructure of ZnO (ZnO-R) and amine-based polar solvent treatment using 2-methoxyethanol and ethanolamine (2-ME+EA) co-solvents on ZnO-R. The ripple-shape nanostructure of ZnO layer fabricated by solution process with optimal rate of annealing temperature improves the extraction of wave guide modes inside the device structure, and 2-ME+EA interlayer enhances the electron injection and hole blocking and reduces exciton quenching between polar solvent treated ZnO-R and emissive layer. As a result, our optimized iPLEDs show the luminous efficiency (LE) of 61.6 cd A-1, power efficiency (PE) of 19.4 lm W-1 and external quantum efficiency (EQE) of 17.8 %. This method provides a promising method, and opens new possibilities for not only organic light-emitting diodes (OLEDs) but also other organic optoelectronic devices such as organic photovoltaics, organic thin film transistors, and electrically driven organic diode laser.

  12. Fabric-Based Wearable Dry Electrodes for Body Surface Biopotential Recording. (United States)

    Yokus, Murat A; Jur, Jesse S


    A flexible and conformable dry electrode design on nonwoven fabrics is examined as a sensing platform for biopotential measurements. Due to limitations of commercial wet electrodes (e.g., shelf life, skin irritation), dry electrodes are investigated as the potential candidates for long-term monitoring of ECG signals. Multilayered dry electrodes are fabricated by screen printing of Ag/AgCl conductive inks on flexible nonwoven fabrics. This study focuses on the investigation of skin-electrode interface, form factor design, electrode body placement of printed dry electrodes for a wearable sensing platform. ECG signals obtained with dry and wet electrodes are comparatively studied as a function of body posture and movement. Experimental results show that skin-electrode impedance is influenced by printed electrode area, skin-electrode interface material, and applied pressure. The printed electrode yields comparable ECG signals to wet electrodes, and the QRS peak amplitude of ECG signal is dependent on printed electrode area and electrode on body spacing. Overall, fabric-based printed dry electrodes present an inexpensive health monitoring platform solution for mobile wearable electronics applications by fulfilling user comfort and wearability.

  13. Ultrathin, rollable, paper-based triboelectric nanogenerator for acoustic energy harvesting and self-powered sound recording. (United States)

    Fan, Xing; Chen, Jun; Yang, Jin; Bai, Peng; Li, Zhaoling; Wang, Zhong Lin


    A 125 μm thickness, rollable, paper-based triboelectric nanogenerator (TENG) has been developed for harvesting sound wave energy, which is capable of delivering a maximum power density of 121 mW/m(2) and 968 W/m(3) under a sound pressure of 117 dBSPL. The TENG is designed in the contact-separation mode using membranes that have rationally designed holes at one side. The TENG can be implemented onto a commercial cell phone for acoustic energy harvesting from human talking; the electricity generated can be used to charge a capacitor at a rate of 0.144 V/s. Additionally, owing to the superior advantages of a broad working bandwidth, thin structure, and flexibility, a self-powered microphone for sound recording with rolled structure is demonstrated for all-sound recording without an angular dependence. The concept and design presented in this work can be extensively applied to a variety of other circumstances for either energy-harvesting or sensing purposes, for example, wearable and flexible electronics, military surveillance, jet engine noise reduction, low-cost implantable human ear, and wireless technology applications.

  14. 基于单片机的语音记录仪%Voice recorder based on microcontroller

    Institute of Scientific and Technical Information of China (English)

    王彦茹; 胡体玲


    介绍了一种基于单片机AT89S52和语音芯片ISD25120的语音记录仪,可实现对语音信号的实时采集,分段存储,选段播放的功能。通过按键和液晶1602实现人机对话,对功能(录音或者放音)、频道(存储或播放的位置)以及音量的大小进行选择和控制。该语音记录仪可应用于公交语音报站系统或银行报号系统等装置中。%This paper introduces a voice recorder based on AT89S52 and ISD25120. It can be realized the functions of realtime acquisition of the voice signal, fragmentation, excerpts playback. Moreover, it can be to come true about human-machine dialogue, channel selection and volume control. The voice recorder can be applied to bus-stop system, banking system and other devices.

  15. Selection of medical diagnostic codes for analysis of electronic patient records. Application to stroke in a primary care database.

    Directory of Open Access Journals (Sweden)

    Martin C Gulliford

    Full Text Available BACKGROUND: Electronic patient records from primary care databases are increasingly used in public health and health services research but methods used to identify cases with disease are not well described. This study aimed to evaluate the relevance of different codes for the identification of acute stroke in a primary care database, and to evaluate trends in the use of different codes over time. METHODS: Data were obtained from the General Practice Research Database from 1997 to 2006. All subjects had a minimum of 24 months of up-to-standard record before the first recorded stroke diagnosis. Initially, we identified stroke cases using a supplemented version of the set of codes for prevalent stroke used by the Office for National Statistics in Key health statistics from general practice 1998 (ONS codes. The ONS codes were then independently reviewed by four raters and a restricted set of 121 codes for 'acute stroke' was identified but the kappa statistic was low at 0.23. RESULTS: Initial extraction of data using the ONS codes gave 48,239 cases of stroke from 1997 to 2006. Application of the restricted set of codes reduced this to 39,424 cases. There were 2,288 cases whose index medical codes were for 'stroke annual review' and 3,112 for 'stroke monitoring'. The frequency of stroke review and monitoring codes as index codes increased from 9 per year in 1997 to 1,612 in 2004, 1,530 in 2005 and 1,424 in 2006. The one year mortality of cases with the restricted set of codes was 29.1% but for 'stroke annual review,' 4.6% and for 'stroke monitoring codes', 5.7%. CONCLUSION: In the analysis of electronic patient records, different medical codes for a single condition may have varying clinical and prognostic significance; utilisation of different medical codes may change over time; researchers with differing clinical or epidemiological experience may have differing interpretations of the relevance of particular codes. There is a need for greater

  16. Analysis of data recorded by the LCTPC equipped with a two layer GEM-system

    CERN Document Server

    Ljunggren, M


    wire based readout. The prototype TPC is placed in a 1 Tesla magnet at DESY and tested using an electron beam. Analyses of data taken during two different measurement series, in 2009 and 2010, are presented here. The TPC was instrumented with a two layer GEM system and read out using modified electronics from the ALICE experiment, including the programmable charge sensitive preamp-shaper PCA16. The PCA16 chip has a number of programmable parameters which allows studies to determine the settings optimal to the final TPC. Here, the impact of the shaping time on the space resolution in the drift direction was studied. It was found that a shaping time of 60 ns is the b...

  17. Multiscale multifractal analysis of heart rate variability recordings with a large number of occurrences of arrhythmia (United States)

    Gierałtowski, J.; Żebrowski, J. J.; Baranowski, R.


    Human heart rate variability, in the form of time series of intervals between heart beats, shows complex, fractal properties. Recently, it was demonstrated many times that the fractal properties vary from point to point along the series, leading to multifractality. In this paper, we concentrate not only on the fact that the human heart rate has multifractal properties but also that these properties depend on the time scale in which the multifractality is measured. This time scale is related to the frequency band of the signal. We find that human heart rate variability appears to be far more complex than hitherto reported in the studies using a fixed time scale. We introduce a method called multiscale multifractal analysis (MMA), which allows us to extend the description of heart rate variability to include the dependence on the magnitude of the variability and time scale (or frequency band). MMA is relatively immune to additive noise and nonstationarity, including the nonstationarity due to inclusions into the time series of events of a different dynamics (e.g., arrhythmic events in sinus rhythm). The MMA method may provide new ways of measuring the nonlinearity of a signal, and it may help to develop new methods of medical diagnostics.

  18. Radiometric Short-Term Fourier Transform analysis of photonic Doppler velocimetry recordings and detectivity limit (United States)

    Prudhomme, G.; Berthe, L.; Bénier, J.; Bozier, O.; Mercier, P.


    Photonic Doppler Velocimetry is a plug-and-play and versatile diagnostic used in dynamic physic experiments to measure velocities. When signals are analyzed using a Short-Time Fourier Transform, multiple velocities can be distinguished: for example, the velocities of moving particle-cloud appear on spectrograms. In order to estimate the back-scattering fluxes of target, we propose an original approach "PDV Radiometric analysis" resulting in an expression of time-velocity spectrograms coded in power units. Experiments involving micron-sized particles raise the issue of detection limit; particle-size limit is very difficult to evaluate. From the quantification of noise sources, we derive an estimation of the spectrogram noise leading to a detectivity limit, which may be compared to the fraction of the incoming power which has been back-scattered by the particle and then collected by the probe. This fraction increases with their size. At last, some results from laser-shock accelerated particles using two different PDV systems are compared: it shows the improvement of detectivity with respect to the Effective Number of Bits (ENOB) of the digitizer.

  19. A quality improvement study using fishbone analysis and an electronic medical records intervention to improve care for children with asthma. (United States)

    Gold, Jonathan; Reyes-Gastelum, David; Turner, Jane; Davies, H Dele


    Despite expert guidelines, gaps persist in quality of care for children with asthma. This study sought to identify barriers and potential interventions to improve compliance to national asthma prevention guidelines at a single academic pediatric primary care clinic. Using the plan-do-check-act (PDCA) quality improvement framework and fishbone analysis, several barriers to consistent asthma processes and possible interventions were identified by a group of key stakeholders. Two interventions were implemented using the electronic medical record (EMR). Physician documentation of asthma quality measures were analyzed before intervention and during 2 subsequent time points over 16 months. Documentation of asthma action plans (core group P asthma care in a pediatric primary care setting.

  20. Independent component analysis of gait-related movement artifact recorded using EEG electrodes during treadmill walking.

    Directory of Open Access Journals (Sweden)

    Kristine Lynne Snyder


    Full Text Available There has been a recent surge in the use of electroencephalography (EEG as a tool for mobile brain imaging due to its portability and fine time resolution. When EEG is combined with independent component analysis (ICA and source localization techniques, it can model electrocortical activity as arising from temporally independent signals located in spatially distinct cortical areas. However, for mobile tasks, it is not clear how movement artifacts influence ICA and source localization. We devised a novel method to collect pure movement artifact data (devoid of any electrophysiological signals with a 256-channel EEG system. We first blocked true electrocortical activity using a silicone swim cap. Over the silicone layer, we placed a simulated scalp with electrical properties similar to real human scalp. We collected EEG movement artifact signals from ten healthy, young subjects wearing this setup as they walked on a treadmill at speeds from 0.4-1.6 m/s. We performed ICA and dipole fitting on the EEG movement artifact data to quantify how accurately these methods would identify the artifact signals as non-neural. ICA and dipole fitting accurately localized 99% of the independent components in non-neural locations or lacked dipolar characteristics. The remaining 1% of sources had locations within the brain volume and low residual variances, but had topographical maps, power spectra, time courses, and event related spectral perturbations typical of non-neural sources. Caution should be exercised when interpreting ICA for data that includes semi-periodic artifacts including artifact arising from human walking. Alternative methods are needed for the identification and separation of movement artifact in mobile EEG signals, especially methods that can be performed in real time. Separating true brain signals from motion artifact could clear the way for EEG brain computer interfaces for assistance during mobile activities, such as walking.

  1. The role of process analysis and expert consultation in implementing an electronic medical record solution for multidrug-resistant tuberculosis

    Directory of Open Access Journals (Sweden)

    Rubeshan Perumal


    Full Text Available Background: Process analysis and expert consultation help streamline and optimise processes, but these are underutilised. The World Health Organisation (WHO recommends migration to electronic data collection by 2015, partly in response to multidrug-resistant tuberculosis (MDR-TB. We explore the influence of process analysis and iterative expert consultation, on shaping health information solutions to MDR-TB programmes.Methods: The study employs a two phase design. Phase one involves a process analysis of the South African National Tuberculosis Programme and an electronic medical records (EMR solution and the generation of a detailed process model grounded in the fit between individual task and technology (FITT theoretical framework using ‘business process modelling notation’. Phase two involves a two round Delphi study in the clinical management of tuberculosis and implementers of EMR solutions. Expert opinion is analysed according to emergent thematic content. Analyses and graphical model representation are performed using Microsoft Excel® and Visio® software.Results: A detailed process model is constructed which reveals 54 break points, 12 gaps, 3 risks, 5 wastes. Five participants are included in the Delphi study which support the findings of the process analysis. Thematic analysis identifies five themes: the individual, the process, technology, capacity, and collaboration. The opportunity to include synergistic relations across programmes emerges as a strong theme.Conclusions: Overall, the findings highlight inefficiencies, risk and gaps in the current process and the need for an operational excellence intervention. The study demonstrated the value of process engineering with iterative expert consultation toward developing a meaningful EMR solution consultation in a resource constrained, developing world context.

  2. Reconstruction of Subdecadal Changes in Sunspot Numbers Based on the NGRIP 10Be Record

    DEFF Research Database (Denmark)

    Inceoglu, Fadil; Knudsen, Mads Faurschou; Karoff, Christoffer


    , to reconstruct both long-term and subdecadal changes in sunspot numbers (SSNs). We compare three different approaches for reconstructing subdecadal-scale changes in SSNs, including a linear approach and two approaches based on the hysteresis effect, i.e. models with ellipse-linear and ellipse relationships......Sunspot observations since 1610 A.D. show that the solar magnetic activity displays long-term changes, from Maunder Minimum-like low-activity states to Modern Maximum-like high-activity episodes, as well as short-term variations, such as the pronounced 11-year periodicity. Information on changes...... the actual solar cycles and the GCR intensity, which is known as the hysteresis effect. In this study, we use the North Greenland Ice Core Project (NGRIP) records of the 10Be flux to reconstruct the solar modulation strength (Φ), which describes the modulation of GCRs throughout the heliosphere...

  3. Combined Approach to the Analysis of Rainfall Super-Extremes in Locations with Limited Observational Records. (United States)

    Lakshmi, V.; Libertino, A.; Sharma, A.; Claps, P.


    obtained with the classic techniques of frequency analysis and spatial interpolation, demonstrate an increased knowledge coming from satellite, climate and local factors, ensuring more reliable and accurate spatial assessment of extreme thunderstorm probability.

  4. [Assessing food acceptance in scholar children; qualitative visual record versus food waste analysis]. (United States)

    Rodriguez Tadeo, Alejandra; Patiño Villena, Begoña; Periago Castón, María Jesús; Ros Berruezo, Gaspar; González Martínez-Lacuesta, Eduardo


    Introducción: Los comedores escolares cuentan con normativa de gestión y supervisión de menús, sin embargo no se ha valorado si son consumidos en su totalidad. Objetivo: Valorar la aceptación de alimentos mediante pesado de restos y validación de una metodología visual para su estimación en comedores escolares de Murcia. Metodología: Participaron escolares de segundo y tercer ciclo de educación primaria, de 8-12 años. La estimación de restos se realizó mediante pesado de alimentos de 765 bandejas. La valoración visual (300 bandejas) se realizó con escala categórica: 1 = 0-25%; 2 = 26-50%; 3 = 51- 75% y 4 = 76-100%, por dos dietistas y se valoró la fiabilidad con respecto al pesado de alimentos. La concordancia entre ambos métodos fue evaluada en dos muestras estratificadas por la presencia/ausencia de cocina en la escuela. Resultados: Los primeros platos con más restos fueron pasta, arroz y purés de verduras siendo mayor en aquellos colegios sin cocina en el centro (p cocina (p < 0,05). La concordancia entre evaluadoras fue alta en platos a base de carnes y en ensaladas, y considerable en legumbres, precocinados, tortilla de huevo, pasta, pescado y arroz. Conclusiones: Los restos son elevados y existieron diferencias en la aceptación de ciertos alimentos acorde al tipo de menú ofertado. La escala visual es una herramienta confiable para medir la aceptación de forma indirecta, pero necesita capacitación y entrenamiento del personal implicado.

  5. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon


    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  6. Workflow-based approaches to neuroimaging analysis. (United States)

    Fissell, Kate


    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  7. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Nielsen, Mads; Lo, Pechin Chien Pau


    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based...... on measured lung function instead of on manually annotated regions of interest (ROIs). A quantitative measure of COPD is obtained by fusing COPD probabilities computed in ROIs within the lung fields where the individual ROI probabilities are computed using a k nearest neighbor (kNN ) classifier. The distance...... and subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density...

  8. Interactive Analysis of recorded time-series for fault detection and man-machine-interfaces; Interaktive Auswertung von aufgezeichneten Zeitreihen fuer Fehlerdiagnosen und Mensch-Maschine-Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Mikut, R.; Burmeister, O.; Grube, M.; Reischl, M. [Forschungszentrum Karlsruhe GmbH (Germany); Bretthauer, G. [Forschungszentrum Karlsruhe GmbH (Germany); Universitaet Karlsruhe (Germany)


    This work proposes steps towards a unified analysis of recorded time series in an automation environment. Main strategy elements are standardized problem formulations and analysis steps with a focus to classification problems in combination with interactive visualization techniques. The necessary permanent support by software tools is shown for the example of the open source MATLAB toolbox Gait-CAD. (orig.)

  9. A novel bioelectronic nose based on brain-machine interface using implanted electrode recording in vivo in olfactory bulb. (United States)

    Dong, Qi; Du, Liping; Zhuang, Liujing; Li, Rong; Liu, Qingjun; Wang, Ping


    The mammalian olfactory system has merits of higher sensitivity, selectivity and faster response than current electronic nose system based on chemical sensor array. It is advanced and feasible to detect and discriminate odors by mammalian olfactory system. The purpose of this study is to develop a novel bioelectronic nose based on the brain-machine interface (BMI) technology for odor detection by in vivo electrophysiological measurements of olfactory bulb. In this work, extracellular potentials of mitral/tufted (M/T) cells in olfactory bulb (OB) were recorded by implanted 16-channel microwire electrode arrays. The odor-evoked response signals were analyzed. We found that neural activities of different neurons showed visible different firing patterns both in temporal features and rate features when stimulated by different small molecular odorants. The detection low limit is below 1 ppm for some specific odors. Odors were classified by an algorithm based on population vector similarity and support vector machine (SVM). The results suggested that the novel bioelectonic nose was sensitive to odorant stimuli. The best classifying accuracy was up to 95%. With the development of the BMI and olfactory decoding methods, we believe that this system will represent emerging and promising platforms for wide applications in medical diagnosis and security fields.

  10. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana


    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  11. Measuring Class Cohesion Based on Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhen-Qiang Chen; Bao-Wen Xu; Yu-Ming Zhou


    Classes are the basic modules in object-oriented (OO) software, which consist of attributes and methods. Thus, in OO environment, the cohesion is mainly about the tightness of the attributes and methods of classes. This paper discusses the relationships between attributes and attributes, attributes and methods, methods and methods of a class based on dependence analysis. Then the paper presents methods to compute these dependencies. Based on these, the paper proposes a method to measure the class cohesion, which satisfies the properties that a good measurement should have. The approach overcomes the limitations of previous class cohesion measures, which consider only one or two of the three relationships in a class.


    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie


    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  13. Recording and analysis of electrically evoked compound action potentials (ECAPs) with MED-EL cochlear implants and different artifact reduction strategies in Matlab. (United States)

    Bahmer, Andreas; Peter, Otto; Baumann, Uwe


    Electrically evoked compound action potentials (ECAPs) are used in auditory research to evaluate the response of the auditory nerve to electrical stimulation. Animal preparations are typically used for the recording. With the introduction of a new generation of cochlear implants, however it is possible to record the response of the auditory nerve to electrical stimulation in humans as well, which is used in the clinic to test whether the implant works properly and whether the auditory nerve is responsive. Currently, ECAPs are used to estimate thresholds for speech processor programs. In addition, ECAPs recordings allow new research to be addressed, e.g., to evaluate enhanced electrical stimulation patterns. Research platforms are required to test user-defined stimuli and algorithms for the ECAPs analysis. Clinical fitting software that records ECAPs is not flexible enough for this purpose. To enable a larger group of scientists to pursue research in this field, we introduce a flexible setup that allows to change stimulation and recording parameters. ECAP recording and analysis software was developed in Matlab (The Mathworks, Inc.) for standard PC, using a National instruments (PCI-6533, National Instruments, Austin, TX) card and a Research Interface Box 2 (RIB2, Department of Ion Physics and Applied Physics at the University of Innsbruck, Innsbruck, Austria) for MED-EL cochlear implants. ECAP recordings of a human subject with three different artifact reduction methods (alternating, Miller modified masker-probe, triphasic pulses) are presented and compared.

  14. Comparative Analysis Between Tree-Ring Based Drought Records of Populuse up hratic a and Piceasc hre nkiana from the Mountains and Plains of Eastern Xinjiang%新疆东部雪岭云杉和胡杨树轮记录的干湿变化对比分析

    Institute of Scientific and Technical Information of China (English)

    陈峰; 尚华明; 袁玉江


    利用采自新疆东部的雪岭云杉和胡杨树轮样芯研制出区域树轮宽度年表。相关分析发现雪岭云杉区域树轮宽度年表与上年8月至当年7月标准化蒸发指数(SPEI)变化有较好相关性,相关系数为0.67(P<0.01,n=54),同时胡杨区域树轮宽度年表也表现出较强的干湿变化信号(r=0.48,P<0.01,n=54)。利用线性回归模型重建了新疆东部1725—2013年上年8月至当年7月SPEI变化,方差解释量为45.3%。利用雪岭云杉区域树轮宽度年表能够较好地重建新疆东部自1725年以来的上年8月至当年7月的SPEI变化,方差解释量达45.3%。重建结果揭示新疆东部1725—1728年,1737—1758年,1765—1804年,1829—1834年,1845—1852年,1888—1904年,1915—1923年,1932—1961年,1969—1973年,1986—2001年偏湿;1729—1736年,1759—1764年,1805—1828年,1835—1844年,1853—1887年,1905—1914年,1924—1931年,1962—1968年,1974—1985年,2002—2013年偏干。胡杨区域树轮宽度年表与雪岭云杉区域树轮宽度年表所指示的干湿变化在年代际上存在一致性,但是胡杨对干湿变化响应往往滞后于雪岭云杉。新疆东部干湿变化与天山中西部干湿变化在年际变化上存在很强一致性,但在年代际变化上存在显著差异,并与甘肃河西走廊干湿变化存在紧密联系。%The regional tree-ring width chronologies of Populus euphratica and Picea schrenkiana were developed for eastern Xinjiang, northwestern China. The climate response analysis shows the regional tree-ring width chronology of Picea schrenkiana had a good correlation (r=0.67)with August-July standardized precipitation-evapotranspiration index (SPEI). Based on the regional tree-ring chronology of Picea schrenkiana, we developed a August-July SPEI reconstruction of eastern Xinjiang for the period AD 1725-2013. The SPEI/tree-growth model accounts for 45.3% of the

  15. Musical Structural Analysis Database Based on GTTM


    Hamanaka, Masatoshi; Hirata, Keiji; Tojo, Satoshi


    This paper, we present the publication of our analysis data and analyzing tool based on the generative theory of tonal music (GTTM). Musical databases such as score databases, instrument sound databases, and musical pieces with standard MIDI files and annotated data are key to advancements in the field of music information technology. We started implementing the GTTM on a computer in 2004 and ever since have collected and publicized test data by musicologists in a step-by-step manner. In our ...

  16. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.


    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  17. A singular spectrum analysis on Holocene climatic oscillation from lake sedimentary record in Minqin Basin,China

    Institute of Scientific and Technical Information of China (English)

    JIN Liya; CHEN Fahu; DING Xiaojun; ZHU Yah


    The total organic carbon (TOC) content series from the lake sediment of Minqin Basin (100°57'-104°57'E, 37°48'-39°17'N) in northwestern China, which has a 10 000-year-long paleo-climatic proxy record, was used to analyze the Holocene climate changes in the local region. The proxy record was established in the Sanjiaocheng (SJC), Triangle Town in Chinese, Section (103°20'25"E, 39°00'38"N),which is located at the northwestern boundary of the present Asian summer monsoon in China, and is sensitive to global environmental and climate changes. Applying singular spectrum analysis (SSA) to the TOC series, principal climatic oscillations and periodical changes were studied. The results reveal 3 major patterns of climate change regulated by reconstructed components (RCs). The first pattern is natural long-term trend of climatic change in the local area (Minqin Basin), indicating a relatively wetter stage in early Holocene (starting at 9.5 kaBP), and a relatively dryer stage with a strong lake desiccation and a declined vegetation cover in mid-Holocene (during 7-6 kaBP). From 4.0 kaBP to the present, there has been a gradually decreasing trend in the third reconstructed component (RC3) showing that the local climate changed again into a dryer stage. The second pattern shows millennial-centennial scale oscillations containing cycles of 1 600 and 800 years that have been present throughout almost the entire Holocene period of the last 10 000 years. The third pattern is a millennial-centennial scale variation with a relatively smaller amplitude and unclear cycles showing a nonlinear interaction within the earth's climate systems.

  18. 病案首页数据质量分析%Analysis on Data Quality of Medical Record Front Sheet

    Institute of Scientific and Technical Information of China (English)

    李红樱; 周蝶


    目的通过统计分析,发现住院病案首页的数据质量问题,并找出问题产生的原因,提出改进方法。方法对某院2013年88113份住院病案首页数据的60个数据项进行统计分析,得出漏填率和错填率。结果某院2013年住院病案首页数据中,其他信息的漏填率和错填率最高,且错误形式有多种。结论现有的住院病案首页数据质量不高,还需各方面的努力和配合,只有不断发现和改进问题,才能提高住院病案首页质量。%Objective Trough statistical analysis, to find out data quality defects in date on medical record front sheet (MRFS), to identify causes and suggest remediation. Methods 88113 in-patient medical records are collected from a hospital in 2013 and 60 items relating to date on MRFS are statistically analyzed. The incompletion rate and error rate are calculated. Results The highest incompletion rate and highest error rate both occur on the item of Other Information, and a variety of errors present. Conclusions The date quality of MRFS is unsatisfactory, and effort and cooperation from every part is required. To improve the quality of MRFS, continual defect identification and remediation is needed.

  19. Strategies in the processing and analysis of continuous gravity record in active volcanic areas: the case of Mt. Vesuvius

    Directory of Open Access Journals (Sweden)

    J. Hinderer


    Full Text Available This research is intended to describe new strategies in the processing and analysis of continuous gravity records collected in active volcanic areas and to assess how permanent gravity stations can improve the geophysical monitoring of a volcano. The experience of 15 years in continuous gravity monitoring on Mt. Vesuvius is discussed. Several geodynamic phenomena can produce temporal gravity changes. An eruption, for instance, is associated with the ascent of magma producing changes in the density distribution at depth, and leading to ground deformation and gravity changes The amplitude of such gravity variations is often quite small, in the order of 10-102 nms-2, so their detection requires high quality data and a rigorous procedure to isolate from the records those weak gravity signals coming from different sources. Ideally we need gravity signals free of all effects which are not of volcanic origin. Therefore solid Earth tide, ocean and atmospheric loading, instrumental drift or any kind of disturbances other than due to the volcano dynamics have to be removed. The state of the art on the modelling of the solid Earth tide is reviewed. The atmospheric dynamics is one of the main sources precluding the detection of small gravity signals. The most advanced methods to reduce the atmospheric effects on gravity are presented. As the variations of the calibration factors can prevent the repeatability of high-precision measurements, new approaches to model the instrumental response of mechanical gravimeters are proposed too. Moreover, a strategy for an accurate modelling of the instrumental drift and to distinguish it from longterm gravity changes is suggested.

  20. Development of electronic medical record charting for hospital-based transfusion and apheresis medicine services: Early adoption perspectives

    Directory of Open Access Journals (Sweden)

    Rebecca Levy


    Full Text Available Background: Electronic medical records (EMRs provide universal access to health care information across multidisciplinary lines. In pathology departments, transfusion and apheresis medicine services (TAMS involved in direct patient care activities produce data and documentation that typically do not enter the EMR. Taking advantage of our institution′s initiative for implementation of a paperless medical record, our TAMS division set out to develop an electronic charting (e-charting strategy within the EMR. Methods: A focus group of our hospital′s transfusion committee consisting of transfusion medicine specialists, pathologists, residents, nurses, hemapheresis specialists, and information technologists was constituted and charged with the project. The group met periodically to implement e-charting TAMS workflow and produced electronic documents within the EMR (Cerner Millenium for various service line functions. Results: The interdisciplinary working group developed and implemented electronic versions of various paper-based clinical documentation used by these services. All electronic notes collectively gather and reside within a unique Transfusion Medicine Folder tab in the EMR, available to staff with access to patient charts. E-charting eliminated illegible handwritten notes, resulted in more consistent clinical documentation among staff, and provided greater real-time review/access of hemotherapy practices. No major impediments to workflow or inefficiencies have been encountered. However, minor updates and corrections to documents as well as select work re-designs were required for optimal use of e-charting by these services. Conclusion: Documentation of pathology subspecialty activities such as TAMS can be successfully incorporated into the EMR. E-charting by staff enhances communication and helps promote standardized documentation of patient care within and across service lines. Well-constructed electronic documents in the EMR may also

  1. Detector : knowledge-based systems for dairy farm management support and policy-analysis; methods and applications.

    NARCIS (Netherlands)

    Hennen, W.H.G.J.


    This thesis describes new methods and knowledge-based systems for the analysis of technical and economic accounting data from the year-end records of individual dairy farms to support the management and, after adaptation, for policy analysis.A new method for farm comparison, the farm-adjusted standa

  2. Preliminary study of acceleration based sensor to record nile tilapia (Oreochromis niloticus) feeding behavior at water surface (United States)

    Subakti, Aji; Khotimah, Zarah F.; Darozat, Fajar M.


    In this preliminary study, the acceleration based sensor was developed to monitor the activity of Nile tilapia (Oreochromis niloticus) feeding behavior at the water surface. This study was conducted for three weeks in a fish pond with 40 m2 in size, stocked with 850 fingerlings of Nile tilapia strain Nirwana-2 (average biomass of 13 g, fed four times a day at 8 am, 12 pm, 4 pm, and 8 pm). The acceleration sensor system was installed floating in the pond and was designed in a way so that the xz plane of the sensor will be parallel with water surface, while the y-axis will be pointing downward. By sensing the acceleration caused by the surface wave, the activities of fish near surface water could be monitored. The result showed that there were three distinctive patterns could be observed which was related to the feeding activity of fish. Generally, it can be concluded that this acceleration based sensor system can be integrated with automatic feeder machine, in particular by analyzing the recorded patter, it is possible to monitor when the fish stop eating, and so the right amount of feed could be given to the fish.

  3. Analysis of coastal sea-level station records and implications for tsunami monitoring in the Adriatic Apulia region, southern Italy (United States)

    Bressan, Lidia; Tinti, Stefano; Tallarico, Andrea


    The region of Apulia, southern Italy, was theater of one of the largest tsunami disaster in Italian history (the 30 July 1627 event) and is considered to be exposed to tsunami hazard coming from local Italian sources as well as from sources on the eastern side of the Adriatic and from the Ionian sea, including the Hellenic Arc earthquakes. Scientific interest for tsunami studies and monitoring in the region is only recent and this theme was specifically addressed by the international project OTRIONS, coordinated by the University of Bari. In the frame of this project the University of Bologna contributed to the analysis of the tsunami hazard and to the evaluation of the regional tide-gauge network with the scope of assessing its adequacy for tsunami monitoring. This latter is the main topic of the present work. In eastern Apulia, facing the Adriatic sea, the sea-level data network is sufficiently dense being formed of stations of the Italian tide-gauge network (Rete Mareografica Nazionale, RMN), of four additional stations operated by the Apulia Port Authority (in Brindisi, Ischitella, Manfredonia and Porto Cesareo) and of two more stations that were installed in the harbours of Barletta and Monopoli in the frame of the project OTRIONS with real-time data transmission and 1-sec sampling period. Pre-processing of the sea-level data of these stations included quality check and spectral analysis. Where the sampling rate was adequate, the records were also examined by means of the specific tools provided by the TEDA package. This is a Tsunami Early Detection Algorithm, developed by the Tsunami Research Team of the University of Bologna, that allows one to characterize the sea-level background signal in the typical tsunami frequency window (from 1 to several minutes) and consequently to optimize TEDA parameters for an efficient tsunami detection. The results of the analysis show stability of the spectral content and seasonal variations.

  4. Long-Term Impact of an Electronic Health Record-Enabled, Team-Based, and Scalable Population Health Strategy Based on the Chronic Care Model (United States)

    Kawamoto, Kensaku; Anstrom, Kevin J; Anderson, John B; Bosworth, Hayden B; Lobach, David F; McAdam-Marx, Carrie; Ferranti, Jeffrey M; Shang, Howard; Yarnall, Kimberly S H


    The Chronic Care Model (CCM) is a promising framework for improving population health, but little is known regarding the long-term impact of scalable, informatics-enabled interventions based on this model. To address this challenge, this study evaluated the long-term impact of implementing a scalable, electronic health record (EHR)- enabled, and CCM-based population health program to replace a labor-intensive legacy program in 18 primary care practices. Interventions included point-of-care decision support, quality reporting, team-based care, patient engagement, and provider education. Among 6,768 patients with diabetes receiving care over 4 years, hemoglobin A1c levels remained stable during the 2-year pre-intervention and post-intervention periods (0.03% and 0% increases, respectively), compared to a 0.42% increase expected based on A1c progression observed in the United Kingdom Prospective Diabetes Study long-term outcomes cohort. The results indicate that an EHR-enabled, team- based, and scalable population health strategy based on the CCM may be effective and efficient for managing population health.


    Energy Technology Data Exchange (ETDEWEB)



    This document is a Phase I deliverable for the Single-Shell Tank Analysis of Record effort. This document is not the Analysis of Record. The intent of this document is to guide the Phase II detailed modeling effort. Preliminary finite element models for each of the tank types were developed and different case studies were performed on one or more of these tank types. Case studies evaluated include thermal loading, waste level variation, the sensitivity of boundary effects (soil radial extent), excavation slope or run to rise ratio, soil stratigraphic (property and layer thickness) variation at different farm locations, and concrete material property variation and their degradation under thermal loads. The preliminary analysis document reviews and preliminary modeling analysis results are reported herein. In addition, this report provides recommendations for the next phase of the SST AOR project, SST detailed modeling. Efforts and results discussed in this report do not include seismic modeling as seismic modeling is covered by a separate report. The combined results of both static and seismic models are required to complete this effort. The SST AOR project supports the US Department of Energy's (DOE) Office of River Protection (ORP) mission for obtaining a better understanding of the structural integrity of Hanford's SSTs. The 149 SSTs, with six different geometries, have experienced a range of operating histories which would require a large number of unique analyses to fully characterize their individual structural integrity. Preliminary modeling evaluations were conducted to determine the number of analyses required for adequate bounding of each of the SST tank types in the Detailed Modeling Phase of the SST AOR Project. The preliminary modeling was conducted in conjunction with the Evaluation Criteria report, Johnson et al. (2010). Reviews of existing documents were conducted at the initial stage of preliminary modeling. These reviews guided the topics

  6. JCI Standards-based Management Practices of Medical Equipment Records%基于JCI标准的医疗设备档案管理实践

    Institute of Scientific and Technical Information of China (English)

    季慧芳; 张小芬


    In order to solve the problems in medical equipment records system,such as the problems of the deficient management system of medical equipment records,incomplete data and the unqualified staffs,our hospital attempts to improve the system construction and set up a new series of management system and preventive maintenance,based on the accreditation standards of JCI(Joint Commission International).The effective and orderly medical equipment records management system carries out a three-grade management of hospital,administrator and operators.The system reinforce link management,define the range of records files archive,focus on the collection of risk assessment and the original record of preventive maintenance,value the security management of medical equipment and quality training of the staffs,and improve the function of equipment management software as well.The hospital aims at the standardization and scientization of medical equipment record management.The medical equipment archives are supposed to offer preliminary verification for equipment purchase,and provide authentic basic data for the updating and upgrading of medical equipment.Consequently,the archives will cut down repetitive and aimless purchasing and compile benefit analysis report.It will definitely work for efficient distribution of medical equipment resources and meet the need of the sustainable development of the hospital.%针对设备档案管理制度不完善、资料不全、人员素质不高的现状,某院按照国际医院管理(Joint Commission International,JCI)评审标准,加强制度建设,建立和完善了医院设备档案管理制度、设备预防性维护制度等一系列制度,实行医院、科室、操作人员的三级管理,加强环节管理,明确设备档案的收集归档范围,注重收集设备风险评估和预防性维护的原始记录,重视医疗设备安全管理和档案人员素质培训,完善设备管理软件的功能,使得医疗

  7. Engineering the electronic health record for safety: a multi-level video-based approach to diagnosing and preventing technology-induced error arising from usability problems. (United States)

    Borycki, Elizabeth M; Kushniruk, Andre W; Kuwata, Shigeki; Kannry, Joseph


    Electronic health records (EHRs) promise to improve and streamline healthcare through electronic entry and retrieval of patient data. Furthermore, based on a number of studies showing their positive benefits, they promise to reduce medical error and make healthcare safer. However, a growing body of literature has clearly documented that if EHRS are not designed properly and with usability as an important goal in their design, rather than reducing error, EHR deployment has the potential to actually increase medical error. In this paper we describe our approach to engineering (and reengineering) EHRs in order to increase their beneficial potential while at the same time improving their safety. The approach described in this paper involves an integration of the methods of usability analysis with video analysis of end users interacting with EHR systems and extends the evaluation of the usability of EHRs to include the assessment of the impact of these systems on work practices. Using clinical simulations, we analyze human-computer interaction in real healthcare settings (in a portable, low-cost and high fidelity manner) and include both artificial and naturalistic data collection to identify potential usability problems and sources of technology-induced error prior to widespread system release. Two case studies where the methods we have developed and refined have been applied at different levels of user-computer interaction are described.

  8. Development of an electronic medical record based alert for risk of HIV treatment failure in a low-resource setting.

    Directory of Open Access Journals (Sweden)

    Nancy Puttkammer

    Full Text Available BACKGROUND: The adoption of electronic medical record systems in resource-limited settings can help clinicians monitor patients' adherence to HIV antiretroviral therapy (ART and identify patients at risk of future ART failure, allowing resources to be targeted to those most at risk. METHODS: Among adult patients enrolled on ART from 2005-2013 at two large, public-sector hospitals in Haiti, ART failure was assessed after 6-12 months on treatment, based on the World Health Organization's immunologic and clinical criteria. We identified models for predicting ART failure based on ART adherence measures and other patient characteristics. We assessed performance of candidate models using area under the receiver operating curve, and validated results using a randomly-split data sample. The selected prediction model was used to generate a risk score, and its ability to differentiate ART failure risk over a 42-month follow-up period was tested using stratified Kaplan Meier survival curves. RESULTS: Among 923 patients with CD4 results available during the period 6-12 months after ART initiation, 196 (21.2% met ART failure criteria. The pharmacy-based proportion of days covered (PDC measure performed best among five possible ART adherence measures at predicting ART failure. Average PDC during the first 6 months on ART was 79.0% among cases of ART failure and 88.6% among cases of non-failure (p<0.01. When additional information including sex, baseline CD4, and duration of enrollment in HIV care prior to ART initiation were added to PDC, the risk score differentiated between those who did and did not meet failure criteria over 42 months following ART initiation. CONCLUSIONS: Pharmacy data are most useful for new ART adherence alerts within iSanté. Such alerts offer potential to help clinicians identify patients at high risk of ART failure so that they can be targeted with adherence support interventions, before ART failure occurs.

  9. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen


    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  10. Reclink: aplicativo para o relacionamento de bases de dados, implementando o método probabilistic record linkage Reclink: an application for database linkage implementing the probabilistic record linkage method

    Directory of Open Access Journals (Sweden)

    Kenneth R. de Camargo Jr.


    Full Text Available Apresenta-se um sistema de relacionamento de bases de dados fundamentado na técnica de relacionamento probabilístico de registros, desenvolvido na linguagem C++ com o ambiente de programação Borland C++ Builder versão 3.0. O sistema foi testado a partir de fontes de dados de diferentes tamanhos, tendo sido avaliado em tempo de processamento e sensibilidade para a identificação de pares verdadeiros. O tempo gasto com o processamento dos registros foi menor quando se empregou o programa do que ao ser realizado manualmente, em especial, quando envolveram bases de maior tamanho. As sensibilidades do processo manual e do processo automático foram equivalentes quando utilizaram bases com menor número de registros; entretanto, à medida que as bases aumentaram, percebeu-se tendência de diminuição na sensibilidade apenas no processo manual. Ainda que em fase inicial de desenvolvimento, o sistema apresentou boa performance tanto em velocidade quanto em sensibilidade. Embora a performance dos algoritmos utilizados tenha sido satisfatória, o objetivo é avaliar outras rotinas, buscando aprimorar o desempenho do sistema.This paper presents a system for database linkage based on the probabilistic record linkage technique, developed in the C++ language with the Borland C++ Builder version 3.0 programming environment. The system was tested in the linkage of data sources of different sizes, evaluated both in terms of processing time and sensitivity for identifying true record pairs. Significantly less time was spent in record processing when the program was used, as compared to manual processing, especially in situations where larger databases were used. Manual and automatic processes had equivalent sensitivities in situations where we used databases with fewer records. However, as the number of records grew we noticed a clear reduction in the sensitivity of the manual process, but not in the automatic one. Although in its initial stage of

  11. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.


    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  12. Automatic recognition of T and teleseismic P waves by statistical analysis of their spectra: An application to continuous records of moored hydrophones (United States)

    Sukhovich, Alexey; Irisson, Jean-Olivier; Perrot, Julie; Nolet, Guust


    A network of moored hydrophones is an effective way of monitoring seismicity of oceanic ridges since it allows detection and localization of underwater events by recording generated T waves. The high cost of ship time necessitates long periods (normally a year) of autonomous functioning of the hydrophones, which results in very large data sets. The preliminary but indispensable part of the data analysis consists of identifying all T wave signals. This process is extremely time consuming if it is done by a human operator who visually examines the entire database. We propose a new method for automatic signal discrimination based on the Gradient Boosted Decision Trees technique that uses the distribution of signal spectral power among different frequency bands as the discriminating characteristic. We have applied this method to automatically identify the types of acoustic signals in data collected by two moored hydrophones in the North Atlantic. We show that the method is capable of efficiently resolving the signals of seismic origin with a small percentage of wrong identifications and missed events: 1.2% and 0.5% for T waves and 14.5% and 2.8% for teleseismic P waves, respectively. In addition, good identification rates for signals of other types (iceberg and ship generated) are obtained. Our results indicate that the method can be successfully applied to automate the analysis of other (not necessarily acoustic) databases provided that enough information is available to describe statistical properties of the signals to be identified.

  13. 社区戒毒(康复)群体的社会生态系统分析--基于6个个案的跟进记录%The Analysis of Community Drug Rehabilitation Groups’ Social-ecological Systems:A Study Based on Six Cases'Service Records

    Institute of Scientific and Technical Information of China (English)



    In China,the abuse and addiction of illicit drugs have been rapidly increasing over the last decades.The community-based rehabilitation in China has developed in these years but in a small-scale. The community-based one has great significance in the future because it contributes to help the drug users go back to the mainstream society by providing them a healthy social environment.This study aims to de-scribe and analyze the society ecosystem of the participants and try to offer some implications for the inter-vention of social work.The society ecosystems theory give social workers the theory angle of view to un-derstand each system from the big work frame,helping workers to analyze and to understand the situation and mutual relations in the community drug rehabilitation group of micro,mecro and macro system.It has a very good inspiration for the intervention of social workers and the application of social system resources.%受案主生理、心理、社会功能、朋辈、家庭、职业群体、社区、机构、制度和文化等不同层面系统因素的影响,传统单一的社会工作介入模式在社区戒毒(康复)服务中越来越难以奏效;借助社会生态系统理论视角,社会工作者能更好地从微观、中观和宏观层面了解社区戒毒(康复)群体的社会生存现状、困境和不同层面系统之间的交互影响关系,这可为策划整合性社区戒毒(康复)服务方案,发展多元化服务策略,提供理论支持。

  14. Standard Test Method for Application and Analysis of Solid State Track Recorder (SSTR) Monitors for Reactor Surveillance, E706(IIIB)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This test method describes the use of solid-state track recorders (SSTRs) for neutron dosimetry in light-water reactor (LWR) applications. These applications extend from low neutron fluence to high neutron fluence, including high power pressure vessel surveillance and test reactor irradiations as well as low power benchmark field measurement. (1) This test method replaces Method E 418. This test method is more detailed and special attention is given to the use of state-of-the-art manual and automated track counting methods to attain high absolute accuracies. In-situ dosimetry in actual high fluence-high temperature LWR applications is emphasized. 1.2 This test method includes SSTR analysis by both manual and automated methods. To attain a desired accuracy, the track scanning method selected places limits on the allowable track density. Typically good results are obtained in the range of 5 to 800 000 tracks/cm2 and accurate results at higher track densities have been demonstrated for some cases. (2) Trac...

  15. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello


    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  16. Constructing storyboards based on hierarchical clustering analysis (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu


    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  17. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A


    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  18. Reconstruction of three centuries of annual accumulation rates based on the record of stable isotopes of water from Lomonosovfonna, Svalbard

    NARCIS (Netherlands)

    Pohjola, V.; Martma, T.; Meijer, H.A.J.; Moore, J.; Isaksson, E.; Vaikmae, R.; van de Wal, R.S.W.


    We use the upper 81 in of the record of stable isotopes of water from a 122 in long ice core from Lomonosovfonna, central Spitsbergen, Svalbard, to construct an ice-core chronology and the annual accumulation rates over the icefield. The isotope cycles are counted in the ice-core record using a mode

  19. Childhood obesity trends from primary care electronic health records in England between 1994 and 2013: population-based cohort study

    NARCIS (Netherlands)

    Jaarsveld, C.H.M. van; Gulliford, M.C.


    OBJECTIVE: This study aimed to use primary care electronic health records to evaluate the prevalence of overweight and obesity in 2-15-year-old children in England and compare trends over the last two decades. DESIGN: Cohort study of primary care electronic health records. SETTING: 375 general pract

  20. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan


    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  1. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder


    Full Text Available The "European Organisation for the Exploitation of Meteorological Satellites" (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean-Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operations capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, kriging, has been developed and applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA-JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE to both reanalysis is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated in JMA and all

  2. The CM SAF SSM/I-based total column water vapour climate data record: methods and evaluation against re-analyses and satellite

    Directory of Open Access Journals (Sweden)

    M. Schröder


    Full Text Available The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF aims at the provision and sound validation of well documented Climate Data Records (CDRs in sustained and operational environments. In this study, a total column water vapour path (WVPA climatology from CM SAF is presented and inter-compared to water vapour data records from various data sources. Based on homogenised brightness temperatures from the Special Sensor Microwave Imager (SSM/I, a climatology of WVPA has been generated within the Hamburg Ocean–Atmosphere Fluxes and Parameters from Satellite (HOAPS framework. Within a research and operation transition activity the HOAPS data and operation capabilities have been successfully transferred to the CM SAF where the complete HOAPS data and processing schemes are hosted in an operational environment. An objective analysis for interpolation, namely kriging, has been applied to the swath-based WVPA retrievals from the HOAPS data set. The resulting climatology consists of daily and monthly mean fields of WVPA over the global ice-free ocean. The temporal coverage ranges from July 1987 to August 2006. After a comparison to the precursor product the CM SAF SSM/I-based climatology has been comprehensively compared to different types of meteorological analyses from the European Centre for Medium-Range Weather Forecasts (ECMWF-ERA40, ERA INTERIM and operational analyses and from the Japan Meteorological Agency (JMA–JRA. This inter-comparison shows an overall good agreement between the climatology and the analyses, with daily absolute biases generally smaller than 2 kg m−2. The absolute value of the bias to JRA and ERA INTERIM is typically smaller than 0.5 kg m−2. For the period 1991–2006, the root mean square error (RMSE for both reanalyses is approximately 2 kg m−2. As SSM/I WVPA and radiances are assimilated into JMA and all ECMWF analyses and

  3. The mystery of Bunge Land (New Siberian Archipelago): implications for its formation based on palaeoenvironmental records, geomorphology, and remote sensing (United States)

    Schirrmeister, Lutz; Grosse, Guido; Kunitsky, Viktor V.; Fuchs, Margret C.; Krbetschek, Matthias; Andreev, Andrei A.; Herzschuh, Ulrike; Babyi, Olga; Siegert, Christine; Meyer, Hanno; Derevyagin, Alexander Y.; Wetterich, Sebastian


    Multiproxy datasets (geocryology, geochronology, sedimentology, palaeo-ecology) from permafrost exposures were used together with land surface information based on satellite imagery and thematic maps in order to reconstruct the Lateglacial to Holocene landscape and environmental dynamics of Bunge Land (Zemlya Bunge). This area of little relief, situated in the New Siberian Archipelago, connects the geomorphologically well-structured islands of Kotel'ny and Fadeevsky. A buried thermokarst landscape was found in the northwest region of the Bunge Land low terrace sand plain, whereas the Bunge Land high terrace seems to be an exposed residue of a similar late Quaternary thermokarst landscape. That is confirmed especially by radiocarbon accelerator mass spectrometry and optically stimulated luminescence age determinations, and by pollen analyses. Palaeogeographically, the late Pleistocene periglacial landscape and sedimentation of Bunge Land was closely connected to Kotel'ny and Fadeevsky; only later on seismotectonical block movements resulted in reshaping parts of Bunge Land. The Bunge Land low terrace area first subsided and the original landscape there was destroyed by marine inundation, followed by marine sedimentation. Subsequent block heave of the low terrace region exposed a vast sheet of marine sands which is continuously surficially reworked by aeolian processes, while the original alluvial plain landscape in the high terrace area was preserved and started degrading only by early Holocene thermokarst development. The studied exposures contain one of the northernmost (74.88°N) environmental records for the late Pleistocene-Holocene transition in the Eurasian Arctic.

  4. 基于Web的教育类网站/网校备案登记系统研究%Research on the Recording and Registration System of the Educational Website/Website School Based on Web

    Institute of Scientific and Technical Information of China (English)

    于海鹏; 张旭阳


    研究开发基于Web的教育类网站/网校备案登记系统。该系统以Internet为依托,采用了基于MVC的框架结构,有效地解决了大量备案信息的审批、维护、检索、分析、处理的问题。%This paper studies the related technology of recording and registration system of the educational website/website school based on Web. Depending on the Internet, using the frame structure based on MVC, this system can effectively solve a lot of questions of recording information referring to approval, maintenance, retrieval, analysis and disposal.

  5. Comparison of infrared spectroscopy techniques: developing an efficient method for high resolution analysis of sediment properties from long records (United States)

    Hahn, Annette; Rosén, Peter; Kliem, Pierre; Ohlendorf, Christian; Persson, Per; Zolitschka, Bernd; Pasado Science Team


    The analysis of sediment samples in visible to mid-infrared spectra is ideal for high-resolution records. It requires only small amounts (0.01-0.1g dry weight) of sample material and facilitates rapid and cost efficient analysis of a wide variety of biogeochemical properties on minerogenic and organic substances (Kellner et al. 1998). One of these techniques, the Diffuse Reflectance Fourier Transform Infrared Spectrometry (DRIFTS), has already been successfully applied to lake sediment from very different settings and has shown to be a promising technique for high resolution analyses of long sedimentary records on glacial-interglacial timescales (Rosén et al. 2009). However, the DRIFTS technique includes a time-consuming step where sediment samples are mixed with KBr. To assess if alternative and more rapid infrared (IR) techniques can be used, four different IR spectroscopy techniques are compared for core catcher sediment samples from Laguna Potrok Aike - an ICDP site located in southernmost South America. Partial least square (PLS) calibration models were developed using the DRIFTS technique. The correlation coefficients (R) for correlations between DRIFTS-inferred and conventionally measured biogeochemical properties show values of 0.80 for biogenic silica (BSi), 0.95 for total organic carbon (TOC), 0.91 for total nitrogen (TN), and 0.92 for total inorganic carbon (TIC). Good statistical performance was also obtained by using the Attenuated Total Reflectance Fourier Transform Infrared Spectroscopy ATR-FTIRS technique which requires less sample preparation. Two devices were used, the full-sized Bruker Equinox 252 and the smaller and less expensive Bruker Alpha. R for ATR-FTIRS-inferred and conventionally measured biogeochemical properties were 0.87 (BSi), 0.93 (TOC), 0.90 (TN), and 0.91 (TIC) for the Alpha, and 0.78 (TOC), 0.85 (TN), 0.79 (TIC) for the Equinox 252 device. As the penetration depth of the IR beam is frequency dependent, a firm surface contact of

  6. Analysis of the process of representing clinical statements for decision-support applications: a comparison of openEHR archetypes and HL7 virtual medical record. (United States)

    González-Ferrer, A; Peleg, M; Marcos, M; Maldonado, J A


    Delivering patient-specific decision-support based on computer-interpretable guidelines (CIGs) requires mapping CIG clinical statements (data items, clinical recommendations) into patients' data. This is most effectively done via intermediate data schemas, which enable querying the data according to the semantics of a shared standard intermediate schema. This study aims to evaluate the use of HL7 virtual medical record (vMR) and openEHR archetypes as intermediate schemas for capturing clinical statements from CIGs that are mappable to electronic health records (EHRs) containing patient data and patient-specific recommendations. Using qualitative research methods, we analyzed the encoding of ten representative clinical statements taken from two CIGs used in real decision-support systems into two health information models (openEHR archetypes and HL7 vMR instances) by four experienced informaticians. Discussion among the modelers about each case study example greatly increased our understanding of the capabilities of these standards, which we share in this educational paper. Differing in content and structure, the openEHR archetypes were found to contain a greater level of representational detail and structure while the vMR representations took fewer steps to complete. The use of openEHR in the encoding of CIG clinical statements could potentially facilitate applications other than decision-support, including intelligent data analysis and integration of additional properties of data items from existing EHRs. On the other hand, due to their smaller size and fewer details, the use of vMR potentially supports quicker mapping of EHR data into clinical statements.

  7. High-resolution glacial and deglacial record of atmospheric methane by continuous-flow and laser spectrometer analysis along the NEEM ice core

    Directory of Open Access Journals (Sweden)

    J. Chappellaz


    Full Text Available The Greenland NEEM (North Greenland Eemian Ice Drilling operation in 2010 provided the first opportunity to combine trace-gas measurements by laser spectroscopic instruments and continuous-flow analysis along a freshly drilled ice core in a field based setting. We present the resulting atmospheric methane (CH4 record covering the time period from 107.7 to 9.5 ka b2k (thousand years before 2000 AD. Companion discrete CH4 measurements are required to transfer the laser spectroscopic data from a relative to an absolute scale. However, even on a relative scale, the high-resolution CH4 dataset significantly improves our knowledge of past atmospheric methane concentration changes. New significant sub-millennial-scale features appear during interstadials and stadials, generally associated with similar changes in water isotopic ratios of the ice, a proxy for local temperature. In addition to the mid-point of Dansgaard/Oeschger (D/O CH4 transitions usually used for cross-dating, sharp definition of the start and end of these events brings precise depth markers (with ±20 cm uncertainty for further cross-dating with other ice core or paleo records, e.g. speleothems. The method also provides an estimate of CH4 rates of change. The onsets of D/O events in the methane signal show a more rapid rate of change than their endings. The rate of CH4 increase associated with the onsets of D/O events progressively declines from 1.7 to 0.6 ppbv yr−1 in the course of Marine Isotope Stage 3. The largest observed rate of increase takes place at the onset of D/O event #21 and reaches 2.5 ppbv yr−1.

  8. Internal medicine and emergency admissions: from a national hospital discharge records (SDO study to a regional analysis

    Directory of Open Access Journals (Sweden)

    Filomena Pietrantonio


    Full Text Available In Italy, the number of internists has grown by 10% since 1990 reaching 11,435 units, they manage 39,000 beds in 1060 Internal Medicine (IM wards. The Internists are expected to ensure a cost-effective management of poly-pathological and complex patients. A collaborative study between the Federation of Associations of Hospital Doctors on Internal Medicine (FADOI and the Consortium for Applied Health Economics Research (C.R.E.A. Sanità based on data from hospital discharge records has been conducted starting from November 2014. In this article the preliminary results are shown with focus on emergency admissions characteristics to contribute to define the role of hospital IM. Evaluation is performed comparing emergency and planned admissions, IM impact on hospital admissions, availability of community-based healthcare services, diagnosis-related groups (DRGs weight in IM and regional differences in managing hospital admissions with focus on IM department. In 2013 IM wards discharged 1,073,526 patients (16.18% of the total discharged by hospitals with a total economic value of 3,426,279.88 € (average DRG 3882.80 €, from 3682.19 to 4083.42. The average length of stay (LOS in IM was 9.3 days. IM covers 27% of admissions from Emergency Room. Determinants significantly affecting the emergency admissions are old age and comorbidities of the patients that also have a role in increasing LOS. 55% of Italian hospital admissions are emergency admissions. Hospitalization rates in emergency are systematically higher than those in election and the greatest differences are in the regions with inefficiently organized regional network. The role of the hospital IM appears central in the offer of beds to the emergency room by accepting 27% of urgent admissions. The increasing impact of IM on hospital management will put the internists as authoritative stakeholders in health policy.

  9. Phenological Records (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Phenology is the scientific study of periodic biological phenomena, such as flowering, breeding, and migration, in relation to climatic conditions. The few records...

  10. Design of microcontroller-based EMG and the analysis of EMG signals. (United States)

    Güler, Nihal Fatma; Hardalaç, Firat


    In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.

  11. Occupational self-coding and automatic recording (OSCAR): a novel web-based tool to collect and code lifetime job histories in large population-based studies. (United States)

    De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul


    Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.

  12. Study of amplitude frequency spectra of the compound action potentials recorded from normal and M. leprae infected mice using Fourier series analysis. (United States)

    Vidyasagar, P B; Lokhandwalla, M N; Damle, P S


    Compound action potentials recorded from normal and M. leprae infected mice sciatic nerves were analysed in frequency domain using Fourier Series Analysis. Changes in myelinated fibre potentials were detected as early as 2nd post-inoculation month. This technique could be further developed to aid in early diagnosis of leprosy.

  13. Documenting biogeographical patterns of African timber species using herbarium records: a conservation perspective based on native trees from Angola. (United States)

    Romeiras, Maria M; Figueira, Rui; Duarte, Maria Cristina; Beja, Pedro; Darbyshire, Iain


    In many tropical regions the development of informed conservation strategies is hindered by a dearth of biodiversity information. Biological collections can help to overcome this problem, by providing baseline information to guide research and conservation efforts. This study focuses on the timber trees of Angola, combining herbarium (2670 records) and bibliographic data to identify the main timber species, document biogeographic patterns and identify conservation priorities. The study recognized 18 key species, most of which are threatened or near-threatened globally, or lack formal conservation assessments. Biogeographical analysis reveals three groups of species associated with the enclave of Cabinda and northwest Angola, which occur primarily in Guineo-Congolian rainforests, and evergreen forests and woodlands. The fourth group is widespread across the country, and is mostly associated with dry forests. There is little correspondence between the spatial pattern of species groups and the ecoregions adopted by WWF, suggesting that these may not provide an adequate basis for conservation planning for Angolan timber trees. Eight of the species evaluated should be given high conservation priority since they are of global conservation concern, they have very restricted distributions in Angola, their historical collection localities are largely outside protected areas and they may be under increasing logging pressure. High conservation priority was also attributed to another three species that have a large proportion of their global range concentrated in Angola and that occur in dry forests where deforestation rates are high. Our results suggest that timber tree species in Angola may be under increasing risk, thus calling for efforts to promote their conservation and sustainable exploitation. The study also highlights the importance of studying historic herbarium collections in poorly explored regions of the tropics, though new field surveys remain a priority to

  14. Mapping the North Sea base-Quaternary: using 3D seismic to fill a gap in the geological record (United States)

    Lamb, Rachel; Huuse, Mads; Stewart, Margaret; Brocklehurst, Simon H.


    The identification and mapping of the base-Quaternary boundary in the central parts of the North Sea is problematic due to the change from an unconformable transition between Pliocene and Pleistocene deltaic deposits in the southern North Sea to a conformable one further north (Sejrup et al 1991; Gatliff et al 1994). The best estimates of the transition use seismic reflection data to identify a 'crenulated reflector' (Buckley 2012), or rely on correlating sparse biostratigraphy (Cameron et al 1987). Recent integration of biostratigraphy, pollen analysis, paleomagnetism and amino acid analysis in the Dutch and Danish sectors (Rasmussen et al 2005; Kuhlmann et al 2006) allows greater confidence in the correlation to a regional 3D seismic dataset and show that the base-Quaternary can be mapped across the entire basin. The base-Quaternary has been mapped using the PGS MegaSurvey dataset from wells in the Danish Sector along the initially unconformable horizon and down the delta front into the more conformable basin giving a high degree of confidence in the horizon pick. The mapped horizon is presented here alongside the difference between this new interpretation and the previously interpreted base-Quaternary (Buckley 2012). The revised base-Quaternary surface reaches a depth of 1248 ms TWT or approximately 1120 m (assuming average velocity of 1800 m/s) showing an elongate basin shape that follows the underlying structure of the Central Graben. The difference between the revised base-Quaternary and the traditional base-Quaternary reaches a maximum of over 600 ms TWT or approximately 540 m in the south-west with over 300 ms TWT or approximately 270 m at the Josephine well (56° 36.11'N, 2° 27.09'E) in the centre of the basin. Mapping this new base-Quaternary allows for the interpretation of the paleo-envionrment during the earliest Quaternary. Seismic attribute analysis indicates a deep water basin with sediment deposition from multiple deltas and redistribution by deep

  15. High spatial resolution recording of near-infrared hologram based on photo-induced phase transition of vanadium dioxide film. (United States)

    Sui, Xiubao; Zeng, Junjie; Chen, Qian; Gu, Guohua


    We present a method to record near-infrared (NIR) hologram at high spatial resolution. This method up-converts the NIR holograms to visible holograms taking advantage of the photo-induced phase transition characteristic of vanadium dioxide (VO2) material, and subsequently, the visible holograms are recorded by a high-resolution visible CMOS sensor. Obviously the pitch of visible sensor is much smaller than NIR sensors, so our method can extremely increase the recording resolution of NIR holograms. The experiments demonstrate the effectiveness of our method. Our method can improve the viewing angle of NIR holography to observe large-scale objects and shorten the observation distance so that the application area of NIR holography is expanded. It has the potential to become a more effective NIR hologram recording method.

  16. Holocene changes in monsoon precipitation in the Andes of NE Peru based on δ18O speleothem records (United States)

    Bustamante, M. G.; Cruz, F. W.; Vuille, M.; Apaéstegui, J.; Strikis, N.; Panizo, G.; Novello, F. V.; Deininger, M.; Sifeddine, A.; Cheng, H.; Moquet, J. S.; Guyot, J. L.; Santos, R. V.; Segura, H.; Edwards, R. L.


    Two well-dated δ18O-speleothem records from Shatuca cave, situated on the northeastern flank of the Peruvian Andes (1960 m asl) were used to reconstruct high-resolution changes in precipitation during the Holocene in the South American Summer Monsoon region (SASM). The records show that precipitation increased gradually throughout the Holocene in parallel with the austral summer insolation trend modulated by the precession cycle. Additionally the Shatuca speleothem record shows several hydroclimatic changes on both longer- and shorter-term time scales, some of which have not been described in previous paleoclimatic reconstructions from the Andean region. Such climate episodes, marked by negative excursions in the Shatuca δ18O record were logged at 9.7-9.5, 9.2, 8.4, 8.1, 5.0, 4.1, 3.5, 3.0, 2.5, 2.1 and 1.5 ka b2k, and related to abrupt multi-decadal events in the SASM. Some of these events were likely associated with changes in sea surface temperatures (SST) during Bond events in the North Atlantic region. On longer time scales, the low δ18O values reported between 5.1-5.0, 3.5-3.0 and 1.5 ka b2k were contemporaneous with periods of increased sediment influx at Lake Pallcacocha in the Andes of Ecuador, suggesting that the late Holocene intensification of the monsoon recorded at Shatuca site may also have affected high altitudes of the equatorial Andes further north. Numerous episodes of low SASM intensity (dry events) were recorded by the Shatuca record during the Holocene, in particular at 10.2, 9.8, 9.3, 6.5, 5.1, 4.9, 2.5 and 2.3 ka b2k, some of them were synchronous with dry periods in previous Andean records.

  17. Dual-layer write-once media for 1x-4x-speed recording based on Blu-ray Disc format (United States)

    Uno, Mayumi; Akiyama, Tetsuya; Kitaura, Hideki; Kojima, Rie; Nishiuchi, Kenichi; Yamada, Noboru


    We have developed dual-layer write-once media with Te-O-Pd based recording films on Blu-ray (BD) format. Recording capacity was 50GB with dual layers on a disk of 120mm in diameter. Rear and Front layers showed jitters of 5.8% and 7.7% at 1x speed, and 6.0% and 8.0% at 2x speed, respectively, which were good enough to satisfy the BD format. Evaluations were carried out with blue-violet laser of 405nm wavelength, objective lens NA of 0.85. Recording linear velocities were 4.92m/s at BD 1x (36Mbps), and 9.84m/s at BD 2x (72Mbps). Characteristics at 4x speed recording were also examined, and it was revealed that carrier to niose ratio at high recording linear velocity of 19.7m/s, which corresponds to BD 4x (144Mbps), was alomst as same as those of 1x and 2x. Recording mechanism was discussed and proposed a model that Te-O-Pd films were not crystallized directly through solid process, but crystallized through melting.

  18. Service for the Pseudonymization of Electronic Healthcare Records Based on ISO/EN 13606 for the Secondary Use of Information. (United States)

    Somolinos, Roberto; Muñoz, Adolfo; Hernando, M Elena; Pascual, Mario; Cáceres, Jesús; Sánchez-de-Madariaga, Ricardo; Fragua, Juan A; Serrano, Pablo; Salvador, Carlos H


    The availability of electronic health data favors scientific advance through the creation of repositories for secondary use. Data anonymization is a mandatory step to comply with current legislation. A service for the pseudonymization of electronic healthcare record (EHR) extracts aimed at facilitating the exchange of clinical information for secondary use in compliance with legislation on data protection is presented. According to ISO/TS 25237, pseudonymization is a particular type of anonymization. This tool performs the anonymizations by maintaining three quasi-identifiers (gender, date of birth, and place of residence) with a degree of specification selected by the user. The developed system is based on the ISO/EN 13606 norm using its characteristics specifically favorable for anonymization. The service is made up of two independent modules: the demographic server and the pseudonymizing module. The demographic server supports the permanent storage of the demographic entities and the management of the identifiers. The pseudonymizing module anonymizes the ISO/EN 13606 extracts. The pseudonymizing process consists of four phases: the storage of the demographic information included in the extract, the substitution of the identifiers, the elimination of the demographic information of the extract, and the elimination of key data in free-text fields. The described pseudonymizing system was used in three telemedicine research projects with satisfactory results. A problem was detected with the type of data in a demographic data field and a proposal for modification was prepared for the group in charge of the drawing up and revision of the ISO/EN 13606 norm.

  19. An imaging informatics-based ePR (electronic patient record) system for providing decision support in evaluating dose optimization in stroke rehabilitation (United States)

    Liu, Brent J.; Winstein, Carolee; Wang, Ximing; Konersman, Matt; Martinez, Clarisa; Schweighofer, Nicolas


    Stroke is one of the major causes of death and disability in America. After stroke, about 65% of survivors still suffer from severe paresis, while rehabilitation treatment strategy after stroke plays an essential role in recovery. Currently, there is a clinical trial (NIH award #HD065438) to determine the optimal dose of rehabilitation for persistent recovery of arm and hand paresis. For DOSE (Dose Optimization Stroke Evaluation), laboratory-based measurements, such as the Wolf Motor Function test, behavioral questionnaires (e.g. Motor Activity Log-MAL), and MR, DTI, and Transcranial Magnetic Stimulation (TMS) imaging studies are planned. Current data collection processes are tedious and reside in various standalone systems including hardcopy forms. In order to improve the efficiency of this clinical trial and facilitate decision support, a web-based imaging informatics system has been implemented together with utilizing mobile devices (eg, iPAD, tablet PC's, laptops) for collecting input data and integrating all multi-media data into a single system. The system aims to provide clinical imaging informatics management and a platform to develop tools to predict the treatment effect based on the imaging studies and the treatment dosage with mathematical models. Since there is a large amount of information to be recorded within the DOSE project, the system provides clinical data entry through mobile device applications thus allowing users to collect data at the point of patient interaction without typing into a desktop computer, which is inconvenient. Imaging analysis tools will also be developed for structural MRI, DTI, and TMS imaging studies that will be integrated within the system and correlated with the clinical and behavioral data. This system provides a research platform for future development of mathematical models to evaluate the differences between prediction and reality and thus improve and refine the models rapidly and efficiently.

  20. Predicting depression based on dynamic regional connectivity: a windowed Granger causality analysis of MEG recordings. (United States)

    Lu, Qing; Bi, Kun; Liu, Chu; Luo, Guoping; Tang, Hao; Yao, Zhijian


    Abnormal inter-regional causalities can be mapped for the objective diagnosis of various diseases. These inter-regional connectivities are usually calculated over an entire scan and used to characterize the stationary strength of the connections. However, the connectivity within networks may undergo substantial changes during a scan. In this study, we developed an objective depression recognition approach using the dynamic regional interactions that occur in response to sad facial stimuli. The whole time-period magnetoencephalography (MEG) signals from the visual cortex, amygdala, anterior cingulate cortex (ACC) and inferior frontal gyrus (IFG) were separated into sequential time intervals. The Granger causality mapping method was used to identify the pairwise interaction pattern within each time interval. Feature selection was then undertaken within a minimum redundancy-maximum relevance (mRMR) framework. Typical classifiers were utilized to predict those patients who had depression. The overall performances of these classifiers were similar, and the highest classification accuracy rate was 87.5%. The best discriminative performance was obtained when the number of features was within a robust range. The discriminative network pattern obtained through support vector machine (SVM) analyses displayed abnormal causal connectivities that involved the amygdala during the early and late stages. These early and late connections in the amygdala appear to reveal a negative bias to coarse expression information processing and abnormal negative modulation in patients with depression, which may critically affect depression discrimination.