WorldWideScience

Sample records for based record analysis

  1. Pareto analysis based on records

    CERN Document Server

    Doostparast, M

    2012-01-01

    Estimation of the parameters of an exponential distribution based on record data has been treated by Samaniego and Whitaker (1986) and Doostparast (2009). Recently, Doostparast and Balakrishnan (2011) obtained optimal confidence intervals as well as uniformly most powerful tests for one- and two-sided hypotheses concerning location and scale parameters based on record data from a two-parameter exponential model. In this paper, we derive optimal statistical procedures including point and interval estimation as well as most powerful tests based on record data from a two-parameter Pareto model. For illustrative purpose, a data set on annual wages of a sample production-line workers in a large industrial firm is analyzed using the proposed procedures.

  2. Repetition-based Structure Analysis of Music Recordings

    OpenAIRE

    Jiang, Nanzhu

    2015-01-01

    Music Information Retrieval (MIR) is a current area of research which aims at providing techniques and tools for searching, organizing, processing and interacting with music data. In order to extract musically meaningful information from audio recordings, one requires methods from various fields such as digital signal processing, music theory, human perception, and information retrieval. One central research topic within MIR is referred to as music structure analysis, where an important goal ...

  3. Case record analysis

    OpenAIRE

    Whitaker, Simon

    2009-01-01

    It is argued that the determinates of low frequency (less than once an hour) challenging behavior are likely to be more complex than those of high frequency behavior involving setting events that may not be present when the behavior occurs. The analysis of case records is then examined as a method of identifying possible setting events to low frequency behaviours. It is suggested that time series analysis, correlational analysis and time lag sequential analysis may all be useful methods in th...

  4. Phasor based analysis of FRET images recorded using spectrally resolved lifetime imaging

    International Nuclear Information System (INIS)

    The combined analysis of spectral and lifetime images has the potential to provide more accurate and more detailed information about Förster resonance energy transfer (FRET). We have developed a novel FRET analysis method to analyze images recorded by multispectral lifetime imaging. The new method is based on a phasor approach and facilitates the simultaneous analysis of decay kinetics of donor and acceptor molecules. The method is applicable to both molecules that exhibit a mono-exponential decay and a bi-exponential decay. As an example we show the possibility of extracting the energy transfer efficiency and the fraction of interacting molecules even in the presence of non-interacting molecules. The reliability of the method is investigated by comparing it with conventional FRET-FLIM analyses. We show that, with the same number of detected photons, the spectrally resolved phasor approach provides higher accuracy than other analysis methods; the confidence interval is improved and the FRET efficiency is closer to the real value. (paper)

  5. Usability of Web-based Personal Health Records: An Analysis of Consumers' Perspectives.

    Science.gov (United States)

    Wang, Tiankai; Dolezel, Diane

    2016-01-01

    Personal health records (PHRs) have many benefits, including the ability to increase involvement of patients in their care, which provides better healthcare outcomes. Although issues related to usability of PHRs are a significant barrier to adoption, there is a paucity of research in this area. Thus, the researchers explored consumers' perspective on the usability of two commercially available web-based PHRs. Data from the Usefulness, Satisfaction, and Ease of Use questionnaire were collected from a sample of health information management students (N = 90). A one-way analysis of variance (ANOVA) showed that Microsoft HealthVault had higher scores in most usability categories when compared to Health Companion. Study results indicated that PHR developers should evaluate Microsoft HealthVault as a model for improving PHR usability. PMID:27134611

  6. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt; Fauerskov, Inge; Osmanagic, Armin;

    2013-01-01

    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals for...... heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  7. Borneo: a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    OpenAIRE

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species occurrence based on the identified relationships between species recorded presences and the ecological circumstances at those localities. A new statistical method was developed to test the species distribut...

  8. How do repeat suicide attempters differ from first timers? An exploratory record based analysis

    Directory of Open Access Journals (Sweden)

    Vikas Menon

    2016-01-01

    Full Text Available Background: Evidence indicates that repeat suicide attempters, as a group, may differ from 1st time attempters. The identification of repeat attempters is a powerful but underutilized clinical variable. Aims: In this research, we aimed to compare individuals with lifetime histories of multiple attempts with 1st time attempters to identify factors predictive of repeat attempts. Setting and Design: This was a retrospective record based study carried out at a teaching cum Tertiary Care Hospital in South India. Methods: Relevant data was extracted from the clinical records of 1st time attempters (n = 362 and repeat attempters (n = 61 presenting to a single Tertiary Care Center over a 4½ year period. They were compared on various sociodemographic and clinical parameters. The clinical measures included Presumptive Stressful Life Events Scale, Beck Hopelessness Scale, Coping Strategies Inventory – Short Form, and the Global Assessment of Functioning Scale. Statistical Analysis Used: First time attempters and repeaters were compared using appropriate inferential statistics. Logistic regression was used to identify independent predictors of repeat attempts. Results: The two groups did not significantly differ on sociodemographic characteristics. Repeat attempters were more likely to have given prior hints about their act (χ2 = 4.500, P = 0.034. In the final regression model, beck hopelessness score emerged as a significant predictor of repeat suicide attempts (odds ratio = 1.064, P = 0.020. Conclusion: Among suicide attempters presenting to the hospital, the presence of hopelessness is a predictor of repeat suicide attempts, independent of clinical depression. This highlights the importance of considering hopelessness in the assessment of suicidality with a view to minimize the risk of future attempts.

  9. Towards successful coordination of electronic health record based-referrals: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Paul Lindsey A

    2011-07-01

    Full Text Available Abstract Background Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP, the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs (i.e., e-referrals, lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. Methods We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Results Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Conclusions Despite facilitating information transfer between PCPs and

  10. Usability of Web-based Personal Health Records: An Analysis of Consumers' Perspectives

    OpenAIRE

    Wang, Tiankai; Dolezel, Diane

    2016-01-01

    Personal health records (PHRs) have many benefits, including the ability to increase involvement of patients in their care, which provides better healthcare outcomes. Although issues related to usability of PHRs are a significant barrier to adoption, there is a paucity of research in this area. Thus, the researchers explored consumers' perspective on the usability of two commercially available web-based PHRs. Data from the Usefulness, Satisfaction, and Ease of Use questionnaire were collected...

  11. The construction and periodicity analysis of natural disaster database of Alxa area based on Chinese local records

    Science.gov (United States)

    Yan, Zheng; Mingzhong, Tian; Hengli, Wang

    2010-05-01

    Chinese hand-written local records were originated from the first century. Generally, these local records include geography, evolution, customs, education, products, people, historical sites, as well as writings of an area. Through such endeavors, the information of the natural materials of China nearly has had no "dark ages" in the evolution of its 5000-year old civilization. A compilation of all meaningful historical data of natural-disasters taken place in Alxa of inner-Mongolia, the second largest desert in China, is used here for the construction of a 500-year high resolution database. The database is divided into subsets according to the types of natural-disasters like sand-dust storm, drought events, cold wave, etc. Through applying trend, correlation, wavelet, and spectral analysis on these data, we can estimate the statistically periodicity of different natural-disasters, detect and quantify similarities and patterns of the periodicities of these records, and finally take these results in aggregate to find a strong and coherent cyclicity through the last 500 years which serves as the driving mechanism of these geological hazards. Based on the periodicity obtained from the above analysis, the paper discusses the probability of forecasting natural-disasters and the suitable measures to reduce disaster losses through history records. Keyword: Chinese local records; Alxa; natural disasters; database; periodicity analysis

  12. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt

    2014-11-01

    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  13. Bayesian analysis for the Burr type XII distribution based on record values

    Directory of Open Access Journals (Sweden)

    Mustafa Nadar

    2013-05-01

    Full Text Available In this paper we reviewed and extended some results that have been derived on record values from the two parameters Burr Type XII distribution. The two parameters were assumed to be random variables and Bayes estimates were derived on the basis of a linear exponential (LINEX loss function. Estimates for future record values were derived using non Bayesian and Bayesian approaches. In the Bayesian approach we reviewed the estimators obtained by Ahmedi and Doostparast (2006 using the well known squared error loss (SEL function and we derived estimate for the future record value under LINEX loss function. A numerical example with tables and figures illustrated the findings.

  14. Validation of PC-based sound card with Biopac for digitalization of ECG recording in short-term HRV analysis

    Directory of Open Access Journals (Sweden)

    K Maheshkumar

    2016-01-01

    Full Text Available Background: Heart rate variability (HRV analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR in healthy as well as disease conditions. The aim of the present study was to compare (validate the HRV using a temporal series of electrocardiograms (ECG obtained by simple analog amplifier with PC-based sound card (audacity and Biopac MP36 module. Materials and Methods: Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. Results: The unpaired Student′s t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001 between the values in time and frequency domain obtained by the devices. Conclusion: On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  15. Magnetoencephalography recording and analysis

    Directory of Open Access Journals (Sweden)

    Jayabal Velmurugan

    2014-01-01

    Full Text Available Magnetoencephalography (MEG non-invasively measures the magnetic field generated due to the excitatory postsynaptic electrical activity of the apical dendritic pyramidal cells. Such a tiny magnetic field is measured with the help of the biomagnetometer sensors coupled with the Super Conducting Quantum Interference Device (SQUID inside the magnetically shielded room (MSR. The subjects are usually screened for the presence of ferromagnetic materials, and then the head position indicator coils, electroencephalography (EEG electrodes (if measured simultaneously, and fiducials are digitized using a 3D digitizer, which aids in movement correction and also in transferring the MEG data from the head coordinates to the device and voxel coordinates, thereby enabling more accurate co-registration and localization. MEG data pre-processing involves filtering the data for environmental and subject interferences, artefact identification, and rejection. Magnetic resonance Imaging (MRI is processed for correction and identifying fiducials. After choosing and computing for the appropriate head models (spherical or realistic; boundary/finite element model, the interictal/ictal epileptiform discharges are selected and modeled by an appropriate source modeling technique (clinically and commonly used - single equivalent current dipole - ECD model. The equivalent current dipole (ECD source localization of the modeled interictal epileptiform discharge (IED is considered physiologically valid or acceptable based on waveform morphology, isofield pattern, and dipole parameters (localization, dipole moment, confidence volume, goodness of fit. Thus, MEG source localization can aid clinicians in sublobar localization, lateralization, and grid placement, by evoking the irritative/seizure onset zone. It also accurately localizes the eloquent cortex-like visual, language areas. MEG also aids in diagnosing and delineating multiple novel findings in other neuropsychiatric

  16. Recording-based identification of site liquefaction

    Institute of Scientific and Technical Information of China (English)

    Hu Yuxian; Zhang Yushan; Liang Jianwen; Ray Ruichong Zhang

    2005-01-01

    Reconnaissance reports and pertinent research on seismic hazards show that liquefaction is one of the key sources of damage to geotechnical and structural engineering systems. Therefore, identifying site liquefaction conditions plays an important role in seismic hazard mitigation. One of the widely used approaches for detecting liquefaction is based on the time-frequency analysis of ground motion recordings, in which short-time Fourier transform is typically used. It is known that recordings at a site with liquefaction are the result of nonlinear responses of seismic waves propagating in the liquefied layers underneath the site. Moreover, Fourier transform is not effective in characterizing such dynamic features as time-dependent frequency of the recordings rooted in nonlinear responses. Therefore, the aforementioned approach may not be intrinsically effective in detecting liquefaction. An alternative to the Fourier-based approach is presented in this study,which proposes time-frequency analysis of earthquake ground motion recordings with the aid of the Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the liquefaction features shown in the recordings. The paper then defines the predominant instantaneous frequency (PIF) and introduces the PIF-related motion features to identify liquefaction conditions at a given site. Analysis of 29 recorded data sets at different site conditions shows that the proposed approach is effective in detecting site liquefaction in comparison with other methods.

  17. Analysis of stratospheric chlorine monoxide measurements recorded by a ground-based radiometer located at the Plateau de Bure, France

    Science.gov (United States)

    Ricaud, P.; de La Noë, J.; Lauqué, R.; Parrish, A.

    1997-01-01

    Chlorine monoxide (ClO) measurements have been recorded since December 1993 using a ground-based radiometer built in the United States and installed at a European site belonging to the Network for the Detection of Stratospheric Change, namely the Plateau de Bure (45°N, 10°E, 2550 m altitude). This paper describes the installed instrument, details the calibration and data processing routines that have been developed for use with the instrument, and characterizes the optimum estimation method used to retrieve vertical profiles and the associated error budget. Great care is taken to lessen biases induced by instrumental baseline ripples. We show the best data obtained so far, recorded from January 3 to 5, 1995, during the Second European Stratospheric Arctic and Midlatitude Experiment (SESAME) Campaign. The diurnal variation of ClO measured from 25 to 50 km is presented and compared with results from a zero-dimensional and a three-dimensional model. The remainder of the data is still being processed, although a great deal of this data has been degraded by systematic instrumental artifacts in the spectral baseline.

  18. EEG Recording and Analysis for Sleep Research

    OpenAIRE

    Campbell, Ian G.

    2009-01-01

    The electroencephalogram (EEG) is the most common tool used in sleep research. This unit describes the methods for recording and analyzing the EEG. Detailed protocols describe recorder calibration, electrode application, EEG recording, and computer EEG analysis with power spectral analysis. Computer digitization of an analog EEG signal is discussed, along with EEG filtering and the parameters of fast Fourier transform (FFT) power spectral analysis. Sample data are provided for a typical night...

  19. An Analysis of the Accuracy of Electromechanical Eigenvalue Calculations Based on Instantaneous Power Waveforms Recorded in a Power Plant

    OpenAIRE

    Piotr Pruski; Stefan Paszek

    2013-01-01

    The paper presents the results of calculating the eigenvalues (associated with electromechanical phenomena) of the state matrix of the Polish Power System model on the basis of analysis of simulated and measured instantaneous power disturbance waveforms of generating units in Łaziska Power Plant. The method for electromechanical eigenvalue calculations used in investigations consists in approximation of the instantaneous power swing waveforms in particular generating units with the use of the...

  20. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most of...

  1. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... Act; System of Records: State-78, Risk Analysis and Management Records SUMMARY: Notice is hereby given that the Department of State proposes to create a system of records, Risk Analysis and Management.... SUPPLEMENTARY INFORMATION: The Department of State proposes that the new system will be ``Risk Analysis...

  2. Design spectrums based on earthquakes recorded at tarbela

    International Nuclear Information System (INIS)

    First Seismological Network in Pakistan was setup in early 1969 at Tarbela, which is the location of largest water reservoir of the country. The network consisted of Analog Accelerograms and Seismographs. Since the installation many seismic events of different magnitudes occurred and were recorded by the installed instruments. The analog form of recorded time histories has been digitized and data of twelve earthquakes, irrespective of the type of soil, has been used to derive elastic design spectrums for Tarbela, Pakistan. The PGA scaling factors, based on the risk analysis studies carried out for the region, for each component are also given. The design spectrums suggested will be very useful for carrying out new construction in the region and its surroundings. The digitized data of time histories will be useful for seismic response analysis of structures and seismic risk analysis of the region. (author)

  3. Structure and performance of a real-time algorithm to detect tsunami or tsunami-like alert conditions based on sea-level records analysis

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2011-05-01

    Full Text Available The goal of this paper is to present an original real-time algorithm devised for detection of tsunami or tsunami-like waves we call TEDA (Tsunami Early Detection Algorithm, and to introduce a methodology to evaluate its performance. TEDA works on the sea level records of a single station and implements two distinct modules running concurrently: one to assess the presence of tsunami waves ("tsunami detection" and the other to identify high-amplitude long waves ("secure detection". Both detection methods are based on continuously updated time functions depending on a number of parameters that can be varied according to the application. In order to select the most adequate parameter setting for a given station, a methodology to evaluate TEDA performance has been devised, that is based on a number of indicators and that is simple to use. In this paper an example of TEDA application is given by using data from a tide gauge located at the Adak Island in Alaska, USA, that resulted in being quite suitable since it recorded several tsunamis in the last years using the sampling rate of 1 min.

  4. ECG biometric analysis in different physiological recording conditions

    OpenAIRE

    Porée, Fabienne; Kervio, Gaëlle; Carrault, Guy

    2016-01-01

    International audience Biometric systems have for objective to perform identification or verification of identity of individuals. Human electrocardiogram (ECG) has been recently proposed as an additional tool for biometric applications. Then, a set of ECG-based biometric studies has occurred in the literature, but they are difficult to compare because they use various values of: the number of ECG leads, the length of the analysis window (only the QRS or more), the delays between recordings...

  5. Analysis of records of external occupational dose records in Brazil

    International Nuclear Information System (INIS)

    Brazil, a continental country, with actually more than 150,000 workers under individual monitoring for ionizing radiation, has implemented in 1987 a centralized system for storage of external occupational dose. This database has been improved over the years and is now a web-based information system called Brazilian External Occupational Dose Management Database System - GDOSE. This paper presents an overview of the Brazilian external occupational dose over the years. The estimated annual average effective dose shows a decrease from 2.4 mSv in 1987 to about 0.6 mSv, having been a marked reduction from 1987 to 1990. Analyzing by type of controlled practice, one sees that the medical and dental radiology is the area with the largest number of users of individual monitors (70%); followed by education practices (8%) and the industrial radiography (7%). Additionally to photon whole body monitoring; neutron monitors are used in maintenance (36%), reactor (30%) and education (27%); and extremity monitors, in education (27%), nuclear medicine (22%) and radiology (19%). In terms of collective dose, the highest values are also found in conventional radiology, but the highest average dose values are those of interventional radiology. Nuclear medicine, R and D and radiotherapy also have average annual effective dose higher than 1 mSv. However, there is some very high dose values registered in GDOSE that give false information. This should be better analyzed in the future. Annual doses above 500 are certainly not realistic. (author)

  6. 基于群聊天记录的人类行为动力学分析%GROUP CHAT RECORDS BASED HUMAN BEHAVIOR DYNAMICS ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    王洪川; 郭进利; 樊超

    2012-01-01

    The paper analyzes the actual data from chat records of 6 QQ groups. From the group view, it makes statistics and analysis of the intermediate time between two successive messages and the length of each message in characters. Results turn out that both the intermediate time and the length obviously follow the power-low distribution, while their exponents are very close. Research output shows that the grouped communication behavior with IM abides by the general rule of human dynamics.%分析六个QQ群聊天记录的真实数据,从群体的角度出发,对群聊天记录的间隔时间和每条消息的字符长度进行统计分析.结果表明,QQ群聊天行为无论从时间间隔还是聊天记录长度上都表现出明显的重尾特征,且不同群之间的幂指数相近.研究说明,即时通讯的群体性的沟通行为服从人类动力学普遍的重尾规律.

  7. The helpful patient record system: problem oriented and knowledge based.

    OpenAIRE

    Bayegan, Elisabeth; Tu, Samson

    2002-01-01

    In contrast to existing computerized patient record systems, which merely offer static functionality for storage and presentation, a helpful patient record system is a problem-oriented, knowledge-based system which provides the clinician with situation-specific information from the patient record, relevant to the activity within the patient care process. We suggest extending the data model of current patient record systems with (1) knowledge for recognizing and interpreting care situations, (...

  8. A web-based electronic patient record (ePR) system for data integration in movement analysis research on wheel-chair users to minimize shoulder pain

    Science.gov (United States)

    Deshpande, Ruchi R.; Requejo, Philip; Sutisna, Erry; Wang, Ximing; Liu, Margaret; McNitt-Gray, Sarah; Ruparel, Puja; Liu, Brent J.

    2012-02-01

    Patients confined to manual wheel-chairs are at an added risk of shoulder injury. There is a need for developing optimal bio-mechanical techniques for wheel-chair propulsion through movement analysis. Data collected is diverse and in need of normalization and integration. Current databases are ad-hoc and do not provide flexibility, extensibility and ease of access. The need for an efficient means to retrieve specific trial data, display it and compare data from multiple trials is unmet through lack of data association and synchronicity. We propose the development of a robust web-based ePR system that will enhance workflow and facilitate efficient data management.

  9. Drive-based recording analyses at >800 Gfc/in 2 using shingled recording

    Science.gov (United States)

    William Cross, R.; Montemorra, Michael

    2012-02-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ˜130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond.

  10. Recording of ECG signals on a portable MiniDisc recorder for time and frequency domain heart rate variability analysis.

    Science.gov (United States)

    Norman, S E; Eager, R A; Waran, N K; Jeffery, L; Schroter, R C; Marlin, D J

    2005-01-17

    Analysis of heart rate variability (HRV) is a non-invasive technique useful for investigating autonomic function in both humans and animals. It has been used for research into both behaviour and physiology. Commercial systems for human HRV analysis are expensive and may not have sufficient flexibility for appropriate analysis in animals. Some heart rate monitors have the facility to provide inter-beat interval (IBI), but verification following collection is not possible as only IBIs are recorded, and not the raw electrocardiogram (ECG) signal. Computer-based data acquisition and analysis systems such as Po-Ne-Mah and Biopac offer greater flexibility and control but have limited portability. Many laboratories and veterinary surgeons have access to ECG machines but do not have equipment to record ECG signals for further analysis. The aim of the present study was to determine whether suitable HRV data could be obtained from ECG signals recorded onto a MiniDisc (MD) and subsequently digitised and analysed using a commercial data acquisition and analysis package. ECG signals were obtained from six Thoroughbred horses by telemetry. A split BNC connecter was used to allow simultaneous digitisation of analogue output from the ECG receiver unit by a computerised data acquisition system (Po-Ne-Mah) and MiniDisc player (MZ-N710, Sony). Following recording, data were played back from the MiniDisc into the same input channel of the data acquisition system as previously used to record the direct ECG. All data were digitised at a sampling rate of 500 Hz. IBI data were analysed in both time and frequency domains and comparisons between direct recorded and MiniDisc data were made using Bland-Altman analysis. Despite some changes in ECG morphology due to loss of low frequency content (primarily below 5 Hz) following MiniDisc recording, there was minimal difference in IBI or time or frequency domain analysis between the two recording methods. The MiniDisc offers a cost

  11. Analysis of Scan Records with a Recording Densitometer - The ''Re-Scanner''

    International Nuclear Information System (INIS)

    The impact of improvements in scanning equipment has not been fully felt at the clinical level, largely because of deficiencies in scan recording. In an attempt to improve visualization and contrast in scan records, various instrumental methods of analysis have been devised. We have devised a simple and comparatively inexpensive recording densitometer for ''re-scanning'' scan records. A light-sensor scans the record just as a scanner scans a patient. The output of the device is a pulse rate proportional to the opacity (or transmission) of the record, and may be used to make a new, or ''re-scan'', record. The area of the record over which information is integrated is set by sensor aperture. The wide range of output pulse-rates (zero to 15 000 parts/s) causes large and adjustable contrast amplification. A threshold control provides any ''cut-off level'' of choice. Operation is rapid, and a record can be re-scanned in a small fraction of the time required to obtain the original record. Studies on clinical scans of almost every organ or area of interest show that the re-scanner reveals information not at first evident in original scan records. It has been particularly useful in determining the statistical significance of small variations in counting rate in a scan record. In scan records of large dynamic range where no single cut-off level satisfactorily shows all regions of interest, re-scans at several cut-off levels were once necessary. A two-region sensor, that views a region of the record around the field of view of the main sensor, has been used in an attempt to overcome this difficulty. At least three modes of operation are possible with the two-region sensor: (1) ''normal'' operation; (2) ignoring general record density and responding only to small variations, thus setting its own cut-off level; and (3) reporting only abrupt changes in record density. Other modes seem to be possible. This relatively simple and inexpensive device is proving to be of valuable

  12. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  13. Multilevel Analysis of Continuous Acoustic Emission Records

    Czech Academy of Sciences Publication Activity Database

    Chlada, Milan; Převorovský, Zdeněk

    Praha : ČVUT Praha Fakulta jaderná a fyzikálně inženýrská, 2013 - (Hobza, T.), s. 62-71 ISBN 978-80-01-05383-6. [SPMS 2013. Nebřich (CZ), 24.06.2013-29.06.2013] R&D Projects: GA MPO FR-TI3/755 Institutional support: RVO:61388998 Keywords : continuous acoustic emission * wavelet analysis * countogram * helicopter gearbox diagnostics Subject RIV: JR - Other Machinery

  14. A Late Quaternary Climate Record Based on Multi-Proxies Analysis from the Jiaochang Loess Section in the Eastern Tibetan Plateau, China

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    We compared the stable carbon isotopic records from a loess transect of the Jiaochang in the eastern Tibetan Plateau, spanning the last ~21,000 years, with multiproxy data for pedogenesis,including magnetic susceptibility, clay fraction, Fed/Fet ratio, carbonate and total organic carbon content, in order to probe the mechanisms of δ13C values of organic matter and Late Quaternary climate variations in the eastern Tibetan Plateau. Our results indicate that there is no simple relationship between δ13C of organic matter and summer monsoon variations. The change in δ13C values of organic matter (in accordance with the ratios of C3 to C4 plants) results from the interaction among temperature, aridity and atmospheric pCO2 level. Drier climate and lower atmospheric pCO2 level contribute to positive carbon isotopic excursion, while negative carbon isotopic excursion is the result of lower temperature and increased atmospheric pCO2 level. Additionally, our results imply that the Tibetan monsoon may play an important role in climate system in the eastern Tibet Plateau,which specifically reflects frequently changing climate in that area. The results provide new insights into the forcing mechanisms on both the δ13C values of organic matter and the local climate system.

  15. Quantitative transmission electron microscopy analysis of multi-variant grains in present L10-FePt based heat assisted magnetic recording media

    International Nuclear Information System (INIS)

    We present a study on atomic ordering within individual grains in granular L10-FePt thin films using transmission electron microscopy techniques. The film, used as a medium for heat assisted magnetic recording, consists of a single layer of FePt grains separated by non-magnetic grain boundaries and is grown on an MgO underlayer. Using convergent-beam techniques, diffraction patterns of individual grains are obtained for a large number of crystallites. The study found that although the majority of grains are ordered in the perpendicular direction, more than 15% of them are multi-variant, or of in-plane c-axis orientation, or disordered fcc. It was also found that these multi-variant and in-plane grains have always grown across MgO grain boundaries separating two or more MgO grains of the underlayer. The in-plane ordered portion within a multi-variant L10-FePt grain always lacks atomic coherence with the MgO directly underneath it, whereas, the perpendicularly ordered portion is always coherent with the underlying MgO grain. Since the existence of multi-variant and in-plane ordered grains are severely detrimental to high density data storage capability, the understanding of their formation mechanism obtained here should make a significant impact on the future development of hard disk drive technology

  16. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  17. Drive-based recording analyses at >800 Gfc/in{sup 2} using shingled recording

    Energy Technology Data Exchange (ETDEWEB)

    William Cross, R., E-mail: William.r.cross@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States); Montemorra, Michael, E-mail: Mike.r.montemorra@seagate.com [Seagate Technology, 389 Disc Drive, Longmont, CO 80503 (United States)

    2012-02-15

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from {approx}130 to well over 500 Gb/in{sup 2} in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in{sup 2} using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in{sup 2} and beyond. - Research Highlights: > Drive-based recording demonstrations at 805 Gf/in{sup 2} has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. > Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. > Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack

  18. Critiquing based on computer-stored medical records

    OpenAIRE

    Van Der Lei, Johan

    1991-01-01

    textabstractThe purpose of this study was the creation of a model for critiquing based on data obtained from computer-stored medical records. The underlying assumption is that data obtained from automated medical records can be used to generate a medically relevant critique. To validate our ideas, we developed a system, HyperCritic, that critiques the decision making of general practitioners (GPs) caring for patients with hypertension

  19. Analysis of operation records. Evaluation of event sequences in extruder

    International Nuclear Information System (INIS)

    All result of chemical analysis and operators observation suggest non-chemical mechanism raised the filing temperature of the bituminized product at the incident. We, Tokai reprocessing plant safety evaluation and analysis team, performed the experiment using laboratory scale extruder and viscosity measurement to explain the high temperature of mixture. The result of the experiment using laboratory scale extruder showed that the phenomena of salt enrichment and salt accumulation occurred and they raised mixture temperature at the decreased feed rate. These phenomena depend on the feed rate and they have large contribution of heat transportation and rise of operational torque due to the friction between screw and mixture. Based on the experiment result and all information, we investigated the operation procedure, operational records and machine arrangement to try to explain the behavior of the mixture in the extruder. Judging from each torque and temperature behavior, we succeeded in explaining a sequential behavior in the incident. It is estimated that mixture temperature was raised by physical heat generation in the extruder and this report explains each operation, investigated result and estimated event sequences. (author)

  20. Analysis and modelling of tsunami-induced tilt for the 2007, M = 7.6, Tocopilla and the 2010, M = 8.8 Maule earthquakes, Chile, from long-base tiltmeter and broadband seismometer records

    Science.gov (United States)

    Boudin, F.; Allgeyer, S.; Bernard, P.; Hébert, H.; Olcay, M.; Madariaga, R.; El-Madani, M.; Vilotte, J.-P.; Peyrat, S.; Nercessian, A.; Schurr, B.; Esnoult, M.-F.; Asch, G.; Nunez, I.; Kammenthaler, M.

    2013-07-01

    We present a detailed study of tsunami-induced tilt at in-land sites, to test the interest and feasibility of such analysis for tsunami detection and modelling. We studied tiltmeter and broadband seismometer records of northern Chile, detecting a clear signature of the tsunamis generated by the 2007 Tocopilla (M = 7.6) and the 2010 Maule (M = 8.8) earthquakes. We find that these records are dominated by the tilt due to the elastic loading of the oceanic floor, with a small effect of the horizontal gravitational attraction. We modelled the Maule tsunami using the seismic source model proposed by Delouis et al. and a bathymetric map, correctly fitting three tide gauge records of the area (Antofagasta, Iquique and Arica). At all the closest stations (7 STS2, 2 long-base tiltmeters), we correctly modelled the first few hours of the tilt signal for the Maule tsunami. The only phase mismatch is for the site that is closer to the ocean. We find a tilt response of 0.005-0.01 μm at 7 km away from the coastline in response to a sea level amplitude change of 10 cm. For the Maule earthquake, we observe a clear tilt signal starting 20 min before the arrival time of the tsunami at the nearest point on the coastline. This capability of tilt or seismic sensors to detect distant tsunamis before they arrive has been successfully tested with a scenario megathrust in the southern Peru-northern Chile seismic gap. However, for large events near the stations, this analysis may no longer be feasible, due to the large amplitude of the long-period seismic signals expected to obscure the loading signal. Inland tilt measurements of tsunamis smooth out short, often unmodelled wavelengths of the sea level perturbation, thus providing robust, large-scale images of the tsunami. Furthermore, tilt measurements are not expected to saturate even for the largest run-ups, nor to suffer from near-coast tsunami damages. Tiltmeters and broadband seismometers are thus valuable instruments for monitoring

  1. Cognitive analysis of the summarization of longitudinal patient records.

    Science.gov (United States)

    Reichert, Daniel; Kaufman, David; Bloxham, Benjamin; Chase, Herbert; Elhadad, Noémie

    2010-01-01

    Electronic health records contain an abundance of valuable information that can be used to guide patient care. However, the large volume of information embodied in these records also renders access to relevant information a time-consuming and inefficient process. Our ultimate objective is to develop an automated summarizer that succinctly captures all relevant information in the patient record. In this paper, we present a cognitive study of 8 clinicians who were asked to create summaries based on data contained in the patients' electronic health record. The study characterized the primary sources of information that were prioritized by clinicians, the temporal strategies used to develop a summary and the cognitive operations used to guide the summarization process. Although we would not expect the automated summarizer to emulate human performance, we anticipate that this study will inform its development in instrumental ways. PMID:21347062

  2. Quantum-dot based nanothermometry in optical plasmonic recording media

    Energy Technology Data Exchange (ETDEWEB)

    Maestro, Laura Martinez [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain); Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Zhang, Qiming; Li, Xiangping; Gu, Min [Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia); Jaque, Daniel [Fluorescence Imaging Group, Departamento de Física de Materiales, Facultad de Ciencias Físicas, Universidad Autónoma de Madrid, Madrid 28049 (Spain)

    2014-11-03

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media.

  3. Quantum-dot based nanothermometry in optical plasmonic recording media

    International Nuclear Information System (INIS)

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media

  4. HJD-I record and analysis meter for nuclear information

    International Nuclear Information System (INIS)

    A low-cost, small-volume, multi-function and new model intelligent nuclear electronic meter HJD-I Record and Analysis Meter are stated for Nuclear Information. It's hardware and software were detailed and the 137Cs spectrum with this meter was presented

  5. A RNA-based nanodevice recording temperature over time

    Science.gov (United States)

    Höfinger, Siegfried; Zerbetto, Francesco

    2010-04-01

    Nucleic acids provide a wealth of interesting properties that can find important applications in nanotechnology. In this article we describe a concept of how to use RNA for temperature measurements. In particular the principal components of a nanodevice are outlined that works on the basis of RNA secondary structure rearrangement. The major mode of operation is a hairpin-coil transition occurring at different temperatures for different types of short RNA oligonucleotides. Coupling these events to a detection system based on specific RNA hybridization provides the framework for a nanodevice capable of temperature records as a function of time. The analysis is carried out with the help of a statistical mechanics package that has been specifically designed to study RNA secondary structure. The procedure yields an optimized list of eight RNA sequences operational in the range from -10 to 60 °C. The data can form the basis of a new technology of potential interest to many fields of process and quality control.

  6. Electronic Health Record A Systems Analysis of the Medications Domain

    CERN Document Server

    Scarlat, Alexander

    2012-01-01

    An accessible primer, Electronic Health Record: A Systems Analysis of the Medications Domain introduces the tools and methodology of Structured Systems Analysis as well as the nuances of the Medications domain. The first part of the book provides a top-down decomposition along two main paths: data in motion--workflows, processes, activities, and tasks in parallel to the analysis of data at rest--database structures, conceptual, logical models, and entities relationship diagrams. Structured systems analysis methodology and tools are applied to: electronic prescription, computerized physician or

  7. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Directory of Open Access Journals (Sweden)

    Manuel Caldas-Morgan

    Full Text Available As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  8. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost. PMID:26076479

  9. Subtropical trace gas profiles determined by ground-based FTIR spectroscopy at Izaña (28° N, 16° W: Five-year record, error analysis, and comparison with 3-D CTMs

    Directory of Open Access Journals (Sweden)

    E. Cuevas

    2004-09-01

    Full Text Available Within the framework of the NDSC (Network for the Detection of Stratospheric Change ground-based FTIR solar absorption spectra have been routinely recorded at Izaña Observatory (28° N, 16° W on Tenerife Island since March 1999. By analyzing the shape of the absorption lines, and their different temperature sensitivities, the vertical distribution of the absorbers can be retrieved. Unique time series of subtropical profiles of O3, HCl, HF, N2O, and CH4 are presented. The effects of both dynamical and chemical annually varying cycles can be seen in the retrieved profiles. These include enhanced upwelling and photochemistry in summer and a more disturbed atmosphere in winter, which are typical of the subtropical stratosphere. A detailed error analysis has been performed for each profile. The output from two different three-dimensional (3-D chemical transport models (CTMs, which are forced by ECMWF analyses, are compared to the measured profiles. Both models agree well with the measurements in tracking abrupt variations in the atmospheric structure, e.g. due to tropical streamers, in particular for the lower stratosphere. Simulated and measured profiles also reflect similar dynamical and chemical annual cycles. However, the differences between their mixing ratios clearly exceed the error bars estimated for the measured profiles. Possible reasons for this are discussed.

  10. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  11. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    Science.gov (United States)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  12. Diffusion of Electronic Medical Record Based Public Hospital Information Systems

    OpenAIRE

    Cho, Kyoung Won; Kim, Seong Min; An, Chang-Ho; Chae, Young Moon

    2015-01-01

    Objectives This study was conducted to evaluate the adoption behavior of a newly developed Electronic Medical Record (EMR)-based information system (IS) at three public hospitals in Korea with a focus on doctors and nurses. Methods User satisfaction scores from four performance layers were analyzed before and two times after the newly develop system was introduced to evaluate the adoption process of the IS with Rogers' diffusion theory. Results The 'intention to use' scores, the most importan...

  13. Recorded seismic response of a base-isolated steel bridge carrying a steel water pipe

    Science.gov (United States)

    Safak, E.; Brady, A.G.

    1989-01-01

    A set of strong motion records was obtained from the base-isolated Santa Ana River Pipeline Bridge during the magnitude 5.9 Whittier Narrows, California, earthquake of October 1, 1987. The analysis of the records show that the level of excitation was not strong enough to fully activate the base isolators. The dominant modes of the response are the translations of the abutment-bridge-pipe system in the longitudinal and transverse directions, and the bending of the steel truss between supports in the vertical direction.

  14. 'Citizen science' recording of fossils by adapting existing computer-based biodiversity recording tools

    Science.gov (United States)

    McGowan, Alistair

    2014-05-01

    Biodiversity recording activities have been greatly enhanced by the emergence of online schemes and smartphone applications for recording and sharing data about a wide variety of flora and fauna. As a palaeobiologist, one of the areas of research I have been heavily involved in is the question of whether the amount of rock available to sample acts as a bias on our estimates of biodiversity through time. Although great progress has been made on this question over the past ten years by a number of researchers, I still think palaeontology has not followed the lead offered by the 'citizen science' revolution in studies of extant biodiversity. By constructing clearly structured surveys with online data collection support, it should be possible to collect field data on the occurrence of fossils at the scale of individual exposures, which are needed to test competing hypotheses about these effects at relatively small spatial scales. Such data collection would be hard to justify for universities and museums with limited personnel but a co-ordinated citizen science programme would be capable of delivering such a programme. Data collection could be based on the MacKinnon's Lists method, used in rapid conservation assessment work. It relies on observers collecting lists of a fixed length (e.g. 10 species long) but what is important is that it focuses on getting observers to ignore sightings of the same species until that list is complete. This overcomes the problem of 'common taxa being commonly recorded' and encourages observers to seek out and identify the rarer taxa. This gives a targeted but finite task. Rather than removing fossils, participants would be encouraged to take photographs to share via a recording website. The success of iSpot, which allows users to upload photos of plants and animals for other users to help with identifications, offers a model for overcoming the problems of identifying fossils, which can often look nothing like the examples illustrated in

  15. Analysis of astronomical records of King Wu's Conquest

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    All related astronomical records of King Wu's Conquest have been searched and analysed comprehensively. Constrained by the newest conclusions of archeology, philology and history in the Xia-Shang-Zhou Chronology Project and based mainly on dates in Wucheng, Jupiter's position in Guoyu and information on the season, our first choice of the date of King Wu's Conquest is Jun. 20, BC1046. This conclusion explains properly most relevant literature.

  16. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Science.gov (United States)

    2010-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record...

  17. Sales Records Based Recommender System for TPO-Goods

    Science.gov (United States)

    Saga, Ryosuke; Tsuji, Hiroshi

    This paper presents a recommender system for TPO (Time, Place, and Occasion)-dependent goods. The TPO-dependent goods have three features: many attributes, multiformity, and high-frequency update. In order to recommend alternatives of the goods, our system (a) abstracts and metrizes the user's preference implied in sales records, and (b) filters massive alternatives by three kinds of methods: High-Angle Search, Low-Angle Search and Neighbor Search. Additionally, this paper describes the improvement method of the recommendation accuracy by memory-based reasoning with user's preference to latter two kinds of search. The numerical simulation for 10,000 user's data and 400,000 sales records has shown their accuracy.

  18. Break and trend analysis of EUMETSAT Climate Data Records

    Science.gov (United States)

    Doutriaux-Boucher, Marie; Zeder, Joel; Lattanzio, Alessio; Khlystova, Iryna; Graw, Kathrin

    2016-04-01

    EUMETSAT reprocessed imagery acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board Meteosat 8-9. The data covers the period from 2004 to 2012. Climate Data Records (CDRs) of atmospheric parameters such as Atmospheric Motion Vectors (AMV) as well as Clear and All Sky Radiances (CSR and ASR) have been generated. Such CDRs are mainly ingested by ECMWF to produce a reanalysis data. In addition, EUMETSAT produced a long CDR (1982-2004) of land surface albedo exploiting imagery acquired by the Meteosat Visible and Infrared Imager (MVIRI) on board Meteosat 2-7. Such CDR is key information in climate analysis and climate models. Extensive validation has been performed for the surface albedo record and a first validation of the winds and clear sky radiances have been done. All validation results demonstrated that the time series of all parameter appear homogeneous at first sight. Statistical science offers a variety of analyses methods that have been applied to further analyse the homogeneity of the CDRs. Many breakpoint analysis techniques depend on the comparison of two time series which incorporates the issue that both may have breakpoints. This paper will present a quantitative and statistical analysis of eventual breakpoints found in the MVIRI and SEVIRI CDRs that includes attribution of breakpoints to changes of instruments and other events in the data series compared. The value of different methods applied will be discussed with suggestions how to further develop this type of analysis for quality evaluation of CDRs.

  19. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  20. Network Analysis of Time-Lapse Microscopy Recordings

    Directory of Open Access Journals (Sweden)

    Seth Malmersjö

    2014-09-01

    Full Text Available Multicellular organisms rely on intercellular communication to regulate important cellular processes critical to life. To further our understanding of those processes there is a need to scrutinize dynamical signaling events and their functions in both cells and organisms. Here, we report a method and provide MATLAB code that analyzes time-lapse microscopy recordings to identify and characterize network structures within large cell populations, such as interconnected neurons. The approach is demonstrated using intracellular calcium (Ca2+ recordings in neural progenitors and cardiac myocytes, but could be applied to a wide variety of biosensors employed in diverse cell types and organisms. In this method, network structures are analyzed by applying cross-correlation signal processing and graph theory to single-cell recordings. The goal of the analysis is to determine if the single cell activity constitutes a network of interconnected cells and to decipher the properties of this network. The method can be applied in many fields of biology in which biosensors are used to monitor signaling events in living cells. Analyzing intercellular communication in cell ensembles can reveal essential network structures that provide important biological insights.

  1. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  2. Deconvolution of the tree ring based delta13C record

    International Nuclear Information System (INIS)

    We assumed that the tree-ring based 13C/12C record constructed by Freyer and Belacy (1983) to be representative of the fossil fuel and forest-soil induced 13C/12C change for atmospheric CO2. Through the use of a modification of the Oeschger et al. ocean model, we have computed the contribution of the combustion of coal, oil, and natural gas to this observed 13C/12C change. A large residual remains when the tree-ring-based record is corrected for the contribution of fossil fuel CO2. A deconvolution was performed on this residual to determine the time history and magnitude of the forest-soil reservoir changes over the past 150 years. Several important conclusions were reached. (1) The magnitude of the integrated CO2 input from these sources was about 1.6 times that from fossil fuels. (2) The forest-soil contribution reached a broad maximum centered at about 1900. (3) Over the 2 decade period covered by the Mauna Loa atmospheric CO2 content record, the input from forests and soils was about 30% that from fossil fuels. (4) The 13C/12C trend over the last 20 years was dominated by the input of fossil fuel CO2. (5) The forest-soil release did not contribute significantly to the secular increase in atmospheric CO2 observed over the last 20 years. (6) The pre-1850 atmospheric p2 values must have been in the range 245 to 270 x 10-6 atmospheres

  3. The analysis and forecasting of male cycling time trial records established within England and Wales.

    Science.gov (United States)

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-07-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data. PMID:26708927

  4. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... Part 171 Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records..., as amended (5 U.S.C. 552a). Certain portions of the Risk Analysis and Management (RAM) Records, State... system, Risk Analysis and Management (RAM) Records, State-78, will support the vetting of...

  5. Quality Assurance in a Computer-Based Outpatient Record

    OpenAIRE

    Colloff, Edwin; Morgan, Mary; Beaman, Peter; Justice, Norma; Kunstaetter, Robert; Barnett, G. Octo

    1980-01-01

    COSTAR, a COmputer-STored Ambulatory Record system, was developed at the Massachusetts General Hospital Laboratory of Computer Science. It can supplement or entirely replace the paper medical record with a highly encoded record. Although a computer-stored medical record provides a unique opportunity for quality assurance activities, it requires programming skills to examine the data. We have taken the dual approach of writing pre-specified quality assurance packages and developing a high leve...

  6. Anonymization of Electronic Medical Records to Support Clinical Analysis

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2013-01-01

    Anonymization of Electronic Medical Records to Support Clinical Analysis closely examines the privacy threats that may arise from medical data sharing, and surveys the state-of-the-art methods developed to safeguard data against these threats. To motivate the need for computational methods, the book first explores the main challenges facing the privacy-protection of medical data using the existing policies, practices and regulations. Then, it takes an in-depth look at the popular computational privacy-preserving methods that have been developed for demographic, clinical and genomic data sharing, and closely analyzes the privacy principles behind these methods, as well as the optimization and algorithmic strategies that they employ. Finally, through a series of in-depth case studies that highlight data from the US Census as well as the Vanderbilt University Medical Center, the book outlines a new, innovative class of privacy-preserving methods designed to ensure the integrity of transferred medical data for su...

  7. Computational intelligence methods on biomedical signal analysis and data mining in medical records

    OpenAIRE

    Vladutu, Liviu-Mihai

    2004-01-01

    This thesis is centered around the development and application of computationally effective solutions based on artificial neural networks (ANN) for biomedical signal analysis and data mining in medical records. The ultimate goal of this work in the field of Biomedical Engineering is to provide the clinician with the best possible information needed to make an accurate diagnosis (in our case of myocardial ischemia) and to propose advanced mathematical models for recovering the complex de...

  8. Hardware issues in the movement to computer-based patient records.

    Science.gov (United States)

    Bunschoten, B; Deming, B

    1995-02-01

    The health care field is making significant progress in shifting to computer-based patient records. Providers are faced with some difficult decisions about what hardware options are most appropriate. Key issues include the choice of clinical workstations vs. portable computers, the use of new client-server architecture or traditional mainframe-based systems and the role of personal computers. This special report offers an indepth assessment of important hardware trends in the records automation movement. The first story offers an analysis of the hardware implications of client-server architecture and an assessment of the long-term role of mainframe computers. The second story sizes up the potential role for mobile computing, including hand-held devices and wireless technology. PMID:10143840

  9. Classifying Normal and Abnormal Status Based on Video Recordings of Epileptic Patients

    Directory of Open Access Journals (Sweden)

    Jing Li

    2014-01-01

    Full Text Available Based on video recordings of the movement of the patients with epilepsy, this paper proposed a human action recognition scheme to detect distinct motion patterns and to distinguish the normal status from the abnormal status of epileptic patients. The scheme first extracts local features and holistic features, which are complementary to each other. Afterwards, a support vector machine is applied to classification. Based on the experimental results, this scheme obtains a satisfactory classification result and provides a fundamental analysis towards the human-robot interaction with socially assistive robots in caring the patients with epilepsy (or other patients with brain disorders in order to protect them from injury.

  10. Coral-based climate records from tropical South Atlantic

    DEFF Research Database (Denmark)

    Pereira, Natan S.; Sial, Alcides N.; Kikuchi, Ruy K.P.;

    2015-01-01

    Coral skeletons contain records of past environmental conditions due to their long life span and well calibrated geochemical signatures. C and O isotope records of corals are especially interesting, because they can highlight multidecadal variability of local climate conditions beyond the instrum...

  11. Detecting seismic activity with a covariance matrix analysis of data recorded on seismic arrays

    Science.gov (United States)

    Seydoux, L.; Shapiro, N. M.; de Rosny, J.; Brenguier, F.; Landès, M.

    2016-03-01

    Modern seismic networks are recording the ground motion continuously at the Earth's surface, providing dense spatial samples of the seismic wavefield. The aim of our study is to analyse these records with statistical array-based approaches to identify coherent time-series as a function of time and frequency. Using ideas mainly brought from the random matrix theory, we analyse the spatial coherence of the seismic wavefield from the width of the covariance matrix eigenvalue distribution. We propose a robust detection method that could be used for the analysis of weak and emergent signals embedded in background noise, such as the volcanic or tectonic tremors and local microseismicity, without any prior knowledge about the studied wavefields. We apply our algorithm to the records of the seismic monitoring network of the Piton de la Fournaise volcano located at La Réunion Island and composed of 21 receivers with an aperture of ˜15 km. This array recorded many teleseismic earthquakes as well as seismovolcanic events during the year 2010. We show that the analysis of the wavefield at frequencies smaller than ˜0.1 Hz results in detection of the majority of teleseismic events from the Global Centroid Moment Tensor database. The seismic activity related to the Piton de la Fournaise volcano is well detected at frequencies above 1 Hz.

  12. Analysis of recorded earthquake response data at the Hualien large-scale seismic test site

    International Nuclear Information System (INIS)

    A soil-structure interaction (SSI) experiment is being conducted in a seismically active region in Hualien, Taiwan. To obtain earthquake data for quantifying SSI effects and providing a basis to benchmark analysis methods, a 1/4-th scale cylindrical concrete containment model similar in shape to that of a nuclear power plant containment was constructed in the field where both the containment model and its surrounding soil, surface and sub-surface, are extensively instrumented to record earthquake data. In between September 1993 and May 1995, eight earthquakes with Richter magnitudes ranging from 4.2 to 6.2 were recorded. The author focuses on studying and analyzing the recorded data to provide information on the response characteristics of the Hualien soil-structure system, the SSI effects and the ground motion characteristics. An effort was also made to directly determine the site soil physical properties based on correlation analysis of the recorded data. No modeling simulations were attempted to try to analytically predict the SSI response of the soil and the structure. These will be the scope of a subsequent study

  13. Investigation of the frequency content of ground motions recorded during strong Vrancea earthquakes, based on deterministic and stochastic indices

    OpenAIRE

    Iolanda-Gabriela CRAIFALEANU

    2013-01-01

    The paper presents results from a recent study in progress, involving an extensive analysis, based on several deterministic and stochastic indices, of the frequency content of ground motions recorded during strong Vrancea seismic events. The study, continuing those initiated by Lungu et al. in the early nineties, aims to better reveal the characteristics of the analyzed ground motions. Over 300 accelerograms, recorded during the strong Vrancea seismic events mentioned above and recently re-di...

  14. ANALYSIS OF VIBROACOUSTIC SIGNALS RECORDED IN THE PASSENGER LIFT CABIN

    Directory of Open Access Journals (Sweden)

    Kamil Szydło

    2016-06-01

    Full Text Available The analysis of private tests is presented in the article. The applicable tests refer to accelerations, the level of the sound pressure as well as to the sound power emitted by the passenger lift cabin at different technical conditions of the lift. For a group of lifting devices the accelerations were tested at three axes with the use of an accelerometer. The accelerometer was placed in the central part of the cabin with simultaneous measurement of the acoustic parameters with the sound analyzer equipped with the sound volume double microphone probe. The attempt was made to determine the impact of the frame - cabin system construction as well as the lift technical condition on the recorded parameters. It can allow to establish the limit values of the lift structure parameters under which a rapid drop of comfort takes place while travelling in the lift as well as to indicate those construction elements the modification of which would affect the improvement of the operation noiselessness.

  15. Developing a personal-computer-based records retention system using Paradox{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Sprouse, B.; Wray, S.

    1993-10-01

    Many records managers are confronted with large caches of records stored in corners, attics, or warehouses that seem to be ``out of sight, out of mind.`` Much of this information becomes ``lost`` because it is not properly identified and cataloged. Perhaps the records have always been stored in these places because the lack of an alternative. In these situations, the records manager must organize and catalog the records and provide solutions to the records management and storage problems. A simple personal-computer-based records management system can be developed that will provide organization, accountability, and retrievability of the records. By developing a basic database structure and implementing some basic records management principles, a records manager can gain control of even the most extreme displays of records mismanagement. This paper will discuss practical ways of establishing a records system that provides for database tracking using off-the-shelf database software packages. Database examples using Paradox software will be used to explain the basic concepts for developing records systems. The paper will also discuss developing and performing a records assessment, researching applicable requirements, writing a records management plan, implementing the records system, and testing and modifying the system.

  16. A Neuromorphic Event-Based Neural Recording System for Smart Brain-Machine-Interfaces.

    Science.gov (United States)

    Corradi, Federico; Indiveri, Giacomo

    2015-10-01

    Neural recording systems are a central component of Brain-Machince Interfaces (BMIs). In most of these systems the emphasis is on faithful reproduction and transmission of the recorded signal to remote systems for further processing or data analysis. Here we follow an alternative approach: we propose a neural recording system that can be directly interfaced locally to neuromorphic spiking neural processing circuits for compressing the large amounts of data recorded, carrying out signal processing and neural computation to extract relevant information, and transmitting only the low-bandwidth outcome of the processing to remote computing or actuating modules. The fabricated system includes a low-noise amplifier, a delta-modulator analog-to-digital converter, and a low-power band-pass filter. The bio-amplifier has a programmable gain of 45-54 dB, with a Root Mean Squared (RMS) input-referred noise level of 2.1 μV, and consumes 90 μW . The band-pass filter and delta-modulator circuits include asynchronous handshaking interface logic compatible with event-based communication protocols. We describe the properties of the neural recording circuits, validating them with experimental measurements, and present system-level application examples, by interfacing these circuits to a reconfigurable neuromorphic processor comprising an array of spiking neurons with plastic and dynamic synapses. The pool of neurons within the neuromorphic processor was configured to implement a recurrent neural network, and to process the events generated by the neural recording system in order to carry out pattern recognition. PMID:26513801

  17. Tussiphonographic analysis of cough sound recordings performed by Schmidt-Voigt and Hirschberg and Szende.

    Science.gov (United States)

    Korpás, J; Kelemen, S

    1987-01-01

    The cough sound records published by Schmidt-Voigt and Hirschberg and Szende were submitted to tussiphonographic analysis. It has been established that all the recordings of various types of cough sounds registered in airway disease were of pathological character in the tussiphonographic recordings. It has repeatedly been confirmed that tussiphonography is a suitable means for screening of respiratory diseases. PMID:3434295

  18. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  19. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    International Nuclear Information System (INIS)

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA

  20. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Science.gov (United States)

    de Araujo, Draulio B.; Kardec Barros, Allan; Estombelo-Montesco, Carlos; Zhao, Hui; Roque da Silva Filho, A. C.; Baffa, Oswaldo; Wakai, Ronald; Ohnishi, Noboru

    2005-10-01

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  1. Julius – a template based supplementary electronic health record system

    Directory of Open Access Journals (Sweden)

    Klein Gunnar O

    2007-05-01

    Full Text Available Abstract Background EHR systems are widely used in hospitals and primary care centres but it is usually difficult to share information and to collect patient data for clinical research. This is partly due to the different proprietary information models and inconsistent data quality. Our objective was to provide a more flexible solution enabling the clinicians to define which data to be recorded and shared for both routine documentation and clinical studies. The data should be possible to reuse through a common set of variable definitions providing a consistent nomenclature and validation of data. Another objective was that the templates used for the data entry and presentation should be possible to use in combination with the existing EHR systems. Methods We have designed and developed a template based system (called Julius that was integrated with existing EHR systems. The system is driven by the medical domain knowledge defined by clinicians in the form of templates and variable definitions stored in a common data repository. The system architecture consists of three layers. The presentation layer is purely web-based, which facilitates integration with existing EHR products. The domain layer consists of the template design system, a variable/clinical concept definition system, the transformation and validation logic all implemented in Java. The data source layer utilizes an object relational mapping tool and a relational database. Results The Julius system has been implemented, tested and deployed to three health care units in Stockholm, Sweden. The initial responses from the pilot users were positive. The template system facilitates patient data collection in many ways. The experience of using the template system suggests that enabling the clinicians to be in control of the system, is a good way to add supplementary functionality to the present EHR systems. Conclusion The approach of the template system in combination with various local EHR

  2. Modal identification of boiler plant structures on AR spectral analysis of seismic records

    International Nuclear Information System (INIS)

    This paper deals with a modal identification method for large-scale structures such as boiler plants in thermal power station. Practical and accurate modal identification has been carried out by the proposed method, which is composed of two stages; processing frequency transfer functions by autoregressive (AR) spectral analysis, and a curve-fitting technique to extract modal parameters. Seismic records of base acceleration records at various points of the structure are used as multi-output data. This method is examined using time-series data of seismic response simulation. Introduction of the two techniques, namely, decimation of time data and FPE criterion to optimize the order of AR models have realized effective and accurate identification. This method has actually been applied to seismic observation data of boiler plants in operation. As a result of this study, the authors' modal identification has proven to be effective for seismic modeling of large-scale structures

  3. Revised estimates of Greenland ice sheet thinning histories based on ice-core records

    DEFF Research Database (Denmark)

    Lecavalier, B.S.; Milne, G.A.; Fisher, D.A.;

    2013-01-01

    and surface loading also acts to improve the data-model fits such that the residuals at all four sites for the period 8 ka BP to present are significantly reduced compared to the original analysis. Prior to 8 ka BP, the possible influence of Innuitian ice on the inferred elevation histories prevents......Ice core records were recently used to infer elevation changes of the Greenland ice sheet throughout the Holocene. The inferred elevation changes show a significantly greater elevation reduction than those output from numerical models, bringing into question the accuracy of the model-based...... reconstructions and, to some extent, the estimated elevation histories. A key component of the ice core analysis involved removing the influence of vertical surface motion on the dO signal measured from the Agassiz and Renland ice caps. We re-visit the original analysis with the intent to determine if the use...

  4. Reconstructing sacred landscapes from soils-based records

    Science.gov (United States)

    Simpson, Ian; Gilliland, Krista; Coningham, Robin; Manuel, Mark; Davis, Christopher; Strickland, Keir; Acharya, Kosh; Hyland, Katherine; Bull, Ian; Kinnaird, Timothy; Sanderson, David

    2015-04-01

    From soils- and sediments- based records we reconstruct development of the sacred landscape at Lumbini UNESCO World Heritage Site in the central Nepalese Terai, the birthplace of Buddha, a world religion and now a major place of pilgrimage to its temple site. The Terai is a plain less than 100 m above sea level with incising rivers that originate in the Churia Hills and flow to the Ganges. Alluvial sediments on the Terai plain, originating as laterite soils within the hills, are characterised by a range of textural classes rich in iron oxides and manganese, with sandier sediments near water sources and finer textures near the distal ends of alluvial reaches. Our objectives are to establish a chronological framework for occupation, identify influences of alluvial environments on site occupation and determine the process of secular and sacred site formation within the World Heritage Site. A set of key stratigraphies are the basis for our analyses and are located in a palaeo-channel adjacent the temple site, within the temple site itself, and within the mound of the original Lumbini village. Optically stimulated luminescence (OSL) measurements of soils and sediments together with supporting single entity radiocarbon measurements provide robust chronological frameworks. Assessment of field properties, thin section micromorphology and organic biomarkers offer new insight into the intimate and complex relationships between natural, cultural and culturally mediated processes in landscape development. Integration of our findings allows a detailed narrative of cultural landscape development at Lumbini. The area was occupied from ca. 1,500 BC first of all by a transient community who used the area for product storage and who were subject to persistent flooding with periodic major flood events. Subsequent occupation deliberately raised a permanent village settlement above the level of flood events flooding and which had associated managed field cultivation. Village life was

  5. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas

    2012-12-01

    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  6. Android-based access to holistic emergency care record.

    Science.gov (United States)

    Koufi, Vassiliki; Malamateniou, Flora; Prentza, Andriana; Vassilacopoulos, George

    2013-01-01

    This paper is concerned with the development of an Emergency Medical Services (EMS) system which interfaces with a Holistic Emergency Care Record (HECR) that aims at managing emergency care holistically by supporting EMS processes and is accessible by Android-enabled mobile devices. PMID:23823406

  7. Area Disease Estimation Based on Sentinel Hospital Records

    OpenAIRE

    Yang, Yang; Wang, Jin-feng; Reis, Ben Y.; Hu, Mao-Gui; Christakos, George; Yang, Wei-Zhong; Sun, Qiao; Li, Zhong-Jie; Li, Xiao-Zhou; Lai, Sheng-Jie; Chen, Hong-Yan; Wang, Dao-Chen

    2011-01-01

    Background Population health attributes (such as disease incidence and prevalence) are often estimated using sentinel hospital records, which are subject to multiple sources of uncertainty. When applied to these health attributes, commonly used biased estimation techniques can lead to false conclusions and ineffective disease intervention and control. Although some estimators can account for measurement error (in the form of white noise, usually after de-trending), most mainstream health stat...

  8. Some methods for dynamic analysis of the scalp recorded EEG.

    Science.gov (United States)

    Pribram, K H; King, J S; Pierce, T W; Warren, A

    1996-01-01

    This paper describes methods for quantifying the spatiotemporal dynamics of EEG. Development of these methods was motivated by watching computer-generated animations of EEG voltage records. These animations contain a wealth of information about the pattern of change across time in the voltages observed across the surface of the scalp. In an effort to quantify this pattern of changing voltages, we elected to extract a single quantifiable feature from each measurement epoch, the highest squared voltage among the various electrode sites. Nineteen channels of EEG were collected from subjects using an electrode cap with standard 10-20 system placements. Two minute records were obtained. Each record was sampled at a rate of 200 per second. Thirty seconds of artifact-free data were extracted from each 2 minute record. An algorithm then determined the location of the channel with the greatest amplitude for each 5 msec sampling epoch. We quantified these spatio-temporal dynamics as scalars, vectors and cluster analytic plots of EEG activity for finger tapping, cognitive effort (counting backwards) and relaxation to illustrate the utility of the techniques. PMID:8813416

  9. Foetal heart rate recording: analysis and comparison of different methodologies

    OpenAIRE

    Ruffo, Mariano

    2011-01-01

    Monitoring foetal health is a very important task in clinical practice to appropriately plan pregnancy management and delivery. In the third trimester of pregnancy, ultrasound cardiotocography is the most employed diagnostic technique: foetal heart rate and uterine contractions signals are simultaneously recorded and analysed in order to ascertain foetal health. Because ultrasound cardiotocography interpretation still lacks of complete reliability, new parameters and methods of interpreta...

  10. SCRIPT: A framework for scalable real-time IP flow record analysis

    OpenAIRE

    Morariu, C.; Racz, P.; Stiller, B.

    2010-01-01

    Analysis of IP traffic is highly important, since it determines the starting point of many network management operations, such as intrusion detection, network planning, network monitoring, or accounting and billing. One of the most utilized metering data formats in analysis applications are IP (Internet Protocol) flow records. With the increase of IP traffic, such traffic analysis applications need to cope with a constantly increasing number of flow records. Typically, centralized approaches ...

  11. Electronic system for recording proportional counter rare pulses with the pulse shape analysis

    International Nuclear Information System (INIS)

    The anutomated system for recording proportional counter rare pulses is described. The proportional counters are aimed at identification of 37Ar and H71Gr decays in chemical radiation detectors of solar neutrino. Pulse shape recording by means of a storage oscilloscope and a TV display is performed in the system considered besides two-parametric selection of events (measurement of pulse amplitude in a slow channel and the amplitude of pulse differentiated with time constant of about 10 ns in a parallel fast channel). Pulse discrimination by a front rise rate provides background decrease in the 55Fe range (5.9 keV) by 6 times; the visual analysis of pulse shapes recorded allows to decrease the background additionally by 25-30%. The background counting rate in the 55Fe range being equal to 1 pulse per 1.5 days, is obtained when using the installation described above, as well as the passive Pb shield 5 cm thick, and the active shield based on the anticoincidence NaI(Tl) detector with the cathode 5.6 mm in-diameter made of Fe fabircated by zone melting. The installation described allows to reach the background level of 0.6 pulse/day (the total coefficient of background attenuation is 400). Further background decrease is supposed to be provided by installation allocation in the low-noise underground laboratory of the Baksan Neutrino Observatory

  12. Comparing the security risks of paper-based and computerized patient record systems

    Science.gov (United States)

    Collmann, Jeff R.; Meissner, Marion C.; Tohme, Walid G.; Winchester, James F.; Mun, Seong K.

    1997-05-01

    How should hospital administrators compare the security risks of paper-based and computerized patient record systems. There is a general tendency to assume that because computer networks potentially provide broad access to hospital archives, computerized patient records are less secure than paper records and increase the risk of breaches of patient confidentiality. This assumption is ill-founded on two grounds. Reasons exist to say that the computerized patient record provides better access to patient information while enhancing overall information system security. A range of options with different trade-offs between access and security exist in both paper-based and computerized records management systems. The relative accessibility and security of any particular patient record management system depends, therefore, on administrative choice, not simply on the intrinsic features of paper or computerized information management systems.

  13. Recording and Analysis of Tsetse Flight Responses in Three Dimensions

    International Nuclear Information System (INIS)

    Recording and analysing three dimensional (3D) motions of tsetse flies in flight are technically challenging due to their speed of flight. However, video recording of tsetse fly flight responses has already been made in both wind tunnels and the field. The aim of our research was to study the way tsetse flies exploit host odours and visual targets during host searching. Such knowledge can help in the development of better trapping devices. We built a wind tunnel where it is possible to control environmental parameters, e.g. temperature, relative humidity and light. The flight of the flies was filmed from above with two high speed Linux-embedded cameras equipped with fish-eye objectives viewing at 60o from one another. The synchronized stereo images were used to reconstruct the trajectory of flies in 3D and in real time. Software permitted adjustment for parameters such as luminosity and size of the tsetse species being tracked. Interpolation permitted us to calculate flight coordinates and to measure modifications of flight parameters such as acceleration, velocity, rectitude, angular velocity and curvature according to the experimental conditions. Using this system we filmed the responses of Glossina brevipalpis Newstead obtained from a colony at the IAEA Entomology Unit, Seibersdorf, Austria to human breath presented with and without a visual target. Flights lasting up to 150 s duration and covering up to 153 m were recorded. G. brevipalpis flights to human breath were characterized by wide undulations along the course. When a visual target was placed in the plume of breath, flights of G. brevipalpis were more tightly controlled, i.e. slower and more directed. This showed that after multiple generations in a laboratory colony G. brevipalpis was still capable of complex behaviours during bloodmeal searching. (author)

  14. Obesity research based on the Copenhagen School Health Records Register

    DEFF Research Database (Denmark)

    Baker, Jennifer L; Sørensen, Thorkild I A

    2011-01-01

    INTRODUCTION: To summarise key findings from research performed using data from the Copenhagen School Health Records Register over the last 30 years with a main focus on obesity-related research. The register contains computerised anthropometric information on 372,636 schoolchildren from the...... capital city of Denmark. Additional information on the cohort members has been obtained via linkages with population studies and national registers. RESEARCH TOPICS: Studies using data from the register have made important contributions in the areas of the aetiology of obesity, the development of the...... obesity epidemic, and the long-term health consequences of birth weight as well as body size and growth in childhood. CONCLUSION: Research using this unique register is ongoing, and its contributions to the study of obesity as well as other topics will continue for years to come....

  15. Practical analysis of tide gauges records from Antarctica

    Science.gov (United States)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted

  16. A near real-time satellite-based global drought climate data record

    International Nuclear Information System (INIS)

    Reliable drought monitoring requires long-term and continuous precipitation data. High resolution satellite measurements provide valuable precipitation information on a quasi-global scale. However, their short lengths of records limit their applications in drought monitoring. In addition to this limitation, long-term low resolution satellite-based gauge-adjusted data sets such as the Global Precipitation Climatology Project (GPCP) one are not available in near real-time form for timely drought monitoring. This study bridges the gap between low resolution long-term satellite gauge-adjusted data and the emerging high resolution satellite precipitation data sets to create a long-term climate data record of droughts. To accomplish this, a Bayesian correction algorithm is used to combine GPCP data with real-time satellite precipitation data sets for drought monitoring and analysis. The results showed that the combined data sets after the Bayesian correction were a significant improvement compared to the uncorrected data. Furthermore, several recent major droughts such as the 2011 Texas, 2010 Amazon and 2010 Horn of Africa droughts were detected in the combined real-time and long-term satellite observations. This highlights the potential application of satellite precipitation data for regional to global drought monitoring. The final product is a real-time data-driven satellite-based standardized precipitation index that can be used for drought monitoring especially over remote and/or ungauged regions. (letter)

  17. Analysis of In Mine Acoustic Recordings for Single Fired Explosions

    Science.gov (United States)

    McKenna, S.; Hayward, C.; Stump, B.

    2003-12-01

    In August of 2003, a series of single fired test shots were executed at a copper mine in Arizona. The ten shots, fired on August 18 and 19, 2003, ranged in size from 1700 lbs to 13600 lbs in simultaneously detonated patterns ranging from a single hole to eight holes. All were located within the same pit and within 100 m of each other. Both free face and bench shots were included. Southern Methodist University had previously deployed a set of acoustic gauges ringing the active production areas of the mine. The five Validyne DP250 sensors recorded not only the ten test shots, but also seven delay fired production shots over the four day period from August 18 to 21, 2003. Each recorded blast arrival was analyzed for peak amplitude and spectrum. Signals were then compared for the variability between shots and sensors as well as a comparison between fully contained and poorly contained shots. Blast yield, scale depth, and centroid depth were compared to the above measured quantities for each of the single-fired and production shots.

  18. Frequency analysis of electroencephalogram recorded from a bottlenose dolphin (Tursiops truncatus) with a novel method during transportation by truck

    OpenAIRE

    Hashio, Fuyuko; Tamura, Shinichi; Okada, Yasunori; Morimoto, Shigeru; Ohta, Mitsuaki; Uchida, Naoyuki

    2010-01-01

    In order to obtain information regarding the correlation between an electroencephalogram (EEG) and the state of a dolphin, we developed a noninvasive recording method of EEG of a bottlenose dolphin (Tursiops truncatus) and an extraction method of true-EEG (EEG) from recorded-EEG (R-EEG) based on a human EEG recording method, and then carried out frequency analysis during transportation by truck. The frequency detected in the EEG of dolphin during apparent awakening was divided conveniently in...

  19. Speech watermarking: an approach for the forensic analysis of digital telephonic recordings.

    Science.gov (United States)

    Faundez-Zanuy, Marcos; Lucena-Molina, Jose J; Hagmüller, Martin

    2010-07-01

    In this article, the authors discuss the problem of forensic authentication of digital audio recordings. Although forensic audio has been addressed in several articles, the existing approaches are focused on analog magnetic recordings, which are less prevalent because of the large amount of digital recorders available on the market (optical, solid state, hard disks, etc.). An approach based on digital signal processing that consists of spread spectrum techniques for speech watermarking is presented. This approach presents the advantage that the authentication is based on the signal itself rather than the recording format. Thus, it is valid for usual recording devices in police-controlled telephone intercepts. In addition, our proposal allows for the introduction of relevant information such as the recording date and time and all the relevant data (this is not always possible with classical systems). Our experimental results reveal that the speech watermarking procedure does not interfere in a significant way with the posterior forensic speaker identification. PMID:20412360

  20. Factors influencing consumer adoption of USB-based Personal Health Records in Taiwan

    Directory of Open Access Journals (Sweden)

    Jian Wen-Shan

    2012-08-01

    Full Text Available Abstract Background Usually patients receive healthcare services from multiple hospitals, and consequently their healthcare data are dispersed over many facilities’ paper and electronic-based record systems. Therefore, many countries have encouraged the research on data interoperability, access, and patient authorization. This study is an important part of a national project to build an information exchange environment for cross-hospital digital medical records carried out by the Department of Health (DOH of Taiwan in May 2008. The key objective of the core project is to set up a portable data exchange environment in order to enable people to maintain and own their essential health information. This study is aimed at exploring the factors influencing behavior and adoption of USB-based Personal Health Records (PHR in Taiwan. Methods Quota sampling was used, and structured questionnaires were distributed to the outpatient department at ten medical centers which participated in the DOH project to establish the information exchange environment across hospitals. A total of 3000 questionnaires were distributed and 1549 responses were collected, out of those 1465 were valid, accumulating the response rate to 48.83%. Results 1025 out of 1465 respondents had expressed their willingness to apply for the USB-PHR. Detailed analysis of the data reflected that there was a remarkable difference in the “usage intention” between the PHR adopters and non-adopters (χ2 =182.4, p  Conclusions Higher Usage Intentions, Perceived Usefulness and Subjective Norm of patients were found to be the key factors influencing PHR adoption. Thus, we suggest that government and hospitals should promote the potential usefulness of PHR, and physicians should encourage patients' to adopt the PHR.

  1. Simplified Technique for Incorporating a Metal Mesh into Record Bases for Mandibular Implant Overdentures.

    Science.gov (United States)

    Godoy, Antonio; Siegel, Sharon C

    2015-12-01

    Mandibular implant-retained overdentures have become the standard of care for patients with mandibular complete edentulism. As part of the treatment, the mandibular implant-retained overdenture may require a metal mesh framework to be incorporated to strengthen the denture and avoid fracture of the prosthesis. Integrating the metal mesh framework as part of the acrylic record base and wax occlusion rim before the jaw relation procedure will avoid the distortion of the record base and will minimize the chances of processing errors. A simplified method to incorporate the mesh into the record base and occlusion rim is presented in this technique article. PMID:25659988

  2. Students' Satisfaction and Valuation of Web-Based Lecture Recording Technologies

    Science.gov (United States)

    Taplin, Ross H.; Low, Lee Hun; Brown, Alistair M.

    2011-01-01

    This paper explores students' satisfaction and valuation of web-based lecture recording technologies (WBLT) that enable students to download recordings of lectures they could not attend or wish to review for revision purposes. The study was undertaken among undergraduates and postgraduates in accounting at an Australian university. In addition to…

  3. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    Science.gov (United States)

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  4. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  5. Analysis of Continuous Microseismic Recordings: Resonance Frequencies and Unconventional Events

    Science.gov (United States)

    Tary, J.; van der Baan, M.

    2012-12-01

    Hydrofracture experiments, where fluids and proppant are injected into reservoirs to create fractures and enhance oil recovery, are often monitored using microseismic recordings. The total stimulated volume is then estimated by the size of the cloud of induced micro-earthquakes. This implies that only brittle failure should occur inside reservoirs during the fracturing. Yet, this assumption may not be correct, as the total energy injected into the system is orders of magnitude larger than the total energy associated with brittle failure. Instead of using only triggered events, it has been shown recently that the frequency content of continuous recordings may also provide information on the deformations occurring inside reservoirs. Here, we use different kinds of time-frequency transforms to track the presence of resonance frequencies. We analyze different data sets using regular, long-period and broadband geophones. The resonance frequencies observed are mainly included in the frequency band of 5-60 Hz. We systematically examine first the possible causes of resonance frequencies, dividing them into source, path and receiver effects. We then conclude that some of the observed frequency bands likely result from source effects. The resonance frequencies could be produced by either interconnected fluid-filled fractures in the order of tens of meters, or by small repetitive events occurring at a characteristic periodicity. Still, other mechanisms may occur or be predominant during reservoir fracturing, depending on the lithology as well as the pressure and temperature conditions at depth. During one experiment, both regular micro-earthquakes, long-period long-duration events (LPLD) and resonance frequencies are observed. The lower part of the frequency band of these resonance frequencies (5-30 Hz) overlaps with the anticipated frequencies of observed LPLDs in other experiments (<50 Hz). The exact origin of both resonance frequencies and LPLDs is still under debate

  6. Simultaneous recording of rat auditory cortex and thalamus via a titanium-based, microfabricated, microelectrode device

    Science.gov (United States)

    McCarthy, P. T.; Rao, M. P.; Otto, K. J.

    2011-08-01

    Direct recording from sequential processing stations within the brain has provided opportunity for enhancing understanding of important neural circuits, such as the corticothalamic loops underlying auditory, visual, and somatosensory processing. However, the common reliance upon microwire-based electrodes to perform such recordings often necessitates complex surgeries and increases trauma to neural tissues. This paper reports the development of titanium-based, microfabricated, microelectrode devices designed to address these limitations by allowing acute recording from the thalamic nuclei and associated cortical sites simultaneously in a minimally invasive manner. In particular, devices were designed to simultaneously probe rat auditory cortex and auditory thalamus, with the intent of recording auditory response latencies and isolated action potentials within the separate anatomical sites. Details regarding the design, fabrication, and characterization of these devices are presented, as are preliminary results from acute in vivo recording.

  7. Use and Characteristics of Electronic Health Record Systems among Office-Based Physician Practices: United States, ...

    Science.gov (United States)

    ... National Ambulatory Medical Care Survey Adoption of basic EHR systems by office-based physicians increased 21% between ... Survey, Electronic Health Records Survey. Adoption of basic EHR systems and any EHR system varied widely across ...

  8. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  9. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Science.gov (United States)

    2010-07-01

    ... film as specified in Department of Transportation regulations (49 CFR 172.101, Hazardous materials... records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests... and cellulose-acetate base film? (a) The nitrocellulose base, a substance akin to gun cotton,...

  10. Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System

    OpenAIRE

    Park, Hyeoun-Ae; Min, Yul Ha; Jeon, Eunjoo; Chung, Eunja

    2012-01-01

    Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed ...

  11. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    Science.gov (United States)

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  12. Metamaterial-based single pixel imaging system (Presentation Recording)

    Science.gov (United States)

    Padilla, Willie; Watts, Claire M.; Nadell, Christian; Montoya, John A.; Krishna, Sanjay

    2015-09-01

    Single pixel cameras are useful imaging devices where it is difficult or infeasible to fashion focal plan arrays. For example in the Far Infrared (FIR) it is difficult to perform imaging by conventional detector arrays, owing to the cost and size of such an array. The typical single pixel camera uses a spatial light modulator (SLM) - placed in the conjugate image plane - and is used to sample various portions of the image. The spatially modulated light emerging from the SLM is then sent to a single detector where the light is condensed with suitable optics for detection. Conventional SLMs are either based on liquid crystals or digital mirror devices. As such these devices are limited in modulation speeds of order 30 kHz. Further there is little control over the type of light that is modulated. We present metamaterial based spatial light modulators which provide the ability to digitally encode images - with various measurement matrix coefficients - thus permitting high speed and fidelity imaging capability. In particular we use the Hadamard matrix and related S-matrix to encode images for single pixel imaging. Metamaterials thus permit imaging in regimes of the electromagnetic spectrum where conventional SLMs are not available. Additionally, metamaterials offer several salient features that are not available with commercial SLMs. For example, metamaterials may be used to enable hyperspectral, polarimetric, and phase sensitive imaging. We present the theory and experimental results of single pixel imaging with digital metamaterials in the far infrared and highlight the future of this exciting field.

  13. A Wavelet-Based Algorithm for Delineation and Classification of Wave Patterns in Continuous Holter ECG Recordings

    OpenAIRE

    Johannesen, L; Grove, USL; Sørensen, JS; Schmidt, ML; Couderc, J-P; Graff, C

    2010-01-01

    Quantitative analysis of the electrocardiogram (ECG) requires delineation and classification of the individual ECG wave patterns. We propose a wavelet-based waveform classifier that uses the fiducial points identified by a delineation algorithm. For validation of the algorithm, manually annotated ECG records from the QT database (Physionet) were used. ECG waveform classification accuracies were: 85.6% (P-wave), 89.7% (QRS complex), 92.8% (T-wave) and 76.9% (U-wave). The proposed classificatio...

  14. Ex post power economic analysis of record of decision operational restrictions at Glen Canyon Dam.

    Energy Technology Data Exchange (ETDEWEB)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.; Loftin, S.; Osiek, B; Decision and Information Sciences; Western Area Power Administration

    2010-07-31

    On October 9, 1996, Bruce Babbitt, then-Secretary of the U.S. Department of the Interior signed the Record of Decision (ROD) on operating criteria for the Glen Canyon Dam (GCD). Criteria selected were based on the Modified Low Fluctuating Flow (MLFF) Alternative as described in the Operation of Glen Canyon Dam, Colorado River Storage Project, Arizona, Final Environmental Impact Statement (EIS) (Reclamation 1995). These restrictions reduced the operating flexibility of the hydroelectric power plant and therefore its economic value. The EIS provided impact information to support the ROD, including an analysis of operating criteria alternatives on power system economics. This ex post study reevaluates ROD power economic impacts and compares these results to the economic analysis performed prior (ex ante) to the ROD for the MLFF Alternative. On the basis of the methodology used in the ex ante analysis, anticipated annual economic impacts of the ROD were estimated to range from approximately $15.1 million to $44.2 million in terms of 1991 dollars ($1991). This ex post analysis incorporates historical events that took place between 1997 and 2005, including the evolution of power markets in the Western Electricity Coordinating Council as reflected in market prices for capacity and energy. Prompted by ROD operational restrictions, this analysis also incorporates a decision made by the Western Area Power Administration to modify commitments that it made to its customers. Simulated operations of GCD were based on the premise that hourly production patterns would maximize the economic value of the hydropower resource. On the basis of this assumption, it was estimated that economic impacts were on average $26.3 million in $1991, or $39 million in $2009.

  15. Microcomputer-based recording system for clinical electrophysiology.

    Science.gov (United States)

    Török, B

    1990-09-01

    We developed a personal computer-based system for clinical electrophysiologic measurements. The computer interfaced with a commercially available A/D converter, a low-noise isolation preamplifier, filter circuits, pattern and Ganzfeld stimulators, and a hardcopy unit. Separate programs were developed for electroretinography (ERG), pattern ERG and simultaneous visual evoked potential (VEP), flash and pattern-shift VEP, and electro-oculographic measurements. The complete control of the applied hardware (eg, stimulus control, automatic gain, and filter selection) is a common feature of the computer programs. These programs provide oscilloscopic functions, overload protection, artifact elimination, averaging, automatic peak latency and amplitude determination, baseline correction, smoothing, and digital filtering. The results can be presented on matrix, laser printers, or digital plotters. The hardware components and the features of the driver software are demonstrated on normal and pathologic signals. PMID:2276319

  16. Tidal analysis of data recorded by a superconducting gravimeter

    Directory of Open Access Journals (Sweden)

    F. Palmonari

    1995-06-01

    Full Text Available A superconducting gravimeter was used to monitor the tidal signal for a period of five months. The instrument was placed in a site (Brasimone station, Italy chat-acterized by a low noise level, and was calibrated with a precision of 0.2%. Then tidal analysis on hourly data was performed and the results presented in this paper; amplitudes, gravimetric factors, phase differences for the main tidal waves, M2, S2, N2, 01, Pl, K1, QI, were calculated together with barometric pressure admittance and long term instrumental drift.

  17. A retrospective population-based study of childhood hospital admissions with record linkage to a birth defects registry

    OpenAIRE

    Bower Carol; Colvin Lyn

    2009-01-01

    Abstract Background Using population-based linked records of births, deaths, birth defects and hospital admissions for children born 1980–1999 enables profiles of hospital morbidity to be created for each child. Methods This is an analysis of a state-based registry of birth defects linked to population-based hospital admission data. Transfers and readmissions within one day could be taken into account and treated as one episode of care for the purposes of analyses (N = 485,446 children; 742,8...

  18. A Unit Record Analysis of Older Male Labour Force Participation

    OpenAIRE

    O'Brien, Martin

    2003-01-01

    This paper presents empirical analyses of econometric models of older males’ labour force participation based upon the orthodox theory of labour supply. The aim is to assess the effectiveness of using of micro data to account for older male labour force participation rate patterns over recent decades. As such, the influence of financial variables from the budget constraint and observable characteristics from the utility function are incorporated into reduced form models estimating the older m...

  19. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.

    Science.gov (United States)

    Ratwani, Raj M; Fairbanks, Rollin J; Hettinger, A Zachary; Benda, Natalie C

    2015-11-01

    The usability of electronic health records (EHRs) continues to be a point of dissatisfaction for providers, despite certification requirements from the Office of the National Coordinator that require EHR vendors to employ a user-centered design (UCD) process. To better understand factors that contribute to poor usability, a research team visited 11 different EHR vendors in order to analyze their UCD processes and discover the specific challenges that vendors faced as they sought to integrate UCD with their EHR development. Our analysis demonstrates a diverse range of vendors' UCD practices that fall into 3 categories: well-developed UCD, basic UCD, and misconceptions of UCD. Specific challenges to practicing UCD include conducting contextually rich studies of clinical workflow, recruiting participants for usability studies, and having support from leadership within the vendor organization. The results of the study provide novel insights for how to improve usability practices of EHR vendors. PMID:26049532

  20. Community-based, interdisciplinary geriatric care team satisfaction with an electronic health record: a multimethod study.

    Science.gov (United States)

    Sockolow, Paulina S; Bowles, Kathryn H; Lehmann, Harold P; Abbott, Patricia A; Weiner, Jonathan P

    2012-06-01

    This multimethod study measured the impact of an electronic health record (EHR) on clinician satisfaction with clinical process. Subjects were 39 clinicians at a Program of All-inclusive Care for Elders (PACE) site in Philadelphia utilizing an EHR. Methods included the evidence-based evaluation framework, Health Information Technology Research-Based Evaluation Framework, which guided assessment of clinician satisfaction with surveys, observations, follow-up interviews, and actual EHR use at two points in time. Mixed-methods analysis of findings provided context for interpretation and improved validity. The study found that clinicians were satisfied with the EHR; however, satisfaction declined between time periods. Use of EHR was universal and wide and was differentiated by clinical role. Between time periods, EHR use increased in volume, with increased timeliness and decreased efficiency. As the first EHR evaluation at a PACE site from the perspective of clinicians who use the system, this study provides insights into EHR use in the care of older people in community-based healthcare settings. PMID:22411417

  1. Multi-scale dynamical analysis (MSDA) of sea level records versus PDO, AMO, and NAO indexes

    CERN Document Server

    Scafetta, Nicola

    2013-01-01

    Herein I propose a multi-scale dynamical analysis to facilitate the physical interpretation of tide gauge records. The technique uses graphical diagrams. It is applied to six secular-long tide gauge records representative of the world oceans: Sydney, Pacific coast of Australia; Fremantle, Indian Ocean coast of Australia; New York City, Atlantic coast of USA; Honolulu, U.S. state of Hawaii; San Diego, U.S. state of California; and Venice, Mediterranean Sea, Italy. For comparison, an equivalent analysis is applied to the Pacific Decadal Oscillation (PDO) index and to the Atlantic Multidecadal Oscillation (AMO) index. Finally, a global reconstruction of sea level and a reconstruction of the North Atlantic Oscillation (NAO) index are analyzed and compared: both sequences cover about three centuries from 1700 to 2000. The proposed methodology quickly highlights oscillations and teleconnections among the records at the decadal and multidecadal scales. At the secular time scales tide gauge records present relatively...

  2. Analysis of patent value evaluation and recorded value based on real option theory%基于实物期权理论的专利权价值评估及入账价值计算分析

    Institute of Scientific and Technical Information of China (English)

    徐文静

    2013-01-01

      本文通过对实物期权理论的介绍以及其在进行专利权评估上的运用,简要分析了利弊,并给出了合适的专利权价值评估公式。%In this paper, through the introduction of the real options theory and use during patent assessment, a brief analysis of the pros and cons, and given the appropriate patent valuation formula.

  3. Analysis of raw AIS spectrum recordings from a LEO satellite

    DEFF Research Database (Denmark)

    Larsen, Jesper Abildgaard; Mortensen, Hans Peter

    2014-01-01

    The AAUSAT3 satellite is a 1U cubesat, which has been developed by students at Aalborg University, Denmark in collaboration with the Danish Maritime Authority. The satellite was launched in February 2013 on a mission to monitor ships from space using their AIS broadcast signals as an indication of...... position. The SDR receiver developed to listen for these AIS signals also allows for sampling and storing of the raw intermediate frequency spectrum, which has been used in order to map channel utilization over the areas of interest for the mission, which is mainly the arctic regions. The SDR based...... receiver used onboard the satellite is using a single chip front-end solution, which down converts the AIS signal located around 162 MHz into an intermediate frequency, with a up to 200 kHz bandwidth. This I/F signal is sampled with a 750 kSPS A/D converter and further processed by an Analog Devices DSP...

  4. Analysis of global and hemispheric temperature records and prognosis

    Science.gov (United States)

    Werner, Rolf; Valev, Dimitar; Danov, Dimitar; Guineva, Veneta; Kirillov, Andrey

    2015-06-01

    Climate changes are connected to long term variations of global and hemispheric temperatures, which are important for the work out of socio-political strategy for the near future. In the paper the annual temperature time series are modeled by linear multiple regression to identify important climate forcings including external climate factors such as atmospheric CO2 content, volcanic emissions, and the total solar irradiation as well as internal factors such as El Niño-Southern oscillation, Pacific decadal oscillation and Atlantic multidecadal oscillation. Adjusted temperatures were determined by removal of all significant influences except CO2. The adjusted temperatures follow a linear dependence toward the logarithm of the CO2 content, and the coefficient of determination is about 0.91. The evolution of the adjusted temperatures suggests that the warming due to CO2 from the beginning of the studied here time interval in 1900 has never stopped and is going on up to now. The global warming rate deduced from the adjusted temperatures since 1980 is about 0.14 ± 0.02 °C/decade. The warming rate reported in the IPCC assessment report 4 based on observed global surface temperature set is about 20% higher, due to the warming by the Atlantic multidecadal oscillation additional to the anthropogenic warming. The predicted temperature evolution based on long time changes of CO2 and the Atlantic multidecadal oscillation index shows that the Northern Hemispheric temperatures are modulated by the Atlantic multidecadal oscillation influence and will not change significantly to about 2040, after that they will increase speedily, just like during the last decades of the past century. The temperatures of the Southern Hemisphere will increase almost linearly and don't show significant periodic changes due to Atlantic multidecadal oscillation. The concrete warming rates of course are strongly depending on the future atmospheric CO2 content.

  5. An ontology-based method for secondary use of electronic dental record data

    OpenAIRE

    Schleyer, Titus KL; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P; Liu, Kaihong; Hernandez, Pedro

    2013-01-01

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and creat...

  6. Study of influence of ACPA in holographic reflection gratings recorded in PVA/AA based photopolymer

    OpenAIRE

    Fuentes Rosillo, Rosa; Fernández Varó, Elena; García Llopis, Celia; Beléndez Vázquez, Augusto; Pascual Villalobos, Inmaculada

    2010-01-01

    The performance of a holographic data storage system depends to a great extent on the quality and the physical properties of the recording medium. The storage capabilities of photopolymer materials are under constant study and for some applications a high spatial frequency material is necessary. In this work, we focus on the study of the influence of 4,4´-Azobis(4-cyanopentanoic acid) ACPA on holographic reflection gratings recorded in a polyvinyl alcohol/acrylamide-based photopolymer with th...

  7. THE EVOLUTION OF THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) FROM THE AMIDON-FLANDERS INTERACTION ANALYSIS. APPENDIX G.

    Science.gov (United States)

    BALDWIN, PATRICIA

    A DETAILED LISTING IS GIVEN OF THE REVISIONS THAT WERE MADE TO THE AMIDON-FLANDERS INTERACTION ANALYSIS SCALE WHILE THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) SCALE WAS BEING DEVELOPED. COMMENTS ARE GIVEN FOR GUIDANCE IN THE USE OF SOME OF THE RATINGS ALONG WITH SOME GROUND RULES AND GUIDELINES FOR MAKING A FILM RATING. RELATED REPORTS ARE AA…

  8. Deciphering the record of short-term base-level changes in Gilbert-type deltas

    Science.gov (United States)

    Gobo, Katarina; Ghinassi, Massimiliano; Nemec, Wojciech

    2016-04-01

    -front accommodation driven by short-term base-level changes, with some accompanying inevitable 'noise' in the facies record due to the system autogenic variability and regional climatic fluctuations. Comparison of delta coeval foreset and toeset/bottomset deposits in a delta shows further a reverse pattern of reciprocal changes in facies assemblages, with the TFA assemblage of foreset deposits passing downdip into a DFA assemblage of delta-foot deposits, and the DFA assemblage of foreset deposits passing downdip into a TFA assemblage. This reverse reciprocal alternation of TFA and DFA facies assemblages is attributed to the delta-slope own morphodynamics. When the delta slope is dominated by deposition of debrisflows, only the most diluted turbulent flows and chute bypassing turbidity currents are reaching the delta-foot zone. When the delta slope is dominated by turbiditic sedimentation, larger chutes and gullies form - triggering and conveying debrisflows to the foot zone. These case studies as a whole shed a new light on the varying pattern of subaqueous sediment dispersal processes in an evolving Gilbert-type deltaic system and point to an the attractive possibility of the recognition of a 'hidden' record of base-level changes on the basis of detailed facies analysis.

  9. Using the Java language to develop computer based patient records for use on the Internet.

    OpenAIRE

    Zuckerman, A. E.

    1996-01-01

    The development of the Java Programming Language by Sun Microsystems has provided a new tool for the development of Internet based applications. Our preliminary work has shown how Java can be used to program an Internet based CBPR. Java is well suited to the needs of patient records and can interface with clinical data repositories written in MUMPS or SQL.

  10. A Real Application of a Concept-based Electronic Medical Record

    OpenAIRE

    Purin, Barbara; Eccher, Claudio; Forti, Stefano

    2003-01-01

    We present a real implementation of a concept-based Electronic Medical Record for the management of heart failure disease. Our approach is based on GEHR archetypes represented in XML format for modelling clinical information. By using this technique it could be possible to build a interoperable future-proof clinical information system.

  11. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  12. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features

    Science.gov (United States)

    García, A.; Romano, H.; Laciar, E.; Correa, R.

    2011-12-01

    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases "arrhythmias MIT BIH database" and "MIT BIH supraventricular arrhythmias database". A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  13. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features

    International Nuclear Information System (INIS)

    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases 'arrhythmias MIT BIH database' and MIT BIH supraventricular arrhythmias database. A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  14. Spironolactone use and renal toxicity: population based longitudinal analysis.

    OpenAIRE

    Wei, L; Struthers, A D; Fahey, T; Watson, A D; MacDonald, T. M.

    2010-01-01

    Objective To determine the safety of spironolactone prescribing in the setting of the UK National Health Service. Design Population based longitudinal analysis using a record linkage database. Setting Tayside, Scotland. Population All patients who received one or more dispensed prescriptions for spironolactone between 1994 and 2007. Main outcome measures Rates of prescribing for spironolactone, hospital admissions for hyperkalaemia, and hyperkalaemia and renal function without...

  15. The computer based patient record: a strategic issue in process innovation.

    Science.gov (United States)

    Sicotte, C; Denis, J L; Lehoux, P

    1998-12-01

    Reengineering of the workplace through Information Technology is an important strategic issue for today's hospitals. The computer-based patient record (CPR) is one technology that has the potential to profoundly modify the work routines of the care unit. This study investigates a CPR project aimed at allowing physicians and nurses to work in a completely electronic environment. The focus of our analysis was the patient nursing care process. The rationale behind the introduction of this technology was based on its alleged capability to both enhance quality of care and control costs. This is done by better managing the flow of information within the organization and by introducing mechanisms such as the timeless and spaceless organization of the work place, de-localization, and automation of work processes. The present case study analyzed the implementation of a large CPR project ($45 million U.S.) conducted in four hospitals in joint venture with two computer firms. The computerized system had to be withdrawn because of boycotts from both the medical and nursing personnel. User-resistance was not the problem. Despite its failure, this project was a good opportunity to understand better the intricate complexity of introducing technology in professional work where the usefulness of information is short lived and where it is difficult to predetermine the relevancy of information. Profound misconceptions in achieving a tighter fit (synchronization) between care processes and information processes were the main problems. PMID:9871877

  16. Investigation of the frequency content of ground motions recorded during strong Vrancea earthquakes, based on deterministic and stochastic indices

    CERN Document Server

    Craifaleanu, Iolanda-Gabriela

    2013-01-01

    The paper presents results from a recent study in progress, involving an extensive analysis, based on several deterministic and stochastic indices, of the frequency content of ground motions recorded during strong Vrancea seismic events. The study, continuing those initiated by Lungu et al. in the early nineties, aims to better reveal the characteristics of the analyzed ground motions. Over 300 accelerograms, recorded during the strong Vrancea seismic events mentioned above and recently re-digitized, are used in the study. Various analytical estimators of the frequency content, such as those based on Fourier spectra, power spectral density, response spectra and peak ground motion values are evaluated and compared. The results are correlated and validated by using the information provided by various spectral bandwidth measures, as the Vanmarcke and the Cartwright and Longuet-Higgins indices. The capacity of the analyzed estimators to describe the frequency content of the analyzed ground motions is assessed com...

  17. A portable, GUI-based, object-oriented client-server architecture for computer-based patient record (CPR) systems.

    Science.gov (United States)

    Schleyer, T K

    1995-01-01

    Software applications for computer-based patient records require substantial development investments. Portable, open software architectures are one way to delay or avoid software application obsolescence. The Clinical Management System at Temple University School of Dentistry uses a portable, GUI-based, object-oriented client-server architecture. Two main criteria determined this approach: preservation of investment in software development and a smooth migration path to a Computer-based Patient Record. The application is separated into three layers: graphical user interface, database interface, and application functionality Implementation with generic cross-platform development tools ensures maximum portability. PMID:7662879

  18. Early Warning and Risk Estimation methods based on Unstructured Text in Electronic Medical Records to Improve Patient Adherence and Care

    OpenAIRE

    Sairamesh, Jakka; Rajagopal, Ram; Nemana, Ravi; Argenbright, Keith

    2009-01-01

    In this paper we present risk-estimation models and methods for early detection of patient non-adherence based on unstructured text in patient records. The primary objectives are to perform early interventions on patients at risk of non-adherence and improve outcomes. We analyzed over 1.1 million visit notes corresponding to 30,095 Cancer patients, spread across 12 years of Oncology practice. Our risk analysis, based on a rich risk-factor dictionary, revealed that a staggering 30% of the pati...

  19. PC analysis of bit records enhances drilling operations in southern Alabama; Case history

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.M. (Helmerich and Payne IDC (US))

    Kelly No. 1 was drilled on a footage basis in Excambia County, AL. Computer bit-record analysis provided summarized information in a format easily read and understood by field personnel. With a PC used to compare surrounding bit records on a cost-per-foot basis, the best bit type, weight on bit (WOB), and bit speed for a given section of hole were identified. This paper reports that from the analysis, field personnel were able to improve bit selection, resulting in three bit runs having the lowest cost per foot in a given section of hole and an overall lower-cost-per-foot drilling operation.

  20. Computer analysis of sound recordings from two Anasazi sites in northwestern New Mexico

    Science.gov (United States)

    Loose, Richard

    2002-11-01

    Sound recordings were made at a natural outdoor amphitheater in Chaco Canyon and in a reconstructed great kiva at Aztec Ruins. Recordings included computer-generated tones and swept sine waves, classical concert flute, Native American flute, conch shell trumpet, and prerecorded music. Recording equipment included analog tape deck, digital minidisk recorder, and direct digital recording to a laptop computer disk. Microphones and geophones were used as transducers. The natural amphitheater lies between the ruins of Pueblo Bonito and Chetro Ketl. It is a semicircular arc in a sandstone cliff measuring 500 ft. wide and 75 ft. high. The radius of the arc was verified with aerial photography, and an acoustic ray trace was generated using cad software. The arc is in an overhanging cliff face and brings distant sounds to a line focus. Along this line, there are unusual acoustic effects at conjugate foci. Time history analysis of recordings from both sites showed that a 60-dB reverb decay lasted from 1.8 to 2.0 s, nearly ideal for public performances of music. Echoes from the amphitheater were perceived to be upshifted in pitch, but this was not seen in FFT analysis. Geophones placed on the floor of the great kiva showed a resonance at 95 Hz.

  1. Social science and linguistic text analysis of nurses' records: a systematic review and critique.

    Science.gov (United States)

    Buus, Niels; Hamilton, Bridget Elizabeth

    2016-03-01

    The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations. PMID:26109278

  2. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  3. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  4. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis

    OpenAIRE

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of...

  5. Flood Risk Analysis Using Non-Stationary Models: Application to 1500 Records and Assessment of Predictive Ability

    Science.gov (United States)

    Luke, A.; Sanders, B. F.; Aghakouchak, A.; Vrugt, J. A.; Matthew, R.

    2015-12-01

    Urbanization and shifts in precipitation patterns have altered the risk of inland flooding. Methods to assess the flood risk, such as flood frequency analysis, are based on the key assumption of stationarity (ST). Under the ST assumption, the behavior of the hydroclimatic system (precipitation, temperature) and watershed is assumed to be time invariant. This ST assumption is quite restrictive and perhaps not accurate for flood risk assessment in watersheds that have undergone significant urbanization. Consequently, there is an urgent need for statistical methods that can account explicitly for system non-stationarity (NS) in the analysis and quantification of flood risks. One approach is to use time variant parameters in an extreme value distribution. This approach, called NEVA, has shown to improve the statistical representation of observed data (within-sample), yet NEVA has not been comprehensively evaluated for predictive analysis (out-of-sample). We apply NEVA to 1,548 records of observed annual maximum discharges with the goal to (1) assess which of the two approaches (ST/NS) and their parametric models, in Log-Pearson Type III (LPIII) distribution, best describe the statistical representation of future flood risks, and (2) which diagnostic is most suitable for model selection (NS/ST). To explore these questions, we use the first half of each flood record for inference of the LPIII model parameters using MCMC simulation with the DREAM(ZS) algorithm - and the second part of the record is used for evaluation purposes (predictive analysis). Our results show that in about 70% of the records with a trend, the LPIII ST model performed better during evaluation than the LPIII NS model - unless the "trend" record is more than 55-years long; then the NS model is always preferred. If trend classification of the 1,548 records was done using summary metrics of watershed processes (runoff coefficient) rather than the peak discharges, the performance of the NS model improved

  6. Time and spectral analysis methods with machine learning for the authentication of digital audio recordings.

    Science.gov (United States)

    Korycki, Rafal

    2013-07-10

    This paper addresses the problem of tampering detection and discusses new methods that can be used for authenticity analysis of digital audio recordings. Nowadays, the only method referred to digital audio files commonly approved by forensic experts is the ENF criterion. It consists in fluctuation analysis of the mains frequency induced in electronic circuits of recording devices. Therefore, its effectiveness is strictly dependent on the presence of mains signal in the recording, which is a rare occurrence. This article presents the existing methods of time and spectral analysis along with their modifications as proposed by the author involving spectral analysis of residual signal enhanced by machine learning algorithms. The effectiveness of tampering detection methods described in this paper is tested on a predefined music database. The results are compared graphically using ROC-like curves. Furthermore, time-frequency plots are presented and enhanced by reassignment method in purpose of visual inspection of modified recordings. Using this solution, enables analysis of minimal changes of background sounds, which may indicate tampering. PMID:23481673

  7. A system for automatic recording and analysis of motor activity in rats.

    Science.gov (United States)

    Heredia-López, Francisco J; May-Tuyub, Rossana M; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J

    2013-03-01

    We describe the design and evaluation of an electronic system for the automatic recording of motor activity in rats. The device continually locates the position of a rat inside a transparent acrylic cube (50 cm/side) with infrared sensors arranged on its walls so as to correspond to the x-, y-, and z-axes. The system is governed by two microcontrollers. The raw data are saved in a text file within a secure digital memory card, and offline analyses are performed with a library of programs that automatically compute several parameters based on the sequence of coordinates and the time of occurrence of each movement. Four analyses can be made at specified time intervals: traveled distance (cm), movement speed (cm/s), time spent in vertical exploration (s), and thigmotaxis (%). In addition, three analyses are made for the total duration of the experiment: time spent at each x-y coordinate pair (min), time spent on vertical exploration at each x-y coordinate pair (s), and frequency distribution of vertical exploration episodes of distinct durations. User profiles of frequently analyzed parameters may be created and saved for future experimental analyses, thus obtaining a full set of analyses for a group of rats in a short time. The performance of the developed system was assessed by recording the spontaneous motor activity of six rats, while their behaviors were simultaneously videotaped for manual analysis by two trained observers. A high and significant correlation was found between the values measured by the electronic system and by the observers. PMID:22707401

  8. Image-based electronic patient records for secured collaborative medical applications.

    Science.gov (United States)

    Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun

    2005-01-01

    We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications. PMID:17282930

  9. Microscopic analysis on showers recorded as single core on X-ray films

    International Nuclear Information System (INIS)

    Cosmic-ray particles recorded as single dark spots on X-ray films with use of the emulsion chamber data of Brazil-Japan Collaboration are studied. Some results of microscopic analysis of such single-core-like showers on nuclear emulsion plates are reported. (Author)

  10. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  11. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging

    International Nuclear Information System (INIS)

    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother’s abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice. (paper)

  12. Receiver Function Analysis using Ocean-bottom Seismometer Records around the Kii Peninsula, Southwestern Japan

    Science.gov (United States)

    Akuhara, T.; Mochizuki, K.

    2014-12-01

    Recent progress on receiver function (RF) analysis has provided us with new insight about the subsurface structure. The method is now gradually being more applied to records of ocean-bottom seismometers (OBSs). In the present study, we conducted RF analysis using OBS records at 32 observation sites around the Kii Peninsula, southwestern Japan, from 2003 to 2007 (Mochizuki et al., 2010, GRL). We addressed problems concerning water reverberations. We first checked the effects of water reverberations on the OBS vertical component records by calculating vertical P-wave RFs (Langston and Hammer, 2001, BSSA), where the OBS vertical component records were deconvolved by stacked traces of on-land records as source functions. The resultant RFs showed strong peaks corresponding to the water reverberations. Referring to these RFs, we constructed inverse filters to remove the effects of water reverberations from the vertical component records, which were assumed to be represented by two parameters, a two-way travel time within the water layer, and a reflection coefficient at the seafloor. We then calculated radial RFs using the filtered, reverberation-free, vertical component records of OBS data as source functions. The resultant RFs showed that some phases at later times became clearer than those obtained by an ordinary method. From the comparison with a previous tomography model (Akuhara et al., 2013, GRL), we identified phases originating from the oceanic Moho, which delineates the relationship between the depth of earthquakes and the oceanic Moho: seaward intraslab seismicity is high within the oceanic crust while the landward seismicity is high within the oceanic mantle. This character may be relevant to the dehydration process.

  13. 13 CFR 106.303 - Who has authority to approve and sign a Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who has authority to approve and... Activities § 106.303 Who has authority to approve and sign a Fee Based Record? The Administrator, or upon his... consultation with the General Counsel (or designee), has the authority to approve and sign each Fee...

  14. 49 CFR 1544.230 - Fingerprint-based criminal history records checks (CHRC): Flightcrew members.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Fingerprint-based criminal history records checks (CHRC): Flightcrew members. 1544.230 Section 1544.230 Transportation Other Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY AIRCRAFT OPERATOR...

  15. Noninvasive method for electrocardiogram recording in conscious rats: feasibility for heart rate variability analysis

    Directory of Open Access Journals (Sweden)

    Pedro P. Pereira-Junior

    2010-06-01

    Full Text Available Heart rate variability (HRV analysis consists in a well-established tool for the assessment of cardiac autonomic control, both in humans and in animal models. Conventional methods for HRV analysis in rats rely on conscious state electrocardiogram (ECG recording based on prior invasive surgical procedures for electrodes/transmitters implants. The aim of the present study was to test a noninvasive and inexpensive method for ECG recording in conscious rats, assessing its feasibility for HRV analysis. A custom-made elastic cotton jacket was developed to fit the rat's mean thoracic circumference, with two pieces of platinum electrodes attached on its inner surface, allowing ECG to be recorded noninvasively in conscious, restrained rats (n=6. Time- and frequency-domain HRV analyses were conducted, under basal and autonomic blockade conditions. High-quality ECG signals were obtained, being feasible for HRV analysis. As expected, mean RR interval was significantly decreased in the presence of atropine (p A análise da variabilidade da freqüência cardíaca (VFC consiste em uma metodologia bem estabelecida para o estudo do controle autonômico cardíaco, tanto em humanos como em modelos animais. As metodologias convencionais para o estudo da VFC em ratos utilizam-se de procedimentos cirúrgicos para o implante de eletródios ou transmissores, o que possibilita a posterior aquisição do eletrocardiograma (ECG no estado consciente. O objetivo do presente trabalho foi o de desenvolver e aplicar um método não-invasivo para o registro do ECG em ratos conscientes, verificando sua validade para a análise da VFC. Uma vestimenta de tecido elástico em algodão foi desenvolvida de acordo com as dimensões médias da circunferência torácica dos animais, e dois pequenos eletródios retangulares de platina foram aderidos à superfície interna da vestimenta, permitindo o registro do ECG de forma não-invasiva em ratos conscientes (n=6, sob contenção. Foram

  16. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re PMID

  17. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  18. Characteristics of solar diurnal variations: a case study based on records from the ground magnetic observatory at Vassouras, Brazil

    CERN Document Server

    Klausner, Virginia; Mendes, Odim; Domingues, Margarete O; Frick, Peter

    2011-01-01

    The horizontal component amplitudes observed by ground-based observatories of the INTERMAGNET network have been used to analyze the global pattern variance of the solar diurnal variations. Data from magnetic stations present gaps in records and consequently we explored them via a time-frequency gapped wavelet algorithm. After computing the gapped wavelet transform, we performed wavelet cross-correlation analysis which was useful to isolate the period of the spectral components of the geomagnetic field in each of the selected magnetic stations and to correlate them as function of scale (period) with the low latitude Vassouras Observatory, Rio de Janeiro, Brazil, which is under the South Atlantic Magnetic Anomaly (SAMA) influence and should be used as a reference for an under-construction Brazilian network of magnetic observatories. The results show that the records in magnetic stations have a latitudinal dependence affected by the season of year and by the level of solar activity. We have found a disparity on ...

  19. Seismic simulation analysis of a nuclear reactor building using observed earthquake records

    International Nuclear Information System (INIS)

    In this paper, to verify the effectiveness of dynamic response analysis technique, simulation analyses using observed records of five different earthquakes are performed for the reactor building of Unit 6 of the Fukushima Daiichi Nuclear Power Plant. A sway-rocking model (SR model) with embedment effect is adopted for the analyses. The model properties of the structure and soil springs are estimated by using the results of the forced vibration test. The soil properties are estimated by referring to the observed records of free field and the soil test data. The flow of the process for establishing the model properties is shown

  20. Hill Air Force Base, Utah, Final Record of Decision and Responsiveness Summary for Operable Unit 2

    OpenAIRE

    Ogden ALC

    1997-01-01

    This decision document presents the selected remedy for Operable Unit 2 (OU2) at Hill Air Force Base (HAFB), Utah. It was selected in accordance with the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA), as amended by the Superfund Amendments and Reauthorization Act of 1986 (SARA), and to the extent practicable, the National Oil and Hazardous Substances Pollution Contingency Plan (NCP). This decision is based on the Administrative Record for this site. Th...

  1. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    NARCIS (Netherlands)

    Haan, de S.; Lodder, J.C.

    1991-01-01

    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having perpend

  2. Functional recordings from awake, behaving rodents through a microchannel based regenerative neural interface

    Science.gov (United States)

    Gore, Russell K.; Choi, Yoonsu; Bellamkonda, Ravi; English, Arthur

    2015-02-01

    group of awake and behaving animals. These unique findings provide preliminary evidence that efferent, volitional motor potentials can be recorded from the microchannel-based peripheral neural interface; a critical requirement for any neural interface intended to facilitate direct neural control of external technologies.

  3. A closed-loop compressive-sensing-based neural recording system

    Science.gov (United States)

    Zhang, Jie; Mitra, Srinjoy; Suo, Yuanming; Cheng, Andrew; Xiong, Tao; Michon, Frederic; Welkenhuysen, Marleen; Kloosterman, Fabian; Chin, Peter S.; Hsiao, Steven; Tran, Trac D.; Yazicioglu, Firat; Etienne-Cummings, Ralph

    2015-06-01

    Objective. This paper describes a low power closed-loop compressive sensing (CS) based neural recording system. This system provides an efficient method to reduce data transmission bandwidth for implantable neural recording devices. By doing so, this technique reduces a majority of system power consumption which is dissipated at data readout interface. The design of the system is scalable and is a viable option for large scale integration of electrodes or recording sites onto a single device. Approach. The entire system consists of an application-specific integrated circuit (ASIC) with 4 recording readout channels with CS circuits, a real time off-chip CS recovery block and a recovery quality evaluation block that provides a closed feedback to adaptively adjust compression rate. Since CS performance is strongly signal dependent, the ASIC has been tested in vivo and with standard public neural databases. Main results. Implemented using efficient digital circuit, this system is able to achieve >10 times data compression on the entire neural spike band (500-6KHz) while consuming only 0.83uW (0.53 V voltage supply) additional digital power per electrode. When only the spikes are desired, the system is able to further compress the detected spikes by around 16 times. Unlike other similar systems, the characteristic spikes and inter-spike data can both be recovered which guarantes a >95% spike classification success rate. The compression circuit occupied 0.11mm2/electrode in a 180nm CMOS process. The complete signal processing circuit consumes <16uW/electrode. Significance. Power and area efficiency demonstrated by the system make it an ideal candidate for integration into large recording arrays containing thousands of electrode. Closed-loop recording and reconstruction performance evaluation further improves the robustness of the compression method, thus making the system more practical for long term recording.

  4. An integrable, web-based solution for easy assessment of video-recorded performances.

    Science.gov (United States)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; however, managing video recordings and assessment sheets is troublesome and may lead to missing or incorrect data. Currently, no existing software solution can provide a local solution for the management of videos and assessments but this is necessary as assessment scores are confidential information, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA). PMID:24833946

  5. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  6. Smart Card Based Integrated Electronic Health Record System For Clinical Practice

    Directory of Open Access Journals (Sweden)

    N. Anju Latha

    2012-10-01

    Full Text Available Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Smart cards are used as a Electronic Health Record (EHR Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in hospitals. In this paper, a smart card based Integrated Electronic health Record System is developed. The system uses smart card for personal identification and transfer of health data and provides data communication. In addition to personal information, general health information about the patient is also loaded to patient smart card. Health care providers use smart cards to access data on patient cards. Electronic health records have number of advantages over the paper record, which improve the accuracy, quality of patient care, reduce the cost, efficiency, productivity. In present work we measure the biomedical parameters like Blood Pressure, Diabetes Mellitus and Pulse oxygen measurement.,etc clinical parameters of patient and store health details in Electronic Health record. The system has been successfully tested and implemented (Abstract

  7. A scheme for assuring lifelong readability in computer based medical records.

    Science.gov (United States)

    Matsumura, Yasushi; Kurabayashi, Noriyuki; Iwasaki, Tetsuya; Sugaya, Shuichi; Ueda, Kanayo; Mineno, Takahiro; Takeda, Hiroshi

    2010-01-01

    Medical records must be kept over an extended period of time, meanwhile computer based medical records are renewed every 5-6 years. Readability of medical records must be assured even though the systems are renewed by different vendors. To achieve this, we proposed a method called DACS, in which a medical record is considered as an aggregation of documents. A Document generated by a system is transformed to a format read by free software such as PDF, which is transferred with the document meta-information and important data written on the XML to the Document Deliverer. It stores these data into the Document Archiver, the Document Sharing Server and the Data Warehouse (DWH). We developed the Matrix View which shows documents in chronological order, and the Tree View showing documents in class tree structure. By this method all the documents can be integrated and be viewed by a single viewer. This helps users figure out patient history and find a document being sought. In addition, documents' data can be shared among systems and analyzed by DWH. Most importantly DACS can assure the lifelong readability of medical records. PMID:20841656

  8. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  9. L10 FePt-based thin films for future perpendicular magnetic recording media

    International Nuclear Information System (INIS)

    Current magnetic recording media using perpendicular CoCrPt-Oxide granular films are reaching their physical limit (approx 750 Gbit/in2 density) due to thermal fluctuations that hinder a further reduction of grain size (<6–7 nm) needed to scale down the bit size. L10-FePt alloy is currently considered the most promising candidate for future recording media with areal densities above 1 Tbit/in2 thanks to its high magneto-crystalline anisotropy (K=6–10 MJ/m3), which enables it to be thermally stable even at grain sizes down to 3 nm. However, its huge anisotropy implies an increase of the switching field, which cannot be afforded by current available write heads. To simultaneously address the writability and thermal stability requirements, exchange coupled composite media, combining two or multiphase hard and soft materials, where the hard phase provides thermal stability and the soft phase reduces the switching field, have been recently proposed. This paper briefly reviews the fundamental aspects as well as both experimental approaches and magnetic properties of L10 FePt-based single phase films and exchange coupled systems for future perpendicular magnetic recording media. - Highlights: • Up-to-date review on the progress in the study of FePt films for magnetic recording. • Single phase L10 FePt films: fundamental properties and experimental approaches. • Basics of exchange coupled composite media. • FePt-based exchange coupled systems: magnetic properties and preparation approaches

  10. The Private Communications of Magnetic Recording under Socialism (Retrospective Disco Analysis

    Directory of Open Access Journals (Sweden)

    Oleg Vladimir Sineokij

    2013-07-01

    Full Text Available The article analyzes the formation and development of a general model of rare sound records in the structure of institutions of a social communication. The author considers psychocomminicative features of the filophone communication as a special type of interaction in the field of entertainment. The author studied the causes and conditions of a tape subculture in the USSR. It is observed the dynamics of the disco-communication in limited information conditions from socialism till modern high-tech conditions.At the end of the article the author argues based achievements in the field of advanced technology systems, innovation revival in the industry of music-record. Hence, using innovative approaches in the study, the author sets out the basic concept of recording popular music as a special information and legal institution, in retrospect, the theory and practice of the future needs in the information society.

  11. Analysis of Background Seismic Noise Recorded at the Amundsen-Scott South Pole Station, Antarctica

    Science.gov (United States)

    Anderson, K. R.; Aster, R.; Beaudoin, B. C.; Butler, R.

    2006-12-01

    A small array of high frequency seismometers was recently placed around the Amundsen-Scott South Pole Station in order to characterize seismic noise generated by the station during operations. This week long experiment, titled, "South Pole Analysis of Machines" or SPAM was conducted in January of 2006 using equipment provided by IRIS PASSCAL to sample the high frequency noise sources generated at the NSF's research base. These data will be correlated to those observed at the ultra quiet GSN seismic station (QSPA) located 5 miles from the base. The purpose of the experiment is to show that although the QSPA sensors are 5 miles away and nearly 1000 feet deep in the ice, there is still a risk of contamination of the signals by cultural noise from the South Pole research base. A Quiet Sector was established around the QSPA station in order to minimize vibrational noise sources, but there is interest in moving some experiments out into the Quiet Sector. Characterizing the noise sources will help us determine the potential reduction in data quality expected at the QSPA station as experiments move closer to the site. Sensors were placed next to the power generators, aircraft taxiway, large antenna towers, as well as at the base of the new station itself. Sensors were also placed between the research base and the QSPA station to get an idea of the propagation of the noise toward the QSPA station. Several high frequency noise sources are clearly seen on all array elements with a number of very clear spectral lines above 1 Hz. These are primarily associated with snow moving tractors and power generators. Smaller signals are seen that may be related to wind loading on the new South Pole elevated station along with harmonics that appear to be correlated with large air handling equipment in the station. Also evident are air operations with landings, takeoffs, taxi and idling C-130's evident. Although greatly attenuated, almost all of these signals are observed at the QSPA

  12. Web application for recording learners’ mouse trajectories and retrieving their study logs for data analysis

    Directory of Open Access Journals (Sweden)

    Yoshinori Miyazaki

    2012-03-01

    Full Text Available With the accelerated implementation of e-learning systems in educational institutions, it has become possible to record learners’ study logs in recent years. It must be admitted that little research has been conducted upon the analysis of the study logs that are obtained. In addition, there is no software that traces the mouse movements of learners during their learning processes, which the authors believe would enable teachers to better understand their students’ behaviors. The objective of this study is to develop a Web application that records students’ study logs, including their mouse trajectories, and to devise an IR tool that can summarize such diversified data. The results of an experiment are also scrutinized to provide an analysis of the relationship between learners’ activities and their study logs.

  13. Role-based access control through on-demand classification of electronic health record.

    Science.gov (United States)

    Tiwari, Basant; Kumar, Abhay

    2015-01-01

    Electronic health records (EHR) provides convenient method to exchange medical information of patients between different healthcare providers. Access control mechanism in healthcare services characterises authorising users to access EHR records. Role Based Access Control helps to restrict EHRs to users in a certain role. Significant works have been carried out for access control since last one decade but little emphasis has been given to on-demand role based access control. Presented work achieved access control through physical data isolation which is more robust and secure. We propose an algorithm in which selective combination of policies for each user of the EHR database has been defined. We extend well known data mining technique 'classification' to group EHRs with respect to the given role. Algorithm works by taking various roles as class and defined their features as a vector. Here, features are used as a Feature Vector for classification to describe user authority. PMID:26559071

  14. The INGV's new OBS/H: Analysis of the signals recorded at the Marsili submarine volcano

    Science.gov (United States)

    D'Alessandro, Antonino; D'Anna, Giuseppe; Luzio, Dario; Mangano, Giorgio

    2009-05-01

    The ocean bottom seismometer with hydrophone deployed on the flat top of the Marsili submarine volcano (790 m deep) by the Gibilmanna OBS Lab (CNT-INGV) from 12th to 21st July, 2006, recorded more than 1000 transient seismic signals. Nineteen of these signals were associated with tectonic earthquakes: 1 teleseismic, 8 regional (located by INGV) and 10 small local seismic events (non located earthquakes). The regional events were used to determine sensor orientation. By comparing the signals recorded with typical volcanic seismic activity, we were able to group all the other signals into three categories: 817 volcano-tectonic type B (VT-B) events, 159 occurrences of high frequency tremor (HFT) and 32 short duration events (SDE). Small-magnitude VT-B swarms, having a frequency band of 2-6 Hz and a mean length of about 30 s, were almost all recorded during the first 7 days. During the last 2 days, the OBS/H mainly recorded HFT events with frequencies of over 40 Hz and of a few minutes in length. Signals that have similar features in frequency and time domain are generally associated with hydrothermal activity. During the last two days a signal was recorded that had a frequency content similar to that of VT-B events was recorded. It will be referred to as continuous volcanic tremor (CVT). The SDE signals, characterized by a quasi-monochromatic waveform and having an exponential decaying envelope, may have been generated by oscillations of resonant bodies excited by magmatic or hydrothermal activity. By applying polarization and parametric spectral analyses, we inferred that the VT-B were probably multi P-phase events having shallow sources that were situated in narrow azimuthal windows in relation to the positions of the OBS/H. The parametric spectral analysis of the SDE signals allowed us to determine their dominant complex frequencies with high accuracy; these frequencies are distributed in two distinct clusters on the complex plane.

  15. Climate elasticity of streamflow revisited – an elasticity index based on long-term hydrometeorological records

    OpenAIRE

    V. Andréassian; L. Coron; Lerat, J.; Le Moine, N.

    2015-01-01

    We present a new method to derive the empirical (i.e., data-based) elasticity of streamflow to precipitation and potential evaporation. This method, which uses long-term hydrometeorological records, is tested on a set of 519 French catchments. We compare a total of five different ways to compute elasticity: the reference method first proposed by Sankarasubramanian et al. (2001) and four alternatives differing in the type of regression model chosen (OLS or GLS, univari...

  16. Cloud-based Electronic Health Records for Real-time, Region-specific Influenza Surveillance.

    Science.gov (United States)

    Santillana, M; Nguyen, A T; Louie, T; Zink, A; Gray, J; Sung, I; Brownstein, J S

    2016-01-01

    Accurate real-time monitoring systems of influenza outbreaks help public health officials make informed decisions that may help save lives. We show that information extracted from cloud-based electronic health records databases, in combination with machine learning techniques and historical epidemiological information, have the potential to accurately and reliably provide near real-time regional estimates of flu outbreaks in the United States. PMID:27165494

  17. Implications of the Java language on computer-based patient records.

    OpenAIRE

    Pollard, D.; Kucharz, E.; Hammond, W.E.

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal so...

  18. Cloud-based Electronic Health Records for Real-time, Region-specific Influenza Surveillance

    Science.gov (United States)

    Santillana, M.; Nguyen, A. T.; Louie, T.; Zink, A.; Gray, J.; Sung, I.; Brownstein, J. S.

    2016-01-01

    Accurate real-time monitoring systems of influenza outbreaks help public health officials make informed decisions that may help save lives. We show that information extracted from cloud-based electronic health records databases, in combination with machine learning techniques and historical epidemiological information, have the potential to accurately and reliably provide near real-time regional estimates of flu outbreaks in the United States. PMID:27165494

  19. An Improved Itinerary Recording Protocol for Securing Distributed Architectures Based on Mobile Agents

    OpenAIRE

    Guillaume Allée; Samuel Pierre; Roch H. Glitho; Abdelmorhit El Rhazi

    2005-01-01

    This paper proposes an improved itinerary recording protocol for securing distributed architectures based on mobile agents. The behavior of each of the cooperating agents is described, as well as the decision process establishing the identities of offenders when an attack is detected. Our protocol is tested on a set of potential attacks and the results confirm our assumption regarding offender designations and moments of detection. More precisely, the performance evaluation shows that our pro...

  20. Optimization of an acrylamides-based photopolymer for reflection holographic recording

    OpenAIRE

    Jallapuram, Raghavendra

    2005-01-01

    Photopolymers have been the subject of special attention in the last two decades. The advantage of being self-developing makes them a practical alternative to silver halide photographic emulsions in holographic interferometry applications. This thesis is aimed at understanding diffusion properties and optimization of an acrylamide-based green sensitized photopolymer material for reflection holographic recording. The composition of the photopolymer includes a green sensitive dye (erythrosine B...

  1. Comparison of experimental approaches to study selective properties of thick phase-amplitude holograms recorded in materials with diffusion-based formation mechanisms

    Science.gov (United States)

    Borisov, Vladimir; Klepinina, Mariia; Veniaminov, Andrey; Angervaks, Aleksandr; Shcheulin, Aleksandr; Ryskin, Aleksandr

    2016-04-01

    Volume holographic gratings, both transmission and reflection-type, may be employed as one-dimensional pho- tonic crystals. More complex two- and three-dimensional holographic photonic-crystalline structures can be recorded using several properly organized beams. As compared to colloidal photonic crystals, their holographic counterparts let minimize distortions caused by multiple inner boundaries of the media. Unfortunately, it's still hard to analyze spectral response of holographic structures. This work presents the results of thick holographic gratings analysis based on spectral-angular selectivity contours approximation. The gratings were recorded in an additively colored fluorite crystal and a glassy polymer doped with phenanthrenequinone (PQ-PMMA). The two materials known as promising candidates for 3D diffraction optics including photonic crystals, employ diffusion-based mechanisms of grating formation. The surfaces of spectral-angular selectivity were obtained in a single scan using a white-light LED, rotable table and a matrix spectrometer. The data expressed as 3D plots make apparent visual estimation of the grating phase/amplitude nature, noninearity of recording, etc., and provide sufficient information for numerical analysis. The grating recorded in the crystal was found to be a mixed phase-amplitude one, with different contributions of refractive index and absorbance modulation at different wavelengths, and demonstrated three diffraction orders corresponding to its three spatial harmonics originating from intrinsically nonlinear diffusion-drift recording mechanism. Contrastingly, the grating in the polymeric medium appeared purely phase and linearly recorded.

  2. Multi-image Photogrammetry for Underwater Archaeological Site Recording: An Accessible, Diver-Based Approach

    Science.gov (United States)

    McCarthy, John; Benjamin, Jonathan

    2014-06-01

    This article presents a discussion of recent advances in underwater photogrammetric survey, illustrated by case studies in Scotland and Denmark between 2011 and 2013. Results from field trials are discussed with the aim of illustrating practical low-cost solutions for recording underwater archaeological sites in 3D using photogrammetry and using this data to offer enhanced recording, interpretation and analysis. We argue that the availability of integrated multi-image photogrammetry software, highly light-sensitive digital sensors and wide-aperture compact cameras, now allow for simple work flows with minimal equipment and excellent natural colour images even at depths of up to 30 m. This has changed the possibilities for underwater photogrammetric recording, which can now be done on a small scale, through the use of a single camera and automated work flow. The intention of this paper is to demonstrate the quality and versatility of the `one camera/ambient light/integrated software' technique through the case studies presented and the results derived from this process. We also demonstrate how the 3D data generated can be subjected to surface analysis techniques to enhance detail and to generate data-driven fly-throughs and reconstructions, opening the door to new avenues of engagement with both specialists and the wider public.

  3. CNT/PDMS-based canal-typed ear electrodes for inconspicuous EEG recording

    Science.gov (United States)

    Lee, Joong Hoon; Lee, Seung Min; Byeon, Hang Jin; Hong, Joung Sook; Park, Kwang Suk; Lee, Sang-Hoon

    2014-08-01

    Objective. Current electroencephalogram (EEG) monitoring systems typically require cumbersome electrodes that must be pasted on a scalp, making a private recording of an EEG in a public place difficult. We have developed a small, user friendly, biocompatible electrode with a good appearance for inconspicuous EEG monitoring. Approach. We fabricated carbon nanotube polydimethylsiloxane (CNT/PDMS)-based canal-type ear electrodes (CEE) for EEG recording. These electrodes have an additional function, triggering sound stimulation like earphones and recording EEG simultaneously for auditory brain-computer interface (BCI). The electrode performance was evaluated by a standard EEG measurement paradigm, including the detection of alpha rhythms and measurements of N100 auditory evoked potential (AEP), steady-state visual evoked potential (SSVEP) and auditory steady-state response (ASSR). Furthermore, the bio- and skin-compatibility of CNT/PDMS were tested. Main results. All feasibility studies were successfully recorded with the fabricated electrodes, and the biocompatibility of CNT/PDMS was also proved. Significance. These electrodes could be used to monitor EEG clinically, in ubiquitous health care and in brain-computer interfaces.

  4. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording.

    Science.gov (United States)

    Crescentini, Marco; Bennati, Marco; Saha, Shimul Chandra; Ivica, Josip; de Planque, Maurits; Morgan, Hywel; Tartagni, Marco

    2016-01-01

    High-throughput screening (HTS) using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM) is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i) design of scalable microfluidic devices; (ii) design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii) design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter. PMID:27213382

  5. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Directory of Open Access Journals (Sweden)

    Marco Crescentini

    2016-05-01

    Full Text Available High-throughput screening (HTS using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i design of scalable microfluidic devices; (ii design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  6. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Science.gov (United States)

    Crescentini, Marco; Bennati, Marco; Saha, Shimul Chandra; Ivica, Josip; de Planque, Maurits; Morgan, Hywel; Tartagni, Marco

    2016-01-01

    High-throughput screening (HTS) using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM) is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i) design of scalable microfluidic devices; (ii) design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii) design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter. PMID:27213382

  7. ANALYSIS-BASED SPARSE RECONSTRUCTION WITH SYNTHESIS-BASED SOLVERS

    OpenAIRE

    Cleju, Nicolae; Jafari, Maria,; Plumbley, Mark D.

    2012-01-01

    Analysis based reconstruction has recently been introduced as an alternative to the well-known synthesis sparsity model used in a variety of signal processing areas. In this paper we convert the analysis exact-sparse reconstruction problem to an equivalent synthesis recovery problem with a set of additional constraints. We are therefore able to use existing synthesis-based algorithms for analysis-based exact-sparse recovery. We call this the Analysis-By-Synthesis (ABS) approach. We evaluate o...

  8. Multi-periodic climate dynamics: spectral analysis of long-term instrumental and proxy temperature records

    Directory of Open Access Journals (Sweden)

    H.-J. Lüdecke

    2012-09-01

    Full Text Available The longest six instrumental temperature records of monthly means reach back maximally to 1757 AD and were recorded in Europe. All six show a V-shape, with temperature drop in the 19th and rise in the 20th century. Proxy temperature time series of Antarctic ice cores show this same characteristic shape, indicating this pattern as a global phenomenon. We used the mean of the 6 instrumental records for analysis by discrete Fourier transformation (DFT, wavelets, and the detrended fluctuation method (DFA. For comparison, a stalagmite record was also analyzed by DFT. The harmonic decomposition of the mean shows only 6 significant frequencies above periods over 30 yr. The Pearson correlation between the mean, smoothed by a 15 yr running average (boxcar and the reconstruction using the 6 significant frequencies yields r = 0.961. This good agreement has a > 99.9% confidence level confirmed by Monte Carlo simulations. Assumption of additional forcing by anthropogenic green house gases would therefore not improve the agreement between measurement and temperature construction from the 6 documented periodicities. We find indications that the observed periodicities result from intrinsic system dynamics.

  9. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  10. Paper-Based Medical Records: the Challenges and Lessons Learned from Studying Obstetrics and Gynaecological Post-Operation Records in a Nigerian Hospital

    Directory of Open Access Journals (Sweden)

    Adekunle Yisau Abdulkadir

    2010-10-01

    Full Text Available AIM: With the background knowledge that auditing of Medical Records (MR for adequacy and completeness is necessary if it is to be useful and reliable in continuing patient care; protection of the legal interest of the patient, physicians, and the Hospital; and meeting requirements for researches, we scrutinized theatre records of our hospital to identify routine omissions or deficiencies, and correctable errors in our MR system. METHOD: Obstetrics and Gynaecological post operation theatre records between January 2006 and December 2008 were quantitatively and qualitatively analyzed for details that included: hospital number; Patients age; diagnosis; surgery performed; types and modes of anesthesia; date of surgery; patients’ ward; Anesthetists names; surgeons and attending nurses names, and abbreviations used with SPSS 15.0 for Windows. RESULTS: Hardly were any of the 1270 surgeries during the study period documented without an omission or an abbreviation. Hospital numbers and patients’ age were not documented in 21.8% (n=277 and 59.1% (n=750 respectively. Diagnoses and surgeries were recorded with varying abbreviations in about 96% of instances. Surgical team names were mostly abbreviated or initials only given. CONCLUSION: To improve the quality of Paper-based Medical Record, regular auditing, training and good orientation of medical personnel for good record practices, and discouraging large volume record book to reduce paper damages and sheet loss from handling are necessary else what we record toady may neither be useful nor available tomorrow. [TAF Prev Med Bull 2010; 9(5.000: 427-432

  11. Quantitative complexity analysis in multi-channel intracranial EEG recordings form epilepsy brains

    Science.gov (United States)

    Liu, Chang-Chia; Pardalos, Panos M.; Chaovalitwongse, W. Art; Shiau, Deng-Shan; Ghacibeh, Georges; Suharitdamrong, Wichai; Sackellares, J. Chris

    2008-01-01

    Epilepsy is a brain disorder characterized clinically by temporary but recurrent disturbances of brain function that may or may not be associated with destruction or loss of consciousness and abnormal behavior. Human brain is composed of more than 10 to the power 10 neurons, each of which receives electrical impulses known as action potentials from others neurons via synapses and sends electrical impulses via a sing output line to a similar (the axon) number of neurons. When neuronal networks are active, they produced a change in voltage potential, which can be captured by an electroencephalogram (EEG). The EEG recordings represent the time series that match up to neurological activity as a function of time. By analyzing the EEG recordings, we sought to evaluate the degree of underlining dynamical complexity prior to progression of seizure onset. Through the utilization of the dynamical measurements, it is possible to classify the state of the brain according to the underlying dynamical properties of EEG recordings. The results from two patients with temporal lobe epilepsy (TLE), the degree of complexity start converging to lower value prior to the epileptic seizures was observed from epileptic regions as well as non-epileptic regions. The dynamical measurements appear to reflect the changes of EEG’s dynamical structure. We suggest that the nonlinear dynamical analysis can provide a useful information for detecting relative changes in brain dynamics, which cannot be detected by conventional linear analysis. PMID:19079790

  12. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  13. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  14. Study on key techniques for camera-based hydrological record image digitization

    Science.gov (United States)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping

    2015-10-01

    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  15. An integrable, web-based solution for easy assessment of video-recorded performances

    Directory of Open Access Journals (Sweden)

    Subhi Y

    2014-05-01

    Full Text Available Yousif Subhi,1,2,3 Tobias Todsen,1,4 Lars Konge1,21Centre for Clinical Education, Centre for HR, The Capital Region of Denmark, Copenhagen, Denmark; 2University of Copenhagen, Copenhagen, Denmark; 3Clinical Eye Research Unit, Department of Ophthalmology, Copenhagen University Hospital Roskilde, Roskilde, Denmark; 4Department of Internal Medicine, Queen Ingrid's Hospital, Nuuk, GreenlandAbstract: Assessment of clinical competencies by direct observation is problematic for two main reasons: the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; however, managing video recordings and assessment sheets is troublesome and may lead to missing or incorrect data. Currently, no existing software solution can provide a local solution for the management of videos and assessments but this is necessary as assessment scores are confidential information, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA.Keywords: education assessment, assessment software, video-based assessment

  16. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens

    2013-09-01

    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  17. Beat-to-beat analysis method for magnetocardiographic recordings during interventions

    International Nuclear Information System (INIS)

    Multichannel magnetocardiography (MCG) during exercise testing has been shown to detect myocardial ischaemia in patients with coronary artery disease. Previous studies on exercise MCG have focused on one or few time intervals during the recovery period and only a fragment of the data available has been utilized. We present a method for beat-to-beat analysis and parametrization of the MCG signal. The method can be used for studying and quantifying the changes induced in the MCG by interventions. We test the method with data recorded in bicycle exercise testing in healthy volunteers and patients with coronary artery disease. Information in all cardiac cycles recorded during the recovery period of exercise MCG testing is, for the first time, utilized in the signal analysis. Exercise-induced myocardial ischaemia was detected by heart rate adjustment of change in magnetic field map orientation. In addition to the ST segment, the T wave in the MCG was also found to provide information related to myocardial ischaemia. The method of analysis efficiently utilizes the spatial and temporal properties of multichannel MCG mapping, providing a new tool for detecting and quantifying fast phenomena during interventional MCG studies. The method can also be applied to an on-line analysis of MCG data. (author)

  18. Beat-to-beat analysis method for magnetocardiographic recordings during interventions

    Energy Technology Data Exchange (ETDEWEB)

    Takala, P. [Laboratory of Biomedical Engineering, Helsinki University of Technology, FIN-02015 HUT (Finland)]|[BioMag Laboratory, Medical Engineering Centre, Helsinki University Central Hospital, FIN-00029 HUCH (Finland). E-mail: panu.takala@hut.fi; Montonen, J.; Nenonen, J.; Katila, T. [Laboratory of Biomedical Engineering, Helsinki University of Technology, FIN-02015 HUT (Finland); BioMag Laboratory, Medical Engineering Centre, Helsinki University Central Hospital, FIN-00029 HUCH (Finland); Haenninen, H.; Maekijaervi, M.; Toivonen, L. [BioMag Laboratory, Medical Engineering Centre, Helsinki University Central Hospital, FIN-00029 HUCH (Finland); Division of Cardiology, Helsinki University Central Hospital, FIN-00029 HUCH (Finland)

    2001-04-01

    Multichannel magnetocardiography (MCG) during exercise testing has been shown to detect myocardial ischaemia in patients with coronary artery disease. Previous studies on exercise MCG have focused on one or few time intervals during the recovery period and only a fragment of the data available has been utilized. We present a method for beat-to-beat analysis and parametrization of the MCG signal. The method can be used for studying and quantifying the changes induced in the MCG by interventions. We test the method with data recorded in bicycle exercise testing in healthy volunteers and patients with coronary artery disease. Information in all cardiac cycles recorded during the recovery period of exercise MCG testing is, for the first time, utilized in the signal analysis. Exercise-induced myocardial ischaemia was detected by heart rate adjustment of change in magnetic field map orientation. In addition to the ST segment, the T wave in the MCG was also found to provide information related to myocardial ischaemia. The method of analysis efficiently utilizes the spatial and temporal properties of multichannel MCG mapping, providing a new tool for detecting and quantifying fast phenomena during interventional MCG studies. The method can also be applied to an on-line analysis of MCG data. (author)

  19. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  20. Development of Co-Cr-based longitudinal magnetic recording media:Thermodynamic consideration

    Institute of Scientific and Technical Information of China (English)

    QIN Gao-wu; K. Oikawa

    2004-01-01

    This paper reviews our recent work on development of Co-Cr-based longitudinal magnetic recording media through the point of view of thermodynamics. It focuses on our experimental finding on the miscibility gap in the fcc α-Co phase region of the Co-Cr binary system, and on the predictions on the improvements of magnetic properties of many Co-Cr-Z ternary systems by thermodynamic computing on the basis of the newly-assessed Co-Cr binary thermodynamic parameters. Good agreement in the phase separation behavior of many Co-Cr-Z (Z=Pt, Ta, Ge)alloy systems between the calculation and the experiments has been achieved, as discussed in detail in the full paper.By the same token, many other elements, such as Ir, P, B, Mo, Zr, Nb, have been predicted to improve the magnetic grain isolation of the potential Co-Cr-Z multicomponent magnetic recording media in the future.

  1. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record].

    Science.gov (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen

    2015-06-01

    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital. PMID:26485982

  2. Laboratory-based recording of holographic fine structure in X-ray absorption anisotropy using polycapillary optics

    Energy Technology Data Exchange (ETDEWEB)

    Dabrowski, K.M. [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland); Korecki, P., E-mail: pawel.korecki@uj.edu.pl [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer Holographic fine structures in X-ray absorption recorded using a tabletop setup. Black-Right-Pointing-Pointer Setup based on polycapillary collimating optics and an HOPG crystal. Black-Right-Pointing-Pointer Demonstration of element sensitivity by detection of X-ray fluorescence. Black-Right-Pointing-Pointer Potential of laboratory-based experiments for heavily doped crystals and thin films. - Abstract: A tabletop setup composed of a collimating polycapillary optics and a highly oriented pyrolytic graphite monochromator (HOPG) was characterized and used for recording two-dimensional maps of X-ray absorption anisotropy (XAA). XAA originates from interference of X-rays directly inside the sample. Depending on experimental conditions, fine structures in XAA can be interpreted in terms of X-ray holograms or X-ray standing waves and can be used for an element selective atomic-resolved structural analysis. The implementation of polycapillary optics resulted in a two-order of magnitude gain in the radiant intensity (photons/s/solid angle) as compared to a system without optics and enabled efficient recording of XAA with a resolution of 0.15 Degree-Sign for Mo K{alpha} radiation. Element sensitivity was demonstrated by acquisition of distinct XAA signals for Ga and As atoms in a GaAs (1 1 1) wafer by using X-ray fluorescence as a secondary signal. These results indicate the possibility of performing laboratory-based XAA experiments for heavily doped single crystals or thin films. So far, because of the weak holographic modulation of XAA, such experiments could be only performed using synchrotron radiation.

  3. Space and Astrophysical Plasmas : Matched filtering-parameter estimation method and analysis of whistlers recorded at Varanasi

    Indian Academy of Sciences (India)

    R P Singh; R P Patel; Ashok K Singh; D Hamar; J Lichtenberger

    2000-11-01

    The matched filtering technique is based on the digital-construction of theoretical whistlers and their comparison with observed whistlers. The parameters estimated from the theoretical and experimental whistler curves are matched to have higher accuracy using digital filters. This yields a resolution ten times better in the time domain. We have tested the applicability of this technique for the analysis of whistlers recorded at Varanasi. It is found that the whistlers have propagated along > 2 and have wave normal angles after exiting from the ionosphere such that they propagate towards equator in the earth-ionosphere wave-guide. High-resolution analysis shows the presence of fine structures present in the dynamic spectrum. An effort is made to interpret the results.

  4. Automating occupational protection records systems

    International Nuclear Information System (INIS)

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs

  5. A Multi-Scaler Recording System and its Application to Radiometric ''Off-Line'' Analysis

    International Nuclear Information System (INIS)

    In large complex reprocessing plants a great deal has been done over the past few years to provide in-line instrumentation for the contemporary analysis of process stream content and characteristics. However, these instruments have a qualitative rather than a quantitative part to play in the overall control of the plant. Quantitative information, which must be obtained for control and accounting purposes, demands and relies upon the efficient use of laboratory techniques and instrumentation for the precise analysis of representative samples taken from the process streams. These techniques, in particular those involving pulse counting systems, can be made automatic with modern instrumentation, such as will be described, in which the data is obtained in digital form in electronic stores (scalers). To support a large plant there will be many separate counting systems of this kind, independently controlled and therefore having no time correlation between them. The automatic recording system described in the paper provides a common data read-out facility for more than 50 independently operating counting systems, recording scaler data, together with associated sample and system identification and the absolute time occurrence of each read-out. The data can be recorded, in forms suitable for subsequent processing by a computer, on a variety of tape and card punches, serial and parallel printers or magnetic tape. In addition, the whole recording system, including the scalers in any one system, can be checked for correct operation on an automatic routine basis which does not interfere with the operation of other counting systems. It is concluded that the effective quantitative control of a plant rests on a rapid efficient sample analysis under laboratory conditions. It is probable that future developments of ''off-line'' facilities rather than on-line instrumentation will be possible and more worthwhile. The desirable characteristics of instrumentation for such a laboratory

  6. Comparison of the Hazard Mapping System (HMS) fire product to ground-based fire records in Georgia, USA

    Science.gov (United States)

    Hu, Xuefei; Yu, Chao; Tian, Di; Ruminski, Mark; Robertson, Kevin; Waller, Lance A.; Liu, Yang

    2016-03-01

    Biomass burning has a significant and adverse impact on air quality, climate change, and various ecosystems. The Hazard Mapping System (HMS) detects fires using data from multiple satellite sensors in order to maximize its fire detection rate. However, to date, the detection rate of the HMS fire product for small fires has not been well studied, especially using ground-based fire records. This paper utilizes the 2011 fire information compiled from ground observations and burn authorizations in Georgia to assess the comprehensiveness of the HMS active fire product. The results show that detection rates of the hybrid HMS increase substantially by integrating multiple satellite instruments. The detection rate increases dramatically from 3% to 80% with an increase in fire size from less than 0.02 km2 to larger than 2 km2, resulting in detection of approximately 12% of all recorded fires which represent approximately 57% of the total area burned. The spatial pattern of detection rates reveals that grid cells with high detection rates are generally located in areas where large fires occur frequently. The seasonal analysis shows that overall detection rates in winter and spring (12% and 13%, respectively) are higher than those in summer and fall (3% and 6%, respectively), mainly because of higher percentages of large fires (>0.19 km2) that occurred in winter and spring. The land cover analysis shows that detection rates are 2-7 percentage points higher in land cover types that are prone to large fires such as forestland and shrub land.

  7. Development and programming of Geophonino: A low cost Arduino-based seismic recorder for vertical geophones

    Science.gov (United States)

    Soler-Llorens, J. L.; Galiana-Merino, J. J.; Giner-Caturla, J.; Jauregui-Eslava, P.; Rosa-Cintas, S.; Rosa-Herranz, J.

    2016-09-01

    The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.

  8. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    Science.gov (United States)

    Prouty, N.G.; Jupiter, S.D.; Field, M.E.; McCulloch, M.T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologie and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (-40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge. Copyright 2009 by the American Geophysical Union.

  9. Design and implementation of web-based mobile electronic medication administration record.

    Science.gov (United States)

    Hsieh, Sung-Huai; Hou, I-Ching; Cheng, Po-Hsun; Tan, Ching-Ting; Shen, Po-Chao; Hsu, Kai-Ping; Hsieh, Sheau-Ling; Lai, Feipei

    2010-10-01

    Patients' safety is the most essential, critical issue, however, errors can hardly prevent, especially for human faults. In order to reduce the errors caused by human, we construct Electronic Health Records (EHR) in the Health Information System (HIS) to facilitate patients' safety and to improve the quality of medical care. During the medical care processing, all the tasks are based upon physicians' orders. In National Taiwan University Hospital (NTUH), the Electronic Health Record committee proposed a standard of order flows. There are objectives of the standard: first, to enhance medical procedures and enforce hospital policies; secondly, to improve the quality of medical care; third, to collect sufficient, adequate data for EHR in the near future. Among the proposed procedures, NTUH decides to establish a web-based mobile electronic medication administration record (ME-MAR) system. The system, build based on the service-oriented architecture (SOA) as well as embedded the HL7/XML standard, is installed in the Mobile Nursing Carts. It also implement accompany with the advanced techniques like Asynchronous JavaScript and XML (Ajax) or Web services to enhance the system usability. According to researches, it indicates that medication errors are highly proportion to total medical faults. Therefore, we expect the ME-MAR system can reduce medication errors. In addition, we evaluate ME-MAR can assist nurses or healthcare practitioners to administer, manage medication properly. This successful experience of developing the NTUH ME-MAR system can be easily applied to other related system. Meanwhile, the SOA architecture of the system can also be seamless integrated to NTUH or other HIS system. PMID:20703613

  10. Task and error analysis balancing benefits over business of electronic medical records.

    Science.gov (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B

    2014-01-01

    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR. PMID:25161108

  11. Similarities and differences of doctor-patient co-operated evidence-based medical record of treating digestive system diseases with integrative medicine compared with traditional medical records

    Institute of Scientific and Technical Information of China (English)

    Bo Li; Wen-Hong Shao; Yan-Da Li; Ying-Pan Zhao; Qing-Na Li; Zhao Yang; Hong-Cai Shang

    2016-01-01

    遵循叙事循证医学理念,咨询中西医消化内科及循证医学专家,凝练医患共建式病历的理论,建立医患共建式病历的范本,对比医患共建式病历与传统病历记录的不同,分析医患共建式病历的优缺点。思考与展望:医患共建式病历有可能成为中西医合作治疗脾胃病疗效评价方法学体系的一个要素。%Objective: To establish the model of doctor-patient cooperated record, based on the concepts of narrative evidence-based medicine and related theories on Doctor-Patient Co-operated Evidence-Based Medical Record. Methods: We conducted a literature search from Pubmed, following the principles of narrative evidence-based medicine, and refer to the advice of experts of digestive system and EBM in both traditional Chinese medicine and Western medicine. Result: This research is a useful attempt to discuss the establishment of doctor-patient co-operated evidence-based medical record guided by narrative evidence-based medicine. Conclusion:Doctor-patient co-operated medical record can become a key factor of the curative effect evaluation methodology system of integrated therapy of tradition Chinese medicine and Western medicine on spleen and stomach diseases.

  12. A critical ear: Analysis of value judgements in reviews of Beethoven’s piano sonata recordings

    Directory of Open Access Journals (Sweden)

    Elena eAlessandri

    2016-03-01

    Full Text Available What sets a great music performance apart? In this study we addressed this question through an examination of value judgements in written criticism of recorded performance. One hundred reviews of recordings of Beethoven’s piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative expressed by critics’ statements and the evaluation criteria that underpinned their judgements. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording.

  13. A Critical Ear: Analysis of Value Judgments in Reviews of Beethoven's Piano Sonata Recordings.

    Science.gov (United States)

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron

    2016-01-01

    What sets a great music performance apart? In this study, we addressed this question through an examination of value judgments in written criticism of recorded performance. One hundred reviews of recordings of Beethoven's piano sonatas, published in the Gramophone between 1934 and 2010, were analyzed through a three-step qualitative analysis that identified the valence (positive/negative) expressed by critics' statements and the evaluation criteria that underpinned their judgments. The outcome is a model of the main evaluation criteria used by professional critics: aesthetic properties, including intensity, coherence, and complexity, and achievement-related properties, including sureness, comprehension, and endeavor. The model also emphasizes how critics consider the suitability and balance of these properties across the musical and cultural context of the performance. The findings relate directly to current discourses on the role of evaluation in music criticism and the generalizability of aesthetic principles. In particular, the perceived achievement of the performer stands out as a factor that drives appreciation of a recording. PMID:27065900

  14. Personal dose analysis of TLD glow curve data from individual monitoring records

    International Nuclear Information System (INIS)

    Radiation exposure of workers in Ghana have been estimated on the basis of personal dose records of the occupationally exposed in medical, industrial and research/teaching practices for the period 2008-09. The estimated effective doses for 2008 are 0.400, 0.495 and 0.426 mSv for medical, industrial and research/teaching practices, respectively. The corresponding collective effective doses are 0.128, 0.044 and 0.017 person-Sv, respectively. Similarly, the effective doses recorded in 2009 are 0.448, 0.545 and 0.388 mSv, respectively with corresponding collective effective doses of 0.108, 0.032 and 0.012 person-Sv, respectively. The study shows that occupational exposure in Ghana is skewed to the lower doses (between 0.001 and 0.500 mSv). A statistical analysis of the effective doses showed no significant difference at p < 0.05 among the means of the effective doses recorded in various practices. (authors)

  15. Extracting physician group intelligence from electronic health records to support evidence based medicine.

    Directory of Open Access Journals (Sweden)

    Griffin M Weber

    Full Text Available Evidence-based medicine employs expert opinion and clinical data to inform clinical decision making. The objective of this study is to determine whether it is possible to complement these sources of evidence with information about physician "group intelligence" that exists in electronic health records. Specifically, we measured laboratory test "repeat intervals", defined as the amount of time it takes for a physician to repeat a test that was previously ordered for the same patient. Our assumption is that while the result of a test is a direct measure of one marker of a patient's health, the physician's decision to order the test is based on multiple factors including past experience, available treatment options, and information about the patient that might not be coded in the electronic health record. By examining repeat intervals in aggregate over large numbers of patients, we show that it is possible to 1 determine what laboratory test results physicians consider "normal", 2 identify subpopulations of patients that deviate from the norm, and 3 identify situations where laboratory tests are over-ordered. We used laboratory tests as just one example of how physician group intelligence can be used to support evidence based medicine in a way that is automated and continually updated.

  16. New models for frequency content prediction of earthquake records based on Iranian ground-motion data

    Science.gov (United States)

    Yaghmaei-Sabegh, Saman

    2015-10-01

    This paper presents the development of new and simple empirical models for frequency content prediction of ground-motion records to resolve the assumed limitations on the useable magnitude range of previous studies. Three period values are used in the analysis for describing the frequency content of earthquake ground-motions named as the average spectral period ( T avg), the mean period ( T m), and the smoothed spectral predominant period ( T 0). The proposed models could predict these scalar indicators as function of magnitude, closest site-to-source distance and local site condition. Three site classes as rock, stiff soil, and soft soil has been considered in the analysis. The results of the proposed relationships have been compared with those of other published models. It has been found that the resulting regression equations can be used to predict scalar frequency content estimators over a wide range of magnitudes including magnitudes below 5.5.

  17. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  18. Interval Estimation of Stress-Strength Reliability Based on Lower Record Values from Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Bahman Tarvirdizade

    2014-01-01

    Full Text Available We consider the estimation of stress-strength reliability based on lower record values when X and Y are independently but not identically inverse Rayleigh distributed random variables. The maximum likelihood, Bayes, and empirical Bayes estimators of R are obtained and their properties are studied. Confidence intervals, exact and approximate, as well as the Bayesian credible sets for R are obtained. A real example is presented in order to illustrate the inferences discussed in the previous sections. A simulation study is conducted to investigate and compare the performance of the intervals presented in this paper and some bootstrap intervals.

  19. Effect of acidity on the polarization sensitivity of azo-indicator based recording media*

    Science.gov (United States)

    Shaverdova, V. G.; Petrova, S. S.; Purtseladze, A. L.; Tarasashvili, V. I.; Obolashvili, N. Z.

    2013-01-01

    This is an experimental study of the photoanisotropic gyrotropic properties of recording media based on azoindicators — homologs (five dyes) of methyl orange-- introduced into the polymer matrix. Samples were prepared by a technology we have developed employing solvents with different acidities (pH 1.68-12.48). The samples were exposed to actinic radiation (λ = 488 nm) from an argon laser, and the photoinduced anisotropy measured in real time. The circular dichroism and circular birefringence in the layers under study are calculated for a neutral medium and at different pH levels.

  20. Development of Software for dose Records Data Base Access; Programacion para la consulta del Banco de Datos Dosimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M.

    1990-07-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs.

  1. Feature-based sentiment analysis with ontologies

    OpenAIRE

    Taner, Berk

    2011-01-01

    Sentiment analysis is a topic that many researchers work on. In recent years, new research directions under sentiment analysis appeared. Feature-based sentiment analysis is one such topic that deals not only with finding sentiment in a sentence but providing a more detailed analysis on a given domain. In the beginning researchers focused on commercial products and manually generated list of features for a product. Then they tried to generate a feature-based approach to attach sentiments to th...

  2. Evaluation of an algorithm based on single-condition decision rules for binary classification of 12-lead ambulatory ECG recording quality

    International Nuclear Information System (INIS)

    A new algorithm for classifying ECG recording quality based on the detection of commonly observed ECG contaminants which often render the ECG unusable for diagnostic purposes was evaluated. Contaminants (baseline drift, flat line, QRS-artefact, spurious spikes, amplitude stepwise changes, noise) were detected on individual leads from joint time-frequency analysis and QRS amplitude. Classification was based on cascaded single-condition decision rules (SCDR) that tested levels of contaminants against classification thresholds. A supervised learning classifier (SLC) was implemented for comparison. The SCDR and SLC algorithms were trained on an annotated database (Set A, PhysioNet Challenge 2011) of ‘acceptable’ versus ‘unacceptable’ quality recordings using the ‘leave M out’ approach with repeated random partitioning and cross-validation. Two training approaches were considered: (i) balanced, in which training records had equal numbers of ‘acceptable’ and ‘unacceptable’ recordings, (ii) unbalanced, in which the ratio of ‘acceptable’ to ‘unacceptable’ recordings from Set A was preserved. For each training approach, thresholds were calculated, and classification accuracy of the algorithm compared to other rule based algorithms and the SLC using a database for which classifications were unknown (Set B PhysioNet Challenge 2011). The SCDR algorithm achieved the highest accuracy (91.40%) compared to the SLC (90.40%) in spite of its simple logic. It also offers the advantage that it facilitates reporting of meaningful causes of poor signal quality to users. (paper)

  3. An analysis of concert saxophone vibrato through the examination of recordings by eight prominent soloists

    Science.gov (United States)

    Zinninger, Thomas

    This study examines concert saxophone vibrato through the analysis of several recordings of standard repertoire by prominent soloists. The vibrato of Vincent Abato, Arno Bornkamp, Claude Delangle, Jean-Marie Londeix, Marcel Mule, Otis Murphy, Sigurd Rascher, and Eugene Rousseau is analyzed with regards to rate, extent, shape, and discretionary use. Examination of these parameters was conducted through both general observation and precise measurements with the aid of a spectrogram. Statistical analyses of the results provide tendencies for overall vibrato use, as well as the effects of certain musical attributes (note length, tempo, dynamic, range) on vibrato. The results of this analysis are also compared among each soloist and against pre-existing theories or findings in vibrato research.

  4. The (Anomalous) Hall Magnetometer as an Analysis Tool for High Density Recording Media

    OpenAIRE

    Haan; Lodder, J. C.

    1991-01-01

    In this work an evaluation tool for the characterization of high-density recording thin film media is discussed. The measurement principles are based on the anomalous and the planar Hall effect. We used these Hall effects to characterize ferromagnetic Co-Cr films and Co/Pd multilayers having perpendicular anisotropy. The measurements set-up that was built has a sensitivity capable of measuring the hysteresis loops of 0.2x0.2 mm2 Hall structures in Co-Cr and jumps were observed in the Hall vol...

  5. The effect of recording and analysis bandwidth on acoustic identification of delphinid species

    Science.gov (United States)

    Oswald, Julie N.; Rankin, Shannon; Barlow, Jay

    2004-11-01

    Because many cetacean species produce characteristic calls that propagate well under water, acoustic techniques can be used to detect and identify them. The ability to identify cetaceans to species using acoustic methods varies and may be affected by recording and analysis bandwidth. To examine the effect of bandwidth on species identification, whistles were recorded from four delphinid species (Delphinus delphis, Stenella attenuata, S. coeruleoalba, and S. longirostris) in the eastern tropical Pacific ocean. Four spectrograms, each with a different upper frequency limit (20, 24, 30, and 40 kHz), were created for each whistle (n=484). Eight variables (beginning, ending, minimum, and maximum frequency; duration; number of inflection points; number of steps; and presence/absence of harmonics) were measured from the fundamental frequency of each whistle. The whistle repertoires of all four species contained fundamental frequencies extending above 20 kHz. Overall correct classification using discriminant function analysis ranged from 30% for the 20-kHz upper frequency limit data to 37% for the 40-kHz upper frequency limit data. For the four species included in this study, an upper bandwidth limit of at least 24 kHz is required for an accurate representation of fundamental whistle contours..

  6. Eielson Air Force Base operable unit 2 and other areas record of decision

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, R.E.; Smith, R.M.

    1994-10-01

    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater.

  7. Estimating the frequency of extremely energetic solar events, based on solar, stellar, lunar, and terrestrial records

    CERN Document Server

    Schrijver, C J; Baltensperger, U; Cliver, E W; Guedel, M; Hudson, H S; McCracken, K G; Osten, R A; Peter, Th; Soderblom, D R; Usoskin, I G; Wolff, E W

    2012-01-01

    The most powerful explosions on the Sun [...] drive the most severe space-weather storms. Proxy records of flare energies based on SEPs in principle may offer the longest time base to study infrequent large events. We conclude that one suggested proxy, nitrate concentrations in polar ice cores, does not map reliably to SEP events. Concentrations of select radionuclides measured in natural archives may prove useful in extending the time interval of direct observations up to ten millennia, but as their calibration to solar flare fluences depends on multiple poorly known properties and processes, these proxies cannot presently be used to help determine the flare energy frequency distribution. Being thus limited to the use of direct flare observations, we evaluate the probabilities of large-energy solar explosions by combining solar flare observations with an ensemble of stellar flare observations. We conclude that solar flare energies form a relatively smooth distribution from small events to large flares, while...

  8. Eielson Air Force Base operable unit 2 and other areas record of decision

    International Nuclear Information System (INIS)

    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater

  9. A compact self-recording pressure based sea level gauge suitable for deployments at harbour and offshore environments

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.; Peshwe, V.B.; Joseph, A.; Mehra, P.; Naik, G.P.; Kumar, V.; Desa, E.S.; Desai, R.G.P.; Nagvekar, S.; Desai, S.P.

    A compact and lightweight self-recording pressure based sea level gauge has been designed to suit deployments from harbour and offshore environments. A novel hydraulic coupling device designed in-house was used to transfer the seawater pressure...

  10. Provincial prenatal record revision: a multiple case study of evidence-based decision-making at the population-policy level

    Directory of Open Access Journals (Sweden)

    Olson Joanne

    2008-12-01

    Full Text Available Abstract Background There is a significant gap in the knowledge translation literature related to how research evidence actually contributes to health care decision-making. Decisions around what care to provide at the population (rather than individual level are particularly complex, involving considerations such as feasibility, cost, and population needs in addition to scientific evidence. One example of decision-making at this "population-policy" level involves what screening questions and intervention guides to include on standardized provincial prenatal records. As mandatory medical reporting forms, prenatal records are potentially powerful vehicles for promoting population-wide evidence-based care. However, the extent to which Canadian prenatal records reflect best-practice recommendations for the assessment of well-known risk factors such as maternal smoking and alcohol consumption varies markedly across Canadian provinces and territories. The goal of this study is to better understand the interaction of contextual factors and research evidence on decision-making at the population-policy level, by examining the processes by which provincial prenatal records are reviewed and revised. Methods Guided by Dobrow et al.'s (2004 conceptual model for context-based evidence-based decision-making, this study will use a multiple case study design with embedded units of analysis to examine contextual factors influencing the prenatal record revision process in different Canadian provinces and territories. Data will be collected using multiple methods to construct detailed case descriptions for each province/territory. Using qualitative data analysis techniques, decision-making processes involving prenatal record content specifically related to maternal smoking and alcohol use will be compared both within and across each case, to identify key contextual factors influencing the uptake and application of research evidence by prenatal record review

  11. Stochasticity of Road Traffic Dynamics: Comprehensive Linear and Nonlinear Time Series Analysis on High Resolution Freeway Traffic Records

    CERN Document Server

    Siegel, H; Siegel, Helge; Belomestnyi, Dennis

    2006-01-01

    The dynamical properties of road traffic time series from North-Rhine Westphalian motorways are investigated. The article shows that road traffic dynamics is well described as a persistent stochastic process with two fixed points representing the freeflow (non-congested) and the congested state regime. These traffic states have different statistical properties, with respect to waiting time distribution, velocity distribution and autocorrelation. Logdifferences of velocity records reveal non-normal, obviously leptocurtic distribution. Further, linear and nonlinear phase-plane based analysis methods yield no evidence for any determinism or deterministic chaos to be involved in traffic dynamics on shorter than diurnal time scales. Several Hurst-exponent estimators indicate long-range dependence for the free flow state. Finally, our results are not in accordance to the typical heuristic fingerprints of self-organized criticality. We suggest the more simplistic assumption of a non-critical phase transition between...

  12. Astronomical calibration and global correlation of the Santonian (Cretaceous) based on the marine carbon isotope record

    Science.gov (United States)

    Thibault, N.; Jarvis, I.; Voigt, S.; Gale, A. S.; Attree, K.; Jenkyns, H. C.

    2016-06-01

    High-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the sections at Seaford Head (southern England) and Bottaccione (central Italy). An unambiguous stratigraphic correlation is presented for the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of carbon and oxygen isotopes at Seaford Head points to the Boreal Santonian spanning five 405 kyr cycles (Sa1 to Sa5). Correlation of the Seaford Head time scale to that of the Niobrara Formation (Western Interior Basin) permits anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19 ± 0.38 Ma. Among the five tuning options examined, option 2 places the Sa/Ca at the 84.2 Ma 405 kyr insolation minimum and appears as the most likely. This solution indicates that minima of the 405 kyr filtered output of the resistivity in the Niobrara Formation correlate to 405 kyr insolation minima in the astronomical solution and to maxima in the filtered δ13C of Seaford Head. We suggest that variance in δ13C is driven by climate forcing of the proportions of CaCO3 versus organic carbon burial on land and in oceanic basins. The astronomical calibration generates a 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, due either to diachronism of the lowest occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions or to remaining uncertainties of radiometric dating and cyclostratigraphic records.

  13. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  14. Untangling inconsistent magnetic polarity records through an integrated rock magnetic analysis: A case study on Neogene sections in East Timor

    Science.gov (United States)

    Aben, F. M.; Dekkers, M. J.; Bakker, R. R.; van Hinsbergen, D. J. J.; Zachariasse, W. J.; Tate, G. W.; McQuarrie, N.; Harris, R.; Duffy, B.

    2014-06-01

    polarity patterns in sediments are a common problem in magnetostratigraphic and paleomagnetic research. Multiple magnetic mineral generations result in such remanence "haystacks." Here we test whether end-member modeling of isothermal remanent magnetization acquisition curves as a basis for an integrated rock magnetic and microscopic analysis is capable of isolating original magnetic polarity patterns. Uppermost Miocene-Pliocene deep-marine siliciclastics and limestones in East Timor, originally sampled to constrain the uplift history of the young Timor orogeny, serve as case study. An apparently straightforward polarity record was obtained that, however, proved impossible to reconcile with the associated biostratigraphy. Our analysis distinguished two magnetic end-members for each section, which result from various greigite suites and a detrital magnetite suite. The latter yields largely viscous remanence signals and is deemed unsuited. The greigite suites are late diagenetic in the Cailaco River section and early diagenetic, thus reliable, in the Viqueque Type section. By selecting reliable sample levels based on a quality index, a revised polarity pattern of the latter section is obtained: consistent with the biostratigraphy and unequivocally correlatable to the Geomagnetic Polarity Time Scale. Although the Cailaco River section lacks a reliable magnetostratigraphy, it does record a significant postremagnetization tectonic rotation. Our results shows that the application of well-designed rock magnetic research, based on the end-member model and integrated with microscopy and paleomagnetic data, can unravel complex and seemingly inconsistent polarity patterns. We recommend this approach to assess the veracity of the polarity of strata with complex magnetic mineralogy.

  15. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson

    2010-06-01

    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  16. Robert Recorde

    CERN Document Server

    Williams, Jack

    2011-01-01

    The 16th-Century intellectual Robert Recorde is chiefly remembered for introducing the equals sign into algebra, yet the greater significance and broader scope of his work is often overlooked. This book presents an authoritative and in-depth analysis of the man, his achievements and his historical importance. This scholarly yet accessible work examines the latest evidence on all aspects of Recorde's life, throwing new light on a character deserving of greater recognition. Topics and features: presents a concise chronology of Recorde's life; examines his published works; describes Recorde's pro

  17. Analysis of microseismic signals and temperature recordings for rock slope stability investigations in high mountain areas

    Directory of Open Access Journals (Sweden)

    C. Occhiena

    2012-07-01

    Full Text Available The permafrost degradation is a probable cause for the increase of rock instabilities and rock falls observed in recent years in high mountain areas, particularly in the Alpine region. The phenomenon causes the thaw of the ice filling rock discontinuities; the water deriving from it subsequently freezes again inducing stresses in the rock mass that may lead, in the long term, to rock falls. To investigate these processes, a monitoring system composed by geophones and thermometers was installed in 2007 at the Carrel hut (3829 m a.s.l., Matterhorn, NW Alps. In 2010, in the framework of the Interreg 2007–2013 Alcotra project no. 56 MASSA, the monitoring system has been empowered and renovated in order to meet project needs.

    In this paper, the data recorded by this renewed system between 6 October 2010 and 5 October 2011 are presented and 329 selected microseismic events are analysed. The data processing has concerned the classification of the recorded signals, the analysis of their distribution in time and the identification of the most important trace characteristics in time and frequency domain. The interpretation of the results has evidenced a possible correlation between the temperature trend and the event occurrence.

    The research is still in progress and the data recording and interpretation are planned for a longer period to better investigate the spatial-temporal distribution of microseismic activity in the rock mass, with specific attention to the relation of microseismic activity with temperatures. The overall goal is to verify the possibility to set up an effective monitoring system for investigating the stability of a rock mass under permafrost conditions, in order to supply the researchers with useful data to better understand the relationship between temperature and rock mass stability and, possibly, the technicians with a valid tool for decision-making.

  18. Analysis of microseismic signals and temperature recordings for rock slope stability investigations in high mountain areas

    Science.gov (United States)

    Occhiena, C.; Coviello, V.; Arattano, M.; Chiarle, M.; Morra di Cella, U.; Pirulli, M.; Pogliotti, P.; Scavia, C.

    2012-07-01

    The permafrost degradation is a probable cause for the increase of rock instabilities and rock falls observed in recent years in high mountain areas, particularly in the Alpine region. The phenomenon causes the thaw of the ice filling rock discontinuities; the water deriving from it subsequently freezes again inducing stresses in the rock mass that may lead, in the long term, to rock falls. To investigate these processes, a monitoring system composed by geophones and thermometers was installed in 2007 at the Carrel hut (3829 m a.s.l., Matterhorn, NW Alps). In 2010, in the framework of the Interreg 2007-2013 Alcotra project no. 56 MASSA, the monitoring system has been empowered and renovated in order to meet project needs. In this paper, the data recorded by this renewed system between 6 October 2010 and 5 October 2011 are presented and 329 selected microseismic events are analysed. The data processing has concerned the classification of the recorded signals, the analysis of their distribution in time and the identification of the most important trace characteristics in time and frequency domain. The interpretation of the results has evidenced a possible correlation between the temperature trend and the event occurrence. The research is still in progress and the data recording and interpretation are planned for a longer period to better investigate the spatial-temporal distribution of microseismic activity in the rock mass, with specific attention to the relation of microseismic activity with temperatures. The overall goal is to verify the possibility to set up an effective monitoring system for investigating the stability of a rock mass under permafrost conditions, in order to supply the researchers with useful data to better understand the relationship between temperature and rock mass stability and, possibly, the technicians with a valid tool for decision-making.

  19. Pulling smarties out of a bag: a Headed Records analysis of children's recall of their own past beliefs.

    Science.gov (United States)

    Barreau, S; Morton, J

    1999-11-01

    The work reported provides an information processing account of young children's performance on the Smarties task (Perner, J., Leekam, S.R., & Wimmer, H. 1987, Three-year-olds' difficulty with false belief: the case for a conceptual deficit. British Journal of Developmental Psychology, 5, 125-137). In this task, a 3-year-old is shown a Smarties tube and asked about the supposed contents. The true contents, pencils, is then revealed, and the majority of 3-year-olds cannot recall their initial belief that the tube contained Smarties. The theoretical analysis, based on the Headed Records framework (Morton, J., Hammersley, R.J., & Bekerian, D.A. 1985, Headed records: a model for memory and its failures, Cognition, 20, 1-23), focuses on the computational conditions that are required to resolve the Smarties task; on the possible limitations in the developing memory system that may lead to a computational breakdown; and on ways of bypassing such limitations to ensure correct resolution. The design, motivated by this analysis, is a variation on Perner's Smarties task. Instead of revealing the tube's contents immediately after establishing the child's beliefs about it, these contents were then transferred to a bag and a (false) belief about the bag's contents established. Only then were the true contents of the bag revealed. The same procedure (different contents) was carried out a week later. As predicted children's performance was better (a) in the 'tube' condition; and (b) on the second test. Consistent with the proposed analysis, the data show that when the computational demands imposed by the original task are reduced, young children can and do remember what they had thought about the contents of the tube even after its true contents are revealed. PMID:10536224

  20. Regional flood impact assessment based on local land use patterns and sample damage records

    International Nuclear Information System (INIS)

    Increasing land consumption and land demand particularly in mountainous regions entail further expansion of settlements to known hazard-prone areas. Potential impacts as well as regionally defined levels of 'acceptable risk' are often not transparently communicated and residual risks are not perceived by the public. Analysing past events and assessing regional damage potentials can help planners on all levels to improve comprehensive and sustainable risk management. In this letter, a geospatial and statistical approach to regional damage cost assessment is presented, integrating information on actual conditions in terms of land use disparities and recorded damage data from a documented severe flooding event. In a first step building objects are categorized according to their function and use. Tabular company information is linked to the building model via geocoded postal address data, enabling classification of building types in terms of predominant uses. For the disaster impact assessment the flood plain is delineated based on post-disaster aerial imagery and a digital terrain model distinguishing areas of long and short term flooding. Finally, four regional damage cost assessment scenarios on different levels of detail are calculated. The damage cost projection relies on available sample building-level damage records, allowing rough damage averaging for distinct building uses. Results confirm that consideration of local land use patterns is essential for optimizing regional damage cost projections.

  1. Experimental analysis of decay biases in the fossil record of lobopodians

    Science.gov (United States)

    Murdock, Duncan; Gabbott, Sarah; Purnell, Mark

    2016-04-01

    If fossils are to realize their full potential in reconstructing the tree of life we must understand how our view of ancient organisms is obscured by taphonomic filters of decay and preservation. In most cases, processes of decay will leave behind either nothing or only the most decay resistant body parts, and even in those rare instances where soft tissues are fossilized we cannot assume that the resulting fossil, however exquisite, represents a faithful anatomical representation of the animal as it was in life.Recent experiments have shown that the biases introduced by decay can be far from random; in chordates, for example, the most phylogenetically informative characters are also the most decay-prone, resulting in 'stemward slippage'. But how widespread is this phenomenon, and are there other non-random biases linked to decay? Intuitively, we make assumptions about the likelihood of different kinds of characters to survive and be preserved, with knock-on effects for anatomical and phylogenetic interpretations. To what extent are these assumptions valid? We combine our understanding of the fossil record of lobopodians with insights from decay experiments of modern onychophorans (velvet worms) to test these assumptions. Our analysis demonstrates that taphonomically informed tests of character interpretations have the potential to improve phylogenetic resolution. This approach is widely applicable to the fossil record - allowing us to ground-truth some of the assumptions involved in describing exceptionally preserved fossil material.

  2. Estimation of the Contribution of Intrinsic Currents to Motoneuron Firing Based on Paired Motoneuron Discharge Records in the Decerebrate Cat

    OpenAIRE

    Powers, Randall K.; Nardelli, Paul; Cope, T. C.

    2008-01-01

    Motoneuron activation is strongly influenced by persistent inward currents (PICs) flowing through voltage-sensitive channels. PIC characteristics and their contribution to the control of motoneuron firing rate have been extensively described in reduced animal preparations, but their contribution to rate modulation in human motoneurons is controversial. It has recently been proposed that the analysis of discharge records of a simultaneously recorded pair of motor units can be used to make quan...

  3. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  4. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  5. Mercury Determination in Fish Samples by Chronopotentiometric Stripping Analysis Using Gold Electrodes Prepared from Recordable CDs

    Directory of Open Access Journals (Sweden)

    Andrei Florin Danet

    2008-11-01

    Full Text Available A simple method for manufacturing gold working electrodes for chronopotentiometric stripping measurements from recordable CD-R’s is described. These gold electrodes are much cheaper than commercially available ones. The electrochemical behavior of such an electrode and the working parameters for mercury determination by chronopotentiometric stripping analysis were studied. Detection limit was 0.30 μg Hg/L and determination limit was 1.0 μg Hg/L for a deposition time of 600 s. Using the developed working electrodes it was possible to determine the total mercury in fish samples. A method for fish sample digestion was developed by using a mixture of fuming nitric acid and both concentrated sulfuric and hydrochloric acids. The recovery degree for a known amount of mercury introduced in the sample before digestion was 95.3% (n=4.

  6. Integrated interpretation of helicopter and ground-based geophysical data recorded within the Okavango Delta, Botswana

    DEFF Research Database (Denmark)

    Podgorski, Joel E.; Green, Alan G.; Kalscheuer, Thomas;

    2015-01-01

    Integration of information from the following sources has been used to produce a much better constrained and more complete four-unit geological/hydrological model of the Okavango Delta than previously available: (i) a 3D resistivity model determined from helicopter time-domain electromagnetic (HTEM......) data recorded across most of the delta, (ii) 2D models and images derived from ground-based electrical resistance tomographic, transient electromagnetic, and high resolution seismic reflection/refraction tomographic data acquired at four selected sites in western and north-central regions of the delta...... electrical resistivities and very low to low P-wave velocities. Except for images of several buried abandoned river channels, it is non-reflective. The laterally extensive underlying unit of low resistivities, low P-wave velocity, and subhorizontal reflectors very likely contains saline-water-saturated sands...

  7. TEXTURE ANALYSIS BASED IRIS RECOGNITION

    OpenAIRE

    GÜRKAN, Güray; AKAN, Aydın

    2012-01-01

    In this paper, we present a new method for personal identification, based on iris patterns. The method composed of iris image acquisition, image preprocessing, feature extraction and finally decision stages. Normalized iris images are vertically log-sampled and filtered by circular symmetric Gabor filters. The output of filters are windowed and mean absolute deviation of pixels in the window are calculated as the feature vectors. The proposed  method has the desired properties of an iris reco...

  8. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  9. ROAn, a ROOT based Analysis Framework

    CERN Document Server

    Lauf, Thomas

    2013-01-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and...

  10. Automatic BSS-based filtering of metallic interference in MEG recordings: definition and validation using simulated signals

    Science.gov (United States)

    Migliorelli, Carolina; Alonso, Joan F.; Romero, Sergio; Mañanas, Miguel A.; Nowak, Rafał; Russi, Antonio

    2015-08-01

    Objective. One of the principal drawbacks of magnetoencephalography (MEG) is its high sensitivity to metallic artifacts, which come from implanted intracranial electrodes and dental ferromagnetic prosthesis and produce a high distortion that masks cerebral activity. The aim of this study was to develop an automatic algorithm based on blind source separation (BSS) techniques to remove metallic artifacts from MEG signals. Approach. Three methods were evaluated: AMUSE, a second-order technique; and INFOMAX and FastICA, both based on high-order statistics. Simulated signals consisting of real artifact-free data mixed with real metallic artifacts were generated to objectively evaluate the effectiveness of BSS and the subsequent interference reduction. A completely automatic detection of metallic-related components was proposed, exploiting the known characteristics of the metallic interference: regularity and low frequency content. Main results. The automatic procedure was applied to the simulated datasets and the three methods exhibited different performances. Results indicated that AMUSE preserved and consequently recovered more brain activity than INFOMAX and FastICA. Normalized mean squared error for AMUSE decomposition remained below 2%, allowing an effective removal of artifactual components. Significance. To date, the performance of automatic artifact reduction has not been evaluated in MEG recordings. The proposed methodology is based on an automatic algorithm that provides an effective interference removal. This approach can be applied to any MEG dataset affected by metallic artifacts as a processing step, allowing further analysis of unusable or poor quality data.

  11. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Fong-Lin Jang

    2013-10-01

    Full Text Available The health care sector has become increasingly interested in developing personal health record (PHR systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM that integrates the physician-patient relationship (PPR construct into TAM’s original perceived ease of use (PEOU and perceived usefulness (PU constructs to explore which factors will influence the behavioral intentions (BI of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR.

  12. Excel-Based Business Analysis

    CERN Document Server

    Anari, Ali

    2012-01-01

    ai"The trend is your friend"is a practical principle often used by business managers, who seek to forecast future sales, expenditures, and profitability in order to make production and other operational decisions. The problem is how best to identify and discover business trends and utilize trend information for attaining objectives of firms.This book contains an Excel-based solution to this problem, applying principles of the authors' "profit system model" of the firm that enables forecasts of trends in sales, expenditures, profits and other business variables. The program,

  13. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  14. Pulse artifact detection in simultaneous EEG-fMRI recording based on EEG map topography.

    Science.gov (United States)

    Iannotti, Giannina R; Pittau, Francesca; Michel, Christoph M; Vulliemoz, Serge; Grouiller, Frédéric

    2015-01-01

    One of the major artifact corrupting electroencephalogram (EEG) acquired during functional magnetic resonance imaging (fMRI) is the pulse artifact (PA). It is mainly due to the motion of the head and attached electrodes and wires in the magnetic field occurring after each heartbeat. In this study we propose a novel method to improve PA detection by considering the strong gradient and inversed polarity between left and right EEG electrodes. We acquired high-density EEG-fMRI (256 electrodes) with simultaneous electrocardiogram (ECG) at 3 T. PA was estimated as the voltage difference between right and left signals from the electrodes showing the strongest artifact (facial and temporal). Peaks were detected on this estimated signal and compared to the peaks in the ECG recording. We analyzed data from eleven healthy subjects, two epileptic patients and four healthy subjects with an insulating layer between electrodes and scalp. The accuracy of the two methods was assessed with three criteria: (i) standard deviation, (ii) kurtosis and (iii) confinement into the physiological range of the inter-peak intervals. We also checked whether the new method has an influence on the identification of epileptic spikes. Results show that estimated PA improved artifact detection in 15/17 cases, when compared to the ECG method. Moreover, epileptic spike identification was not altered by the correction. The proposed method improves the detection of pulse-related artifacts, particularly crucial when the ECG is of poor quality or cannot be recorded. It will contribute to enhance the quality of the EEG increasing the reliability of EEG-informed fMRI analysis. PMID:25307731

  15. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review

    Directory of Open Access Journals (Sweden)

    Martínez-García Ana I

    2010-10-01

    Full Text Available Abstract Background The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. Methods This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. Results The review identifies two main issues: 1 a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2 the definition of a base for the design of a conceptual

  16. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  17. Implementation of a cloud-based electronic medical record to reduce gaps in the HIV treatment continuum in rural Kenya

    OpenAIRE

    John Haskew; Gunnar Rø; Kenrick Turner; Davies Kimanga; Martin Sirengo; Shahnaaz Sharif

    2015-01-01

    Background Electronic medical record (EMR) systems are increasingly being adopted to support the delivery of health care in developing countries and their implementation can help to strengthen pathways of care and close gaps in the HIV treatment cascade by improving access to and use of data to inform clinical and public health decision-making. Methods This study implemented a novel cloud-based electronic medical record system in an HIV outpatient setting in Western Kenya and eval...

  18. Reconstructing Past Depositional and Diagenetic Processes through Quantitative Stratigraphic Analysis of the Martian Sedimentary Rock Record

    Science.gov (United States)

    Stack, Kathryn M.

    High-resolution orbital and in situ observations acquired of the Martian surface during the past two decades provide the opportunity to study the rock record of Mars at an unprecedented level of detail. This dissertation consists of four studies whose common goal is to establish new standards for the quantitative analysis of visible and near-infrared data from the surface of Mars. Through the compilation of global image inventories, application of stratigraphic and sedimentologic statistical methods, and use of laboratory analogs, this dissertation provides insight into the history of past depositional and diagenetic processes on Mars. The first study presents a global inventory of stratified deposits observed in images from the High Resolution Image Science Experiment (HiRISE) camera on-board the Mars Reconnaissance Orbiter. This work uses the widespread coverage of high-resolution orbital images to make global-scale observations about the processes controlling sediment transport and deposition on Mars. The next chapter presents a study of bed thickness distributions in Martian sedimentary deposits, showing how statistical methods can be used to establish quantitative criteria for evaluating the depositional history of stratified deposits observed in orbital images. The third study tests the ability of spectral mixing models to obtain quantitative mineral abundances from near-infrared reflectance spectra of clay and sulfate mixtures in the laboratory for application to the analysis of orbital spectra of sedimentary deposits on Mars. The final study employs a statistical analysis of the size, shape, and distribution of nodules observed by the Mars Science Laboratory Curiosity rover team in the Sheepbed mudstone at Yellowknife Bay in Gale crater. This analysis is used to evaluate hypotheses for nodule formation and to gain insight into the diagenetic history of an ancient habitable environment on Mars.

  19. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  20. Fragmented implementation of maternal and child health home-based records in Vietnam: need for integration

    Directory of Open Access Journals (Sweden)

    Hirotsugu Aiga

    2016-02-01

    Full Text Available Background: Home-based records (HBRs are globally implemented as the effective tools that encourage pregnant women and mothers to timely and adequately utilise maternal and child health (MCH services. While availability and utilisation of nationally representative HBRs have been assessed in several earlier studies, the reality of a number of HBRs subnationally implemented in a less coordinated manner has been neither reported nor analysed. Objectives: This study is aimed at estimating the prevalence of HBRs for MCH and the level of fragmentation of and overlapping between different HBRs for MCH in Vietnam. The study further attempts to identify health workers’ and mothers’ perceptions towards HBR operations and utilisations. Design: A self-administered questionnaire was sent to the provincial health departments of 28 selected provinces. A copy of each HBR available was collected from them. A total of 20 semi-structured interviews with health workers and mothers were conducted at rural communities in four of 28 selected provinces. Results: Whereas HBRs developed exclusively for maternal health and exclusively for child health were available in four provinces (14% and in 28 provinces (100%, respectively, those for both maternal health and child health were available in nine provinces (32%. The mean number of HBRs in 28 provinces (=5.75 indicates over-availability of HBRs. All 119 minimum required items for recording found in three different HBRs under nationwide scale-up were also included in the Maternal and Child Health Handbook being piloted for nationwide scaling-up. Implementation of multiple HBRs is likely to confuse not only health workers by requiring them to record the same data on several HBRs but also mothers about which HBR they should refer to and rely on at home. Conclusions: To enable both health workers and pregnant women to focus on only one type of HBR, province-specific HBRs for maternal and/or child health need to be

  1. Application of Dense Array Analysis to Strong Motion Data Recorded at The SMART-1 Array

    Science.gov (United States)

    Francois, C.

    2003-12-01

    This paper is part of a project to design an optimal strong motion dense array in New Zealand. The overall project looks at developing a dense network of strong motion seismometers in order to measure directly the rupture process of major seismogenic sources such as the Alpine Fault and strands of the Marlborough Fault System defining the South Island sector of the Australia-Pacific plate boundary zone. This work shows the application of dense array analysis to a set of seismic data recorded at the SMART-1 array in Taiwan (data kindly provided by the Institute of Earth Sciences, Academia Sinica Data Management Center for Strong Motion Seismology - Taiwan). The data have been processed and analysed applying modified MUSIC algorithm with higher computing capabilities giving higher resolution results. The SMART-1 array is an ideal dense array of 37 strong motion instruments set up in the following configuration: 3 concentric circles of radii 200m, 1 km and 2km, and one central station. The studied event called Event 5 was recorded on January 29th 1981 and had a magnitude 6. Event 5 is an ideal case study as its epicentral distance (about 30 km) is comparable to epicentral distances for expected events on the Alpine Fault or on the Hope Fault in New Zealand. Event 5 has been previously widely analysed using strong motion array studies and aftershocks studies but with disagreeing results; this new study hopes to bring new insights in the debate. Using simple fault and velocity models, this latest analysis of Event 5 has given the following rupture properties. It has confirmed one of the hypotheses that the fault ruptured from southeast to northwest. The higher resolution of the computation has improved the location of the hypocentre depth and the location of the propagating rupture front. This allowed resolving changes of velocities in the rupture process and locating asperities in the fault plane. Contrary to the previous array studies, the inferred size of the fault

  2. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  3. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  4. Understanding the Impact of Electronic Medical Record Use on Practice-Based Population Health Management: A Mixed-Method Study

    Science.gov (United States)

    Hughes, John B; Law, Susan; Lortie, Michel; Leaver, Chad; Lapointe, Liette

    2016-01-01

    Background Practice-based population health (PBPH) management is the proactive management of patients by their primary care clinical team. The ability of clinics to engage in PBPH and the means by which they incorporate it in a clinical setting remain unknown. Objective We conducted the Canadian Population Health Management Challenge to determine the capacity and preparedness of primary care settings to engage in PBPH using their existing medical record systems and to understand the complexities that may exist in PBPH implementation. Methods We recruited a sample of electronic medical record (EMR) -enabled and paper-based clinics from across Canada to participate in the challenge. The challenge required clinic staff and physicians to complete time-controlled, evidence-based practice reviews of their patients who may benefit from evidence-informed care, treatment, or interventions across five different areas (immunization, postmyocardial infarction care, cancer screening, diabetes management, and medication recall). We formulated a preparedness index to measure the capacity of clinics to engage in PBPH management. Finally, we conducted follow-up qualitative interviews to provide richer understanding of PBPH implementation and related issues (ie, challenges and facilitators). Results A total of 11 primary care clinics participated, representing 21 clinician practices. EMR-enabled clinics completed a full review of charts in an average of 1.37 hours. On the contrary, paper-based clinics reviewed nearly 10% of their charts in an average of 3.9 hours, hinting that they would have required an estimated 40 hours to complete a review of charts in their practice. Furthermore, the index revealed a major gap in preparedness between the EMR and paper-based clinics (0.86–3.78 vs 0.05–0.12), as well as a broad range among the EMR clinics. Finally, building on the results of the qualitative analysis, we identified factors facilitating the integration of PBPH. Conclusions Our

  5. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  6. EHR query language (EQL)--a query language for archetype-based health records.

    Science.gov (United States)

    Ma, Chunlan; Frankel, Heath; Beale, Thomas; Heard, Sam

    2007-01-01

    OpenEHR specifications have been developed to standardise the representation of an international electronic health record (EHR). The language used for querying EHR data is not as yet part of the specification. To fill in this gap, Ocean Informatics has developed a query language currently known as EHR Query Language (EQL), a declarative language supporting queries on EHR data. EQL is neutral to EHR systems, programming languages and system environments and depends only on the openEHR archetype model and semantics. Thus, in principle, EQL can be used in any archetype-based computational context. In the EHR context described here, particular queries mention concepts from the openEHR EHR Reference Model (RM). EQL can be used as a common query language for disparate archetype-based applications. The use of a common RM, archetypes, and a companion query language, such as EQL, semantic interoperability of EHR information is much closer. This paper introduces the EQL syntax and provides example clinical queries to illustrate the syntax. Finally, current implementations and future directions are outlined. PMID:17911747

  7. Image Processing Based Girth Monitoring and Recording System for Rubber Plantations

    Directory of Open Access Journals (Sweden)

    Chathura Thilakarathne

    2015-02-01

    Full Text Available Measuring the girth and continuous monitoring of the increase in girth is one of the most important processes in rubber plantations since identification of girth deficiencies would enable planters to take corrective actions to ensure a good yield from the plantation. This research paper presents an image processing based girth measurement & recording system that can replace existing manual process in an efficient and economical manner. The system uses a digital image of the tree which uses the current number drawn on the tree to identify the tree number & its width. The image is threshold first & then filtered out using several filtering criterion to identify possible candidates for numbers. Identified blobs are then fed to the Tesseract OCR for number recognition. Threshold image is then filtered again with different criterion to segment out the black strip drawn on the tree which is then used to calculate the width of the tree using calibration parameters. Once the tree number is identified & width is calculated the girth the measured girth of the tree is stored in the data base under the identified tree number. The results obtained from the system indicated significant improvement in efficiency & economy for main plantations. As future developments we are proposing a standard commercial system for girth measurement using standardized 2D Bar Codes as tree identifiers

  8. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  9. Curvelet Based Offline Analysis of SEM Images

    OpenAIRE

    Shirazi, Syed Hamad; Haq, Nuhman ul; Hayat, Khizar; Naz, Saeeda; Haque, Ihsan ul

    2014-01-01

    Manual offline analysis, of a scanning electron microscopy (SEM) image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method ...

  10. Maturity Matrices for Quality of Model- and Observation-Based Climate Data Records

    Science.gov (United States)

    Höck, Heinke; Kaiser-Weiss, Andrea; Kaspar, Frank; Stockhause, Martina; Toussaint, Frank; Lautenschlager, Michael

    2015-04-01

    In the field of Software Engineering the Capability Maturity Model is used to evaluate and improve software development processes. The application of a Maturity Matrix is a method to assess the degree of software maturity. This method was adapted to the maturity of Earth System data in scientific archives. The application of such an approach to Climate Data Records was first proposed in the context of satellite-based climate products and applied by NOAA and NASA. The European FP7 project CORE-CLIMAX suggested and tested extensions of the approach in order to allow the applicability to additional climate datasets, e.g. based on in-situ observations as well as model-based reanalysis. Within that project the concept was applied to products of satellite- and in-situ based datasets. Examples are national ground-based data from Germany as an example for typical products of a national meteorological service, the EUMETSAT Satellite Application Facility Network, the ESA Climate Change Initiative, European Reanalysis activities (ERA-CLIM) and international in situ-based climatologies such as GPCC, ECA&D, BSRN, HadSST. Climate models and their related output have some additional characteristics that need specific consideration in such an approach. Here we use examples from the World Data Centre for Climate (WDCC) to discuss the applicability. The WDCC focuses on climate data products, specifically those resulting from climate simulations. Based on these already existing Maturity Matrix models, WDCC developed a generic Quality Assessment System for Earth System data. A self-assessment is performed using a maturity matrix evaluating the data quality for five maturity levels with respect to the criteria data and metadata consistency, completeness, accessibility and accuracy. The classical goals of a quality assessment system in a data processing workflow are: (1) to encourage data creators to improve quality to reach the next quality level, (2) enable data consumers to decide

  11. How to limit the burden of data collection for Quality Indicators based on medical records? The COMPAQH experience

    Directory of Open Access Journals (Sweden)

    Grenier Catherine

    2008-10-01

    Full Text Available Abstract Background Our objective was to limit the burden of data collection for Quality Indicators (QIs based on medical records. Methods The study was supervised by the COMPAQH project. Four QIs based on medical records were tested: medical record conformity; traceability of pain assessment; screening for nutritional disorders; time elapsed before sending copy of discharge letter to the general practitioner. Data were collected by 6 Clinical Research Assistants (CRAs in a panel of 36 volunteer hospitals and analyzed by COMPAQH. To limit the burden of data collection, we used the same sample of medical records for all 4 QIs, limited sample size to 80 medical records, and built a composite score of only 10 items to assess medical record completeness. We assessed QI feasibility by completing a grid of 19 potential problems and evaluating time spent. We assessed reliability (κ coefficient as well as internal consistency (Cronbach α coefficient in an inter-observer study, and discriminatory power by analysing QI variability among hospitals. Results Overall, 23 115 data items were collected for the 4 QIs and analyzed. The average time spent on data collection was 8.5 days per hospital. The most common feasibility problem was misunderstanding of the item by hospital staff. QI reliability was good (κ: 0.59–0.97 according to QI. The hospitals differed widely in their ability to meet the quality criteria (mean value: 19–85%. Conclusion These 4 QIs based on medical records can be used to compare the quality of record keeping among hospitals while limiting the burden of data collection, and can therefore be used for benchmarking purposes. The French National Health Directorate has included them in the new 2009 version of the accreditation procedure for healthcare organizations.

  12. [The influence of Donguibogam during the middle Joseon era based on clinical records on low back pain in Seungjeongwon ilgi].

    Science.gov (United States)

    Jung, Jae Young; Lee, Jun Hwan; Chung, Seok Hee

    2011-06-30

    The recently increasing interest in historical records has led to more research on historical records in various fields of study. This trend has also affected medical research, with the medical climate and popular treatment modalities of the past now being revealed based on historical records. However, most research on medical history during the Joseon era has been based on the most well-known record, Joseon wangjo sillok or Annals of the Joseon Dynasty. Joseon wangjo sillok is a comprehensive and organized record of society during the Joseon era and contains key knowledge about medical history during the period, but it lacks details on the treatment of common disorders at the time. Seungjeongwon ilgi or Diary of the Royal Secretariat has detailed records of daily events and is a valuable resource for the daily activities of the era. And in the middle Josoen era, a variety of medical books - especially Donguibogam - was published. Therefore, the authors focused on the under-researched Seungjeongwon ilgi, Donguibogam and attempted to assess and evaluate low back pain treatment performed on Joseon royalty. The most notable characteristic of low back treatment records within the Seungjeongwon ilgi is that diagnosis and treatment was made based on an independent Korean medicine, rather than conventional Chinese medicine. This paradigm shift is represented in Dongeuibogam, and can be seen in the close relationship between Dongeuibogam and national medical exams of the day. Along with the pragmatism of the middle Joseon era, medical treatment also put more focus on pragmatic treatment methods, and records show emphasis on acupuncture and moxibustion and other points in accord with this. The authors also observed meaning and limitations of low back pain treatment during that era through comparison with current diagnosis and treatment. PMID:21894068

  13. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton

    2012-01-01

    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  14. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos

    2012-03-01

    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  15. Reconstructing Late Pleistocene air temperature variability based on branched GDGTs in the sedimentary record of Llangorse Lake (Wales)

    Science.gov (United States)

    Maas, David; Hoek, Wim; Peterse, Francien; Akkerman, Keechy; Macleod, Alison; Palmer, Adrian; Lowe, John

    2015-04-01

    This study aims to provide a temperature reconstruction of the Lateglacial sediments of Llangorse Lake. A new temperature proxy is used, based on the occurrence of different membrane lipids of soil bacteria (de Jonge et al., 2014). Application of this proxy on lacustrine environments is difficult because of in situ (water column) production and co-elution of isomers. Pollen analysis provides a palynological record that can be used for biostratigraphical correlation to other records. Llangorse Lake lies in a glacial basin just northeast of the Brecon Beacons in Powys, South Wales. The lake is located upstream in the Afon Llynfi valley, at the edge of the watershed of the River Wye. The lake consists of two semi-separated basins with a maximum water depth of 7.5 m, arranged in an L-shape with a surface area of roughly 1.5 km2. Previous studies have focused on the Holocene development of the lake and its surrounding environment (Jones et al., 1985). This study focuses on the deglacial record that appeared to be present in the basal part of the sequence. The lake was cored in the September, 2014 with a manual operated 3 m piston corer from a small coring platform. Overlapping cores were taken to form a continuous 12 m core, spanning the Holocene and the Lateglacial sediments. Six adjacent Lateglacial core segments from the southern basin of Llangorse lake were scanned for their major element composition using XRF scanning at 5 mm resolution to discern changes in sediment origin. Furthermore, loss on ignition (LOI) analysis was used to determine the changes in organic content of the sediments. Subsamples of the Lateglacial sedimentary record were analyzed for the occurrence of different bacterial membrane lipids (brGDGTs: branched glycerol dialkyl glycerol tetraethers) by means of HPLC-MS (high performance liquid chromatography and mass spectrometry) using two silica columns to achieve proper separation of isomers (de Jonge et al., 2013). Air temperatures are

  16. A Patient-Based Analysis of Drug Disorder Diagnoses in the Medicare Population

    OpenAIRE

    Cartwright, William S.; Ingster, Lillian M.

    1993-01-01

    This article utilizes the Part A Medicare provider analysis and review (MEDPAR) file for fiscal year (FY) 1987. The discharge records were organized into a patient-based record that included alcohol, drug, and mental (ADM) disorder diagnoses as well as measures of resource use. The authors find that there are substantially higher costs of health care incurred by the drug disorder diagnosed population. Those of the Medicare population diagnosed with drug disorders had longer lengths of stay (L...

  17. Lakes as recorders of extreme flows: utilising particle size analysis to generate a millennial-scale palaeoflood record from the English Lake District

    Science.gov (United States)

    Schillereff, Daniel; Chiverrell, Richard; Macdonald, Neil; Hooke, Janet

    2013-04-01

    Developing new quantitative measures of catchment processes, such as flood events, is a key goal of geomorphologists. The geomorphic effects of extreme hydrological events are effectively recorded in upland lake basins as efficient sediment trapping renders flow-related proxy indicators (e.g., particle size) reflective of changes in river discharge. We demonstrate that integrating particle size analysis of lake sediment cores with data from an on-going sediment trapping protocol within the lake can provide a valuable natural archive for investigating hydrogeomorphic extremes over extended time periods. A series of sediment cores (3 - 5 m length) extracted from Brotherswater, English Lake District, contain numerous coarse-grained laminations, discerned by applying high-resolution (0.5 cm) laser granulometry and interpreted to reflect a palaeoflood record extending to ~2000 yr BP. Well-constrained core chronologies are derived through integrating radionuclide (210Pb, 137Cs, 241Am, 14C) dating with geochemical markers which reflect phases of local lead (Pb) mining. Geochemical and magnetic profiles have facilitated precise core correlation and the repeatability of the distinctive coarse facies to be verified. That these laminae exhibit inverse grading underlying normal grading, most likely reflecting the waxing and waning of flood-induced hyperpycnal flows, supports our palaeoflood interpretation. Application of a recently-published end-member model for unmixing particle size distributions (Deitze et al., 2012) demonstrates a prominent coarse end-member (medium sand) which we attribute to fluvial transport of coarse grains during high-magnitude flows. Two end members feature in the silt-size fraction, most likely reflecting the sedimentary component delivered under normal flow conditions. The relative importance of these two modes appears to respond to catchment conditioning due to land-use change, which has important implications for how flood events may be recorded

  18. A Study on Enhancing Data Storage Capacity and Mechanical Reliability of Solid Immersion Lens-Based Near-Field Recording System

    Science.gov (United States)

    Park, No-Cheol; Yang, Hyun-Seok; Rhim, Yoon-Cheol; Park, Young-Pil

    2008-08-01

    In this study, several technical issues on solid immersion lens (SIL)-based near-field recording (NFR) are explored, namely, to enhance storage capacity and to guarantee mechanical reliability of the device. For the purpose of enhancing the storage capacity of the NFR system, two optical configurations using radial polarization and dual recording layers are proposed. Through a feasibility analysis of the proposed optical configuration with radial polarization, it was determined that illumination of radial polarization is not a suitable solution to achieve higher areal density. To apply highly focusing characteristics of incidence of radial polarized light to cover-layer protected data storage, an annular pupil filtering method was introduced. Complete field analysis of the proposed dual layered NFR optics verified its feasibility, and the assembly of the SIL of the proposed model was successfully achieved. In addition, to improve mechanical reliability of the SIL-based NFR system, improved near-field (NF) air-gap servo methods and air flow analysis around the low part of the SIL have been evaluated. With improved NF gap servo methods using an error-based disturbance observer (EDOB) on a base air-gap controller, residual gap errors were markebly reduced by 26.26% while controlling the NF air-gap to 30 nm. Air flow near the head media interface was visualized and an undesirable effect of backward flow climbing from the bottom surface of the SIL was ovserved.

  19. Analysis of a Chaotic Memristor Based Oscillator

    Directory of Open Access Journals (Sweden)

    F. Setoudeh

    2014-01-01

    Full Text Available A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT. Then, using Advance Design System (ADS software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of the paper.

  20. Analysis of a Chaotic Memristor Based Oscillator

    OpenAIRE

    F. Setoudeh; Khaki Sedigh, A.; Dousti, M

    2014-01-01

    A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT). Then, using Advance Design System (ADS) software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of t...

  1. Independent Component Analysis and Decision Trees for ECG Holter Recording De-Noising

    OpenAIRE

    Jakub Kuzilek; Vaclav Kremen; Filip Soucek; Lenka Lhotska

    2014-01-01

    We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA). This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE) between origin...

  2. Using a Web-Based Database to Record and Monitor Athletic Training Students' Clinical Experiences

    Science.gov (United States)

    Brown, Kirk W.; Williams, Lisa; Janicki, Thomas

    2008-01-01

    Objective: The purpose of this article is to introduce a documentation recording system employing the Microsoft Structured Query Language (MS-SQL) database used by the Athletic Training Education Program (ATEP) for recording and monitoring of athletic training student (ATS) clinical experiences and hours. Background: Monitoring ATSs clinical…

  3. An integrable, web-based solution for easy assessment of video-recorded performances

    DEFF Research Database (Denmark)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; howe......-recorded performances (ISEA)....

  4. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L;

    2014-01-01

    BACKGROUND: Elevated resting heart rate (RHR) is associated with cardiovascular mortality and morbidity. Assessment of heart rate (HR) from Holter recording may afford a more precise estimate of the effect of RHR on cardiovascular risk, as compared to casual RHR. Comparative analysis was carried ...

  5. Multimedia consultation session recording and playback using Java-based browser in global PACS

    Science.gov (United States)

    Martinez, Ralph; Shah, Pinkesh J.; Yu, Yuan-Pin

    1998-07-01

    The current version of the Global PACS software system uses a Java-based implementation of the Remote Consultation and Diagnosis (RCD) system. The Java RCD includes a multimedia consultation session between physicians that includes text, static image, image annotation, and audio data. The JAVA RCD allows 2-4 physicians to collaborate on a patient case. It allows physicians to join the session via WWW Java-enabled browsers or stand alone RCD application. The RCD system includes a distributed database archive system for archiving and retrieving patient and session data. The RCD system can be used for store and forward scenarios, case reviews, and interactive RCD multimedia sessions. The RCD system operates over the Internet, telephone lines, or in a private Intranet. A multimedia consultation session can be recorded, and then played back at a later time for review, comments, and education. A session can be played back using Java-enabled WWW browsers on any operating system platform. The JAVA RCD system shows that a case diagnosis can be captured digitally and played back with the original real-time temporal relationships between data streams. In this paper, we describe design and implementation of the RCD session playback.

  6. Implications of the Java language on computer-based patient records.

    Science.gov (United States)

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs. PMID:8947762

  7. A tutorial on activity-based costing of electronic health records.

    Science.gov (United States)

    Federowicz, Marie H; Grossman, Mila N; Hayes, Bryant J; Riggs, Joseph

    2010-01-01

    As the American Recovery and Restoration Act of 2009 allocates $19 billion to health information technology, it will be useful for health care managers to project the true cost of implementing an electronic health record (EHR). This study presents a step-by-step guide for using activity-based costing (ABC) to estimate the cost of an EHR. ABC is a cost accounting method with a "top-down" approach for estimating the cost of a project or service within an organization. The total cost to implement an EHR includes obvious costs, such as licensing fees, and hidden costs, such as impact on productivity. Unlike other methods, ABC includes all of the organization's expenditures and is less likely to miss hidden costs. Although ABC is used considerably in manufacturing and other industries, it is a relatively new phenomenon in health care. ABC is a comprehensive approach that the health care field can use to analyze the cost-effectiveness of implementing EHRs. In this article, ABC is applied to a health clinic that recently implemented an EHR, and the clinic is found to be more productive after EHR implementation. This methodology can help health care administrators assess the impact of a stimulus investment on organizational performance. PMID:20042937

  8. Statistical analysis in dBASE-compatible databases.

    Science.gov (United States)

    Hauer-Jensen, M

    1991-01-01

    Database management in clinical and experimental research often requires statistical analysis of the data in addition to the usual functions for storing, organizing, manipulating and reporting. With most database systems, transfer of data to a dedicated statistics package is a relatively simple task. However, many statistics programs lack the powerful features found in database management software. dBASE IV and compatible programs are currently among the most widely used database management programs. d4STAT is a utility program for dBASE, containing a collection of statistical functions and tests for data stored in the dBASE file format. By using d4STAT, statistical calculations may be performed directly on the data stored in the database without having to exit dBASE IV or export data. Record selection and variable transformations are performed in memory, thus obviating the need for creating new variables or data files. The current version of the program contains routines for descriptive statistics, paired and unpaired t-tests, correlation, linear regression, frequency tables, Mann-Whitney U-test, Wilcoxon signed rank test, a time-saving procedure for counting observations according to user specified selection criteria, survival analysis (product limit estimate analysis, log-rank test, and graphics), and normal t and chi-squared distribution functions. PMID:2004275

  9. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    Science.gov (United States)

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency. PMID:23968036

  10. An ecometric analysis of the fossil mammal record of the Turkana Basin

    Science.gov (United States)

    Žliobaitė, Indrė; Kaya, Ferhat; Bibi, Faysal; Bobe, René; Leakey, Louise; Leakey, Meave; Patterson, David; Rannikko, Janina; Werdelin, Lars

    2016-01-01

    Although ecometric methods have been used to analyse fossil mammal faunas and environments of Eurasia and North America, such methods have not yet been applied to the rich fossil mammal record of eastern Africa. Here we report results from analysis of a combined dataset spanning east and west Turkana from Kenya between 7 and 1 million years ago (Ma). We provide temporally and spatially resolved estimates of temperature and precipitation and discuss their relationship to patterns of faunal change, and propose a new hypothesis to explain the lack of a temperature trend. We suggest that the regionally arid Turkana Basin may between 4 and 2 Ma have acted as a ‘species factory’, generating ecological adaptations in advance of the global trend. We show a persistent difference between the eastern and western sides of the Turkana Basin and suggest that the wetlands of the shallow eastern side could have provided additional humidity to the terrestrial ecosystems. Pending further research, a transient episode of faunal change centred at the time of the KBS Member (1.87–1.53 Ma), may be equally plausibly attributed to climate change or to a top-down ecological cascade initiated by the entry of technologically sophisticated humans. This article is part of the themed issue ‘Major transitions in human evolution’. PMID:27298463

  11. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    Science.gov (United States)

    Fritsche, L; Lindemann, G; Schroeter, K; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential benefits of an EPR. The aim of our project was to investigate the feasibility of an EPR based solely on "of-the-shelf"-software and Internet-technology in the field of organ transplantation. Methods The EPR-system consists of three main elements: Data-storage facilities, a Web-server and a user-interface. Data are stored either in a relational database (Sybase Adaptive 11.5, Sybase Inc., CA) or in case of pictures (JPEG) and files in application formats (e. g. Word-Documents) on a Windows NT 4.0 Server (Microsoft Corp., WA). The entire communication of all data is handled by a Web-server (IIS 4.0, Microsoft) with an Active Server Pages extension. The database is accessed by ActiveX Data Objects via the ODBC-interface. The only software required on the user's computer is the Internet Explorer 4.01 (Microsoft), during the first use of the EPR, the ActiveX HTML Layout Control is automatically added. The user can access the EPR via Local or Wide Area Network or by dial-up connection. If the EPR is accessed from outside the firewall, all communication is encrypted (SSL 3.0, Netscape Comm. Corp., CA).The speed of the EPR-system was tested with 50 repeated measurements of the duration of two key-functions: 1) Display of all lab results for a given day and patient and 2) automatic composition of a letter containing diagnoses, medication, notes and lab results. For the test a 233 MHz Pentium II Processor with 10 Mbit/s Ethernet connection (ping-time below 10 ms) over 2 hubs to the server

  12. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty

    2011-10-01

    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  13. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Nielsen, Mads; Lo, Pechin Chien Pau;

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based on...... subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density. The...

  14. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information: http://homer.ou.nl/lsa-workshop0

  15. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005)

    Science.gov (United States)

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  16. Records analysis of Altai earthquake dated September 27, 2003 obtained at seismic and infrasound stations of NNC RK

    International Nuclear Information System (INIS)

    The report contains testing results of progressive multi-channel correlation (PMCC) method, which was developed by specialists from Commissariat for Atomic Energy, France for infrasound and seismic data analysis, earthquake source parameterization on the basis of obtained results and also records use of unique earthquake as alternative calibration method of infrasound chain. (author)

  17. Quantitative analysis of single muscle fibre action potentials recorded at known distances

    NARCIS (Netherlands)

    Albers, B.A.; Put, J.H.M.; Wallinga, W.; Wirtz, P.

    1989-01-01

    In vivo records of single fibre action potentials (SFAPs) have always been obtained at unknown distance from the active muscle fibre. A new experimental method has been developed enabling the derivation of the recording distance in animal experiments. A single fibre is stimulated with an intracellu

  18. DETECTION OF QRS COMPLEXES OF ECG RECORDING BASED ON WAVELET TRANSFORM USING MATLAB

    Directory of Open Access Journals (Sweden)

    Ruchita Gautam,

    2010-07-01

    Full Text Available The electrocardiogram (ECG is quite important tool to find out more information about the heart. The main tasks in ECG signal analysis are the detection of QRS complex (i.e. R wave, and the estimation ofinstantaneous heart rate by measuring the time interval between two consecutive R-waves. After recognizing R wave, other components like P, Q, S and T can be detected by using window method. In this paper, we describe a QRS complex detector based on the Dyadic wavelet transform (DyWT which is robust in comparison with time- varying QRS complex morphology and to noise. We illustrate the performance of the DyWT-based QRS detector by considering problematic ECG signals from Common Standard for Electrocardiography (CSE database. We also compare and analyze its performance to some of the QRS detectors developed in the past.

  19. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Directory of Open Access Journals (Sweden)

    Catalina Pimiento

    Full Text Available Carcharocles megalodon ("Megalodon" is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma. Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems.

  20. When did Carcharocles megalodon become extinct? A new analysis of the fossil record.

    Science.gov (United States)

    Pimiento, Catalina; Clements, Christopher F

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding the effects of such an event in ancient communities. However, the time of extinction of this important species has never been quantitatively assessed. Here, we synthesize the most recent records of C. megalodon from the literature and scientific collections and infer the date of its extinction by making a novel use of the Optimal Linear Estimation (OLE) model. Our results suggest that C. megalodon went extinct around 2.6 Ma. Furthermore, when contrasting our results with known ecological and macroevolutionary trends in marine mammals, it became evident that the modern composition and function of modern gigantic filter-feeding whales was established after the extinction of C. megalodon. Consequently, the study of the time of extinction of C. megalodon provides the basis to improve our understanding of the responses of marine species to the removal of apex predators, presenting a deep-time perspective for the conservation of modern ecosystems. PMID:25338197

  1. An Open Architecture Scaleable Maintainable Software Defined Commodity Based Data Recorder And Correlator Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project addresses the need for higher data rate recording capability, increased correlation speed and flexibility needed for next generation VLBI systems. The...

  2. Version based spatial record management techniques for spatial database management system

    Institute of Scientific and Technical Information of China (English)

    KIM Ho-seok; KIM Hee-taek; KIM Myung-keun; BAE Hae-young

    2004-01-01

    The search operation of spatial data was a principal operation in existent spatial database management system, but the update operation of spatial data such as tracking are occurring frequently in the spatial database management system recently. So, necessity of concurrency improvement among transactions is increasing. In general database management system, many techniques have been studied to solve concurrency problem of transaction. Among them, multi-version algorithm does to minimize interference among transactions. However, to apply existent multi-version algorithm to improve concurrency of transaction to spatial database management system, the waste of storage happens because it must store entire version for spatial record even if only aspatial data of spatial record is changed. This paper has proposed the record management techniques to manage separating aspatial data version and spatial data version to decrease waste of storage for record version and improve concurrency among transactions.

  3. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  4. Polyphase Order Analysis Based on Convolutional Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-06-01

    Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.

  5. 13 CFR 106.403 - Who has authority to approve and sign a Non-Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who has authority to approve and...-Sponsored Activities § 106.403 Who has authority to approve and sign a Non-Fee Based Record? The appropriate Responsible Program Official, after consultation with the designated legal counsel, has authority to...

  6. Polarization properties of four-wave interaction in dynamic recording material based on bacteriorhodopsin

    Science.gov (United States)

    Korchemskaya, Ellen Y.; Soskin, Marat S.

    1994-10-01

    The polarization properties of four-wave interaction on polymer films with bacteriorhodopsin that possess anisotropically saturating nonlinearity are studied both theoretically and experimentally. The amplitude and the polarization of the diffracted wave for recording material with anisotropically saturating nonlinearity are calculated. Low saturation intensity allows the operation of the polarization of low-intensity signals to be realized. It is shown that control of the diffractive wave polarization is possible only with the variation of the light recording intensity.

  7. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng

    2006-01-01

    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  8. Social Network Analysis Based on Network Motifs

    OpenAIRE

    Xu Hong-lin; Yan Han-bing; Gao Cui-fang; Zhu Ping

    2014-01-01

    Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...

  9. Swarm Intelligence Based Algorithms: A Critical Analysis

    OpenAIRE

    Yang, Xin-She

    2014-01-01

    Many optimization algorithms have been developed by drawing inspiration from swarm intelligence (SI). These SI-based algorithms can have some advantages over traditional algorithms. In this paper, we carry out a critical analysis of these SI-based algorithms by analyzing their ways to mimic evolutionary operators. We also analyze the ways of achieving exploration and exploitation in algorithms by using mutation, crossover and selection. In addition, we also look at algorithms using dynamic sy...

  10. 3D Reconstruction of Human Laryngeal Dynamics Based on Endoscopic High-Speed Recordings.

    Science.gov (United States)

    Semmler, Marion; Kniesburges, Stefan; Birk, Veronika; Ziethe, Anke; Patel, Rita; Dollinger, Michael

    2016-07-01

    Standard laryngoscopic imaging techniques provide only limited two-dimensional insights into the vocal fold vibrations not taking the vertical component into account. However, previous experiments have shown a significant vertical component in the vibration of the vocal folds. We present a 3D reconstruction of the entire superior vocal fold surface from 2D high-speed videoendoscopy via stereo triangulation. In a typical camera-laser set-up the structured laser light pattern is projected on the vocal folds and captured at 4000 fps. The measuring device is suitable for in vivo application since the external dimensions of the miniaturized set-up barely exceed the size of a standard rigid laryngoscope. We provide a conservative estimate on the resulting resolution based on the hardware components and point out the possibilities and limitations of the miniaturized camera-laser set-up. In addition to the 3D vocal fold surface, we extended previous approaches with a G2-continuous model of the vocal fold edge. The clinical applicability was successfully established by the reconstruction of visual data acquired from 2D in vivo high-speed recordings of a female and a male subject. We present extracted dynamic parameters like maximum amplitude and velocity in the vertical direction. The additional vertical component reveals deeper insights into the vibratory dynamics of the vocal folds by means of a non-invasive method. The successful miniaturization allows for in vivo application giving access to the most realistic model available and hence enables a comprehensive understanding of the human phonation process. PMID:26829782

  11. Relating low-flow characteristics to the base flow recession time constant at partial record stream gauges

    Science.gov (United States)

    Eng, K.; Milly, P.C.D.

    2007-01-01

    Base flow recession information is helpful for regional estimation of low-flow characteristics. However, analyses that exploit such information generally require a continuous record of streamflow at the estimation site to characterize base flow recession. Here we propose a simple method for characterizing base flow recession at low-flow partial record stream gauges (i.e., sites with very few streamflow measurements under low-streamflow conditions), and we use that characterization as the basis for a practical new approach to low-flow regression. In a case study the introduction of a base flow recession time constant, estimated from a single pair of strategically timed streamflow measurements, approximately halves the root-mean-square estimation error relative to that of a conventional drainage area regression. Additional streamflow measurements can be used to reduce the error further.

  12. Independent component analysis and decision trees for ECG holter recording de-noising.

    Directory of Open Access Journals (Sweden)

    Jakub Kuzilek

    Full Text Available We have developed a method focusing on ECG signal de-noising using Independent component analysis (ICA. This approach combines JADE source separation and binary decision tree for identification and subsequent ECG noise removal. In order to to test the efficiency of this method comparison to standard filtering a wavelet- based de-noising method was used. Freely data available at Physionet medical data storage were evaluated. Evaluation criteria was root mean square error (RMSE between original ECG and filtered data contaminated with artificial noise. Proposed algorithm achieved comparable result in terms of standard noises (power line interference, base line wander, EMG, but noticeably significantly better results were achieved when uncommon noise (electrode cable movement artefact were compared.

  13. Managing Everyday Life: A Qualitative Study of Patients’ Experiences of a Web-Based Ulcer Record for Home-Based Treatment

    Science.gov (United States)

    Trondsen, Marianne V.

    2014-01-01

    Chronic skin ulcers are a significant challenge for patients and health service resources, and ulcer treatment often requires the competence of a specialist. Although e-health interventions are increasingly valued for ulcer care by giving access to specialists at a distance, there is limited research on patients’ use of e-health services for home-based ulcer treatment. This article reports an exploratory qualitative study of the first Norwegian web-based counselling service for home-based ulcer treatment, established in 2011 by the University Hospital of North Norway (UNN). Community nurses, general practitioners (GPs) and patients are offered access to a web-based record system to optimize ulcer care. The web-based ulcer record enables the exchange and storage of digital photos and clinical information, by the use of which, an ulcer team at UNN, consisting of specialized nurses and dermatologists, is accessible within 24 h. This article explores patients’ experiences of using the web-based record for their home-based ulcer treatment without assistance from community nurses. Semi-structured interviews were conducted with a total of four patients who had used the record. The main outcomes identified were: autonomy and flexibility; safety and trust; involvement and control; and motivation and hope. These aspects improved the patients’ everyday life during long-term ulcer care and can be understood as stimulating patient empowerment.

  14. Development of an Electronic Claim System Based on an Integrated Electronic Health Record Platform to Guarantee Interoperability

    OpenAIRE

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-01-01

    Objectives We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. Methods The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medic...

  15. A robust physiology-based source separation method for QRS detection in low amplitude fetal ECG recordings

    International Nuclear Information System (INIS)

    The use of the non-invasively obtained fetal electrocardiogram (ECG) in fetal monitoring is complicated by the low signal-to-noise ratio (SNR) of ECG signals. Even after removal of the predominant interference (i.e. the maternal ECG), the SNR is generally too low for medical diagnostics, and hence additional signal processing is still required. To this end, several methods for exploiting the spatial correlation of multi-channel fetal ECG recordings from the maternal abdomen have been proposed in the literature, of which principal component analysis (PCA) and independent component analysis (ICA) are the most prominent. Both PCA and ICA, however, suffer from the drawback that they are blind source separation (BSS) techniques and as such suboptimum in that they do not consider a priori knowledge on the abdominal electrode configuration and fetal heart activity. In this paper we propose a source separation technique that is based on the physiology of the fetal heart and on the knowledge of the electrode configuration. This technique operates by calculating the spatial fetal vectorcardiogram (VCG) and approximating the VCG for several overlayed heartbeats by an ellipse. By subsequently projecting the VCG onto the long axis of this ellipse, a source signal of the fetal ECG can be obtained. To evaluate the developed technique, its performance is compared to that of both PCA and ICA and to that of augmented versions of these techniques (aPCA and aICA; PCA and ICA applied on preprocessed signals) in generating a fetal ECG source signal with enhanced SNR that can be used to detect fetal QRS complexes. The evaluation shows that the developed source separation technique performs slightly better than aPCA and aICA and outperforms PCA and ICA and has the main advantage that, with respect to aPCA/PCA and aICA/ICA, it performs more robustly. This advantage renders it favorable for employment in automated, real-time fetal monitoring applications

  16. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang

    2004-01-01

    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  17. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;

    2014-01-01

    with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...

  18. Modeling Mass Spectrometry Based Protein Analysis

    OpenAIRE

    Eriksson, Jan; Fenyö, David

    2011-01-01

    The success of mass spectrometry based proteomics depends on efficient methods for data analysis. These methods require a detailed understanding of the information value of the data. Here, we describe how the information value can be elucidated by performing simulations using synthetic data.

  19. An Ensemble Learning Based Framework for Traditional Chinese Medicine Data Analysis with ICD-10 Labels

    OpenAIRE

    Gang Zhang; Yonghui Huang; Ling Zhong; Shanxing Ou; Yi Zhang; Ziping Li

    2015-01-01

    Objective. This study aims to establish a model to analyze clinical experience of TCM veteran doctors. We propose an ensemble learning based framework to analyze clinical records with ICD-10 labels information for effective diagnosis and acupoints recommendation. Methods. We propose an ensemble learning framework for the analysis task. A set of base learners composed of decision tree (DT) and support vector machine (SVM) are trained by bootstrapping the training dataset. The base learners are...

  20. Volume holographic recording in photopolymerizable nanocomposite materials based on radical-mediated thiol-yne step-growth polymerizations

    Science.gov (United States)

    Mitsube, Ken; Nishimura, Yuki; Takayama, Shingo; Nagaya, Kohta; Tomita, Yasuo

    2013-05-01

    We propose the use of radical-mediated thiol-yne step-growth photopolymerizations for volume holographic recording in NPC films to overcome the drawback of low crosslinking densities but retain the advantage of low shrinkage in the thiol-ene photopolymerizations. The thiol-yne photopolymerization mechanism is different from the thiol-ene photopolymeriztions in the sense that each alkyne functional group can react consecutively with two thiol functional groups. We show that thiol-yne based NPC films dispersed with silica nanoparticles give the saturated refractive index change as large as 0.008 and the material recording sensitivity as high as 2005 cm/J at a wavelength of 532 nm, larger than the minimum acceptable values of 0.005 and 500 cm/J, respectively, for holographic data storage. We also show that the shrinkage of a recorded hologram can be as low as that of thiol-ene based NPC films and that the thermal stability is improved better. In addition, we demonstrate digital data page recording in thiol-yne based NPC films, showing a low symbol error rate and a high signal-to-noise ratio to be 2.8×10-4 and 8, respectively.

  1. Analysis of laser turbulence utilizing a video tape recorder and digital storage oscilloscope.

    OpenAIRE

    Connor, John Henry

    1982-01-01

    Approved for public release; distribution unlimited The ability to measure and predict atmospheric turbulence affecting laser beam propagation is a major concern when considering military applications. Such a method using a telescope, high resolution television camera, video tape recorder, digital storage oscilloscope, and calculator system has been devised, tested and utilized. A laser beam signal is recorded on video tape for further processing. This signal is displayed...

  2. Literature based drug interaction prediction with clinical assessment using electronic medical records: novel myopathy associated drug interactions.

    Directory of Open Access Journals (Sweden)

    Jon D Duke

    Full Text Available Drug-drug interactions (DDIs are a common cause of adverse drug events. In this paper, we combined a literature discovery approach with analysis of a large electronic medical record database method to predict and evaluate novel DDIs. We predicted an initial set of 13197 potential DDIs based on substrates and inhibitors of cytochrome P450 (CYP metabolism enzymes identified from published in vitro pharmacology experiments. Using a clinical repository of over 800,000 patients, we narrowed this theoretical set of DDIs to 3670 drug pairs actually taken by patients. Finally, we sought to identify novel combinations that synergistically increased the risk of myopathy. Five pairs were identified with their p-values less than 1E-06: loratadine and simvastatin (relative risk or RR = 1.69; loratadine and alprazolam (RR = 1.86; loratadine and duloxetine (RR = 1.94; loratadine and ropinirole (RR = 3.21; and promethazine and tegaserod (RR = 3.00. When taken together, each drug pair showed a significantly increased risk of myopathy when compared to the expected additive myopathy risk from taking either of the drugs alone. Based on additional literature data on in vitro drug metabolism and inhibition potency, loratadine and simvastatin and tegaserod and promethazine were predicted to have a strong DDI through the CYP3A4 and CYP2D6 enzymes, respectively. This new translational biomedical informatics approach supports not only detection of new clinically significant DDI signals, but also evaluation of their potential molecular mechanisms.

  3. Analysis of debris-flow recordings in an instrumented basin: confirmations and new findings

    Directory of Open Access Journals (Sweden)

    M. Arattano

    2012-03-01

    Full Text Available On 24 August 2006, a debris flow took place in the Moscardo Torrent, a basin of the Eastern Italian Alps instrumented for debris-flow monitoring. The debris flow was recorded by two seismic networks located in the lower part of the basin and on the alluvial fan, respectively. The event was also recorded by a pair of ultrasonic sensors installed on the fan, close to the lower seismic network. The comparison between the different recordings outlines particular features of the August 2006 debris flow, different from that of events recorded in previous years. A typical debris-flow wave was observed at the upper seismic network, with a main front abruptly appearing in the torrent, followed by a gradual decrease of flow height. On the contrary, on the alluvial fan the wave displayed an irregular pattern, with low flow depth and the main peak occurring in the central part of the surge both in the seismic recording and in the hydrographs. Recorded data and field evidences indicate that the surge observed on the alluvial fan was not a debris flow, and probably consisted in a water surge laden with fine to medium-sized sediment. The change in shape and characteristics of the wave can be ascribed to the attenuation of the surge caused by the torrent control works implemented in the lower basin during the last years.

  4. Analysis of historical meteor and meteor shower records: Korea, China, and Japan

    CERN Document Server

    Yang, H J; Park, M G; Yang, Hong-Jin; Park, Changbom; Park, Myeong-Gu

    2005-01-01

    We have compiled and analyzed historical Korean meteor and meteor shower records in three Korean official history books, Samguksagi which covers the three Kingdoms period (57 B.C -- A.D. 935), Goryeosa of Goryeo dynasty (A.D. 918 -- 1392), and Joseonwangjosillok of Joseon dynasty (A.D. 1392 -- 1910). We have found 3861 meteor and 31 meteor shower records. We have confirmed the peaks of Perseids and an excess due to the mixture of Orionids, north-Taurids, or Leonids through the Monte-Carlo test. The peaks persist from the period of Goryeo dynasty to that of Joseon dynasty, for almost one thousand years. Korean records show a decrease of Perseids activity and an increase of Orionids/north-Taurids/Leonids activity. We have also analyzed seasonal variation of sporadic meteors from Korean records. We confirm the seasonal variation of sporadic meteors from the records of Joseon dynasty with the maximum number of events being roughly 1.7 times the minimum. The Korean records are compared with Chinese and Japanese re...

  5. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records

    DEFF Research Database (Denmark)

    Walker, Mike; Johnsen, Sigfus Johann; Rasmussen, Sune Olander;

    2009-01-01

    of climatic warming at the end of the Younger Dryas/Greenland Stadial 1 cold phase, to be located with a high degree of precision. This climatic event is most clearly reflected in an abrupt shift in deuterium excess values, accompanied by more gradual changes in d18O, dust concentration, a range of chemical......The Greenland ice core from NorthGRIP (NGRIP) contains a proxy climate record across the Pleistocene-Holocene boundary of unprecedented clarity and resolution. Analysis of an array of physical and chemical parameters within the ice enables the base of the Holocene, as reflected in the first signs...

  6. Dissertation on the computer-based exploitation of a coincidence multi parametric recording. Application to the study of the disintegration scheme of Americium 241

    International Nuclear Information System (INIS)

    After having presented the meaning of disintegration scheme (alpha and gamma emissions, internal conversion, mean lifetime), the author highlights the benefits of the use of multi-parametric chain for the recording of correlated parameters, and of the use of a computer for the analysis of bi-parametric information based on contour lines. Using the example of Americium 241, the author shows how these information are obtained (alpha and gamma spectrometry, time measurement), how they are chosen, coded, analysed and stored, and then processed by contour lines

  7. Multi-level analysis of electronic health record adoption by health care professionals: A study protocol

    Directory of Open Access Journals (Sweden)

    Labrecque Michel

    2010-04-01

    Full Text Available Abstract Background The electronic health record (EHR is an important application of information and communication technologies to the healthcare sector. EHR implementation is expected to produce benefits for patients, professionals, organisations, and the population as a whole. These benefits cannot be achieved without the adoption of EHR by healthcare professionals. Nevertheless, the influence of individual and organisational factors in determining EHR adoption is still unclear. This study aims to assess the unique contribution of individual and organisational factors on EHR adoption in healthcare settings, as well as possible interrelations between these factors. Methods A prospective study will be conducted. A stratified random sampling method will be used to select 50 healthcare organisations in the Quebec City Health Region (Canada. At the individual level, a sample of 15 to 30 health professionals will be chosen within each organisation depending on its size. A semi-structured questionnaire will be administered to two key informants in each organisation to collect organisational data. A composite adoption score of EHR adoption will be developed based on a Delphi process and will be used as the outcome variable. Twelve to eighteen months after the first contact, depending on the pace of EHR implementation, key informants and clinicians will be contacted once again to monitor the evolution of EHR adoption. A multilevel regression model will be applied to identify the organisational and individual determinants of EHR adoption in clinical settings. Alternative analytical models would be applied if necessary. Results The study will assess the contribution of organisational and individual factors, as well as their interactions, to the implementation of EHR in clinical settings. Conclusions These results will be very relevant for decision makers and managers who are facing the challenge of implementing EHR in the healthcare system. In addition

  8. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...... analysis of variables with different genesis and therefore different statistical distributions and different modalities. As a proof of concept we give a toy example. We also give an example with one (weather radar based) variable in the one set and eight spectral bands of optical satellite data in the...

  9. Viewpoint from Defects Analysis of Medical Records%由病案缺陷引发的思考

    Institute of Scientific and Technical Information of China (English)

    林春生

    2012-01-01

    Objective This paper discusses new countermeasures about medical records quality . Methods 3000 hospitalization medical records are randomly selected from July 2010 to October 2010 , had quality analysis . Results There are 2872 class A medical records (95 .73% ) , 128 class B medical records (4 .27% ) , no class C medical records; there are 5430 defects , in which defects about clinical basic and standardization accounting for 55 .49% , defects about medical safety record accounting for 28 .71% and defects about treatment technology and medication accounting for 15 .8% . Conclusions It's advised that the doctors study correlative legal items about medicine and carry through good medical training , so as to enhance quality of medical records writing . The real time medical records monitoring system might be a new mode of medical records management . Rules must be es-tablished such as drug application so as to improve quality of medical records writing . According to Law of Tort liability , it is very important to further standardize the clinicians' diagnosis and treatment behavior.%目的 探讨提高病案质量的新措施.方法 随机抽取我院2010年7月-2010年10月间的出院病案3000份进行质控分析.结果 甲级病案2872份(95.73%),乙级病案128份(4.27%),无丙级病案;病案缺陷共5430处,其中临床基础与规范类缺陷占55.49%、医疗安全记录类缺陷占28.71%、诊疗技术与用药类缺陷占15.80%.结论 通过对临床医师进行法制和专业知识培训,以提高病案书写能力;利用医院信息管理系统,对病案质量进行实时监控是一种新的病案管理模式;建议进一步完善药物使用等管理制度,确保病历内涵质量的提高;应结合进一步规范临床医师诊疗行为.

  10. Construction of a SORCE-based Solar Spectral Irradiance (SSI) Record for Input into Chemistry Climate Models

    Science.gov (United States)

    Harder, J. W.; Fontenla, J. M.

    2015-12-01

    We present a research program to produce a solar spectral irradiance (SSI) record suitable for whole atmosphere chemistry-climate model (CCM) transient studies over the 2001-2015 time period for Solar Cycle 23 and 24 (SC23-24). Climate simulations during this time period are particularly valuable because SC23-24 represents the best-observed solar cycle in history - both from the perspective of solar physics and in terms of Earth observation systems. This record will be based predominantly on the observed irradiance of the SORCE mission as measured by the SIM and SOLSTICE instruments from April of 2003 to the present time. The SSI data record for this proposed study requires very broad wavelength coverage (115-100000 nm), daily spectral coverage, compliance of the integrated SSI record with the TSI, and well-defined and documented uncertainty estimates. While the majority of the record will be derived from SORCE observations, extensions back to the SC23 maximum time period (early 2001) and closure of critical gaps in the SORCE record will be generated employing the Fontenla et al. (2015) Solar Radiation Physical Model (SRPMv2). Since SRPM is a physics-based model, estimates of the SSI for wavelengths outside the SORCE measurement range can be meaningfully included. This model now includes non-LTE contributions from metals in the atomic number range 22-28 (i.e. titanium through nickel) as well as important molecular photo-disassociation contributions from molecules such as NH, molecular hydrogen, CH, and OH led have led to greatly improved agreement between the model and the observed 0.1 nm SOLSTICE spectrum. Thus comparative studies of the SORCE observations with SRPMv2 provide meaningful insight into the nature of solar variability critical for subsequent Earth atmospheric modeling efforts.

  11. Diffractive Optical Elements with a Large Angle of Operation Recorded in Acrylamide Based Photopolymer on Flexible Substrates

    Directory of Open Access Journals (Sweden)

    Hoda Akbari

    2014-01-01

    Full Text Available A holographic device characterised by a large angular range of operation is under development. The aim of this study is to increase the angular working range of the diffractive lens by stacking three layers of high efficiency optical elements on top of each other so that light is collected (and focussed from a broader range of angles. The angular range of each individual lens element is important, and work has already been done in an acrylamide-based photosensitive polymer to broaden the angular range of individual elements using holographic recording at a low spatial frequency. This paper reports new results on the angular selectivity of stacked diffractive lenses. A working range of 12° is achieved. The diffractive focussing elements were recorded holographically with a central spatial frequency of 300 l/mm using exposure energy of 60 mJ/cm2 at a range of recording angles. At this spatial frequency with layers of thickness 50 ± 5 µm, a diffraction efficiency of 80% and 50% was achieved in the single lens element and combined device, respectively. The optical recording process and the properties of the multilayer structure are described and discussed. Holographic recording of a single lens element is also successfully demonstrated on a flexible glass substrate (Corning(R Willow(R Glass for the first time.

  12. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  13. Development of an apnea detection algorithm based on temporal analysis of thoracic respiratory effort signal

    Science.gov (United States)

    Dell’Aquila, C. R.; Cañadas, G. E.; Correa, L. S.; Laciar, E.

    2016-04-01

    This work describes the design of an algorithm for detecting apnea episodes, based on analysis of thorax respiratory effort signal. Inspiration and expiration time, and range amplitude of respiratory cycle were evaluated. For range analysis the standard deviation statistical tool was used over respiratory signal temporal windows. The validity of its performance was carried out in 8 records of Apnea-ECG database that has annotations of apnea episodes. The results are: sensitivity (Se) 73%, specificity (Sp) 83%. These values can be improving eliminating artifact of signal records.

  14. Facies characterization based on physical properties from downhole logging for the sediment record of Lake Van, Turkey

    Science.gov (United States)

    Baumgarten, H.; Wonik, T.; Kwiecien, O.

    2014-11-01

    significant depth shifts of up to 2.5 m between the composite profile based on the VCD and the downhole measurements in hole 2D of the Ahlat Ridge, (b) a correlation was difficult to ascertain from the vertical resolution of the downhole logging data and the extremely detailed core description in mm-scale, (c) mixed signals were obtained because of prevailing thin layers and intercalations of different lithotypes and (d) cluster analysis was difficult to perform because the contrast within the input data is too low (possibly background sedimentation) to distinguish between glacial and interglacial deposits. Tephra units are characterized by contrasting properties and differ mainly in their magnetic susceptibility, spectral gamma ray components (uranium, thorium and potassium) and XRF-intensities of calcium and zirconium. Tephra units have been linked to the dominant volcanic composition of the deposited tephra layers and partly to the volcanic sources. Depth trends are derived with prevailing basaltic deposits in the bottom part (128 m-210 m below lake floor) and are gradually outweighed by the highly differentiated (dacitic and rhyolitic/trachytic) products towards the top of the record.

  15. Study of TCM clinical records based on LSA and LDA SHTDT model

    Science.gov (United States)

    LIN, FAN; ZHANG, ZHIHONG; LIN, SHU-FU; ZENG, JIA-SONG; GAN, YAN-FANG

    2016-01-01

    Description of syndromes and symptoms in traditional Chinese medicine are extremely complicated. The method utilized to diagnose a patient's syndrome more efficiently is the primary aim of clinical health care workers. In the present study, two models were presented concerning this issue. The first is the latent semantic analysis (LSA)-based semantic classification model, which is employed when the classification and words used to depict these classfications have been confirmed. The second is the symptom-herb-therapies-diagnosis topic (SHTDT), which is employed when the classification has not been confirmed or described. The experimental results showed that this method was successful, and symptoms can be diagnosed to a certain extent. The experimental results indicated that the topic feature reflected patient characteristics and the topic structure was obtained, which was clinically significant. The experimental results showed that when provided with a patient's symptoms, the model can be used to predict the theme and diagnose the disease, and administer appropriate drugs and treatments. Additionally, the SHTDT model prediction results did not yield completely accurate results because this prediction is equivalent to multi-label prediction, whereby the drugs, treatment and diagnosis are considered as labels. In conclusion, diagnosis, and the drug and treatment administered are based on human factors. PMID:27347051

  16. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  17. A content analysis of stroke physical therapy intervention using stroke physiotherapy intervention recording tool

    Science.gov (United States)

    Cho, Hyuk-shin; Cha, Hyun-gyu

    2016-01-01

    [Purpose] Physical therapy for recovery of function in people with stroke is known to be effective, but which type of physical therapy intervention is most effective is uncertain because a concrete and detailed record of interventions is done. This study aimed to record, analyze, and describe the content of physical therapy interventions for recovery of function after stroke using stroke physiotherapy intervention recording tool (SPIRIT). [Subjects and Methods] A convenience sample of 23 physical therapists from a rehabilitation hospital in Chung-nam recorded the interventions for 73 patients with stroke who were treated for 30 minutes in 670 treatment sessions. Treatment session contents were recorded using SPIRIT. Descriptive statistics were used to describe the interventions accurately and to investigate the differences according to time since stroke. [Results] Facilitation techniques were the most frequently used interventions (n=1,342, 35.1%), followed by practice (n=1,056, 27.6%), and exercise (n=748, 19.6%) in the physical therapists’ clinical practice. [Conclusion] This pattern shows that physical therapists were focused on functional activity. Organizing or teaching patient activities for independent practice interventions (n=286, 7.5%) were used to encourage patient activity and independence outside the treatment sessions. Interventions according to time since stroke were not significantly different. PMID:27313368

  18. Model-based estimation of the global carbon budget and its uncertainty from carbon dioxide and carbon isotope records

    International Nuclear Information System (INIS)

    A global carbon cycle model is used to reconstruct the carbon budget, balancing emissions from fossil fuel and land use with carbon uptake by the oceans, and the terrestrial biosphere. We apply Bayesian statistics to estimate uncertainty of carbon uptake by the oceans and the terrestrial biosphere based on carbon dioxide and carbon isotope records, and prior information on model parameter probability distributions. This results in a quantitative reconstruction of past carbon budget and its uncertainty derived from an explicit choice of model, data-based constraints, and prior distribution of parameters. Our estimated ocean sink for the 1980s is 17±7 Gt C (90% confidence interval) and is comparable to the estimate of 20±8 Gt C given in the recent Intergovernmental Panel on Climate Change assessment [Schimel et al., 1996]. Constraint choice is tested to determine which records have the most influence over estimates of the past carbon budget; records individually (e.g., bomb-radiocarbon inventory) have little effect since there are other records which form similar constraints. (c) 1999 American Geophysical Union

  19. Electronic Health Record for Shared Care Based on International Standards and Nomenclatures in Czech National Environment

    Czech Academy of Sciences Publication Activity Database

    Nagy, Miroslav; Hanzlíček, Petr; Dioszegi, Matěj; Zvárová, Jana; Přečková, Petra; Seidl, L.; Zvára, K.; Bureš, V.; Šubrt, D.

    London : The Royal Society of Medicine, 2008. RE7. [TeleMed & eHealth 2008. 24.11.2008-25.11.2008, London] R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : electronic health record * communication standards * semantic interoperability Subject RIV: IN - Informatics, Computer Science

  20. 77 FR 47826 - Record of Decision for F35A Training Basing Final Environmental Impact Statement

    Science.gov (United States)

    2012-08-10

    ... ACTION: Notice of Availability (NOA) of a Record of Decision (ROD). SUMMARY: On August 1, 2012, the... relevant factors. The FEIS was made available to the public on June 15, 2012 through a NOA in the Federal... FEIS. Authority: This NOA is published pursuant to the regulations (40 CFR Part 1506.6)...

  1. Analysis of recently digitized continuous seismic data recorded during the March-May, 1980, eruption sequence at Mount St. Helens

    Science.gov (United States)

    Moran, S. C.; Malone, S. D.

    2013-12-01

    The May 18, 1980, eruption of Mount St. Helens (MSH) was an historic event, both for society and for the field of volcanology. However, our knowledge of the eruption and the precursory period leading up it is limited by the fact that most of the data, particularly seismic recordings, were not kept due to severe limitations in the amount of digital data that could be handled and stored using 1980 computer technology. Because of these limitations, only about 900 digital event files have been available for seismic studies of the March-May seismic sequence out of a total of more than 4,000 events that were counted using paper records. Fortunately, data from a subset of stations were also recorded continuously on a series of 24 analog 14-track IRIG magnetic tapes. We have recently digitized these tapes and time-corrected and cataloged the resultant digital data streams, enabling more in-depth studies of the (almost) complete pre-eruption seismic sequence using modern digital processing techniques. Of the fifteen seismic stations operating near MSH for at least a part of the two months between March 20 and May 18, six stations have relatively complete analog recordings. These recordings have gaps of minutes to days because of radio noise, poor tape quality, or missing tapes. In addition, several other stations have partial records. All stations had short-period vertical-component sensors with very limited dynamic range and unknown response details. Nevertheless, because the stations were at a range of distances and were operated at a range of gains, a variety of earthquake sizes were recorded on scale by at least one station, and therefore a much more complete understanding of the evolution of event types, sizes and character should be achievable. In our preliminary analysis of this dataset we have found over 10,000 individual events as recorded on stations 35-40 km from MSH, spanning a recalculated coda-duration magnitude range of ~1.5 to 4.1, including many M 7 km

  2. Ground-based assessment of the bias and long-term stability of 14 limb and occultation ozone profile data records

    Science.gov (United States)

    Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, José; Keppens, Arno; Baray, Jean-Luc; Bourassa, Adam E.; Cortesi, Ugo; Degenstein, Doug A.; Froidevaux, Lucien; Godin-Beekmann, Sophie; Hoppel, Karl W.; Johnson, Bryan J.; Kyrölä, Erkki; Leblanc, Thierry; Lichtenberg, Günter; Marchand, Marion; McElroy, C. Thomas; Murtagh, Donal; Nakane, Hideaki; Portafaix, Thierry; Querel, Richard; Russell, James M., III; Salvador, Jacobo; Smit, Herman G. J.; Stebel, Kerstin; Steinbrecht, Wolfgang; Strawbridge, Kevin B.; Stübi, René; Swart, Daan P. J.; Taha, Ghassan; Tarasick, David W.; Thompson, Anne M.; Urban, Joachim; van Gijsel, Joanna A. E.; Van Malderen, Roeland; von der Gathen, Peter; Walker, Kaley A.; Wolfram, Elian; Zawodny, Joseph M.

    2016-06-01

    profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of 14 limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20 and 40 km the satellite ozone measurement biases are smaller than ±5 %, the short-term variabilities are less than 5-12 % and the drifts are at most ±5 % decade-1 (or even ±3 % decade-1 for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10 % and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY) and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions

  3. Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data

    Science.gov (United States)

    Butterfield, Ansel J.

    2001-01-01

    Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.

  4. Scoring tail damage in pigs: an evaluation based on recordings at Swedish slaughterhouses

    Directory of Open Access Journals (Sweden)

    Keeling Linda J

    2012-05-01

    Full Text Available Abstract Background There is increasing interest in recording tail damage in pigs at slaughter to identify problem farms for advisory purposes, but also for benchmarking within and between countries as part of systematic monitoring of animal welfare. However, it is difficult to draw conclusions when comparing prevalence’s between studies and countries partly due to differences in management (e.g. differences in tail docking and enrichment routines and partly due to differences in the definition of tail damage. Methods Tail damage and tail length was recorded for 15,068 pigs slaughtered during three and four consecutive days at two slaughterhouses in Sweden. Tail damage was visually scored according to a 6-point scale and tail length was both visually scored according to a 5-point scale and recorded as tail length in centimetres for pigs with injured or shortened tails. Results The total prevalence of injury or shortening of the tail was 7.0% and 7.2% in slaughterhouse A and B, respectively. When only considering pigs with half or less of the tail left, these percentages were 1.5% and 1.9%, which is in line with the prevalence estimated from the routine recordings at slaughter in Sweden. A higher percentage of males had injured and/or shortened tails, and males had more severely bitten tails than females. Conclusions While the current method to record tail damage in Sweden was found to be reliable as a method to identify problem farms, it clearly underestimates the actual prevalence of tail damage. For monitoring and benchmarking purposes, both in Sweden and internationally, we propose that a three graded scale including both old and new tail damage would be more appropriate. The scale consists of one class for no tail damage, one for mild tail damage (injured or shortened tail with more than half of the tail remaining and one for severe tail damage (half or less of the tail remaining.

  5. TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING

    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie

    2003-01-01

    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  6. Analysis of observational records of Dae-gyupyo in Joseon Dynasty

    Science.gov (United States)

    Mihn, Byeong-Hee; Lee, Ki-Won; Kim, Sang-Hyuk; Ahn, Young Sook; Lee, Yong Sam

    2012-09-01

    It is known that Dae-gyupyo (the Large Noon Gnomon) and So-gyupyo (the Small Noon Gnomon) were constructed in the reign of King Sejong (1418--1450) of the Joseon Dynasty. Gyupyo is an astronomical instrument for measuring the length of the shadow cast by a celestial body at the meridian passage time; it consists of two basic parts: a measuring scale and a vertical column. According to the Veritable Records of King Sejong and of King Myeongjong (1545--1567), the column of Dae-gyupyo was 40 Cheok (˜ 8 m) in height from the measuring scale and had a cross-bar, like the Guibiao of Shoujing Guo of the Yuan Dynasty in China. In the latter Veritable Records, three observations of the Sun on the date of the winter solstice and two of the full Moon on the first month in a luni-solar calendar are also recorded. In particular, the observational record of Dae-gyupyo for the Sun on Dec. 12, 1563 is ˜ 1 m shorter than the previous two records. To explain this, we investigated two possibilities: the vertical column was inclined, and the cross-bar was lowered. The cross-bar was attached to the column by a supporting arm; that should be installed at an angle of ˜ 36.9° to the north on the basis of a geometric structure inferred from the records of Yuanshi (History of the Yuan Dynasty). We found that it was possible that the vertical column was inclined ˜ 7.7° to the south or the supporting arm was tilted ˜ 58.3° downward. We suggest that the arm was tilted by ˜ 95° (= 36.9° + 58.3°).

  7. Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy

    CERN Document Server

    Kopitzki, K; Timmer, J

    1998-01-01

    Invasive electroencephalograph (EEG) recordings of ten patients suffering from focal epilepsy were analyzed using the method of renormalized entropy. Introduced as a complexity measure for the different regimes of a dynamical system, the feature was tested here for its spatio-temporal behavior in epileptic seizures. In all patients a decrease of renormalized entropy within the ictal phase of seizure was found. Furthermore, the strength of this decrease is monotonically related to the distance of the recording location to the focus. The results suggest that the method of renormalized entropy is a useful procedure for clinical applications like seizure detection and localization of epileptic foci.

  8. Climatology Analysis of Aerosol Effect on Marine Water Cloud from Long-Term Satellite Climate Data Records

    OpenAIRE

    Xuepeng Zhao; Andrew K. Heidinger; Andi Walther

    2016-01-01

    Satellite aerosol and cloud climate data records (CDRs) have been used successfully to study the aerosol indirect effect (AIE). Data from the Advanced Very High Resolution Radiometer (AVHRR) now span more than 30 years and allow these studies to be conducted from a climatology perspective. In this paper, AVHRR data are used to study the AIE on water clouds over the global oceans. Correlation analysis between aerosol optical thickness (AOT) and cloud parameters, including cloud droplet effecti...

  9. Structure-based analysis of Web sites

    OpenAIRE

    Yen, B

    2004-01-01

    The performance of information retrieval on the Web is heavily influenced by the organization of Web pages, user navigation patterns, and guidance-related functions. Having observed the lack of measures to reflect this factor, this paper focuses on an approach based on both structure properties and navigation data to analyze and improve the performance of Web site. Two types of indices are defined two major factors for analysis and improvement- "aaccessibility" reflects the structure property...

  10. Quantum entanglement analysis based on abstract interpretation

    OpenAIRE

    Perdrix, Simon

    2008-01-01

    Entanglement is a non local property of quantum states which has no classical counterpart and plays a decisive role in quantum information theory. Several protocols, like the teleportation, are based on quantum entangled states. Moreover, any quantum algorithm which does not create entanglement can be efficiently simulated on a classical computer. The exact role of the entanglement is nevertheless not well understood. Since an exact analysis of entanglement evolution induces an exponential sl...

  11. Particle Pollution Estimation Based on Image Analysis

    OpenAIRE

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic...

  12. Physician assessment of disease activity in JIA subtypes. Analysis of data extracted from electronic medical records

    Directory of Open Access Journals (Sweden)

    Wang Deli

    2011-04-01

    Full Text Available Abstract Objective Although electronic medical records (EMRs have facilitated care for children with juvenile idiopathic arthritis (JIA, analyses of treatment outcomes have required paper based or manually re-entered data. We have started EMR discrete data entry for JIA patient visits, including joint examination and global assessment, by physician and patient. In this preliminary study, we extracted data from the EMR to Xenobase™ (TransMed Systems, Inc., Cupertino, CA, an application permitting cohort analyses of the relationship between global assessment to joint examination and subtype. Methods During clinic visits, data were entered into discrete fields in ambulatory visit forms in the EMR (EpicCare™, Epic Systems, Verona, WI. Data were extracted using Clarity Reports, then de-identified and uploaded for analyses to Xenobase™. Parameters included joint examination, ILAR diagnostic classification, physician global assessment, patient global assessment, and patient pain score. Data for a single visit for each of 160 patients over a 2 month period, beginning March, 2010, were analyzed. Results In systemic JIA patients, strong correlations for physician global assessment were found with pain score, joint count and patient assessment. In contrast, physician assessment for patients with persistent oligoarticular and rheumatoid factor negative patients showed strong correlation with joint counts, but only moderate correlation with pain scores and patient global assessment. Conversely, for enthesitis patients, physician assessment correlated strongly with pain scores, and moderately with joint count and patient global assessment. Rheumatoid factor positive patients, the smallest group studied, showed moderate correlation for all three measures. Patient global assessment for systemic patients showed strong correlations with pain scores and joint count, similar to data for physician assessment. For polyarticular and enthesitis patients

  13. Determination of threshold exposure and intensity for recording holograms in thick green-sensitive acrylamide-based photopolymer.

    Science.gov (United States)

    Mahmud, Mohammad Sultan; Naydenova, Izabela; Babeva, Tzwetanka; Jallapuram, Raghavendra; Martin, Suzanne; Toal, Vincent

    2010-10-01

    For optical data storage applications, it is essential to determine the lowest intensity (also known as threshold intensity) below or at which no data page or grating can be recorded in the photosensitive material, as this in turn determines the data capacity of the material. Here, experiments were carried out to determine the threshold intensity below which the formation of a simple hologram--a holographic diffraction grating in a green-sensitized acrylamide-based photopolymer--is not possible. Two main parameters of the recording layers--dye concentration and thickness--were varied to study the influence of the density of the generated free radicals on the holographic properties of these layers. It was observed that a minimum concentration per unit volume of free radicals is required for efficient cross-linking of the created polymer chains and for recording a hologram. The threshold intensity below which no hologram can be recorded in the Erythrosin B sensitized layers with absorbance less than 0.16 was 50 μW/cm(2). The real-time diffraction efficiency was analyzed in the early stage of recording. It was determined that the minimum intensity required to obtain diffraction efficiency of 1% was 90 μW/cm(2), and the minimum required exposure was 8 mJ/cm(2). It was also determined that there is an optimum dye concentration of 1.5 × 10(-7) mol/L for effective recording above which no increase in the sensitivity of the layers is observed. PMID:20885463

  14. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  15. Design and evaluation of area-efficient and wide-range impedance analysis circuit for multichannel high-quality brain signal recording system

    Science.gov (United States)

    Iwagami, Takuma; Tani, Takaharu; Ito, Keita; Nishino, Satoru; Harashima, Takuya; Kino, Hisashi; Kiyoyama, Koji; Tanaka, Tetsu

    2016-04-01

    To enable chronic and stable neural recording, we have been developing an implantable multichannel neural recording system with impedance analysis functions. One of the important things for high-quality neural signal recording is to maintain well interfaces between recording electrodes and tissues. We have proposed an impedance analysis circuit with a very small circuit area, which is implemented in a multichannel neural recording and stimulating system. In this paper, we focused on the design of an impedance analysis circuit configuration and the evaluation of a minimal voltage measurement unit. The proposed circuit has a very small circuit area of 0.23 mm2 designed with 0.18 µm CMOS technology and can measure interface impedances between recording electrodes and tissues in ultrawide ranges from 100 Ω to 10 MΩ. In addition, we also successfully acquired interface impedances using the proposed circuit in agarose gel experiments.

  16. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  17. Analysis of Self-Recording in Self-Management Interventions for Stereotypy

    Science.gov (United States)

    Fritz, Jennifer N.; Iwata, Brian A.; Rolider, Natalie U.; Camp, Erin M.; Neidert, Pamela L.

    2012-01-01

    Most treatments for stereotypy involve arrangements of antecedent or consequent events that are imposed entirely by a therapist. By contrast, results of some studies suggest that self-recording, a common component of self-management interventions, might be an effective and efficient way to reduce stereotypy. Because the procedure typically has…

  18. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  19. The Accuracy Analysis of Five-planet Movements Recorded in China in the Han Dynasty

    Science.gov (United States)

    Zhang, J.

    2010-04-01

    The observations and researches of five-planet are one of the important part of ancient calendars and also one of the methods to evaluate their accuracies. So astronomers paid much attention to this field. In "Hanshu·Tian wen zhi" and "Xuhanshu· Tian wen zhi", there are 160 records with detailed dates and positions, which are calculated and studied by the modern astronomical method in this paper. The calculated results show that these positions are mostly correct, taking up 77.5% of the total records. While the rest 36 records are incorrect, taking up 22.5%. In addition, there are three typical or special forms of five-planet movements. The numbers of “shou”, “he”, “fan” movements are 14, 22 and 46, taking up 9%, 14% and 29%, respectively. In this paper, a detailed research on these three typical forms of five-planet movements is carried out. We think that the 36 incorrect records are caused by various reasons, but mainly in the data processes carried out by later generations.

  20. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study

    Directory of Open Access Journals (Sweden)

    Ann R R Robertson

    2015-03-01

    Full Text Available Background   Globally, diabetes mellitus presents a substantial burden to individuals and healthcare systems. Structuring and/or coding of medical records underpin attempts to improve information sharing and searching, potentially bringing clinical and secondary uses benefits.Aims and objectives   We investigated if, how and why records for adults with diabetes were structured and/or coded, and explored stakeholders’ perceptions of current practice.Methods   We carried out a qualitative, theoretically-informed case study of documenting healthcare information for diabetes patients in family practice and hospital settings, using semi-structured interviews, observations, systems demonstrations and documentary data.Results   We conducted 22 interviews and four on-site observations, and reviewed 25 documents. For secondary uses – research, audit, public health and service planning – the benefits of highly structured and coded diabetes data were clearly articulated. Reported clinical benefits in terms of managing and monitoring diabetes, and perhaps encouraging patient self-management, were modest. We observed marked differences in levels of record structuring and/or coding between settings, and found little evidence that these data were being exploited to improve information sharing between them.Conclusions   Using high levels of data structuring and coding in medical records for diabetes patients has potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK.

  1. Multi-electrode nerve cuff recording - model analysis of the effects of finite cuff length

    NARCIS (Netherlands)

    Veltink, P.H.; Tonis, T.; Buschman, H.P.J.; Marani, E.; Wesselink, W.A.

    2005-01-01

    The effect of finite cuff length on the signals recorded by electrodes at different positions along the nerve was analysed in a model study. Relations were derived using a one-dimensional model. These were evaluated in a more realistic axially symmetric 3D model. This evaluation indicated that the c

  2. A Correlational Analysis: Electronic Health Records (EHR) and Quality of Care in Critical Access Hospitals

    Science.gov (United States)

    Khan, Arshia A.

    2012-01-01

    Driven by the compulsion to improve the evident paucity in quality of care, especially in critical access hospitals in the United States, policy makers, healthcare providers, and administrators have taken the advise of researchers suggesting the integration of technology in healthcare. The Electronic Health Record (EHR) System composed of multiple…

  3. Estimating the Preferences of Central Bankers : An Analysis of Four Voting Records

    NARCIS (Netherlands)

    Eijffinger, S.C.W.; Mahieu, R.J.; Raes, L.B.D.

    2013-01-01

    Abstract: This paper analyzes the voting records of four central banks (Sweden, Hungary, Poland and the Czech Republic) with spatial models of voting. We infer the policy preferences of the monetary policy committee members and use these to analyze the evolution in preferences over time and the diff

  4. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis

    OpenAIRE

    Moreno-Conde, Alberto; Moner Cano, David; Da Cruz, Wellington Dimas; Santos, Marcelo R.; Maldonado Segura, José Alberto; Robles Viejo, Montserrat; KALRA, Dipak

    2015-01-01

    This is a pre-copyedited, author-produced PDF of an article accepted for publication in Journal of the American Medical Informatics Association following peer review. The version of record is available online at: http://dx.doi.org/10.1093/jamia/ocv008

  5. Allen's big-eared bat (Idionycteris phyllotis) documented in colorado based on recordings of its distinctive echolocation call

    Science.gov (United States)

    Hayes, M.A.; Navo, K.W.; Bonewell, L.; Mosch, C.J.; Adams, R.A.

    2009-01-01

    Allen's big-eared bat (Idionycteris phyllotis) inhabits much of the southwestern USA, but has not been documented in Colorado. We recorded echolocation calls consistent with I. phyllotis near La Sal Creek, Montrose County, Colorado. Based on characteristics of echolocation calls and flight behavior, we conclude that the echolocation calls described here were emitted by I. phyllotis and that they represent the first documentation of this species in Colorado.

  6. Determination of Threshold Exposure and Intensity for Recording Holograms in Thick Green-Sensitive Acrylamide-Based Photopolymer

    OpenAIRE

    Mahmud, Mohammad Sultan; Naydenova, Izabela; Bebeva, Tzwetanka; Jallapuram, Raghavendra; Martin, Suzanne; Toal, Vincent

    2010-01-01

    For optical data storage applications, it is essential to determine the lowest intensity (also known as threshold intensity) below or at which no data page or grating can be recorded in the photosensitive material, as this in turn determines the data capacity of the material. Here, experiments were carried out to determine the threshold intensity below which the formation of a simple hologram—a holographic diffraction grating in a green-sensitized acrylamide-based photopolymer—is not possible...

  7. Theory-based Support for Mobile Language Learning: Noticing and Recording

    Directory of Open Access Journals (Sweden)

    Agnes Kukulska-Hulme

    2009-04-01

    Full Text Available This paper considers the issue of 'noticing' in second language acquisition, and argues for the potential of handheld devices to: (i support language learners in noticing and recording noticed features 'on the spot', to help them develop their second language system; (ii help language teachers better understand the specific difficulties of individuals or those from a particular language background; and (iii facilitate data collection by applied linguistics researchers, which can be fed back into educational applications for language learning. We consider: theoretical perspectives drawn from the second language acquisition literature, relating these to the practice of writing language learning diaries; and the potential for learner modelling to facilitate recording and prompting noticing in mobile assisted language learning contexts. We then offer guidelines for developers of mobile language learning solutions to support the development of language awareness in learners.

  8. Improved Security of Attribute Based Encryption for Securing Sharing of Personal Health Records

    Directory of Open Access Journals (Sweden)

    Able E Alias

    2014-11-01

    Full Text Available Cloud computing servers provides platform for users to remotely store data and share the data items to everyone. Personal health record (PHR has emerged as a patient –centric model of health information exchange. Confidentiality of the shared data is the major problem when patients uses the commercial cloud servers because it can be view by everyone., to assure the patient’s control over access to their own medical records; it is a promising method to encrypt the files before outsourcing and give access control to that data. Privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control In this paper a high degree of patient privacy is guaranteed by exploiting multi-authority ABE. Divide the users in the PHR system into multiple security domains that greatly reduces the key management complexity for owners and users

  9. Design of an Electronic Healthcare Record Server Based on Part 1 of ISO EN 13606

    Directory of Open Access Journals (Sweden)

    Tony Austin

    2011-01-01

    Full Text Available ISO EN 13606 is a newly approved standard at European and ISO levels for the meaningful exchange of clinical information between systems. Although conceived as an inter-operability standard to which existing electronic health record (EHR systems will transform legacy data, the requirements met and architectural approach reflected in this standard also make it a good candidate for the internal architecture of an EHR server. The authors have built such a server for the storage of healthcare records and demonstrated that it is possible to use ISO EN 13606 part 1 as the basis of an internal system architecture. The development of the system and some of the applications of the server are described in this paper. It is the first known operational implementation of the standard as an EHR system.

  10. IMPROVED SECURITY OF ATTRIBUTE BASED ENCRYPTION FOR SECURING SHARING OF PERSONAL HEALTH RECORDS

    Directory of Open Access Journals (Sweden)

    Able E Alias

    2015-10-01

    Full Text Available Cloud computing servers provides platform for users to remotely store data and share the data items to everyone. Personal health record (PHR has emerged as a patient –centric model of health information exchange. Confidentiality of the shared data is the major problem when patients uses the commercial cloud servers because it can be view by everyone., to assure the patient’s control over access to their own medical records; it is a promising method to encrypt the files before outsourcing and give access control to that data. Privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control In this paper a high degree of patient privacy is guaranteed by exploiting multi-authority ABE. Divide the users in the PHR system into multiple security domains that greatly reduces the key management complexity for owners and users.

  11. Gating treatment delivery QA based on a surrogate motion analysis

    International Nuclear Information System (INIS)

    Full text: To develop a methodology to estimate intrafractional target position error during a phase-based gated treatment. Westmead Cancer Care Centre is using respiratory correlated phase-based gated beam delivery in the treatment of lung cancer. The gating technique is managed by the Varian Real-time Position Management (RPM) system, version 1.7.5. A 6-dot block is placed on the abdomen of the patient and acts as a surrogate for the target motion. During a treatment session, the motion of the surrogate can be recorded by RPM application. Analysis of the surrogate motion file by in-house developed software allows the intrafractional error of the treatment session to be computed. To validate the computed error, a simple test that involves the introduction of deliberate errors is performed. Errors of up to 1.1 cm are introduced to a metal marker placed on a surrogate using the Varian Breathing Phantom. The moving marker was scanned in prospective mode using a GE Lightspeed 16 CT scanner. Using the CT images, a difference of the marker position with and without introduced errors is compared to the calculated errors based on the surrogate motion. The average and standard deviation of a difference between calculated target position errors and measured introduced artificial errors of the marker position is 0.02 cm and 0.07 cm respectively. Conclusion The calculated target positional error based on surrogate motion analysis provides a quantitative measure of intrafractional target positional errors during treatment. Routine QA for gated treatment using surrogate motion analysis is relatively quick and simple.

  12. An Agent Based System Framework for Mining Data Record Extraction from Search Engine Result Pages

    OpenAIRE

    Dr. K.L.Shunmuganathan; P.Kalaivani

    2012-01-01

    Nowadays, the huge amount of information distributed through the Web motivates studying techniques to be adopted in order to extract relevant data in an efficient and reliable way. Information extraction (IE) from semistructured Web documents plays an important role for a variety of information agents. In this paper, a framework of WebIE system with the help of the JADE platform is proposed to solve problems by non-visual automatic wrapper to extract data records from search engine results pa...

  13. End-user developed workflow-based hemodialysis nursing record system.

    Science.gov (United States)

    Tai, Hsin-Ling; Lin, Hsiu-Wen; Ke, Suh-Huei; Lin, Shu-Ai; Chang, Chiung-Chu; Chang, Polun

    2009-01-01

    We reported how we decided to build our own Hemodialysis nursing record system using the end user computing strategy with Excel VBA. The project took one year to complete since we used our off-duty time and started everything from the grounds. We are proud of the final system which tightly meets our workflow and clinical needs. Its interface was carefully designed to be easy to use with a style. PMID:19593037

  14. Theory-based support for mobile language learning: noticing and recording

    OpenAIRE

    Agnes Kukulska-Hulme; Susan Bull

    2009-01-01

    This paper considers the issue of 'noticing' in second language acquisition, and argues for the potential of handheld devices to: (i) support language learners in noticing and recording noticed features 'on the spot', to help them develop their second language system; (ii) help language teachers better understand the specific difficulties of individuals or those from a particular language background; and (iii) facilitate data collection by applied linguistics researchers, which can be fed bac...

  15. HL7 document patient record architecture: an XML document architecture based on a shared information model.

    OpenAIRE

    Dolin, R H; Alschuler, L.; Behlen, F.; Biron, P. V.; BOYER S.; Essin, D.; Harding, L.; Lincoln, T.; Mattison, J E; Rishel, W.; Sokolowski, R.; Spinosa, J.; Williams, J. P.

    1999-01-01

    The HL7 SGML/XML Special Interest Group is developing the HL7 Document Patient Record Architecture. This draft proposal strives to create a common data architecture for the interoperability of healthcare documents. Key components are that it is under the umbrella of HL7 standards, it is specified in Extensible Markup Language, the semantics are drawn from the HL7 Reference Information Model, and the document specifications form an architecture that, in aggregate, define the semantics and stru...

  16. An Ontology-Based Methodology for the Migration of Biomedical Terminologies to Electronic Health Records

    OpenAIRE

    Smith, Barry; Ceusters, Werner

    2005-01-01

    Biomedical terminologies are focused on what is general, Electronic Health Records (EHRs) on what is particular, and it is commonly assumed that the step from the one to the other is unproblematic. We argue that this is not so, and that, if the EHR of the future is to fulfill its promise, then the foundations of both EHR architectures and biomedical terminologies need to be reconceived. We accordingly describe a new framework for the treatment of both generals and particular...

  17. Contract-Based Motivation for Keeping Records of a Manager's Reporting and Budgeting History

    OpenAIRE

    Anil Arya; John C. Fellingham; Young, Richard A.

    1994-01-01

    This paper analyzes the role of the agent's bankruptcy constraints in multiperiod principal-agent models with asymmetric information. Conditions are provided under which commitment to a long-term contract involving N rounds of investment improves upon repetition of N identical single-period contracts. Further, when the agent's reservation wage is sufficiently low the optimal contract is always long term. Keeping records of a manager's history of reporting activity facilitates contracting, sin...

  18. Residential segregation, dividing walls and mental health: A population-based record linkage study

    OpenAIRE

    Maguire, Aideen; French, Declan; O'Reilly, Dermot

    2016-01-01

    BackgroundNeighbourhood segregation has been described as a fundamental determinant of physical health, but literature on its effect on mental health is less clear. Whilst most previous research has relied on conceptualized measures of segregation, Northern Ireland is unique as it contains physical manifestations of segregation in the form of segregation barriers (or “peacelines”) which can be used to accurately identify residential segregation. MethodsWe used population-wide health record da...

  19. Scalable Event-based Clustering of Social Media via Record Linkage Techniques

    OpenAIRE

    Reuter, Timo; Cimiano, Philipp; Drumond, Lucas; Buza, Krisztian; Schmidt-Thieme, Lars

    2011-01-01

    We tackle the problem of grouping content available in social media applications such as Flickr, Youtube, Panoramino etc. into clusters of documents describing the same event. This task has been referred to as event identification before.We present a new formalization of the event identification task as a record linkage problem and show that this formulation leads to a principled and highly efficient solution to the problem. We present results on two datasets derived from Flickr – last.fm and...

  20. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    OpenAIRE

    Balasubramaniam, S; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third pa...

  1. The Quality of Stakeholder-Based Decisions: Lessons from the Case Study Record

    OpenAIRE

    Beierle, Thomas

    2000-01-01

    The increased use of stakeholder processes in environmental decisionmaking has raised concerns that the inherently “political” nature of such processes may sacrifice substantive quality for political expediency. In particular, there is concern that good science will not be used adequately in stakeholder processes nor be reflected in their decision outcomes. This paper looks to the case study record to examine the quality of the outcomes of stakeholder efforts and the scientific and technical ...

  2. Analysis and correction of ballistocardiogram contamination of EEG recordings in MR

    International Nuclear Information System (INIS)

    Purpose: to examine the influence of cardiac activity-related head movements and varying blood pulse frequencies on the shape of electroencephalography (EEG) recordings in a high magnetic field, and to implement a post-processing technique to eliminate cardiac activity-related artifacts. Material and methods: respiratory thoracic movements, changes of blood pulse frequency and passive head movements to 20 healthy subjects were examined outside and inside an MR magnet at rest in a simultaneously recorded 21-channel surface EEG. An electrocardiogram (ECG) was recorded simultaneously. On the basis of the correlation of the left ventricular ejection time (LVET) with the heart-rate, a post-processing heart-rate dependent subtraction of the cardiac activity-related artifacts of the EEG was developed. The quality of the post-processed EEG was tested by detecting alpha-activity in the pre- and post-processed EEGs. Results: inside the magnet, passive head motion but not respiratory thoracic movements resulted in EEG artifacts that correlated strongly with cardiac activity-related artifacts of the EEG. The blood pulse frequency influenced the appearance of the cardiac activity-related artifacts of the EEG. The removal of the cardiac activity-related artifacts of the EEG by the implemented post-processing algorithm resulted in an EEG of diagnostic quality with detected alpha-activity. Conclusion: when recording an EEG in MR environment, heart rate-dependent subtraction of EEG artifacts caused by ballistocardiogram contamination is essential to obtain EEG recordings of diagnostic quality and reliability. (orig.)

  3. A river based stable isotope record of orographic precipitation: Taurus Mountains, south central Turkey

    Science.gov (United States)

    Schemmel, Fabian; Mulch, Andreas; Mikes, Tamás.; Schildgen, Taylor

    2010-05-01

    lapse rate of dD is about -20 per mil/km. Select samples have higher d18O and dD values than expected for their respective elevations due to strong evaporative effects at the sampling sites. In areas of very steep topography (waterfalls, valley gorges) the d18O and dD values of water samples are biased towards values lower than expected for their respective elevations. However, such irregularities can be accounted for by plotting the measured isotopic compositions against the hypsometric mean elevations instead of the actual sampling elevations. Comparison with data gathered by the GNIP network at four neighboring stations (Adana, Antalya, Güzeloluk, and Kocebeyli) shows that the longer-term (1 to 18 years) isotopic composition of precipitation agrees very well with the data collected in this study. Collectively, the data presented here may serve as a modern template against which late Neogene proxy-based records of paleoprecipitation along the southern margin of the central Anatolian plateau may be calibrated.

  4. Key Point Based Data Analysis Technique

    Science.gov (United States)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  5. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.;

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis......This review with 239 references has as its aim to give the reader an introduction to the kinds of methods used for developing microchip based electrode systems as well as to cover the existing literature on electroanalytical systems where microchips play a crucial role for 'nondestructive...... studies and scanning electrochemical microscopy (SECM) studies of living cells have been omitted. Included is also a discussion about some future and emerging nano tools and considerations that might have an impact on the future of "nondestructive" chip based electroanalysis of living cells....

  6. Fault-based analysis of flexible ciphers

    Directory of Open Access Journals (Sweden)

    V.I.Korjik

    2002-07-01

    Full Text Available We consider security of some flexible ciphers against differential fault analysis (DFA. We present a description of the fault-based attack on two kinds of the flexible ciphers. The first kind is represented by the fast software-oriented cipher based on data-dependent subkey selection (DDSS, in which flexibility corresponds to the use of key-dependent operations. The second kind is represented by a DES-like cryptosystem GOST with secrete S-boxes. In general, the use of some secrete operations and procedures contributes to the security of the cryptosystem, however degree of this contribution depends significantly on the structure of the encryption mechanism. It is shown how to attack the DDSS-based flexible cipher using DFA though this cipher is secure against standard variants of the differential and linear cryptanalysis. We also give an outline of ciphers RC5 and GOST showing that they are also insecure against DFA-based attack. We suggest also a modification of the DDSS mechanism and a variant of the advanced DDSS-based flexible cipher that is secure against attacks based on random hardware faults.

  7. Impact of a computerized system for evidence-based diabetes care on completeness of records: a before–after study

    Directory of Open Access Journals (Sweden)

    Roshanov Pavel S

    2012-07-01

    Full Text Available Abstract Background Physicians practicing in ambulatory care are adopting electronic health record (EHR systems. Governments promote this adoption with financial incentives, some hinged on improvements in care. These systems can improve care but most demonstrations of successful systems come from a few highly computerized academic environments. Those findings may not be generalizable to typical ambulatory settings, where evidence of success is largely anecdotal, with little or no use of rigorous methods. The purpose of our pilot study was to evaluate the impact of a diabetes specific chronic disease management system (CDMS on recording of information pertinent to guideline-concordant diabetes care and to plan for larger, more conclusive studies. Methods Using a before–after study design we analyzed the medical record of approximately 10 patients from each of 3 diabetes specialists (total = 31 who were seen both before and after the implementation of a CDMS. We used a checklist of key clinical data to compare the completeness of information recorded in the CDMS record to both the clinical note sent to the primary care physician based on that same encounter and the clinical note sent to the primary care physician based on the visit that occurred prior to the implementation of the CDMS, accounting for provider effects with Generalized Estimating Equations. Results The CDMS record outperformed by a substantial margin dictated notes created for the same encounter. Only 10.1% (95% CI, 7.7% to 12.3% of the clinically important data were missing from the CDMS chart compared to 25.8% (95% CI, 20.5% to 31.1% from the clinical note prepared at the time (p p  Conclusions The CDMS chart captured information important for the management of diabetes more often than dictated notes created with or without its use but we were unable to detect a difference in completeness between notes dictated in CDMS-associated and usual-care encounters. Our sample of

  8. Astronomical calibration of the Boreal Santonian (Cretaceous) based on the marine carbon isotope record and correlation to the tropical realm

    Science.gov (United States)

    Thibault, Nicolas; Jarvis, Ian; Voigt, Silke; Gale, Andy; Attree, Kevin; Jenkyns, Hugh

    2016-04-01

    New high-resolution records of bulk carbonate carbon isotopes have been generated for the Upper Coniacian to Lower Campanian interval of the reference sections at Seaford Head (southern England) and Bottaccione (Gubbio, central Italy). These records allow for a new and unambiguous stratigraphic correlation of the base and top of the Santonian between the Boreal and Tethyan realms. Orbital forcing of stable carbon and oxygen isotopes can be highlighted in the Seaford Head dataset, and a floating astronomical time scale is presented for the Santonian of the section, which spans five 405 kyr cycles (Sa1 to Sa5). Macro-, micro- and nannofossil biostratigraphy of the Seaford section is integrated along with magnetostratigraphy, carbon-isotope chemostratigraphy and cyclostratigraphy. Correlation of the Seaford Head astronomical time scale to that of the Niobrara Formation (U.S. Western Interior Basin) allows for anchoring these records to the La2011 astronomical solution at the Santonian-Campanian (Sa/Ca) boundary, which has been recently dated to 84.19±0.38 Ma. Five different astronomical tuning options are examined. The astronomical calibration generates a c. 200 kyr mismatch of the Coniacian-Santonian boundary age between the Boreal Realm in Europe and the Western Interior, likely due either to slight diachronism of the first occurrence of the inoceramid Cladoceramus undulatoplicatus between the two regions, or to remaining uncertainties of radiometric dating and the cyclostratigraphic records.

  9. System based practice: a concept analysis

    Science.gov (United States)

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  10. Constructing a population-based research database from routine maternal screening records: a resource for studying alloimmunization in pregnant women.

    Directory of Open Access Journals (Sweden)

    Brian K Lee

    Full Text Available BACKGROUND: Although screening for maternal red blood cell antibodies during pregnancy is a standard procedure, the prevalence and clinical consequences of non-anti-D immunization are poorly understood. The objective was to create a national database of maternal antibody screening results that can be linked with population health registers to create a research resource for investigating these issues. STUDY DESIGN AND METHODS: Each birth in the Swedish Medical Birth Register was uniquely identified and linked to the text stored in routine maternal antibody screening records in the time window from 9 months prior to 2 weeks after the delivery date. These text records were subjected to a computerized search for specific antibodies using regular expressions. To illustrate the research potential of the resulting database, selected antibody prevalence rates are presented as tables and figures, and the complete data (from more than 60 specific antibodies presented as online moving graphical displays. RESULTS: More than one million (1,191,761 births with valid screening information from 1982-2002 constitute the study population. Computerized coverage of screening increased steadily over time and varied by region as electronic records were adopted. To ensure data quality, we restricted analysis to birth records in areas and years with a sustained coverage of at least 80%, representing 920,903 births from 572,626 mothers in 17 of the 24 counties in Sweden. During the study period, non-anti-D and anti-D antibodies occurred in 76.8/10,000 and 14.1/10,000 pregnancies respectively, with marked differences between specific antibodies over time. CONCLUSION: This work demonstrates the feasibility of creating a nationally representative research database from the routine maternal antibody screening records from an extended calendar period. By linkage with population registers of maternal and child health, such data are a valuable resource for addressing important

  11. Vertical Microbial Community Variability of Carbonate-based Cones may Provide Insight into Formation in the Rock Record

    Science.gov (United States)

    Trivedi, C.; Bojanowski, C.; Daille, L. K.; Bradley, J.; Johnson, H.; Stamps, B. W.; Stevenson, B. S.; Berelson, W.; Corsetti, F. A.; Spear, J. R.

    2015-12-01

    Stromatolite morphogenesis is poorly understood, and the process by which microbial mats become mineralized is a primary question in microbialite formation. Ancient conical stromatolites are primarily carbonate-based whereas the few modern analogues in hot springs are either non-mineralized or mineralized by silica. A team from the 2015 International GeoBiology Course investigated carbonate-rich microbial cones from near Little Hot Creek (LHC), Long Valley Caldera, California, to investigate how conical stromatolites might form in a hot spring carbonate system. The cones are up to 3 cm tall and are found in a calm, ~45° C pool near LHC that is 4 times super-saturated with respect to CaCO3. The cones rise from a flat, layered microbial mat at the edge of the pool. Scanning electron microscopy revealed filamentous bacteria associated with calcite crystals within the cone tips. Preliminary 16S rRNA gene analysis indicated variability of community composition between different vertical levels of the cone. The cone tip had comparatively greater abundance of filamentous cyanobacteria (Leptolyngbya and Phormidium) and fewer heterotrophs (e.g. Chloroflexi) compared to the cone bottom. This supports the hypothesis that cone formation may depend on the differential abundance of the microbial community and their potential functional roles. Metagenomic analyses of the cones revealed potential genes related to chemotaxis and motility. Specifically, a genomic bin identified as a member of the genus Isosphaera contained an hmp chemotaxis operon implicated in gliding motility in the cyanobacterium Nostoc punctiforme [1]. Isosphaera is a Planctomycete shown to have phototactic capabilities [2], and may play a role in conjunction with cyanobacteria in the vertical formation of the cones. This analysis of actively growing cones indicates a complex interplay of geochemistry and microbiology that form structures which can serve as models for processes that occurred in the past and are

  12. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  13. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  14. Gait correlation analysis based human identification.

    Science.gov (United States)

    Chen, Jinyan

    2014-01-01

    Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x), vertical axis (y), and temporal axis (t). By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features' dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance. PMID:24592144

  15. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  16. Records Reaching Recording Data Technologies

    Science.gov (United States)

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-07-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  17. Redundancy in electronic health record corpora: analysis, impact on text mining performance and mitigation strategies

    OpenAIRE

    Cohen, Raphael; Elhadad, Michael; Elhadad, Noémie

    2013-01-01

    Background The increasing availability of Electronic Health Record (EHR) data and specifically free-text patient notes presents opportunities for phenotype extraction. Text-mining methods in particular can help disease modeling by mapping named-entities mentions to terminologies and clustering semantically related terms. EHR corpora, however, exhibit specific statistical and linguistic characteristics when compared with corpora in the biomedical literature domain. We focus on copy-and-paste r...

  18. Historical flood records of the Tagus river: Stationarity and flood hazard analysis

    OpenAIRE

    Benito, Gerardo; Botero, Blanca; Machado, María José

    2012-01-01

    The Tagus river drains the central Spanish Plateau (Meseta) and flows east-west into the Atlantic Ocean at Lisbon. It is the longest river of the Iberian Peninsula (1,200 km) and its flood regime is mainly related to persistent rainfalls associated to successive passage of cold fronts during winter months. Historical flood records at four major locations (Aranjuez, Toledo, Talavera and Alcantara) were derived both from historical documents (Proceedings of the City Council, diaries, chronicles...

  19. Analysis of Clinical Record Data for Anticoagulation Management within an EHR System

    OpenAIRE

    Austin, T.; Kalra, D.; Lea, N C; Patterson, D. L.; Ingram, D.

    2009-01-01

    OBJECTIVES: This paper reports an evaluation of the properties of a generic electronic health record information model that were actually required and used when importing an existing clinical application into a generic EHR repository. METHOD: A generic EHR repository and system were developed as part of the EU Projects Synapses and SynEx. A Web application to support the management of anticoagulation therapy was developed to interface to the EHR system, and deployed within a north London hosp...

  20. A strategic analysis of synapse and Canada health infoway’s electronic health record solution blueprint

    OpenAIRE

    Labrosse, Chadwick Andre

    2007-01-01

    Synapse is a currently deployed software application that collects and presents clinical and administrative information about Mental Health & Addictions patients, in the form of an Electronic Health Record (EHR). Synapse was jointly developed by regional health authorities, federal and provincial governments and research institutions. While Synapse has enjoyed limited regional success in British Columbia, the Synapse Project Steering Committee seeks to expand its adoption with clinicians ...

  1. A qualitative analysis of an electronic health record (EHR) implementation in an academic ambulatory setting

    OpenAIRE

    Kahyun Yoon-Flannery; Stephanie Zandieh; Gilad Kuperman; Daniel Langsam; Daniel Hyman; Rainu Kaushal

    2008-01-01

    Objectives To determine pre-implementation perspectives of institutional, practice and vendor leadership regarding best practice for implementation of two ambulatory electronic health records (EHRs) at an academic institution. Design Semi-structured interviews with ambulatory care network and information systems leadership, medical directors, practice managers and vendors before EHR implementation. Results were analysed using grounded theory with ATLAS.ti version 5.0. Measurements Quali...

  2. Tax evasion and measurement error: An econometric analysis of survey data linked with tax records

    OpenAIRE

    Paulus, Alari

    2015-01-01

    We use income survey data linked with tax records at the individual level for Estonia to estimate the determinants and extent of income tax compliance in a novel way. Unlike earlier studies attributing income discrepancies between such data sources either to tax evasion or survey measurement error, we model these processes jointly. Focussing on employment income, the key identifying assumption made is that people working in public sector cannot evade taxes. The results indicate a number of so...

  3. The INGV's new OBS/H: Analysis of the signals recorded at the Marsili submarine volcano

    OpenAIRE

    D'Alessandro, A.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione OV, Napoli, Italia; D'Anna, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia; Luzio, D.; CFTA, University of Palermo; Mangano, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione CNT, Roma, Italia

    2009-01-01

    The ocean bottom seismometer with hydrophone deployed on the flat top of the Marsili submarine volcano (790 m deep) by the Gibilmanna OBS Lab (CNT–INGV) from 12th to 21st July, 2006, recorded more than 1000 transient seismic signals. Nineteen of these signals were associated with tectonic earthquakes: 1 teleseismic, 8 regional (located by INGV) and 10 small local seismic events (non located earthquakes). The regional events were used to determine sensor orientation. By comparing t...

  4. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    Science.gov (United States)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  5. Analysis of a Near-field Earthquake Record at the Deep Underground Research Tunnel

    International Nuclear Information System (INIS)

    On October 29, 2008, a moderate earthquake (M=3.4, 36.35 N 127.25 E) occurred near the city of Daejon where an underground testing facilities called 'KURT (KAERI Underground Research Tunnel)' was located inside KAERI. Even though this earthquake did not trigger a seismic monitoring system of the mock-up Nuclear Power Plant of Hanaro, it was large enough not only to provide nation-wide earthquake data of good quality but also to be widely felt by the people uncomfortably around Daejon. In addition, this earthquake provides a good chance to obtain a nearfield broadband seismogram of frequency up to 200Hz recorded at the three-component geophones at the deep underground tunnel of the KURT (-90m). So we compared the seismic records from the KURT with other records from the nearby national seismic network to evaluate the earthquake ground-motion characteristics at the underground facilities for future engineering application. Three nearby seismic stations of the national seismic network jointly operated by Korea Meteorological Administration (KMA), Korea Institute of Geoscience And Mineral Resources (KIGAM), KEPRI, and KINS

  6. An analysis of the recording of tobacco use among inpatients in Irish hospitals.

    LENUS (Irish Health Repository)

    Sheridan, A

    2014-10-01

    Smoking is the largest avoidable cause of premature mortality in the world. Hospital admission is an opportunity to identify and help smokers quit. This study aimed to determine the level of recording of tobacco use (current and past) in Irish hospitals. Information on inpatient discharges with a tobacco use diagnosis was extracted from HIPE. In 2011, a quarter (n=84, 679) of discharges had a recording of tobacco use, which were more common among males (29% (n=50,161) male v. 20% (n=30,162) female), among medical patients (29% (n=54,375) medical v. 20% (n=30,162) other) and was highest among those aged 55-59 years (30.6%; n=7,885). SLAN 2007 reported that 48% of adults had smoked at some point in their lives. This study would suggest an under- reporting of tobacco use among hospital inpatients. Efforts should be made to record smoking status at hospital admission, and to improve the quality of the HIPE coding of tobacco use.

  7. Ostracod-based isotope record from Lake Ohrid (Balkan Peninsula) over the last 140 ka

    Science.gov (United States)

    Belmecheri, Soumaya; von Grafenstein, Ulrich; Andersen, Nils; Eymard-Bordon, Amandine; Régnier, Damien; Grenier, Christophe; Lézine, Anne-Marie

    2010-12-01

    The stable isotope composition of benthic ostracods from a deep-lake sediment core (JO2004-1) recovered from Lake Ohrid (Albania-Macedonia) was studied to investigate regional responses to climate change at the interface between the north-central European and Mediterranean climate systems. Ostracod valves are present only during interglacial intervals, during the Marine Isotope Stage (MIS) 5 and 1. The ostracod oxygen isotope values (δ 18O) quantitatively reflect changes in the oxygen isotope signal of the lake water (δ 18O L). The interpretation of this record however, is far from straight forward. δ 18O L variations throughout MIS 5/6 transition (TII), MIS 5 and MIS 1 appear to be controlled by site specific hydrological processes as shown by modern isotope hydrology. The δ 18O L trends at TII, MIS 5 and MIS 1 match the timing and the main structural feature of the major regional climate records (Corchia cave δ 18O, Iberian margin Sea Surface Temperature) suggesting that the Ohrid δ 18O L responded to global-scale climate changes, although it seems certain that the lake experienced a significant degree of evaporation and varying moisture availability. The carbon isotope signal (δ 13C) seems to respond more accurately to climate changes in agreement with other JO2004-1 proxies. δ 13C of the ostracod calcite is directly linked to the δ 13C of the dissolved inorganic carbon (DIC) in the lake, which in this case is controlled by the isotopic composition of the DIC in the incoming water and by the internal processes of the lake. High δ 13C during cold periods and low values during warm periods reflect changing vegetation cover and soil activity. These results suggest that Lake Ohrid has the potential to capture a long record of regional environment related-temperature trends during interglacial periods, particularly given the exceptional thickness of the lake sediment covering probably the entire Quaternary.

  8. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  9. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.

    2016-01-01

    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  10. Electronic Medical Record System Based on XML%基于XML的电子病历系统

    Institute of Scientific and Technical Information of China (English)

    陈可

    2012-01-01

    According to the problems of medical information transmission delaying and time consuming of browsing anamnesis in the hand-writing medical records, the design solution of the electronic medical record based on XML was raised and complemented. By the development platform of Web Services, the clinical information centered by patients was integrated by the support of EMR, including the physical orders, medical technical inspections, nursing care and infectious diseases' reports. And the information query and integration was implemented in the system. By applying the electronic medical record system, supervision on medical records from multi-direction is being substituted for emphasis on terminal quality control only in hand-writing medical records.%文章针对传统病历书写中存在的医疗信息传递慢,历史病历调阅繁琐等问题,提出并实现了基于XML技术的电子病历系统设计方案.该系统作为临床信息数据的载体,以患者诊疗信息为主线,借助Web Services开发平台,集成了医嘱、医技、护理及传染病报病等信息,实现了数据查询与集成.本系统克服了传统手写病历管理只重病历的终末监控的问题,强化了对病历的多点、多方位监控.

  11. HL7 document patient record architecture: an XML document architecture based on a shared information model.

    Science.gov (United States)

    Dolin, R H; Alschuler, L; Behlen, F; Biron, P V; Boyer, S; Essin, D; Harding, L; Lincoln, T; Mattison, J E; Rishel, W; Sokolowski, R; Spinosa, J; Williams, J P

    1999-01-01

    The HL7 SGML/XML Special Interest Group is developing the HL7 Document Patient Record Architecture. This draft proposal strives to create a common data architecture for the interoperability of healthcare documents. Key components are that it is under the umbrella of HL7 standards, it is specified in Extensible Markup Language, the semantics are drawn from the HL7 Reference Information Model, and the document specifications form an architecture that, in aggregate, define the semantics and structural constraints necessary for the exchange of clinical documents. The proposal is a work in progress and has not yet been submitted to HL7's formal balloting process. PMID:10566319

  12. The implementation of a Personal Digital Assistant (PDA) based patient record and charting system: lessons learned.

    OpenAIRE

    Carroll, Aaron E.; Saluja, Sunil; Tarczy-Hornoch, Peter

    2002-01-01

    Personal Digital Assistants (PDAs) offer many potential advantages to clinicians. A number of systems have begun to appear for all types of PDAs that allow for the recording and tracking of patient information. PDAs allow information to be both entered and accessed at the point of care. They also allow information entered away from a central repository to be added or "synced" with data through the use of a wireless or wired connection. Few systems, however, have been designed to work in the c...

  13. CIS3/398: Implementation of a Web-Based Electronic Patient Record for Transplant Recipients

    OpenAIRE

    Fritsche, L; Lindemann, G.; Schroeter, K.; Schlaefer, A; Neumayer, H-H

    1999-01-01

    Introduction While the "Electronic patient record" (EPR) is a frequently quoted term in many areas of healthcare, only few working EPR-systems are available so far. To justify their use, EPRs must be able to store and display all kinds of medical information in a reliable, secure, time-saving, user-friendly way at an affordable price. Fields with patients who are attended to by a large number of medical specialists over a prolonged period of time are best suited to demonstrate the potential b...

  14. Management of radiological related equipments. Creating the equipment management database and analysis of the repair and maintenance records

    International Nuclear Information System (INIS)

    In 1997, we established the committee of equipments maintenance and management in our department. We designed the database in order to classify and register all the radiological related equipments using Microsoft Access. The management of conditions and cost of each equipment has become easier, by keeping and recording the database in the equipments management ledger and by filing the history of repairs or maintenances occurred to modalities. We then accounted numbers, cost of repairs and downtimes from the data of the repair and maintenance records for four years, and we reexamined the causal analysis of failures and the contents of the regular maintenance for CT and MRI equipments that had shown the higher numbers of repairs. Consequently, we have found the improvement of registration method of the data and the more economical way to use of the cost of repair. (author)

  15. Non-Contact Analysis of the Adsorptive Ink Capacity of Nano Silica Pigments on a Printing Coating Base

    Science.gov (United States)

    Jiang, Bo; Huang, Yu Dong

    2014-01-01

    Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2  = 0.80 and SEP  = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled. PMID:25329464

  16. Non-contact analysis of the adsorptive ink capacity of nano silica pigments on a printing coating base.

    Science.gov (United States)

    Jiang, Bo; Huang, Yu Dong

    2014-01-01

    Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2  = 0.80 and SEP = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled. PMID:25329464

  17. Arabic Interface Analysis Based on Cultural Markers

    Directory of Open Access Journals (Sweden)

    Mohammadi Akheela Khanum

    2012-01-01

    Full Text Available This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstedes cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstedes score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  18. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  19. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A

    2012-01-01

    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  20. Video semantic content analysis based on ontology

    OpenAIRE

    Bai, Liang; Lao, Songyang; Jones, Gareth J.F.; Smeaton, Alan F.

    2007-01-01

    The rapid increase in the available amount of video data is creating a growing demand for efficient methods for understanding and managing it at the semantic level. New multimedia standards, such as MPEG-4 and MPEG-7, provide the basic functionalities in order to manipulate and transmit objects and metadata. But importantly, most of the content of video data at a semantic level is out of the scope of the standards. In this paper, a video semantic content analysis framework based on ontology i...

  1. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  2. An Agent Based System Framework for Mining Data Record Extraction from Search Engine Result Pages

    Directory of Open Access Journals (Sweden)

    Dr.K.L Shunmuganathan

    2012-04-01

    Full Text Available Nowadays, the huge amount of information distributed through the Web motivates studying techniques to be adopted in order to extract relevant data in an efficient and reliable way. Information extraction (IE from semistructured Web documents plays an important role for a variety of information agents. In this paper, a framework of WebIE system with the help of the JADE platform is proposed to solve problems by non-visual automatic wrapper to extract data records from search engine results pages which contain important information for Meta search engine and computer users. It gives the idea about different agents used in WebIE and how the communication occurred between them and how to manage different agents. Multi Agent System (MAS provides an efficient way for communicating agents and it is decentralized. Prototype model is developed for the study purpose and how it is used to solve the complex problems arise into the WebIE. Our wrapper consists of a series of agent filter to detect and remove irrelevant data region from the web page. In this paper, we propose a highly effective and efficient algorithm for automatically mining result records from search engine responsepages.

  3. Health scorecard of spacecraft platforms: Track record of on-orbit anomalies and failures and preliminary comparative analysis

    Science.gov (United States)

    Wise, Marcie A.; Saleh, Joseph H.; Haga, Rachel A.

    2011-01-01

    Choosing the "right" satellite platform for a given market and mission requirements is a major investment decision for a satellite operator. With a variety of platforms available on the market from different manufacturers, and multiple offerings from the same manufacturer, the down-selection process can be quite involved. In addition, because data for on-obit failures and anomalies per platform is unavailable, incomplete, or fragmented, it is difficult to compare options and make an informed choice with respect to the critical attribute of field reliability of different platforms. In this work, we first survey a large number of geosynchronous satellite platforms by the major satellite manufacturers, and we provide a brief overview of their technical characteristics, timeline of introduction, and number of units launched. We then analyze an extensive database of satellite failures and anomalies, and develop for each platform a "health scorecard" that includes all the minor and major anomalies, and complete failures—that is failure events of different severities—observed on-orbit for each platform. We identify the subsystems that drive these failure events and how much each subsystem contributes to these events for each platform. In addition, we provide the percentage of units in each platform which have experienced failure events, and, after calculating the total number of years logged on-orbit by each platform, we compute its corresponding average failure and anomaly rate. We conclude this work with a preliminary comparative analysis of the health scorecards of different platforms. The concept of a "health scorecard" here introduced provides a useful snapshot of the failure and anomaly track record of a spacecraft platform on orbit. As such, it constitutes a useful and transparent benchmark that can be used by satellite operators to inform their acquisition choices ("inform" not "base" as other considerations are factored in when comparing different spacecraft

  4. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations

    Science.gov (United States)

    Talandier, Jacques; Okal, Emile A.

    2016-07-01

    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called "hotspot earthquakes" occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wavetrain, than in the case of dislocation sources embedded in the solid Earth.

  5. A new source discriminant based on frequency dispersion for hydroacoustic phases recorded by T-phase stations

    Science.gov (United States)

    Talandier, Jacques; Okal, Emile A.

    2016-09-01

    In the context of the verification of the Comprehensive Nuclear-Test Ban Treaty in the marine environment, we present a new discriminant based on the empirical observation that hydroacoustic phases recorded at T-phase stations from explosive sources in the water column feature a systematic inverse dispersion, with lower frequencies traveling slower, which is absent from signals emanating from earthquake sources. This difference is present even in the case of the so-called `hotspot earthquakes' occurring inside volcanic edifices featuring steep slopes leading to efficient seismic-acoustic conversions, which can lead to misidentification of such events as explosions when using more classical duration-amplitude discriminants. We propose an algorithm for the compensation of the effect of dispersion over the hydroacoustic path based on a correction to the spectral phase of the ground velocity recorded by the T-phase station, computed individually from the dispersion observed on each record. We show that the application of a standard amplitude-duration algorithm to the resulting compensated time-series satisfactorily identifies records from hotspot earthquakes as generated by dislocation sources, and present a full algorithm, lending itself to automation, for the discrimination of explosive and earthquake sources of hydroacoustic signals at T-phase stations. The only sources not readily identifiable consist of a handful of complex explosions which occurred in the 1970s, believed to involve the testing of advanced weaponry, and which should be independently identifiable through routine vetting by analysts. While we presently cannot provide a theoretical justification to the observation that only explosive sources generate dispersed T phases, we hint that this probably reflects a simpler, and more coherent distribution of acoustic energy among the various modes constituting the wave train, than in the case of dislocation sources embedded in the solid Earth.

  6. Mass accumulation rate and monsoon records from Xifeng, Chinese Loess Plateau, based on a luminescence age model

    DEFF Research Database (Denmark)

    Stevens, Thomas; Buylaert, Jan-Pieter; Lu, Huayu;

    2016-01-01

    Luminescence dating of loess accumulation in China has raised questions over disturbance and gaps in the record, the magnitude of mass accumulation rates (MARs), and monsoon forcing mechanisms. Here we present a detailed quartz optically stimulated luminescence (OSL) chronology from the Xifeng...... Chinese Loess Plateau site. We reconstruct MARs and construct an age model for monsoon proxies at the site. The luminescence ages show significant pedogenic and anthropogenic disturbance in material deposited after ca. 20–22 ka. Analysis of other published data suggests that this disturbance may be more...

  7. A probabilistic analysis of human influence on recent record global mean temperature changes

    Directory of Open Access Journals (Sweden)

    Philip Kokic

    2014-01-01

    Full Text Available December 2013 was the 346th consecutive month where global land and ocean average surface temperature exceeded the 20th century monthly average, with February 1985 the last time mean temperature fell below this value. Even given these and other extraordinary statistics, public acceptance of human induced climate change and confidence in the supporting science has declined since 2007. The degree of uncertainty as to whether observed climate changes are due to human activity or are part of natural systems fluctuations remains a major stumbling block to effective adaptation action and risk management. Previous approaches to attribute change include qualitative expert-assessment approaches such as used in IPCC reports and use of ‘fingerprinting’ methods based on global climate models. Here we develop an alternative approach which provides a rigorous probabilistic statistical assessment of the link between observed climate changes and human activities in a way that can inform formal climate risk assessment. We construct and validate a time series model of anomalous global temperatures to June 2010, using rates of greenhouse gas (GHG emissions, as well as other causal factors including solar radiation, volcanic forcing and the El Niño Southern Oscillation. When the effect of GHGs is removed, bootstrap simulation of the model reveals that there is less than a one in one hundred thousand chance of observing an unbroken sequence of 304 months (our analysis extends to June 2010 with mean surface temperature exceeding the 20th century average. We also show that one would expect a far greater number of short periods of falling global temperatures (as observed since 1998 if climate change was not occurring. This approach to assessing probabilities of human influence on global temperature could be transferred to other climate variables and extremes allowing enhanced formal risk assessment of climate change.

  8. Detection of the short-term preseizure changes in EEG recordings using complexity and synchrony analysis

    Institute of Scientific and Technical Information of China (English)

    JIA Wenyan; KONG Na; MA Jun; LIU Hesheng; GAO Xiaorong; GAO Shangkai; YANG Fusheng

    2006-01-01

    An important consideration in epileptic seizure prediction is proving the existence of a pre-seizure state that can be detected using various signal processing algorithms. In the analyses of intracranial electroencephalographic (EEG)recordings of four epilepsy patients, the short-term changes in the measures of complexity and synchrony were detected before the majority of seizure events across the sample patient population. A decrease in complexity and increase in phase synchrony appeared several minutes before seizure onset and the changes were more pronounced in the focal region than in the remote region. This result was also validated statistically using a surrogate data method.

  9. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning

    Science.gov (United States)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.

    2012-07-01

    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get

  10. Curvelet based offline analysis of SEM images.

    Directory of Open Access Journals (Sweden)

    Syed Hamad Shirazi

    Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.

  11. Soldering-based easy packaging of thin polyimide multichannel electrodes for neuro-signal recording

    International Nuclear Information System (INIS)

    We propose a novel packaging method for preparing thin polyimide (PI) multichannel microelectrodes. The electrodes were connected simply by making a via-hole at the interconnection pad of a thin PI electrode, and a nickel (Ni) ring was constructed by electroplating through the via-hole to permit stable soldering with strong adhesion to the electrode and the printed circuit board. The electroplating conditions were optimized for the construction of a well-organized Ni ring. The electrical properties of the packaged electrode were evaluated by fabricating and packaging a 40-channel thin PI electrode. Animal experiments were performed using the packaged electrode for high-resolution recording of somatosensory evoked potential from the skull of a rat. The in vivo and in vitro tests demonstrated that the packaged PI electrode may be used broadly for the continuous measurement of bio-signals or for neural prosthetics. (paper)

  12. Global solar radiation: comparison of satellite-based climatology with station records

    Science.gov (United States)

    Skalak, Petr; Zahradnicek, Pavel; Stepanek, Petr; Farda, Ales

    2016-04-01

    We analyze surface incoming shortwave radiation (SIS) from the SARAH dataset prepared by the EUMETSAT Climate Monitoring Satellite Applications Facility from satellite observations of the visible channels of the MVIRI and SEVIRI instruments onboard the geostationary Meteosat satellites. The satellite SIS data are evaluated within the period 1984-2014 on various time scales: from individual months and years to long-term climate means. The validation is performed using the ground measurements of global solar radiation (GLBR) carried out on 11 meteorological stations of the Czech Hydrometeorological Institute in the Czech Republic with at least 30 years long data series. Our aim is to explore whether the SIS data could potentially serve as an alternative source of information on GLBR outside of a relatively sparse network of meteorological stations recording GLBR. Acknowledgement: Supported by the Ministry of Education, Youth and Sports of the Czech Republic within the National Sustainability Program I (NPU I), grant number LO1415.

  13. Soldering-based easy packaging of thin polyimide multichannel electrodes for neuro-signal recording

    Science.gov (United States)

    Baek, Dong-Hyun; Han, Chang-Hee; Jung, Ha-Chul; Kim, Seon Min; Im, Chang-Hwan; Oh, Hyun-Jik; Jungho Pak, James; Lee, Sang-Hoon

    2012-11-01

    We propose a novel packaging method for preparing thin polyimide (PI) multichannel microelectrodes. The electrodes were connected simply by making a via-hole at the interconnection pad of a thin PI electrode, and a nickel (Ni) ring was constructed by electroplating through the via-hole to permit stable soldering with strong adhesion to the electrode and the printed circuit board. The electroplating conditions were optimized for the construction of a well-organized Ni ring. The electrical properties of the packaged electrode were evaluated by fabricating and packaging a 40-channel thin PI electrode. Animal experiments were performed using the packaged electrode for high-resolution recording of somatosensory evoked potential from the skull of a rat. The in vivo and in vitro tests demonstrated that the packaged PI electrode may be used broadly for the continuous measurement of bio-signals or for neural prosthetics.

  14. Implementation of a Next-Generation Electronic Nursing Records System Based on Detailed Clinical Models and Integration of Clinical Practice Guidelines

    OpenAIRE

    Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook

    2013-01-01

    Objectives The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. Methods A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation ...

  15. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    OpenAIRE

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The h...

  16. Quantitative reconstruction of the last interglacial vegetation and climate based on the pollen record from Lake Baikal, Russia

    Energy Technology Data Exchange (ETDEWEB)

    Tarasov, P. [Free University, Institute of Geological Sciences, Palaeontology Department, Berlin (Germany); Granoszewski, W. [Polish Geological Institute, Carpathian Branch, Krakow (Poland); Bezrukova, E.; Abzaeva, A. [Siberian Branch Russian Academy of Sciences, Institute of Geochemistry, Irkutsk (Russian Federation); Brewer, S. [CEREGE CNRS/University P. Cezanne, UMR 6635, BP80, Aix-en-Provence (France); Nita, M. [University of Silesia, Faculty of Earth Sciences, Sosnowiec (Poland); Oberhaensli, H. [GeoForschungsZentrum, Potsdam (Germany)

    2005-11-01

    Changes in mean temperature of the coldest (T{sub c}) and warmest month (T{sub w}), annual precipitation (P{sub ann}) and moisture index ({alpha}) were reconstructed from a continuous pollen record from Lake Baikal, Russia. The pollen sequence CON01-603-2 (53 57'N, 108 54'E) was recovered from a 386 m water depth in the Continent Ridge and dated to ca. 130-114.8 ky BP. This time interval covers the complete last interglacial (LI), corresponding to MIS 5e. Results of pollen analysis and pollen-based quantitative biome reconstruction show pronounced changes in the regional vegetation throughout the record. Shrubby tundra covered the area at the beginning of MIS 5e (ca. 130-128 ky), consistent with the end of the Middle Pleistocene glaciation. The late glacial climate was characterised by low winter and summer temperatures (T{sub c}{proportional_to} -38 to -35 C and T{sub w}{proportional_to}11-13 C) and low annual precipitation (P{sub ann}{proportional_to}300 mm). However, the wide spread of tundra vegetation suggests rather moist environments associated with low temperatures and evaporation (reconstructed {alpha}{proportional_to}1). Tundra was replaced by boreal conifer forest (taiga) by ca. 128 ky BP, suggesting a transition to the interglacial. Taiga-dominant phase lasted until ca. 117.4 ky BP, e.g. about 10 ky. The most favourable climate conditions occurred during the first half of the LI. P{sub ann} reached 500 mm soon after 128 ky BP. However, temperature changed more gradually. Maximum values of T{sub c}{proportional_to} -20 C and T{sub w}{proportional_to}16-17 C are reconstructed from about 126 ky BP. Conditions became gradually colder after ca. 121 ky BP. T{sub c} dropped to {proportional_to} -27 C and T{sub w} to {proportional_to}15 C by 119.5 ky BP. The reconstructed increase in continentality was accompanied by a decrease in P{sub ann} to {proportional_to}400-420 mm. However, the climate was still humid enough ({alpha}{proportional_to}0.9) to

  17. Strain localization and fluid infiltration during subduction initiation: the record from sheared mafic amphibolites at the base of the New Caledonian ophiolite

    Science.gov (United States)

    Soret, Mathieu; Agard, Philippe; Dubacq, Benoît; Vitale-Brovarone, Alberto; Monié, Patrick; Chauvet, Alain; Whitechurch, Hubert

    2015-04-01

    Most of our knowledge on subduction inception and obduction processes comes from metamorphic soles structurally associated with peridotite tectonites at the base of many ophiolites, and from early-obduction, rarely deformed, magmatic dikes emplaced at different level of the mantle sequence. These dikes record a partial refertilization of obducted ophiolites through subduction-derived fluids. However, these dikes are rarely deformed and/or metamorphosed. Here, we study the base of the New Caledonian ophiolite, using a combination of structural field studies and petrological-geochemical-geochronological analysis, with the aim of linking deformation and metasomatism through fluid infiltration and recrystallization. We report the existence of strongly sheared mafic amphibolites within the base of the New Caledonian obducted ophiolite, ~ 50-100 m above the basal thrust contact and peridotites), highly boudinaged and amphibolitized at high temperatures (750-800 °C), providing evidence that strain localized at the base of the ophiolite. Mafic protoliths of these amphibolites consisted of plagioclase and orthopyroxene (± olivine and calcic amphibole in places). We show that deformation is intimately associated to at least three major stages of fluid infiltration on mafic intrusions. The first stage of deformation and metasomatism coincides with amphibolitization and controlled the later channelization of fluids. The formation of calcic amphiboles records the percolation of Ca and Al-rich aqueous fluids. Amphibole-plagioclase geothermobarometry indicates high temperature and low pressure conditions (i.e. 750-800 °C; 3-5 kbar). Thermochronological data from hornblende (40Ar/39Ar) suggest that this deformation episode occurred at ~ 55 Ma, coinciding with E-dipping subduction initiation and incipient obduction. The main metasomatic stage is evidenced by a phlogopite-rich matrix wrapping peridotite and amphibolite boudins. The formation of phlogopite records the percolation

  18. Spatial-temporal analysis on climate variation in early Qing dynasty (17th -18th century) using China's chronological records

    Science.gov (United States)

    Lin, Kuan-Hui Elaine; Wang, Pao-Kuan; Fan, I.-Chun; Liao, Yi-Chun; Liao, Hsiung-Ming; Pai, Pi-Ling

    2016-04-01

    Global climate change in the form of extreme, variation, and short- or mid-term fluctuation is now widely conceived to challenge the survival of the human beings and the societies. Meanwhile, improving present and future climate modeling needs a comprehensive understanding of the past climate patterns. Although historical climate modeling has gained substantive progress in recent years based on the new findings from dynamical meteorology, phenology, or paleobiology, less known are the mid- to short-term variations or lower-frequency variabilities at different temporal scale and their regional expressions. Enabling accurate historical climate modeling would heavily rely on the robustness of the dataset that could carry specific time, location, and meteorological information in the continuous temporal and spatial chains. This study thus presents an important methodological innovation to reconstruct historical climate modeling at multiple temporal and spatial scales through building a historical climate dataset, based on the Chinese chronicles compiled in a Zhang (2004) edited Compendium of Chinese Meteorological Records of the Last 3,000 Years since Zhou Dynasty (1100BC). The dataset reserves the most delicate meteorological data with accurate time, location, meteorological event, duration, and other phonological, social and economic impact information, and is carefully digitalized, coded, and geo-referenced on the Geographical Information System based maps according to Tan's (1982) historical atlas in China. The research project, beginning in January 2015, is a collaborative work among scholars across meteorology, geography, and historical linguistics disciplines. The present research findings derived from the early 100+ years of the Qing dynasty include the following. First, the analysis is based on the sampling size, denoted as cities/counties, n=1398 across the Mainland China in the observation period. Second, the frequencies of precipitation, cold

  19. Tree-ring analysis by pixe for a historical record of soil chemistry response to acidic air pollution

    Science.gov (United States)

    Legge, Allan H.; Kaufmann, Henry C.; Winchester, John W.

    1984-04-01

    Tree cores have been analyzed intact in 1 mm steps, corresponding to time intervals in the rings as short as half a growing season, providing a chronological record of 16 elemental concentrations extending over thirty years back to 1950. Samples were collected in a forested region of western Canada in sandy soil which was impacted by acid-forming gases released by a sulfur recovery sour natural gas plant. Tree core samples of the hybrid lodgepole-Jack pine ( Pinns contorta Loud. × Pinus banksiana Lamb.) were taken in five ecologically similar locations between 1.2 and 9.6 km from the gas plant stacks. Concentrations of some elements showed patterns suggesting that the annual rings preserved a record of changing soil chemistry in response both to natural environmental conditions and to deposition from sulfur gas emissions, commencing after plant start-up in 1959 and modified by subsequent modifications in plant operating procedures. These patterns were most pronounced nearest the gas plant. Certain other elements did not exhibit these patterns, probably reflecting greater importance of biological than of soil chemical properties. The high time resolution of tree-ring analysis, which can be achieved by PIXE, demonstrates that the rings preserve a historical record of elemental composition which may reflect changes in soil chemistry during plant growth as it may be affected by both natural ecological processes and acidic deposition from the atmosphere.

  20. Analysis of geomagnetic storm variations and count-rate of cosmic ray muons recorded at the Brazilian southern space observatory

    Energy Technology Data Exchange (ETDEWEB)

    Frigo, Everton [University of Sao Paulo, USP, Institute of Astronomy, Geophysics and Atmospheric Sciences, IAG/USP, Department of Geophysics, Sao Paulo, SP (Brazil); Savian, Jairo Francisco [Space Science Laboratory of Santa Maria, LACESM/CT, Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil); Silva, Marlos Rockenbach da; Lago, Alisson dal; Trivedi, Nalin Babulal [National Institute for Space Research, INPE/MCT, Division of Space Geophysics, DGE, Sao Jose dos Campos, SP (Brazil); Schuch, Nelson Jorge, E-mail: efrigo@iag.usp.br, E-mail: savian@lacesm.ufsm.br, E-mail: njschuch@lacesm.ufsm.br, E-mail: marlos@dge.inpe.br, E-mail: dallago@dge.inpe.br, E-mail: trivedi@dge.inpe.br [Southern Regional Space Research Center, CRS/INPE, MCT, Santa Maria, RS (Brazil)

    2007-07-01

    An analysis of geomagnetic storm variations and the count rate of cosmic ray muons recorded at the Brazilian Southern Space Observatory -OES/CRS/INPE-MCT, in Sao Martinho da Serra, RS during the month of November 2004, is presented in this paper. The geomagnetic measurements are done by a three component low noise fluxgate magnetometer and the count rates of cosmic ray muons are recorded by a muon scintillator telescope - MST, both instruments installed at the Observatory. The fluxgate magnetometer measures variations in the three orthogonal components of Earth magnetic field, H (North-South), D (East-West) and Z (Vertical), with data sampling rate of 0.5 Hz. The muon scintillator telescope records hourly count rates. The arrival of a solar disturbance can be identified by observing the decrease in the muon count rate. The goal of this work is to describe the physical morphology and phenomenology observed during the geomagnetic storm of November 2004, using the H component of the geomagnetic field and vertical channel V of the multi-directional muon detector in South of Brazil. (author)

  1. On the use of radar-based quantitative precipitation estimates for precipitation frequency analysis

    Science.gov (United States)

    Eldardiry, Hisham; Habib, Emad; Zhang, Yu

    2015-12-01

    The high spatio-temporal resolutions of radar-based multi-sensor Quantitative Precipitation Estimates (QPEs) makes them a potential complement to the gauge records for engineering design purposes, such as precipitation frequency analysis. The current study investigates three fundamental issues that arise when radar-based QPE products are used in frequency analysis: (a) Effect of sample size due to the typically short records of radar products; (b) Effect of uncertainties present in radar-rainfall estimation algorithms; and (c) Effect of the frequency estimation approach adopted. The study uses a 13-year dataset of hourly, 4 × 4 km2 radar-based over a domain that covers Louisiana, USA. Data-based investigations, as well as synthetic simulations, are performed to quantify the uncertainties associated with the radar-based derived frequencies, and to gain insight into the relative contributions of short record lengths and those from conditional biases in the radar product. Three regional estimation procedures were tested and the results indicate the sensitivity of the radar frequency estimates to the selection of the estimation approach and the impact on the uncertainties of the derived extreme quantiles. The simulation experiments revealed that the relatively short radar records explained the majority of the uncertainty associated with the radar-based quantiles; however, they did not account for any tangible contribution to the systematic underestimation observed between radar- and gauge-based frequency estimates. This underestimation was mostly attributable to the conditional bias inherent in the radar product. Addressing such key outstanding problems in radar-rainfall products is necessary before they can be fully and reliably used for frequency analysis applications.

  2. Nut crop yield records show that budbreak-based chilling requirements may not reflect yield decline chill thresholds

    Science.gov (United States)

    Pope, Katherine S.; Dose, Volker; Da Silva, David; Brown, Patrick H.; DeJong, Theodore M.

    2015-06-01

    Warming winters due to climate change may critically affect temperate tree species. Insufficiently cold winters are thought to result in fewer viable flower buds and the subsequent development of fewer fruits or nuts, decreasing the yield of an orchard or fecundity of a species. The best existing approximation for a threshold of sufficient cold accumulation, the "chilling requirement" of a species or variety, has been quantified by manipulating or modeling the conditions that result in dormant bud breaking. However, the physiological processes that affect budbreak are not the same as those that determine yield. This study sought to test whether budbreak-based chilling thresholds can reasonably approximate the thresholds that affect yield, particularly regarding the potential impacts of climate change on temperate tree crop yields. County-wide yield records for almond ( Prunus dulcis), pistachio ( Pistacia vera), and walnut ( Juglans regia) in the Central Valley of California were compared with 50 years of weather records. Bayesian nonparametric function estimation was used to model yield potentials at varying amounts of chill accumulation. In almonds, average yields occurred when chill accumulation was close to the budbreak-based chilling requirement. However, in the other two crops, pistachios and walnuts, the best previous estimate of the budbreak-based chilling requirements was 19-32 % higher than the chilling accumulations associated with average or above average yields. This research indicates that physiological processes beyond requirements for budbreak should be considered when estimating chill accumulation thresholds of yield decline and potential impacts of climate change.

  3. Annotation methods to develop and evaluate an expert system based on natural language processing in electronic medical records.

    Science.gov (United States)

    Gicquel, Quentin; Tvardik, Nastassia; Bouvry, Côme; Kergourlay, Ivan; Bittar, André; Segond, Frédérique; Darmoni, Stefan; Metzger, Marie-Hélène

    2015-01-01

    The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data. PMID:26262366

  4. Seismic analysis of base-isolated liquid storage tanks

    Science.gov (United States)

    Shrimali, M. K.; Jangid, R. S.

    2004-08-01

    Three analytical studies for the seismic response of base-isolated ground supported cylindrical liquid storage tanks under recorded earthquake ground motion are presented. The continuous liquid mass of the tank is modelled as lumped masses referred as sloshing mass, impulsive mass and rigid mass. Firstly, the seismic response of isolated tanks is obtained using the modal superposition technique and compared with the exact response to study the effects of non-classical damping. The comparison of results with different tank aspect ratios and stiffness and damping of the bearing indicate that the effects of non-classical damping are insignificant implying that the response of isolated liquid storage tanks can be accurately obtained by the modal analysis with classical damping approximation. The second investigation involves the analysis of base-isolated liquid storage tanks using the response spectrum method in which the peak response of tank in different modes is obtained for the specified response spectrum of earthquake motion and combined with different combination rules. The results indicate that the peak response obtained by the response spectrum method matches well with the corresponding exact response. However, specific combination rule should be used for better estimation of various response quantities of the isolated tanks. Finally, the closed-form expressions for the modal parameters of the base-isolated liquid storage tanks are derived and compared with the exact values. A simplified approximate method is also proposed to evaluate the seismic response of isolated tanks. The response obtained from the above approximate method was found to be in good agreement with the exact response.

  5. Analysis of female radiation workers dose records in the DAE Units of Andhra Pradesh

    International Nuclear Information System (INIS)

    Basis for control of occupational exposures of women is same as that of men except for pregnant women. Percentage of women working in radiation areas of DAE has marginally increased in the last three decades. This paper analysed the data on the externally received personal dose equivalent for female radiation workers who have been exposed ti ionizing radiation in different occupations of DAE units in Andhra Pradesh. From this study we can say confidently that it is equally safe for women to work in radiation areas as long as they follow radiation protection principles. Hence, women in India should be made aware that it is safe to work in radiation areas and DAE is taking their care by periodical medical checkups, maintaining dose records, etc

  6. Visual Similarity Based Document Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    Di Wen; Xiao-Qing Ding

    2006-01-01

    In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.

  7. Watermark Resistance Analysis Based On Linear Transformation

    Directory of Open Access Journals (Sweden)

    N.Karthika Devi

    2012-06-01

    Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.

  8. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  9. Cognitive fusion analysis based on context

    Science.gov (United States)

    Blasch, Erik P.; Plano, Susan

    2004-04-01

    The standard fusion model includes active and passive user interaction in level 5 - "User Refinement". User refinement is more than just details of passive automation partitioning - it is the active management of information. While a fusion system can explore many operational conditions over myopic changes, the user has the ability to reason about the hyperopic "big picture." Blasch and Plano developed cognitive-fusion models that address user constraints including: intent, attention, trust, workload, and throughput to facilitate hyperopic analysis. To enhance user-fusion performance modeling (i.e. confidence, timeliness, and accuracy); we seek to explore the nature of context. Context, the interrelated conditions of which something exists, can be modeled in many ways including geographic, sensor, object, and environmental conditioning. This paper highlights user refinement actions based on context to constrain the fusion analysis for accurately representing the trade space in the real world. As an example, we explore a target identification task in which contextual information from the user"s cognitive model is imparted to a fusion belief filter.

  10. ATLAS Recordings

    CERN Multimedia

    Steven Goldfarb; Mitch McLachlan; Homer A. Neal

    Web Archives of ATLAS Plenary Sessions, Workshops, Meetings, and Tutorials from 2005 until this past month are available via the University of Michigan portal here. Most recent additions include the Trigger-Aware Analysis Tutorial by Monika Wielers on March 23 and the ROOT Workshop held at CERN on March 26-27.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal.Feedback WelcomeOur group is making arrangements now to record plenary sessions, tutorials, and other important ATLAS events for 2007. Your suggestions for potential recording, as well as your feedback on existing archives is always welcome. Please contact us at wlap@umich.edu. Thank you.Enjoy the Lectures!

  11. Camera-Vision Based Oil Content Prediction for Oil Palm (Elaeis Guineensis Jacq Fresh Fruits Bunch at Various Recording Distances

    Directory of Open Access Journals (Sweden)

    Dinah Cherie

    2015-01-01

    Full Text Available In this study, the correlation between oil palm fresh fruits bunch (FFB appearance and its oil content (OC was explored. FFB samples were recorded from various distance (2, 7, and 10 m with different lighting spectrums and configurations (Ultraviolet: 280-380nm, Visible: 400-700nm, and Infrared: 720-1100nm and intensities (600watt and 1000watt lamps to explore the correlations. The recorded FFB images were segmented and its color features were subsequently extracted to be used as input variables for modeling the OC of the FFB. In this study, four developed models were selected to perform oil content prediction (OCP for intact FFBs. These models were selected based on their validity and accuracy upon performing the OCP. Models were developed using Multi-Linear-Perceptron-Artificial-Neural-Network (MLP-ANN methods, employing 10 hidden layers and 15 images features as input variables. Statistical engineering software was used to create the models. Although the number of FFB samples in this study was limited, four models were successfully developed to predict intact FFB’s OC, based on its images’ color features. Three OCP models developed for image recording from 10 m under UV, Vis2, and IR2 lighting configurations. Another model was successfully developed for short range imaging (2m under IR2 light. The coefficient of correlation for each model when validated was 0.816, 0.902, 0.919, and 0.886, respectively. For bias and error, these selected models obtained root-mean-square error (RMSE of 1.803, 0.753, 0.607, and 1.104, respectively.

  12. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  13. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Directory of Open Access Journals (Sweden)

    Sergio Miranda Freire

    Full Text Available This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when

  14. Maintenance of a Computerized Medical Record Form

    OpenAIRE

    Steichen, Olivier; Rossignol, Patrick; Daniel-Lebozec, Christel; Charlet, Jean; Jaulent, Marie-Christine; Degoulet, Patrice

    2007-01-01

    Structured entry forms for clinical records should be updated to take into account the physicians’ needs during consultation and advances in medical knowledge and practice. We updated the computerized medical record form of a hypertension clinic, based on its previous use and clinical guidelines. A statistical analysis of previously completed forms identified several unnecessary items rarely used by clinicians. A terminological analysis of guidelines and of free-text answers on completed form...

  15. Security of the distributed electronic patient record: a case-based approach to identifying policy issues.

    Science.gov (United States)

    Anderson, J G

    2000-11-01

    The growth of managed care and integrated delivery systems has created a new commodity, health information and the technology that it requires. Surveys by Deloitte and Touche indicate that over half of the hospitals in the US are in the process of implementing electronic patient record (EPR) systems. The National Research Council has established that industry spends as much as $15 billion on information technology (IT), an amount that is expanding by 20% per year. The importance of collecting, electronically storing, and using the information is undisputed. This information is needed by consumers to make informed choices; by physicians to provide appropriate quality clinical care: and by health plans to assess outcomes, control costs and monitor quality. The collection, storage and communication of a large variety of personal patient data, however, present a major dilemma. How can we provide the data required by the new forms of health care delivery and at the same time protect the personal privacy of patients? Recent debates concerning medical privacy legislation, software regulation, and telemedicine suggest that this dilemma will not be easily resolved. The problem is systemic and arises out of the routine use and flow of information throughout the health industry. Health care information is primarily transferred among authorized users. Not only is the information used for patient care and financial reimbursement, secondary users of the information include medical, nursing, and allied health education, research, social services, public health, regulation, litigation, and commercial purposes such as the development of new medical technology and marketing. The main threats to privacy and confidentiality arise from within the institutions that provide patient care as well as institutions that have access to patient data for secondary purposes. PMID:11154961

  16. Cystic Echinococcosis Epidemiology in Spain Based on Hospitalization Records, 1997-2012

    Science.gov (United States)

    Siles-Lucas, Mar; Aparicio, Pilar; Lopez-Velez, Rogelio; Gherasim, Alin; Garate, Teresa; Benito, Agustín

    2016-01-01

    Background Cystic echinococcosis (CE) is a parasitic disease caused by the tapeworm Echinococcus granulosus. Although present throughout Europe, deficiencies in the official reporting of CE result in under-reporting and misreporting of this disease, which in turn is reflected in the wrong opinion that CE is not an important health problem. By using an alternative data source, this study aimed at describing the clinical and temporal-spatial characteristics of CE hospitalizations in Spain between 1997 and 2012. Methodology/Principal Findings We performed a retrospective descriptive study using the Hospitalization Minimum Data Set (CMBD in Spanish). All CMBD’s hospital discharges with echinococcosis diagnosis placed in first diagnostic position were reviewed. Hospitalization rates were computed and clinical characteristics were described. Spatial and temporal distribution of hospital discharges was also assessed. Between 1997 and 2012, 14,010 hospitalizations with diagnosis of CE were recorded, 55% were men and 67% were aged over 45 years. Pediatric hospitalizations occurred during the whole study period. The 95.2% were discharged at home, and only 1.7% were exitus. The average cost was 8,439.11 €. The hospitalization rate per 100,000 per year showed a decreasing trend during the study period. All the autonomous communities registered discharges, even those considered as non-endemic. Maximum rates were reached by Extremadura, Castilla-Leon and Aragon. Comparison of the CMBD data and the official Compulsory Notifiable Diseases (CND) reports from 2005 to 2012 showed that official data were lower than registered hospitalization discharges. Conclusions Hospitalizations distribution was uneven by year and autonomous region. Although CE hospitalization rates have decreased considerably due to the success of control programs, it remains a public health problem due to its severity and economic impact. Therefore, it would be desirable to improve its oversight and

  17. Data base of array characteristics instrument response and data, recorded at NNC

    Energy Technology Data Exchange (ETDEWEB)

    Bushueva, E.A.; Ermolenko, E.A.; Efremova, N.A. [and others

    1996-12-01

    A northern and east-northern parts of Kazakstan Republic are utterly favorable for a placing of seismic stations. There is a very low level of natural and industrial seismic noise. Rocks of Kazakh epi-Hercynian platform have a very good transmissive properties. Geophysical observatories (GOs), now belonging to the Institute of Geophysical Researches of National Nuclear Center of Kazakstan Republic (IGR NNC RK), were established in especially selected low-noise places of Northern Kazakstan, in accordance with Soviet program for nuclear weapons test monitoring. In 1994, these GOs were transferred by Russian Federation into the possession of Kazakstan. A location of GOs is shown on the Fig. 1. According to the studying of seismic noises, jointly implemented by scientists from IGR and IRIS, places, where a `Borovoye` and `Kurchatov` seismic stations are located, are among the best places for seismic observations in the world. A seismic arrays exist in `Borovoye` and `Kurchatov` observatories - in two observatories out four (`Aktiubinsk`, `Borovoye`, `Kurchatov` and `Makanchi`). These two observatories are described in this report. A history of geophysical observatories, conditions of equipment operations (climatic, geological and so on) are presented in this report, as well as it is described the equipment of GOs and seismic arrays, and samples of digital seismograms, recorded on the equipment of various types, are presented in this report. GO `Borovoye` is described in the 2nd chart, GO `Kurchatov` is described in the 3rd chart of the report. The main results of work are presented in the conclusion. A list of used papers, a list of tables and figures is given in the end of the report. 14 refs., 95 figs., 12 tabs.

  18. Threshold-based system for noise detection in multilead ECG recordings

    International Nuclear Information System (INIS)

    This paper presents a system for detection of the most common noise types seen on the electrocardiogram (ECG) in order to evaluate whether an episode from 12-lead ECG is reliable for diagnosis. It implements criteria for estimation of the noise corruption level in specific frequency bands, aiming to identify the main sources of ECG quality disruption, such as missing signal or limited dynamics of the QRS components above 4 Hz; presence of high amplitude and steep artifacts seen above 1 Hz; baseline drift estimated at frequencies below 1 Hz; power–line interference in a band ±2 Hz around its central frequency; high-frequency and electromyographic noises above 20 Hz. All noise tests are designed to process the ECG series in the time domain, including 13 adjustable thresholds for amplitude and slope criteria which are evaluated in adjustable time intervals, as well as number of leads. The system allows flexible extension toward application-specific requirements for the noise levels in acceptable quality ECGs. Training of different thresholds’ settings to determine different positive noise detection rates is performed with the annotated set of 1000 ECGs from the PhysioNet database created for the Computing in Cardiology Challenge 2011. Two implementations are highlighted on the receiver operating characteristic (area 0.968) to fit to different applications. The implementation with high sensitivity (Se = 98.7%, Sp = 80.9%) appears as a reliable alarm when there are any incidental problems with the ECG acquisition, while the implementation with high specificity (Sp = 97.8%, Se = 81.8%) is less susceptible to transient problems but rather validates noisy ECGs with acceptable quality during a small portion of the recording. (paper)

  19. Methodological insights on information technologies and distributional analysis for the archaeological record in stratigraphic excavation contexts: the case study of Neolithic settlement of Sammardenchia

    Directory of Open Access Journals (Sweden)

    Cecilia Milantoni

    2008-06-01

    Full Text Available This paper deals with the methodology applied to the management of records from excavations; usually represented by a huge amount of finds and documents. A latter aim is to experiment the distributional analysis of the archaeological record with a GIS software package. The neolithic site of Sammardenchia represents a case study for approaching digitizing techniques of records collected in several seasons of fieldwork and to experiment distribution analysis. The result allow us to discuss about the latent structure and the function of detailed area; without evident structural features.

  20. Rehabilitation System based on the Use of Biomechanical Analysis and Videogames through the Kinect Sensor

    Directory of Open Access Journals (Sweden)

    John E. Muñoz-Cardona

    2013-11-01

    Full Text Available This paper presents development of a novel system for physical rehabilitation of patients with multiple pathologies, through dynamic with exercise videogames (exergames and analysis of the movements of patients using developed software. This system is based on the use of the Kinect sensor for both purposes: amusing the patient in therapy through of specialist exergames and provide a tool to record and analyze MoCap data taken through the Kinect sensor and processed using biomechanical analysis through Euler angles. All interactive system is installed in a rehabilitation center and works with different pathologies (stroke, IMOC, craneoencephallic trauma, etc., patients interact with the platform while the specialist records data for later analysis, which is performed by software designed for this purpose. The motion graphics are shown in the sagittal, frontal and rotationalplanefrom20 points distributed in the body. The final system is portable, non-invasive, inexpensive, natural interaction with the patient and easily implemented for medical purposes.

  1. The CONTENT project: a problem-oriented, episode-based electronic patient record in primary care

    Directory of Open Access Journals (Sweden)

    Gunter Laux

    2005-12-01

    The aims are strictly scientific and the underlying hypothesis is that the knowledge-gaining process can be accelerated by combining the experience of many, especially with respect to complex interactions of factors and the analysis of rare events. Aside from maintaining a morbidity registry, within the CONTENT framework various prospective and retrospective studies on particular epidemiological and health economic research topics will be conducted.

  2. Determination of evapotranspiration in a plant cover, based on neutron gauge recordings of water profiles

    International Nuclear Information System (INIS)

    After describing the principle of the method used and the measurement installation set up on the Ivory Coast by ORSTOM (Overseas Scientific and Technical Research Organization), the author describes the results obtained. From a statistical analysis of these results one can determine the accuracy which can be anticipated from the method and the extent to which different factors are responsible for the scatter of the measurement values

  3. Pattern recognition on X-ray fluorescence records from Copenhagen lake sediments using principal component analysis

    DEFF Research Database (Denmark)

    Schreiber, Norman; Garcia, Emanuel; Kroon, Aart;

    2014-01-01

    Principle Component Analysis (PCA) was performed on chemical data of two sediment cores from an urban fresh-water lake in Copenhagen, Denmark. X-ray fluorescence (XRF) core scanning provided the underlying datasets on 13 variables (Si, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Cd, Pb). Principle...... depths. The sediments featured a temporal association with contaminant dominance. Lead contamination was superseded by zinc within the compound pattern which was linked to changing contamination sources over time. Principle Component Analysis was useful to visualize and interpret geochemical XRF data...

  4. Performance evaluation of a web-based system to exchange Electronic Health Records using Queueing model (M/M/1).

    Science.gov (United States)

    de la Torre, Isabel; Díaz, Francisco Javier; Antón, Míriam; Martínez, Mario; Díez, José Fernando; Boto, Daniel; López, Miguel; Hornero, Roberto; López, María Isabel

    2012-04-01

    Response time measurement of a web-based system is essential to evaluate its performance. This paper shows a comparison of the response times of a Web-based system for Ophthalmologic Electronic Health Records (EHRs), TeleOftalWeb. It makes use of different database models like Oracle 10 g, dbXML 2.0, Xindice 1.2, and eXist 1.1.1. The system's modelling, which uses Tandem Queue networks, will allow us to estimate the service times of the different components of the system (CPU, network and databases). In order to calculate those times, associated to the different databases, benchmarking techniques are used. The final objective of the comparison is to choose the database system resulting in the lowest response time to TeleOftalWeb and to compare the obtained results using a new benchmarking. PMID:20703642

  5. Medical aspects of the development of an environmental data base taking into account experience in keeping population related cancer records

    International Nuclear Information System (INIS)

    The paper relates to the medical aspects of the development of an environmental data base taking into account experience in keeping population related cancer records. Information on the data and associated problems for maintaining such data bases of the National Cancer Registry (NRC) of the former GDR are presented. NRC data on cancer occurrence in individuals or groups covers the entire territory of the former GDR; they can be combined with data on occupational or environmental exposure available from other institutions. Information on radiation related health risks in the southern districts (e.g. Saxony) are also available. Data were used for health planning, epidemiological research etc. It is concluded that in collecting data the different items should be classified as core items and optional items with appropriate priority. (author). 9 refs

  6. Rehabilitation System based on the Use of Biomechanical Analysis and Videogames through the Kinect Sensor

    OpenAIRE

    John E. Muñoz-Cardona; Oscar A. Henao-Gallo; José F. López-Herrera

    2013-01-01

    This paper presents development of a novel system for physical rehabilitation of patients with multiple pathologies, through dynamic with exercise videogames (exergames) and analysis of the movements of patients using developed software. This system is based on the use of the Kinect sensor for both purposes: amusing the patient in therapy through of specialist exergames and provide a tool to record and analyze MoCap data taken through the Kinect sensor and processed using biomechanical analys...

  7. Analysis and classification of oximetry recordings to predict obstructive sleep apnea severity in children.

    Science.gov (United States)

    Gutierrez-Tobal, Gonzalo C; Kheirandish-Gozal, Leila; Alvarez, Daniel; Crespo, Andrea; Philby, Mona F; Mohammadi, Meelad; Del Campo, Felix; Gozal, David; Hornero, Roberto

    2015-08-01

    Current study is focused around the potential use of oximetry to determine the obstructive sleep apnea-hypopnea syndrome (OSAHS) severity in children. Single-channel SpO2 recordings from 176 children were divided into three severity groups according to the apnea-hypopnea index (AHI): AHIMLP) neural network, in order to classify children into one of the three OSAHS severity groups. Following our MLP multiclass approach, a diagnostic protocol with capability to reduce the need of polysomnography tests by 46% could be derived. Moreover, our proposal can be also evaluated, in a binary classification task for two common AHI diagnostic cutoffs (AHI = 1 e/h and AHI= 5 e/h). High diagnostic ability was reached in both cases (84.7% and 85.8% accuracy, respectively) outperforming the clinical variable ODI3 as well as other measures reported in recent studies. These results suggest that the information contained in SpO2 could be helpful in pediatric OSAHS severity detection. PMID:26737304

  8. Statistical analysis of low-frequency noise recorded in ASIAEX by a PANDA system

    Science.gov (United States)

    Potter, John R.; Beng, Koay T.; Pallayil, Venugopalan

    2002-11-01

    As part of the ASIAEX experiment, the Acoustic Research Laboratory in the Tropical Marine Science Institute at the National University of Singapore deployed a Pop-up Ambient Noise Data Acquisition (PANDA) system. The PANDA was recovered 18 days later with over 9 days of continuous data recorded from a single hydrophone at 2 kSa/s. The data show the various sources that were deployed as part of the experiment, but also provide interesting statistical information on low-frequency ambient noise in the region and the passage of numerous ships. This deployment was in a heavy shipping traffic area, hostile both in terms of potential snagging by fishing activity and in terms of the high levels of noise encountered, both of which are of interest for the deployment and successful use of autonomous acoustic systems in busy littoral waters. We present some statistical results from the 3+ GByte of data. [Work supported by the Defence Science and Technology Agency, Singapore and the US ONR.

  9. Congenital anomalies in children with cerebral palsy: a population-based record linkage study

    DEFF Research Database (Denmark)

    Rankin, Judith; Cans, Christine; Garne, Ester;

    2010-01-01

    Our aim was to determine the proportion of children with cerebral palsy (CP) who have a congenital anomaly (CA) in three regions (Isère Region, French Alps; Funen County, Denmark; Northern Region, England) where population-based CP and CA registries exist, and to classify the children according to...

  10. Operating cost analysis of anaesthesia: Activity based costing (ABC analysis

    Directory of Open Access Journals (Sweden)

    Majstorović Branislava M.

    2011-01-01

    Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  11. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    Science.gov (United States)

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  12. Compression Record Based Efficient k-Medoid Algorithm to Increase Scalability and Efficiency

    Directory of Open Access Journals (Sweden)

    Archana Kumari, Hritu Bhagat

    2013-08-01

    Full Text Available Clustering analysis is a descriptive task thatseeks to identify homogeneous groups of objectsbased on the values of their attributes. K-medoidclustering algorithms are widely used for manypractical applications. Original K-medoid algorithmselect initial centroids and medoids randomly thataffect the quality of the resulting clusters andsometimes it generates unstable and empty clusterswhich are meaningless. The original k-meansalgorithm is computationally expensive and requirestime proportional to the product of the number of dataitems, number of clusters and the number of iterations.Improved k-Medoid clustering algorithm has theaccuracy higher than the original.

  13. The role of home-based records in the establishment of a continuum of care for mothers, newborns, and children in Indonesia

    OpenAIRE

    Osaki, Keiko; Hattori, Tomoko; Kosen, Soewarta

    2013-01-01

    Background: The provision of appropriate care along the continuum of maternal, newborn, and child health (MNCH) service delivery is a challenge in developing countries. To improve this, in the 1990s, Indonesia introduced the maternal and child health (MCH) handbook, as an integrated form of parallel home-based records.Objective: This study aimed to identify the roles of home-based records both before and after childbirth, especially in provinces where the MCH handbook (MCHHB) was extensively ...

  14. An analysis of flight Quick Access Recorder (QAR) data and its applications in preventing landing incidents

    International Nuclear Information System (INIS)

    A long landing is one type of flight incident that will multiply the risk of a runway excursion. It occurs frequently but receives little attention in research due to difficulty in obtaining the real flight data. The aim of this paper is to discover key flight parameter features of long landing incidents by analyzing Quick Access Recorder (QAR) data and put forward prevention measures from the perspective of pilot operation at the same time. First, 73 flight performance parameter variables and 4 operation parameter variables were defined, covering major landing stages from 1500 ft to touchdown. Then 128 cases of selected QAR data were divided into two groups according to the threshold of identifying normal and long landing. Second, each flight parameter variable of these 128 flights was compared between groups and then the logistic and linear regression models were developed respectively to further examine the links between touchdown distance and these flight parameter variables. Third, potential flight operation causing performance difference of long landing incidents was also analyzed. Finally results indicate that the period of 200 ft to touchdown is the key stage of landing and flare is the most critical operation affecting touchdown distance. It is suggested that the pilot should inspect the ratio of descent rate and groundspeed carefully at the height of 50 ft and pilot's faster and steady pulling up columns is probably helpful for an excellent flare and landing. The findings are expected to be applied into flight operation practice for further preventing long landing incidents and even the runway excursion accidents

  15. North Cascade Glacier Annual Mass Balance Record Analysis 1984-2013

    Science.gov (United States)

    Pelto, M. S.

    2014-12-01

    The North Cascade Glacier Climate Project (NCGCP) was founded in 1983 to monitor 10 glaciers throughout the range and identify their response to climate change. The annual observations include mass balance, terminus behavior, glacier surface area and accumulation area ratio (AAR). Annual mass balance (Ba) measurements have been continued on the 8 original glaciers that still exist. Two glaciers have disappeared: the Lewis Glacier and Spider Glacier. In 1990, Easton Glacier and Sholes Glacier were added to the annual balance program to offset the loss. One other glacier Foss Glacier has declined to the extent that continued measurement will likely not be possible. Here we examine the 30 year long Ba time series from this project. All of the data have been reported to the World Glacier Monitoring Service (WGMS). This comparatively long record from glaciers in one region conducted by the same research program using the same methods offers some useful comparative data. Degree day factors for melt of 4.3 mm w.e.°C-1d-1 for snow and 6.6 mm w.e.°C-1d-1 for ice has been determined from 412 days of ablation observation. The variation in the AAR for equilibrium Ba is small ranging from 60 to 67. The mean annual balance of the glaciers from 1984-2013 is -0.45 ma-1, ranging from -0.31 to -0.57 ma-1 for individual glacier's. The correlation coefficient of Ba is above 0.80 between all glaciers including the USGS benchmark glacier, South Cascade Glacier. This indicates that the response is to regional climate change, not local factors. The mean annual balance of -0.45 ma-1 is close to the WGMS global average for this period -0.50 ma-1. The cumulative loss of 13.5 m w.e. and 15 m of ice thickness represents more than 20% of the volume of the glaciers.

  16. Assessment of providers' referral decisions in Rural Burkina Faso: a retrospective analysis of medical records

    Directory of Open Access Journals (Sweden)

    Ilboudo Tegawende

    2012-03-01

    Full Text Available Abstract Background A well-functioning referral system is fundamental to primary health care delivery. Understanding the providers' referral decision-making process becomes critical. This study's aim was to assess the correctness of diagnoses and appropriateness of the providers' referral decisions from health centers (HCs to district hospitals (DHs among patients with severe malaria and pneumonia. Methods A record review of twelve months of consultations was conducted covering eight randomly selected HCs to identify severe malaria (SM cases among children under five and pneumonia cases among adults. The correctness of the diagnosis and appropriateness of providers' referral decisions were determined using the National Clinical Guidebook as a 'gold standard'. Results Among the 457 SM cases affecting children under five, only 66 cases (14.4% were correctly diagnosed and of those 66 correctly diagnosed cases, 40 cases (60.6% received an appropriate referral decision from their providers. Within these 66 correctly diagnosed SM cases, only 60.6% were appropriately referred. Among the adult pneumonia cases, 5.9% (79/1331 of the diagnoses were correctly diagnosed; however, the appropriateness rate of the provider's referral decision was 98.7% (78/79. There was only one case that should not have been referred but was referred. Conclusions The adherence to the National Guidelines among the health center providers when making a diagnosis was low for both severe malaria cases and pneumonia cases. The appropriateness of the referral decisions was particularly poor for children with severe malaria. Health center providers need to be better trained in the diagnostic process and in disease management in order to improve the performance of the referral system in rural Burkina Faso.

  17. Independent component analysis of gait-related movement artifact recorded using EEG electrodes during treadmill walking.

    OpenAIRE

    Kristine Lynne Snyder

    2015-01-01

    There has been a recent surge in the use of electroencephalography (EEG) as a tool for mobile brain imaging due to its portability and fine time resolution. When EEG is combined with independent component analysis (ICA) and source localization techniques, it can model electrocortical activity as arising from temporally independent signals located in spatially distinct cortical areas. However, for mobile tasks, it is not clear how movement artifacts influence ICA and source localization. We de...

  18. Independent Component Analysis of Gait-Related Movement Artifact Recorded using EEG Electrodes during Treadmill Walking

    OpenAIRE

    Snyder, Kristine L.; Kline, Julia E.; Huang, Helen J.; Ferris, Daniel P

    2015-01-01

    There has been a recent surge in the use of electroencephalography (EEG) as a tool for mobile brain imaging due to its portability and fine time resolution. When EEG is combined with independent component analysis (ICA) and source localization techniques, it can model electrocortical activity as arising from temporally independent signals located in spatially distinct cortical areas. However, for mobile tasks, it is not clear how movement artifacts influence ICA and source localization. We de...

  19. III-V strain layer superlattice based band engineered avalanche photodiodes (Presentation Recording)

    Science.gov (United States)

    Ghosh, Sid

    2015-08-01

    Laser detection and ranging (LADAR)-based systems operating in the Near Infrared (NIR) and Short Wave Infrared (SWIR) have become popular optical sensors for remote sensing, medical, and environmental applications. Sophisticated laser-based radar and weapon systems used for long-range military and astronomical applications need to detect, recognize, and track a variety of targets under a wide spectrum of atmospheric conditions. Infrared APDs play an important role in LADAR systems by integrating the detection and gain stages in a single device. Robust silicon-APDs are limited to visible and very near infrared region ( 3um) infrared photon detection applications. Recently, various research groups (including Ghosh et. al.) have reported SWIR and MWIR HgCdTe APDs on CdZnTe and Si substrates. However, HgCdTe APDs suffer from low breakdown fields due to material defects, and excess noise increases significantly at high electric fields. During the past decade, InAs/GaSb Strain Layer Superlattice (SLS) material system has emerged as a potential material for the entire infrared spectrum because of relatively easier growth, comparable absorption coefficients, lower tunneling currents and longer Auger lifetimes resulting in enhanced detectivities (D*). Band engineering in type II SLS allows us to engineer avalanche properties of electrons and holes. This is a great advantage over bulk InGaAs and HgCdTe APDs where engineering avalanche properties is not possible. The talk will discuss the evolution of superlattice based avalanche photodiodes and some of the recent results on the work being done at Raytheon on SWIR avalanche photodiodes.

  20. Damage detection of a large structure based on strong motion record. Theory of adoptive forward-backward Kalman filter

    International Nuclear Information System (INIS)

    The report presents a new system identification procedure for a time-varing system to estimate natural frequency transition of a damaged building from a strong seismic motion record, named as 'Adaptive Forward-Backward Kalman Filter (AFB-KF)'. The AFB-KF is compared to the conventional Kalman filter in the below three points: (1) Forgetting Factor for corvatiance functions to track time-varying structural parameters rapidly, (2) Time-backward estimation scheme and global iteration scheme of the forward processes to estimate unknown initial value of structural parameters, (3) The time series renewal algorithm of statistical properties by reflecting the previous analysis information to improve the identification accuracy. It is useful to accurate identify from natural frequency transition of a building during earthquake for structural health monitoring which evaluates structural integrity. (author)

  1. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  2. Statistical analysis of dewpoint data record at OWL-1 loop cubicle and its application to early detection of abnormal water leakage from the loop

    International Nuclear Information System (INIS)

    A method of statistical analysis was applied to analyze the dewpoint data record of both supply and exhaust air for OWL-1 (Oarai Water Loop No.1) loop cubicle in JMTR (Japan Materials Testing Reactor). The primary purpose of the present study is to determine experimentally the dynamic interrelationship between these two quantities and then to evaluate the effectiveness of incorporating the information thus derived on the dewpoint dynamics into a water leakage monitoring system for the OWL-1 loop. Through the data analysis, it was shown that; The dynamics of dewpoint between the supply and exhaust air contain basically two mode characteristics; i.e. the fast mode with a time constant of about ten minutes and the slow one with that of about several hours. Under normal condition without any water leakage from the loop, variations of the exhaust dewpoint for frequencies below about 2 cycle/hour are mostly due to those of the supply dewpoint. Based upon the results mentioned above, a simple filter including the dewpoint dynamics was designed in an attempt to develop an efficient leak monitor for the OWL-1 loop system. This filter was applied to analyze the dewpoint data record during the time when the OWL-1 loop underwent an abnormal water leakage in the 43rd cycle of JMTR operation. The results of the analysis indicate potential usefulness of the present method for detection of abnormal water leakage at its early stage. The basic idea for the leak monitor proposed here is considered to be applicable also to the problem of water leakage detection at power reactor plants in general. (author)

  3. Seismic Design Value Evaluation Based on Checking Records and Site Geological Conditions Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Tienfuan Kerh

    2013-01-01

    Full Text Available This study proposes an improved computational neural network model that uses three seismic parameters (i.e., local magnitude, epicentral distance, and epicenter depth and two geological conditions (i.e., shear wave velocity and standard penetration test value as the inputs for predicting peak ground acceleration—the key element for evaluating earthquake response. Initial comparison results show that a neural network model with three neurons in the hidden layer can achieve relatively better performance based on the evaluation index of correlation coefficient or mean square error. This study further develops a new weight-based neural network model for estimating peak ground acceleration at unchecked sites. Four locations identified to have higher estimated peak ground accelerations than that of the seismic design value in the 24 subdivision zones are investigated in Taiwan. Finally, this study develops a new equation for the relationship of horizontal peak ground acceleration and focal distance by the curve fitting method. This equation represents seismic characteristics in Taiwan region more reliably and reasonably. The results of this study provide an insight into this type of nonlinear problem, and the proposed method may be applicable to other areas of interest around the world.

  4. Records Management

    Data.gov (United States)

    U.S. Environmental Protection Agency — All Federal Agencies are required to prescribe an appropriate records maintenance program so that complete records are filed or otherwise preserved, records can be...

  5. A Retrospective Analysis of the Burn Injury Patients Records in the Emergency Department, an Epidemiologic Study

    Directory of Open Access Journals (Sweden)

    Nilgün Aksoy

    2014-08-01

    Full Text Available Introduction: Burns can be very destructive, and severely endanger the health and lives of humans. It maybe cause disability and even psychological trauma in individuals. . Such an event can also lead to economic burden on victim’s families and society. The aim of our study is to evaluate epidemiology and outcome of burn patients referring to emergency department. Methods: This is a cross-sectional study was conducted by evaluation of patients’ files and forensic reports of burned patients’ referred to the emergency department (ED of Akdeniz hospital, Turkey, 2008. Demographic data, the season, place, reason, anatomical sites, total body surface area, degrees, proceeding treatment, and admission time were recorded. Multinomial logistic regression was used to compare frequencies’ differences among single categorized variables. Stepwise logistic regression was applied to develop a predictive model for hospitalization. P<0.05 was defined as a significant level. Results: Two hundred thirty patients were enrolled (53.9% female. The mean of patients' ages was 25.3 ± 22.3 years. The most prevalence of burn were in the 0-6 age group and most of which was hot liquid scalding (71.3%. The most affected parts of the body were the left and right upper extremities. With increasing the severity of triage level (OR=2.2; 95% CI: 1.02-4.66; p=0.046, intentional burn (OR=4.7; 95% CI: 1.03-21.8; p=0.047, referring from other hospitals or clinics (OR=3.4; 95% CI: 1.7-6.6; p=0.001, and percentage of burn (OR=18.1; 95% CI: 5.42-62.6; p<0.001 were independent predictive factor for hospitalization. In addition, odds of hospitalization was lower in patients older than 15 years (OR=0.7; 95% CI: 0.5-0.91; p=0.035. Conclusion: This study revealed the most frequent burns are encountered in the age group of 0-6 years, percentage of <10%, second degree, upper extremities, indoor, and scalding from hot liquids. Increasing ESI severity, intentional burn, referring from

  6. Does Lithology Influence Relative Paleointensity Records? A Statistical Analysis on South Atlantic Pelagic Sediments

    Science.gov (United States)

    von Dobeneck, T.; Franke, C.

    2004-12-01

    The relative paleointensity (RPI) method assumes that the intensity of Post Depositional Remanent Magnetization (PDRM) depends exclusively on the magnetic field strength and the concentration of the magnetic carriers. Sedimentary remanence is regarded as an equilibrium state between aligning geomagnetic and randomizing interparticle forces. Just how strong these mechanical and electrostatic forces are, depends on many petrophysical factors related to mineralogy, particle size and shape of the matrix constituents. We therefore test the hypothesis that variations in sediment lithology modulate RPI records. For ninety selected Late Quaternary sediment samples from the subtropical and subantarctic South Atlantic Ocean a combined paleomagnetic and sedimentological dataset was established. Misleading alterations of the magnetic mineral fraction were detected by a routine Fe/kappa test (Funk et al., 2004). Samples with any indication of suboxic magnetite dissolution were excluded from the dataset. The parameters under study include carbonate, opal and terrigenous content, grain size distribution and clay mineral composition. Their bi- and multivariate correlations with the RPI signal were statistically investigated using standard techniques and criteria. While several of the parameters did not yield significant results, clay grain size and chlorite correlate weakly and opal, illite and kaolinite correlate moderately to the NRM/ARM signal used here as a RPI measure. The most influential single sedimentological factor is the kaolinite/illite ratio with a Pearson's coefficient of 0.51 and 99.9% significance. We find that kaolinite has a positive and illite a negative effect on magnetic alignment, while smectite is more indifferent. This is certainly related to the contrasting unit-layer charges of the three clay minerals, eventually also to their crystalline versus flaky structure and low versus medium to high plasticity Our regionally restricted results also indicate an

  7. Polyphase Order Analysis Based on Convolutional Approach

    OpenAIRE

    M. Drutarovsky

    1999-01-01

    The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing) rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational ...

  8. Estimation of plutonium in Hanford Site waste tanks based on historical records

    International Nuclear Information System (INIS)

    An estimation of plutonium in the Hanford Site waste storage tanks is important to nuclear criticality concerns. A reasonable approach for estimating the plutonium in the tanks can be established by considering the recovery efficiency of the chemical separation plants on the plutonium produced in the Hanford reactors. The waste loss from the separation processes represents the bulk of the plutonium in the waste tanks. The lesser contributor of plutonium to the waste tanks was the Plutonium Finishing Plant (PFP). When the PFP waste is added to the plutonium waste from separations, the result is the total estimated amount of plutonium discharged to the waste tanks at the Hanford Site. This estimate is for criticality concerns, and therefore is based on conservative assumptions (giving higher plutonium values). The estimate has been calculated to be ∼981 kg of plutonium in the single- and double-shell high-level waste tanks

  9. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    Science.gov (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice. PMID:25991124

  10. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  11. When Did Carcharocles megalodon Become Extinct? A New Analysis of the Fossil Record

    OpenAIRE

    Catalina Pimiento; Clements, Christopher F.

    2014-01-01

    Carcharocles megalodon ("Megalodon") is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9-2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding th...

  12. When did carcharocles megalodon become extinct? A new analysis of the fossil record

    OpenAIRE

    Pimiento, Catalina; Clements, Christopher F.

    2014-01-01

    Carcharocles megalodon (“Megalodon”) is the largest shark that ever lived. Based on its distribution, dental morphology, and associated fauna, it has been suggested that this species was a cosmopolitan apex predator that fed on marine mammals from the middle Miocene to the Pliocene (15.9–2.6 Ma). Prevailing theory suggests that the extinction of apex predators affects ecosystem dynamics. Accordingly, knowing the time of extinction of C. megalodon is a fundamental step towards understanding th...

  13. Nanoscale analysis of natural and artifical magnetic objects : particles, thin films and recording heads

    OpenAIRE

    Wei, Jiandong

    2009-01-01

    Biogenic and artificial magnetite nanoparticles and epitaxial magnetite thin films have been analyzed by magnetic force microscopy (MFM) and variations of MFM-based high frequency techniques. Influence of the dipolar interactions on the magnetic structures and properties of magnetite particles have been investigated. Clusters of magnetite particles from Salmon behave like permeable spheres. Magnetite particles in tissue slices have been successfully detected by MFM. Magnetic resonance force m...

  14. Some results of analysis of inverted echo-sounder records from the Atlantic Equatorial region

    Directory of Open Access Journals (Sweden)

    Alberto dos Santos Franco

    1985-01-01

    Full Text Available The tidal analysis of data from the Equatorial region, given by inverted echo-sounders, show considerable residuals in the frequency band of approximately 2 cycles per day. In the even harmonics of 4 and 6 cycles per day, tidal components statistically not negligible are also identified. Spectral analysis of temperature series from the same area show, on the other hand, variabilities in the same frequency bands, which suggests the occurrence of internal waves with energy distributed in these frequency bands, in the Atlantic Equatorial area.Análises de dados de maré, da zona equatorial, obtidos com ecobatímetros invertidos, mostram consideráveis resíduos na faixa de freqüências com aproximadamente dois ciclos por dia. Nos harmônicos pares com 4 e 6 ciclos por dia são também identificadas componentes de maré estatisticamente não desprezíveis. Análises espectrais de séries de temperatura obtidas na mesma área mostram, 218 por outro lado, variabilidades na mesma faixa de freqüências, o que sugere a ocorrência, na área equatorial Atlântica, de ondas internas com energia distribuída nessas faixas espectrais.

  15. Interpreting land records

    CERN Document Server

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  16. III-V GaAs based plasmonic lasers (Presentation Recording)

    Science.gov (United States)

    Lafone, Lucas; Nguyen, Ngoc; Clarke, Ed; Fry, Paul; Oulton, Rupert F.

    2015-09-01

    Plasmonics is a potential route to new and improved optical devices. Many predict that sub wavelength optical systems will be essential in the development of future integrated circuits, offering the only viable way of simultaneously increasing speed and reducing power consumption. Realising this potential will be contingent on the ability to exploit plasmonic effects within the framework of the established semiconductor industry and to this end we present III-V (GaAs) based surface plasmon laser platform capable of effective laser light generation in highly focussed regions of space. Our design utilises a suspended slab of GaAs with a metallic slot printed on top. Here, hybridisation between the plasmonic mode of the slot and the photonic mode of the slab leads to the formation of a mode with confinement and loss that can be adjusted through variation of the slot width alone. As in previous designs the use of a hybrid mode provides strong confinement with relatively low losses, however the ability to print the metal slot removes the randomness associated with device fabrication and the requirement for etching that can deteriorate the semiconductor's properties. The deterministic fabrication process and the use of bulk GaAs for gain make the device prime for practical implementation.

  17. Efficient inverted organic light-emitting devices by amine-based solvent treatment (Presentation Recording)

    Science.gov (United States)

    Song, Myoung Hoon; Choi, Kyoung-Jin; Jung, Eui Dae

    2015-10-01

    The efficiency of inverted polymer light-emitting diodes (iPLEDs) were remarkably enhanced by introducing spontaneously formed ripple-shaped nanostructure of ZnO (ZnO-R) and amine-based polar solvent treatment using 2-methoxyethanol and ethanolamine (2-ME+EA) co-solvents on ZnO-R. The ripple-shape nanostructure of ZnO layer fabricated by solution process with optimal rate of annealing temperature improves the extraction of wave guide modes inside the device structure, and 2-ME+EA interlayer enhances the electron injection and hole blocking and reduces exciton quenching between polar solvent treated ZnO-R and emissive layer. As a result, our optimized iPLEDs show the luminous efficiency (LE) of 61.6 cd A-1, power efficiency (PE) of 19.4 lm W-1 and external quantum efficiency (EQE) of 17.8 %. This method provides a promising method, and opens new possibilities for not only organic light-emitting diodes (OLEDs) but also other organic optoelectronic devices such as organic photovoltaics, organic thin film transistors, and electrically driven organic diode laser.

  18. Fabric-Based Wearable Dry Electrodes for Body Surface Biopotential Recording.

    Science.gov (United States)

    Yokus, Murat A; Jur, Jesse S

    2016-02-01

    A flexible and conformable dry electrode design on nonwoven fabrics is examined as a sensing platform for biopotential measurements. Due to limitations of commercial wet electrodes (e.g., shelf life, skin irritation), dry electrodes are investigated as the potential candidates for long-term monitoring of ECG signals. Multilayered dry electrodes are fabricated by screen printing of Ag/AgCl conductive inks on flexible nonwoven fabrics. This study focuses on the investigation of skin-electrode interface, form factor design, electrode body placement of printed dry electrodes for a wearable sensing platform. ECG signals obtained with dry and wet electrodes are comparatively studied as a function of body posture and movement. Experimental results show that skin-electrode impedance is influenced by printed electrode area, skin-electrode interface material, and applied pressure. The printed electrode yields comparable ECG signals to wet electrodes, and the QRS peak amplitude of ECG signal is dependent on printed electrode area and electrode on body spacing. Overall, fabric-based printed dry electrodes present an inexpensive health monitoring platform solution for mobile wearable electronics applications by fulfilling user comfort and wearability. PMID:26241969

  19. Linear and nonlinear analysis of airflow recordings to help in sleep apnoea–hypopnoea syndrome diagnosis

    International Nuclear Information System (INIS)

    This paper focuses on the analysis of single-channel airflow (AF) signal to help in sleep apnoea–hypopnoea syndrome (SAHS) diagnosis. The respiratory rate variability (RRV) series is derived from AF by measuring time between consecutive breathings. A set of statistical, spectral and nonlinear features are extracted from both signals. Then, the forward stepwise logistic regression (FSLR) procedure is used in order to perform feature selection and classification. Three logistic regression (LR) models are obtained by applying FSLR to features from AF, RRV and both signals simultaneously. The diagnostic performance of single features and LR models is assessed and compared in terms of sensitivity, specificity, accuracy and area under the receiver-operating characteristics curve (AROC). The highest accuracy (82.43%) and AROC (0.903) are reached by the LR model derived from the combination of AF and RRV features. This result suggests that AF and RRV provide useful information to detect SAHS. (paper)

  20. Analysis of counts with two latent classes, with application to risk assessment using physician visit records

    OpenAIRE

    Wang, Huijing

    2012-01-01

    Motivated by the CAYACS program at BC Cancer Research Center, this thesis project introduces a latent class model to formulate event counts. In particular, we consider a pop- ulation with two latent classes, such as an at-risk group and a not-at-risk group of cancer survivors in the CAYACS program. Likelihood-based inference procedures are proposed for estimating the model parameters with or without one class fully specified. The EM algo- rithm is adapted to compute the MLE; a pseudo MLE of t...