WorldWideScience

Sample records for based record analysis

  1. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey

    Science.gov (United States)

    Rau, Hsiao-Hsien; Chen, Kang-Hua

    2017-01-01

    Background Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced “My health bank,” a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. Objective This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. Methods This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 (“disagree strongly”) to 5 (“Agree strongly”). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. Results This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65

  2. Quality of nursing documentation: Paper-based health records versus electronic-based health records.

    Science.gov (United States)

    Akhu-Zaheya, Laila; Al-Maaitah, Rowaida; Bany Hani, Salam

    2018-02-01

    To assess and compare the quality of paper-based and electronic-based health records. The comparison examined three criteria: content, documentation process and structure. Nursing documentation is a significant indicator of the quality of patient care delivery. It can be either paper-based or organised within the system known as the electronic health records. Nursing documentation must be completed at the highest standards, to ensure the safety and quality of healthcare services. However, the evidence is not clear on which one of the two forms of documentation (paper-based versus electronic health records is more qualified. A retrospective, descriptive, comparative design was used to address the study's purposes. A convenient number of patients' records, from two public hospitals, were audited using the Cat-ch-Ing audit instrument. The sample size consisted of 434 records for both paper-based health records and electronic health records from medical and surgical wards. Electronic health records were better than paper-based health records in terms of process and structure. In terms of quantity and quality content, paper-based records were better than electronic health records. The study affirmed the poor quality of nursing documentation and lack of nurses' knowledge and skills in the nursing process and its application in both paper-based and electronic-based systems. Both forms of documentation revealed drawbacks in terms of content, process and structure. This study provided important information, which can guide policymakers and administrators in identifying effective strategies aimed at enhancing the quality of nursing documentation. Policies and actions to ensure quality nursing documentation at the national level should focus on improving nursing knowledge, competencies, practice in nursing process, enhancing the work environment and nursing workload, as well as strengthening the capacity building of nursing practice to improve the quality of nursing care and

  3. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey.

    Science.gov (United States)

    Rau, Hsiao-Hsien; Wu, Yi-Syuan; Chu, Chi-Ming; Wang, Fu-Chung; Hsu, Min-Huei; Chang, Chi-Wen; Chen, Kang-Hua; Lee, Yen-Liang; Kao, Senyeong; Chiu, Yu-Lung; Wen, Hsyien-Chia; Fuad, Anis; Hsu, Chien-Yeh; Chiu, Hung-Wen

    2017-04-27

    Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced "My health bank," a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 ("disagree strongly") to 5 ("Agree strongly"). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65.1%). They were still students (195 out of 350

  4. [Electronic versus paper-based patient records: a cost-benefit analysis].

    Science.gov (United States)

    Neubauer, A S; Priglinger, S; Ehrt, O

    2001-11-01

    The aim of this study is to compare the costs and benefits of electronic, paperless patient records with the conventional paper-based charts. Costs and benefits of planned electronic patient records are calculated for a University eye hospital with 140 beds. Benefit is determined by direct costs saved by electronic records. In the example shown, the additional benefits of electronic patient records, as far as they can be quantified total 192,000 DM per year. The costs of the necessary investments are 234,000 DM per year when using a linear depreciation over 4 years. In total, there are additional annual costs for electronic patient records of 42,000 DM. Different scenarios were analyzed. By increasing the time of depreciation to 6 years, the cost deficit reduces to only approximately 9,000 DM. Increased wages reduce the deficit further while the deficit increases with a loss of functions of the electronic patient record. However, several benefits of electronic records regarding research, teaching, quality control and better data access cannot be easily quantified and would greatly increase the benefit to cost ratio. Only part of the advantages of electronic patient records can easily be quantified in terms of directly saved costs. The small cost deficit calculated in this example is overcompensated by several benefits, which can only be enumerated qualitatively due to problems in quantification.

  5. Heart rhythm analysis using ECG recorded with a novel sternum based patch technology

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt; Fauerskov, Inge; Osmanagic, Armin

    2013-01-01

    , reliable long-term ECG recordings. The device is designed for high compliance and low patient burden. This novel patch technology is CE approved for ambulatory ECG recording of two ECG channels on the sternum. This paper describes a clinical pilot study regarding the usefulness of these ECG signals...... for heart rhythm analysis. A clinical technician with experience in ECG interpretation selected 200 noise-free 7 seconds ECG segments from 25 different patients. These 200 ECG segments were evaluated by two medical doctors according to their usefulness for heart rhythm analysis. The first doctor considered...... 98.5% of the segments useful for rhythm analysis, whereas the second doctor considered 99.5% of the segments useful for rhythm analysis. The conclusion of this pilot study indicates that two channel ECG recorded on the sternum is useful for rhythm analysis and could be used as input to diagnosis...

  6. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  7. Electronic Health Record Implementation: A SWOT Analysis.

    Science.gov (United States)

    Shahmoradi, Leila; Darrudi, Alireza; Arji, Goli; Farzaneh Nejad, Ahmadreza

    2017-10-01

    Electronic Health Record (EHR) is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats) analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS). The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians' and other clinical staff's resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  8. Electronic Health Record Implementation: A SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Leila Shahmoradi

    2017-12-01

    Full Text Available Electronic Health Record (EHR is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS. The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians’ and other clinical staff’s resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  9. The Use of Continuous Wavelet Transform Based on the Fast Fourier Transform in the Analysis of Multi-channel Electrogastrography Recordings.

    Science.gov (United States)

    Komorowski, Dariusz; Pietraszek, Stanislaw

    2016-01-01

    This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.

  10. Design spectrums based on earthquakes recorded at tarbela

    International Nuclear Information System (INIS)

    Rizwan, M.; Ilyas, M.; Masood, A.

    2008-01-01

    First Seismological Network in Pakistan was setup in early 1969 at Tarbela, which is the location of largest water reservoir of the country. The network consisted of Analog Accelerograms and Seismographs. Since the installation many seismic events of different magnitudes occurred and were recorded by the installed instruments. The analog form of recorded time histories has been digitized and data of twelve earthquakes, irrespective of the type of soil, has been used to derive elastic design spectrums for Tarbela, Pakistan. The PGA scaling factors, based on the risk analysis studies carried out for the region, for each component are also given. The design spectrums suggested will be very useful for carrying out new construction in the region and its surroundings. The digitized data of time histories will be useful for seismic response analysis of structures and seismic risk analysis of the region. (author)

  11. Localizing wushu players on a platform based on a video recording

    Science.gov (United States)

    Peczek, Piotr M.; Zabołotny, Wojciech M.

    2017-08-01

    This article describes the development of a method to localize an athlete during sports performance on a platform, based on a static video recording. Considered sport for this method is wushu - martial art. However, any other discipline can be applied. There are specified requirements, and 2 algorithms of image processing are described. The next part presents an experiment that was held based on recordings from the Pan American Wushu Championship. Based on those recordings the steps of the algorithm are shown. Results are evaluated manually. The last part of the article concludes if the algorithm is applicable and what improvements have to be implemented to use it during sports competitions as well as for offline analysis.

  12. Fast optical recording media based on semiconductor nanostructures for image recording and processing

    International Nuclear Information System (INIS)

    Kasherininov, P. G.; Tomasov, A. A.

    2008-01-01

    Fast optical recording media based on semiconductor nanostructures (CdTe, GaAs) for image recording and processing with a speed to 10 6 cycle/s (which exceeds the speed of known recording media based on metal-insulator-semiconductor-(liquid crystal) (MIS-LC) structures by two to three orders of magnitude), a photosensitivity of 10 -2 V/cm 2 , and a spatial resolution of 5-10 (line pairs)/mm are developed. Operating principles of nanostructures as fast optical recording media and methods for reading images recorded in such media are described. Fast optical processors for recording images in incoherent light based on CdTe crystal nanostructures are implemented. The possibility of their application to fabricate image correlators is shown.

  13. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... a system of records, Risk Analysis and Management Records, State-78, pursuant to the provisions of... INFORMATION: The Department of State proposes that the new system will be ``Risk Analysis and Management.... These standard routine uses apply to State-78, Risk Analysis and Management Records. POLICIES AND...

  14. Validation of PC-based Sound Card with Biopac for Digitalization of ECG Recording in Short-term HRV Analysis.

    Science.gov (United States)

    Maheshkumar, K; Dilara, K; Maruthy, K N; Sundareswaren, L

    2016-07-01

    Heart rate variability (HRV) analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR) in healthy as well as disease conditions. The aim of the present study was to compare (validate) the HRV using a temporal series of electrocardiograms (ECG) obtained by simple analog amplifier with PC-based sound card (audacity) and Biopac MP36 module. Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. The unpaired Student's t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001) between the values in time and frequency domain obtained by the devices. On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  15. Transition analysis of magnetic recording heads using FDTD

    International Nuclear Information System (INIS)

    Tanabe, Shinji

    2001-01-01

    Transition waveforms of a magnetic recording head have been analyzed using finite difference time domain (FDTD). The distributed inductance and capacitance of the head effect the rising time of the magnetic fields in the recording process. FDTD electromagnetic analysis is easy to combine with SPICE circuit analysis. Using this combined program, a transition analysis of the recording process including a write amplifier has become possible

  16. Transition analysis of magnetic recording heads using FDTD

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Shinji E-mail: tanabe@ele.crl.melco.co.jp

    2001-10-01

    Transition waveforms of a magnetic recording head have been analyzed using finite difference time domain (FDTD). The distributed inductance and capacitance of the head effect the rising time of the magnetic fields in the recording process. FDTD electromagnetic analysis is easy to combine with SPICE circuit analysis. Using this combined program, a transition analysis of the recording process including a write amplifier has become possible.

  17. Borneo: a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    OpenAIRE

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species occurrence based on the identified relationships between species recorded presences and the ecological circumstances at those localities. A new statistical method was developed to test the species distribut...

  18. NEED ANALYSIS FOR IDENTIFYING ESP MATERIALS FOR MEDICAL RECORD STUDENTS IN APIKES CITRA MEDIKA SURAKARTA

    Directory of Open Access Journals (Sweden)

    Beta Setiawati

    2016-06-01

    and quantitative methods. The outcomesof this study showed the real necessities of students in learning English to prepare their future at the field of medical record and health information. Findings of the need analysis demonstrate that all four of the language skills were necessary for their academic studies and their target career. There are certain topics related to English for medical record such as medical record staff’ duties, ethical and legal issues in medical record, Hospital statistics, Medical record filling system, Health information system, and so on. Accordingly, this study proposes new ESP materials based on the stakeholders’ needs.It is suggested that textbook or handout of English for Medical Record will be made based on the Need Analysis by ESP designers and ESP lecturers involve actively recognizing the progressive needs of medical record students.

  19. A New Spectral Shape-Based Record Selection Approach Using Np and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Edén Bojórquez

    2013-01-01

    Full Text Available With the aim to improve code-based real records selection criteria, an approach inspired in a parameter proxy of spectral shape, named Np, is analyzed. The procedure is based on several objectives aimed to minimize the record-to-record variability of the ground motions selected for seismic structural assessment. In order to select the best ground motion set of records to be used as an input for nonlinear dynamic analysis, an optimization approach is applied using genetic algorithms focuse on finding the set of records more compatible with a target spectrum and target Np values. The results of the new Np-based approach suggest that the real accelerograms obtained with this procedure, reduce the scatter of the response spectra as compared with the traditional approach; furthermore, the mean spectrum of the set of records is very similar to the target seismic design spectrum in the range of interest periods, and at the same time, similar Np values are obtained for the selected records and the target spectrum.

  20. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  1. Drive-based recording analyses at >800 Gfc/in2 using shingled recording

    International Nuclear Information System (INIS)

    William Cross, R.; Montemorra, Michael

    2012-01-01

    Since the introduction of perpendicular recording, conventional perpendicular scaling has enabled the hard disk drive industry to deliver products ranging from ∼130 to well over 500 Gb/in 2 in a little over 4 years. The incredible areal density growth spurt enabled by perpendicular recording is now endangered by an inability to effectively balance writeability with erasure effects at the system level. Shingled magnetic recording (SMR) offers an effective means to continue perpendicular areal density growth using conventional heads and tuned media designs. The use of specially designed edge-write head structures (also known as 'corner writers') should further increase the AD gain potential for shingled recording. In this paper, we will demonstrate the drive-based recording performance characteristics of a shingled recording system at areal densities in excess of 800 Gb/in 2 using a conventional head. Using a production drive base, developmental heads/media and a number of sophisticated analytical routines, we have studied the recording performance of a shingled magnetic recording subsystem. Our observations confirm excellent writeability in excess of 400 ktpi and a perpendicular system with acceptable noise balance, especially at extreme ID and OD skews where the benefits of SMR are quite pronounced. We believe that this demonstration illustrates that SMR is not only capable of productization, but is likely the path of least resistance toward production drive areal density closer to 1 Tb/in 2 and beyond. - Research highlights: → Drive-based recording demonstrations at 805 Gf/in 2 has been demonstrated using both 95 and 65 mm drive platforms at roughly 430 ktpi and 1.87 Mfci. → Limiting factors for shingled recording include side reading, which is dominated by the reader crosstrack skirt profile, MT10 being a representative metric. → Media jitter and associated DC media SNR further limit areal density, dominated by crosstrack transition curvature, downtrack

  2. Recording and automated analysis of naturalistic bioptic driving.

    Science.gov (United States)

    Luo, Gang; Peli, Eli

    2011-05-01

    People with moderate central vision loss are legally permitted to drive with a bioptic telescope in 39 US states and the Netherlands, but the safety of bioptic driving remains highly controversial. There is no scientific evidence about bioptic use and its impact on safety. We propose searching for evidence by recording naturalistic driving activities in patients' cars. In a pilot study we used an analogue video system to record two bioptic drivers' daily driving activities for 10 and 5 days, respectively. In this technical report, we also describe our novel digital system that collects vehicle manoeuvre information and enables recording over more extended periods, and discuss our approach to analyzing the vast amount of data. Our observations of telescope use by the pilot subjects were quite different from their reports in a previous survey. One subject used the telescope only seven times in nearly 6 h of driving. For the other subject, the average interval between telescope use was about 2 min, and Mobile (cell) phone use in one trip extended the interval to almost 5 min. We demonstrate that computerized analysis of lengthy recordings based on video, GPS, acceleration, and black box data can be used to select informative segments for efficient off-line review of naturalistic driving behaviours. The inconsistency between self reports and objective data as well as infrequent telescope use underscores the importance of recording bioptic driving behaviours in naturalistic conditions over extended periods. We argue that the new recording system is important for understanding bioptic use behaviours and bioptic driving safety. © 2011 The College of Optometrists.

  3. Opto-mechatronics issues in solid immersion lens based near-field recording

    Science.gov (United States)

    Park, No-Cheol; Yoon, Yong-Joong; Lee, Yong-Hyun; Kim, Joong-Gon; Kim, Wan-Chin; Choi, Hyun; Lim, Seungho; Yang, Tae-Man; Choi, Moon-Ho; Yang, Hyunseok; Rhim, Yoon-Chul; Park, Young-Pil

    2007-06-01

    We analyzed the effects of an external shock on a collision problem in a solid immersion lens (SIL) based near-field recording (NFR) through a shock response analysis and proposed a possible solution to this problem with adopting a protector and safety mode. With this proposed method the collision between SIL and media can be avoided. We showed possible solution for contamination problem in SIL based NFR through a numerical air flow analysis. We also introduced possible solid immersion lens designs to increase the fabrication and assembly tolerances of an optical head with replicated lens. Potentially, these research results could advance NFR technology for commercial product.

  4. A computerised out-patient medical records programme based on the Summary Time-Oriented Record (STOR) System.

    Science.gov (United States)

    Cheong, P Y; Goh, L G; Ong, R; Wong, P K

    1992-12-01

    Advances in microcomputer hardware and software technology have made computerised outpatient medical records practical. We have developed a programme based on the Summary Time-Oriented Record (STOR) system which complements existing paper-based record keeping. The elements of the Problem Oriented Medical Record (POMR) System are displayed in two windows within one screen, namely, the SOAP (Subjective information, Objective information, Assessments and Plans) elements in the Reason For Encounter (RFE) window and the problem list with outcomes in the Problem List (PL) window. Context sensitive child windows display details of plans of management in the RFE window and clinical notes in the PL window. The benefits of such innovations to clinical decision making and practice based research and its medico-legal implications are discussed.

  5. How do repeat suicide attempters differ from first timers? An exploratory record based analysis

    Directory of Open Access Journals (Sweden)

    Vikas Menon

    2016-01-01

    Full Text Available Background: Evidence indicates that repeat suicide attempters, as a group, may differ from 1st time attempters. The identification of repeat attempters is a powerful but underutilized clinical variable. Aims: In this research, we aimed to compare individuals with lifetime histories of multiple attempts with 1st time attempters to identify factors predictive of repeat attempts. Setting and Design: This was a retrospective record based study carried out at a teaching cum Tertiary Care Hospital in South India. Methods: Relevant data was extracted from the clinical records of 1st time attempters (n = 362 and repeat attempters (n = 61 presenting to a single Tertiary Care Center over a 4½ year period. They were compared on various sociodemographic and clinical parameters. The clinical measures included Presumptive Stressful Life Events Scale, Beck Hopelessness Scale, Coping Strategies Inventory – Short Form, and the Global Assessment of Functioning Scale. Statistical Analysis Used: First time attempters and repeaters were compared using appropriate inferential statistics. Logistic regression was used to identify independent predictors of repeat attempts. Results: The two groups did not significantly differ on sociodemographic characteristics. Repeat attempters were more likely to have given prior hints about their act (χ2 = 4.500, P = 0.034. In the final regression model, beck hopelessness score emerged as a significant predictor of repeat suicide attempts (odds ratio = 1.064, P = 0.020. Conclusion: Among suicide attempters presenting to the hospital, the presence of hopelessness is a predictor of repeat suicide attempts, independent of clinical depression. This highlights the importance of considering hopelessness in the assessment of suicidality with a view to minimize the risk of future attempts.

  6. Borneo : a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    NARCIS (Netherlands)

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species

  7. Simultaneous surface and depth neural activity recording with graphene transistor-based dual-modality probes.

    Science.gov (United States)

    Du, Mingde; Xu, Xianchen; Yang, Long; Guo, Yichuan; Guan, Shouliang; Shi, Jidong; Wang, Jinfen; Fang, Ying

    2018-05-15

    Subdural surface and penetrating depth probes are widely applied to record neural activities from the cortical surface and intracortical locations of the brain, respectively. Simultaneous surface and depth neural activity recording is essential to understand the linkage between the two modalities. Here, we develop flexible dual-modality neural probes based on graphene transistors. The neural probes exhibit stable electrical performance even under 90° bending because of the excellent mechanical properties of graphene, and thus allow multi-site recording from the subdural surface of rat cortex. In addition, finite element analysis was carried out to investigate the mechanical interactions between probe and cortex tissue during intracortical implantation. Based on the simulation results, a sharp tip angle of π/6 was chosen to facilitate tissue penetration of the neural probes. Accordingly, the graphene transistor-based dual-modality neural probes have been successfully applied for simultaneous surface and depth recording of epileptiform activity of rat brain in vivo. Our results show that graphene transistor-based dual-modality neural probes can serve as a facile and versatile tool to study tempo-spatial patterns of neural activities. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Social science and linguistic text analysis of nurses’ records

    DEFF Research Database (Denmark)

    Buus, N.; Hamilton, B. E.

    2016-01-01

    that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little......' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More...... probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations. © 2015 John Wiley & Sons Ltd....

  9. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  10. Design and reliability analysis of high-speed and continuous data recording system based on disk array

    Science.gov (United States)

    Jiang, Changlong; Ma, Cheng; He, Ning; Zhang, Xugang; Wang, Chongyang; Jia, Huibo

    2002-12-01

    In many real-time fields the sustained high-speed data recording system is required. This paper proposes a high-speed and sustained data recording system based on the complex-RAID 3+0. The system consists of Array Controller Module (ACM), String Controller Module (SCM) and Main Controller Module (MCM). ACM implemented by an FPGA chip is used to split the high-speed incoming data stream into several lower-speed streams and generate one parity code stream synchronously. It also can inversely recover the original data stream while reading. SCMs record lower-speed streams from the ACM into the SCSI disk drivers. In the SCM, the dual-page buffer technology is adopted to implement speed-matching function and satisfy the need of sustainable recording. MCM monitors the whole system, controls ACM and SCMs to realize the data stripping, reconstruction, and recovery functions. The method of how to determine the system scale is presented. At the end, two new ways Floating Parity Group (FPG) and full 2D-Parity Group (full 2D-PG) are proposed to improve the system reliability and compared with the Traditional Parity Group (TPG). This recording system can be used conveniently in many areas of data recording, storing, playback and remote backup with its high-reliability.

  11. Analysis of infant cortical synchrony is constrained by the number of recording electrodes and the recording montage.

    Science.gov (United States)

    Tokariev, Anton; Vanhatalo, Sampsa; Palva, J Matias

    2016-01-01

    To assess how the recording montage in the neonatal EEG influences the detection of cortical source signals and their phase interactions. Scalp EEG was simulated by forward modeling 20-200 simultaneously active sources covering the cortical surface of a realistic neonatal head model. We assessed systematically how the number of scalp electrodes (11-85), analysis montage, or the size of cortical sources affect the detection of cortical phase synchrony. Statistical metrics were developed for quantifying the resolution and reliability of the montages. The findings converge to show that an increase in the number of recording electrodes leads to a systematic improvement in the detection of true cortical phase synchrony. While there is always a ceiling effect with respect to discernible cortical details, we show that the average and Laplacian montages exhibit superior specificity and sensitivity as compared to other conventional montages. Reliability in assessing true neonatal cortical synchrony is directly related to the choice of EEG recording and analysis configurations. Because of the high conductivity of the neonatal skull, the conventional neonatal EEG recordings are spatially far too sparse for pertinent studies, and this loss of information cannot be recovered by re-montaging during analysis. Future neonatal EEG studies will need prospective planning of recording configuration to allow analysis of spatial details required by each study question. Our findings also advice about the level of details in brain synchrony that can be studied with existing datasets or by using conventional EEG recordings. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    Science.gov (United States)

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  13. Analysis of recorded earthquake response data at the Hualien large-scale seismic test site

    International Nuclear Information System (INIS)

    Hyun, C.H.; Tang, H.T.; Dermitzakis, S.; Esfandiari, S.

    1997-01-01

    A soil-structure interaction (SSI) experiment is being conducted in a seismically active region in Hualien, Taiwan. To obtain earthquake data for quantifying SSI effects and providing a basis to benchmark analysis methods, a 1/4-th scale cylindrical concrete containment model similar in shape to that of a nuclear power plant containment was constructed in the field where both the containment model and its surrounding soil, surface and sub-surface, are extensively instrumented to record earthquake data. In between September 1993 and May 1995, eight earthquakes with Richter magnitudes ranging from 4.2 to 6.2 were recorded. The author focuses on studying and analyzing the recorded data to provide information on the response characteristics of the Hualien soil-structure system, the SSI effects and the ground motion characteristics. An effort was also made to directly determine the site soil physical properties based on correlation analysis of the recorded data. No modeling simulations were attempted to try to analytically predict the SSI response of the soil and the structure. These will be the scope of a subsequent study

  14. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann

    2005-01-01

    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas...... of the data for high-resolution studies such as annual layer counting. The presented method uses deconvolution techniques and is robust to the presence of noise in the measurements. If integrated into the data processing, it requires no additional data collection. The method is applied to selected ice core...

  15. The Sensetivity of Flood Frequency Analysis on Record Length in Continuous United States

    Science.gov (United States)

    Hu, L.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    In flood frequency analysis (FFA), sufficiently long data series are important to get more reliable results. Compared to return periods of interest, at-site FFA usually needs large data sets. Generally, the precision of at site estimators and time-sampling errors are associated with the length of a gauged record. In this work, we quantify the difference with various record lengths. we use generalized extreme value (GEV) and Log Pearson type III (LP3), two traditional methods on annual maximum stream flows to undertake FFA, and propose quantitative ways, relative difference in median and interquartile range (IQR) to compare the flood frequency performances on different record length from selected 350 USGS gauges, which have more than 70 years record length in Continuous United States. Also, we group those gauges into different regions separately based on hydrological unit map and discuss the geometry impacts. The results indicate that long record length can avoid imposing an upper limit on the degree of sophistication. Working with relatively longer record length may lead accurate results than working with shorter record length. Furthermore, the influence of hydrologic unites for the watershed boundary dataset on those gauges also be presented. The California region is the most sensitive to record length, while gauges in the east perform steady.

  16. Decision Support System for Medical Care Quality Assessment Based on Health Records Analysis in Russia.

    Science.gov (United States)

    Taranik, Maksim; Kopanitsa, Georgy

    2017-01-01

    The paper presents developed decision system, oriented for healthcare providers. The system allows healthcare providers to detect and decrease nonconformities in health records and forecast the sum of insurance payments taking into account nonconformities. The components are ISO13606, fuzzy logic and case-based reasoning concept. The result of system implementation allowed to 10% increase insurance payments for healthcare provider.

  17. Towards successful coordination of electronic health record based-referrals: a qualitative analysis.

    Science.gov (United States)

    Hysong, Sylvia J; Esquivel, Adol; Sittig, Dean F; Paul, Lindsey A; Espadas, Donna; Singh, Simran; Singh, Hardeep

    2011-07-27

    Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP), the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs) (i.e., e-referrals), lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA) medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Despite facilitating information transfer between PCPs and subspecialists, e-referrals remain prone to coordination

  18. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    Science.gov (United States)

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access

  19. Analysis of Handling Processes of Record Versions in NoSQL Databases

    Directory of Open Access Journals (Sweden)

    Yu. A. Grigorev

    2015-01-01

    Full Text Available This article investigates the handling processes versions of a record in NoSQL databases. The goal of this work is to develop a model, which enables users both to handle record versions and work with a record simultaneously. This model allows us to estimate both a time distribution for users to handle record versions and a distribution of the count of record versions. With eventual consistency (W=R=1 there is a possibility for several users to update any record simultaneously. In this case, several versions of records with the same key will be stored in database. When reading, the user obtains all versions, handles them, and saves a new version, while older versions are deleted. According to the model, the user’s time for handling the record versions consists of two parts: random handling time of each version and random deliberation time for handling a result. Record saving time and records deleting time are much less than handling time, so, they are ignored in the model. The paper offers two model variants. According to the first variant, client's handling time of one record version is calculated as the sum of random handling times of one version based on the count of record versions. This variant ignores explicitly the fact that handling time of record versions may depend on the update count, performed by the other users between the sequential updates of the record by the current client. So there is the second variant, which takes this feature into consideration. The developed models were implemented in the GPSS environment. The model experiments with different counts of clients and different ratio between one record handling time and results deliberation time were conducted. The analysis showed that despite the resemblance of model variants, a difference in change nature between average values of record versions count and handling time is significant. In the second variant dependences of the average count of record versions in database and

  20. A flexible time recording and time correlation analysis system

    International Nuclear Information System (INIS)

    Shenhav, N.J.; Leiferman, G.; Segal, Y.; Notea, A.

    1983-01-01

    A system was developed to digitize and record the time intervals between detection event pulses, feed to its input channels from a detection device. The accumulated data is transferred continuously in real time to a disc through a PDP 11/34 minicomputer. Even though the system was designed for a specific scope, i.e., the comparative study of passive neutron nondestructive assay methods, it can be characterized by its features as a general purpose time series recorder. The time correlation analysis is performed by software after completion of the data accumulation. The digitizing clock period is selectable and any value, larger than a minimum of 100 ns, may be selected. Bursts of up to 128 events with a frequency up to 10 MHz may be recorded. With the present recorder-minicomputer combination, the maximal average recording frequency is 40 kHz. (orig.)

  1. Near-field optical recording based on solid immersion lens system

    Science.gov (United States)

    Hong, Tao; Wang, Jia; Wu, Yan; Li, Dacheng

    2002-09-01

    Near-field optical recording based on solid immersion lens (SIL) system has attracted great attention in the field of high-density data storage in recent years. The diffraction limited spot size in optical recording and lithography can be decreased by utilizing the SIL. The SIL near-field optical storage has advantages of high density, mass storage capacity and compatibility with many technologies well developed. We have set up a SIL near-field static recording system. The recording medium is placed on a 3-D scanning stage with the scanning range of 70×70×70μm and positioning accuracy of sub-nanometer, which will ensure the rigorous separation control in SIL system and the precision motion of the recording medium. The SIL is mounted on an inverted microscope. The focusing between long working distance objective and SIL can be monitored and observed by the CCD camera and eyes. Readout signal can be collected by a detector. Some experiments have been performed based on the SIL near-field recording system. The attempt of the near-field recording on photochromic medium has been made and the resolution improvement of the SIL has been presented. The influence factors in SIL near-field recording system are also discussed in the paper.

  2. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Science.gov (United States)

    2011-12-06

    ... Rulemaking: State-78, Risk Analysis and Management Records SUMMARY: Notice is hereby given that the... portions of the Risk Analysis and Management (RAM) Records, State-78, system of records contain criminal...) * * * (2) * * * Risk Analysis and Management Records, STATE-78. * * * * * (b) * * * (1) * * * Risk Analysis...

  3. Microcomputer-based system for 24-hour recording of oesophageal motility and pH profile with automated analysis

    NARCIS (Netherlands)

    Breedijk, M.; Smout, A. J.; van der Zouw, C.; Verwey, H.; Akkermans, L. M.

    1989-01-01

    A system developed for long-term simultaneous recording of oesophageal motility and pH in the ambulant patient is described. The system consists of a microprocessor based data-acquisition and preprocessing device, a personal computer for postprocessing, report generation and data storage, a

  4. [Analysis on regularity of prescriptions in "a guide to clinical practice with medical record" for diarrhoea based on traditional Chinese medicine inheritance support system].

    Science.gov (United States)

    He, Lan-Juan; Zhu, Xiang-Dong

    2016-06-01

    To analyze the regularities of prescriptions in "a guide to clinical practice with medical record" (Ye Tianshi) for diarrhoea based on traditional Chinese medicine inheritance support system(V2.5), and provide a reference for further research and development of new traditional Chinese medicines in treating diarrhoea. Traditional Chinese medicine inheritance support system was used to build a prescription database of Chinese medicines for diarrhoea. The software integration data mining method was used to analyze the prescriptions according to "four natures", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis on 94 prescriptions for diarrhoea was used to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations, and achieve 13 new prescriptions. This study indicated that the prescriptions for diarrhoea in "a guide to clinical practice with medical record" are mostly of eliminating dampness and tonifying deficienccy, with neutral drug property, sweet, bitter or hot in flavor, and reflecting the treatment principle of "activating spleen-energy and resolving dampness". Copyright© by the Chinese Pharmaceutical Association.

  5. Extension and statistical analysis of the GACP aerosol optical thickness record

    Science.gov (United States)

    Geogdzhayev, Igor V.; Mishchenko, Michael I.; Li, Jing; Rossow, William B.; Liu, Li; Cairns, Brian

    2015-10-01

    The primary product of the Global Aerosol Climatology Project (GACP) is a continuous record of the aerosol optical thickness (AOT) over the oceans. It is based on channel-1 and -2 radiance data from the Advanced Very High Resolution Radiometer (AVHRR) instruments flown on successive National Oceanic and Atmospheric Administration (NOAA) platforms. We extend the previous GACP dataset by four years through the end of 2009 using NOAA-17 and -18 AVHRR radiances recalibrated against MODerate resolution Imaging Spectroradiometer (MODIS) radiance data, thereby making the GACP record almost three decades long. The temporal overlap of over three years of the new NOAA-17 and the previous NOAA-16 record reveals an excellent agreement of the corresponding global monthly mean AOT values, thereby confirming the robustness of the vicarious radiance calibration used in the original GACP product. The temporal overlap of the NOAA-17 and -18 instruments is used to introduce a small additive adjustment to the channel-2 calibration of the latter resulting in a consistent record with increased data density. The Principal Component Analysis (PCA) of the newly extended GACP record shows that most of the volcanic AOT variability can be isolated into one mode responsible for ~ 12% of the total variance. This conclusion is confirmed by a combined PCA analysis of the GACP, MODIS, and Multi-angle Imaging SpectroRadiometer (MISR) AOTs during the volcano-free period from February 2000 to December 2009. We show that the modes responsible for the tropospheric AOT variability in the three datasets agree well in terms of correlation and spatial patterns. A previously identified negative AOT trend which started in the late 1980s and continued into the early 2000s is confirmed. Its magnitude and duration indicate that it was caused by changes in tropospheric aerosols. The latest multi-satellite segment of the GACP record shows that this trend tapered off, with no noticeable AOT change after 2002. This

  6. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Cho, Sung Gook; Joe, Yang Hee

    2005-01-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities

  7. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)]. E-mail: sgcho@incheon.ac.kr; Joe, Yang Hee [Department of Civil and Environmental System Engineering, University of Incheon, 177 Dohwa-dong, Nam-gu, Incheon 402-749 (Korea, Republic of)

    2005-08-01

    By nature, the seismic fragility analysis results will be considerably affected by the statistical data of design information and site-dependent ground motions. The engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are analyzed in this paper. An improved method of seismic fragility analysis is evaluated by comparative analyses to verify its efficiency for practical application to nuclear power plant structures. The effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures are also evaluated from the comparative studies. Observing the obtained results, the proposed method is more efficient for the multi-modes structures. The case study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities.

  8. Provincial prenatal record revision: a multiple case study of evidence-based decision-making at the population-policy level

    Directory of Open Access Journals (Sweden)

    Olson Joanne

    2008-12-01

    Full Text Available Abstract Background There is a significant gap in the knowledge translation literature related to how research evidence actually contributes to health care decision-making. Decisions around what care to provide at the population (rather than individual level are particularly complex, involving considerations such as feasibility, cost, and population needs in addition to scientific evidence. One example of decision-making at this "population-policy" level involves what screening questions and intervention guides to include on standardized provincial prenatal records. As mandatory medical reporting forms, prenatal records are potentially powerful vehicles for promoting population-wide evidence-based care. However, the extent to which Canadian prenatal records reflect best-practice recommendations for the assessment of well-known risk factors such as maternal smoking and alcohol consumption varies markedly across Canadian provinces and territories. The goal of this study is to better understand the interaction of contextual factors and research evidence on decision-making at the population-policy level, by examining the processes by which provincial prenatal records are reviewed and revised. Methods Guided by Dobrow et al.'s (2004 conceptual model for context-based evidence-based decision-making, this study will use a multiple case study design with embedded units of analysis to examine contextual factors influencing the prenatal record revision process in different Canadian provinces and territories. Data will be collected using multiple methods to construct detailed case descriptions for each province/territory. Using qualitative data analysis techniques, decision-making processes involving prenatal record content specifically related to maternal smoking and alcohol use will be compared both within and across each case, to identify key contextual factors influencing the uptake and application of research evidence by prenatal record review

  9. Inference for exponentiated general class of distributions based on record values

    Directory of Open Access Journals (Sweden)

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  10. Classifying Normal and Abnormal Status Based on Video Recordings of Epileptic Patients

    Directory of Open Access Journals (Sweden)

    Jing Li

    2014-01-01

    Full Text Available Based on video recordings of the movement of the patients with epilepsy, this paper proposed a human action recognition scheme to detect distinct motion patterns and to distinguish the normal status from the abnormal status of epileptic patients. The scheme first extracts local features and holistic features, which are complementary to each other. Afterwards, a support vector machine is applied to classification. Based on the experimental results, this scheme obtains a satisfactory classification result and provides a fundamental analysis towards the human-robot interaction with socially assistive robots in caring the patients with epilepsy (or other patients with brain disorders in order to protect them from injury.

  11. [Web-based electronic patient record as an instrument for quality assurance within an integrated care concept].

    Science.gov (United States)

    Händel, A; Jünemann, A G M; Prokosch, H-U; Beyer, A; Ganslandt, T; Grolik, R; Klein, A; Mrosek, A; Michelson, G; Kruse, F E

    2009-03-01

    A prerequisite for integrated care programmes is the implementation of a communication network meeting quality assurance standards. Against this background the main objective of the integrated care project between the University Eye Hospital Erlangen and the health insurance company AOK Bayern was to evaluate the potential and the acceptance of a web-based electronic patient record in the context of cataract and retinal surgery. Standardised modules for capturing pre-, intra- and post-operative data on the basis of clinical pathway guidelines for cataract- and retinal surgery have been developed. There are 6 data sets recorded per patient (1 pre-operative, 1 operative, 4-6 post-operative). For data collection, a web-based communication system (Soarian Integrated Care) has been chosen which meets the high requirements in data security, as well as being easy to handle. This teleconsultation system and the embedded electronic patient record are independent of the software used by respective offices and hospitals. Data transmission and storage were carried out in real-time. At present, 101 private ophthalmologists are taking part in the IGV contract with the University Eye Hospital Erlangen. This corresponds to 52% of all private ophthalmologists in the region. During the period from January 1st 2006 to December 31st 2006, 1844 patients were entered. Complete documentation was achieved in 1390 (75%) of all surgical procedures. For evaluation of this data, a multidimensional report and analysis tool (Cognos) was used. The deviation from target refraction as one quality indicator was in the mean 0.09 diopter. The web-based patient record used in this project was highly accepted by the private ophthalmologists. However there are still general concerns against the exchange of medical data via the internet. Nevertheless, the web-based patient record is an essential tool for a functional integration between the ambulatory and stationary health-care units. In addition to the

  12. Towards successful coordination of electronic health record based-referrals: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Paul Lindsey A

    2011-07-01

    Full Text Available Abstract Background Successful subspecialty referrals require considerable coordination and interactive communication among the primary care provider (PCP, the subspecialist, and the patient, which may be challenging in the outpatient setting. Even when referrals are facilitated by electronic health records (EHRs (i.e., e-referrals, lapses in patient follow-up might occur. Although compelling reasons exist why referral coordination should be improved, little is known about which elements of the complex referral coordination process should be targeted for improvement. Using Okhuysen & Bechky's coordination framework, this paper aims to understand the barriers, facilitators, and suggestions for improving communication and coordination of EHR-based referrals in an integrated healthcare system. Methods We conducted a qualitative study to understand coordination breakdowns related to e-referrals in an integrated healthcare system and examined work-system factors that affect the timely receipt of subspecialty care. We conducted interviews with seven subject matter experts and six focus groups with a total of 30 PCPs and subspecialists at two tertiary care Department of Veterans Affairs (VA medical centers. Using techniques from grounded theory and content analysis, we identified organizational themes that affected the referral process. Results Four themes emerged: lack of an institutional referral policy, lack of standardization in certain referral procedures, ambiguity in roles and responsibilities, and inadequate resources to adapt and respond to referral requests effectively. Marked differences in PCPs' and subspecialists' communication styles and individual mental models of the referral processes likely precluded the development of a shared mental model to facilitate coordination and successful referral completion. Notably, very few barriers related to the EHR were reported. Conclusions Despite facilitating information transfer between PCPs and

  13. Copula Regression Analysis of Simultaneously Recorded Frontal Eye Field and Inferotemporal Spiking Activity during Object-Based Working Memory

    Science.gov (United States)

    Hu, Meng; Clark, Kelsey L.; Gong, Xiajing; Noudoost, Behrad; Li, Mingyao; Moore, Tirin

    2015-01-01

    Inferotemporal (IT) neurons are known to exhibit persistent, stimulus-selective activity during the delay period of object-based working memory tasks. Frontal eye field (FEF) neurons show robust, spatially selective delay period activity during memory-guided saccade tasks. We present a copula regression paradigm to examine neural interaction of these two types of signals between areas IT and FEF of the monkey during a working memory task. This paradigm is based on copula models that can account for both marginal distribution over spiking activity of individual neurons within each area and joint distribution over ensemble activity of neurons between areas. Considering the popular GLMs as marginal models, we developed a general and flexible likelihood framework that uses the copula to integrate separate GLMs into a joint regression analysis. Such joint analysis essentially leads to a multivariate analog of the marginal GLM theory and hence efficient model estimation. In addition, we show that Granger causality between spike trains can be readily assessed via the likelihood ratio statistic. The performance of this method is validated by extensive simulations, and compared favorably to the widely used GLMs. When applied to spiking activity of simultaneously recorded FEF and IT neurons during working memory task, we observed significant Granger causality influence from FEF to IT, but not in the opposite direction, suggesting the role of the FEF in the selection and retention of visual information during working memory. The copula model has the potential to provide unique neurophysiological insights about network properties of the brain. PMID:26063909

  14. Barriers to retrieving patient information from electronic health record data: failure analysis from the TREC Medical Records Track.

    Science.gov (United States)

    Edinger, Tracy; Cohen, Aaron M; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems.

  15. Experimental investigation on spontaneously active hippocampal cultures recorded by means of high-density MEAs: analysis of the spatial resolution effects

    Directory of Open Access Journals (Sweden)

    Alessandro Maccione

    2010-05-01

    Full Text Available Based on experiments performed with high-resolution Active Pixel Sensor microelectrode arrays (APS-MEAs coupled with spontaneously active hippocampal cultures, this work investigates the spatial resolution effects of the neuroelectronic interface on the analysis of the recorded electrophysiological signals. The adopted methodology consists, first, in recording the spontaneous activity at the highest spatial resolution (inter-electrode separation of 21 µm from the whole array of 4096 microelectrodes. Then, the full resolution dataset is spatially down sampled in order to evaluate the effects on raster plot representation, array-wide spike rate (AWSR, mean firing rate (MFR and mean bursting rate (MBR. Furthermore, the effects of the array-to-network relative position are evaluated by shifting a subset of equally spaced electrodes on the entire recorded area. Results highlight that MFR and MBR are particularly influenced by the spatial resolution provided by the neuroelectronic interface. On high-resolution large MEAs, such analysis better represent the time-based parameterization of the network dynamics. Finally, this work suggest interesting capabilities of high-resolution MEAs for spatial-based analysis in dense and low-dense neuronal preparation for investigating signalling at both local and global neuronal circuitries.

  16. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt

    2014-11-01

    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  17. Instrumentation for low noise nanopore-based ionic current recording under laser illumination

    Science.gov (United States)

    Roelen, Zachary; Bustamante, José A.; Carlsen, Autumn; Baker-Murray, Aidan; Tabard-Cossa, Vincent

    2018-01-01

    We describe a nanopore-based optofluidic instrument capable of performing low-noise ionic current recordings of individual biomolecules under laser illumination. In such systems, simultaneous optical measurements generally introduce significant parasitic noise in the electrical signal, which can severely reduce the instrument sensitivity, critically hindering the monitoring of single-molecule events in the ionic current traces. Here, we present design rules and describe simple adjustments to the experimental setup to mitigate the different noise sources encountered when integrating optical components to an electrical nanopore system. In particular, we address the contributions to the electrical noise spectra from illuminating the nanopore during ionic current recording and mitigate those effects through control of the illumination source and the use of a PDMS layer on the SiNx membrane. We demonstrate the effectiveness of our noise minimization strategies by showing the detection of DNA translocation events during membrane illumination with a signal-to-noise ratio of ˜10 at 10 kHz bandwidth. The instrumental guidelines for noise minimization that we report are applicable to a wide range of nanopore-based optofluidic systems and offer the possibility of enhancing the quality of synchronous optical and electrical signals obtained during single-molecule nanopore-based analysis.

  18. Network Analysis of Time-Lapse Microscopy Recordings

    Directory of Open Access Journals (Sweden)

    Erik eSmedler

    2014-09-01

    Full Text Available Multicellular organisms rely on intercellular communication to regulate important cellular processes critical to life. To further our understanding of those processes there is a need to scrutinize dynamical signaling events and their functions in both cells and organisms. Here, we report a method and provide MATLAB code that analyzes time-lapse microscopy recordings to identify and characterize network structures within large cell populations, such as interconnected neurons. The approach is demonstrated using intracellular calcium (Ca2+ recordings in neural progenitors and cardiac myocytes, but could be applied to a wide variety of biosensors employed in diverse cell types and organisms. In this method, network structures are analyzed by applying cross-correlation signal processing and graph theory to single-cell recordings. The goal of the analysis is to determine if the single cell activity constitutes a network of interconnected cells and to decipher the properties of this network. The method can be applied in many fields of biology in which biosensors are used to monitor signaling events in living cells. Analyzing intercellular communication in cell ensembles can reveal essential network structures that provide important biological insights.

  19. A Java-based electronic healthcare record software for beta-thalassaemia.

    Science.gov (United States)

    Deftereos, S; Lambrinoudakis, C; Andriopoulos, P; Farmakis, D; Aessopos, A

    2001-01-01

    Beta-thalassaemia is a hereditary disease, the prevalence of which is high in persons of Mediterranean, African, and Southeast Asian ancestry. In Greece it constitutes an important public health problem. Beta-thalassaemia necessitates continuous and complicated health care procedures such as daily chelation; biweekly transfusions; and periodic cardiology, endocrinology, and hepatology evaluations. Typically, different care items are offered in different, often-distant, health care units, which leads to increased patient mobility. This is especially true in rural areas. Medical records of patients suffering from beta-thalassaemia are inevitably complex and grow in size very fast. They are currently paper-based, scattered over all units involved in the care process. This hinders communication of information between health care professionals and makes processing of the medical records difficult, thus impeding medical research. Our objective is to provide an electronic means for recording, communicating, and processing all data produced in the context of the care process of patients suffering from beta-thalassaemia. We have developed - and we present in this paper - Java-based Electronic Healthcare Record (EHCR) software, called JAnaemia. JAnaemia is a general-purpose EHCR application, which can be customized for use in all medical specialties. Customization for beta-thalassaemia has been performed in collaboration with 4 Greek hospitals. To be capable of coping with patient record diversity, JAnaemia has been based on the EHCR architecture proposed in the ENV 13606:1999 standard, published by the CEN/TC251 committee. Compliance with the CEN architecture also ensures that several additional requirements are fulfilled in relation to clinical comprehensiveness; to record sharing and communication; and to ethical, medico-legal, and computational issues. Special care has been taken to provide a user-friendly, form-based interface for data entry and processing. The

  20. Image-based electronic patient records for secured collaborative medical applications.

    Science.gov (United States)

    Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun

    2005-01-01

    We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.

  1. Quantum-dot based nanothermometry in optical plasmonic recording media

    International Nuclear Information System (INIS)

    Maestro, Laura Martinez; Zhang, Qiming; Li, Xiangping; Gu, Min; Jaque, Daniel

    2014-01-01

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media

  2. The home-based maternal record: a tool for family involvement in health care.

    Science.gov (United States)

    Shah, P M; Shah, K P; Belsey, M A

    1988-04-01

    The home-based maternal record offers an opportunity for family involvement in health care. Home-based records of maternal health have been used in several developing countries, and have led to increased detection and monitoring of women at high risk for complications during pregnancy. Home-based cards that include menstrual information remind health workers to educate and motivate women for family planning, and serve as a source of health statistics. Records that use pictures and symbols have been used by illiterate traditional birth attendants, and had an accurate completion rate of over 90%. The WHO has prepared a prototype record and guidelines for local adaptation. The objectives were to provide continuity of care throughout pregnancy, ensure recognition of at-risk women, encourage family participation in health care, an provide data on maternal health, breastfeeding, and family planning. The guidelines have been evaluated and results show that the records have improved the coverage, acceptability, and quality of MCH/FP care. The records have also led to an increase in diagnosis and referral of at-risk women and newborns, and the use of family planning and tetanus toxoid immunization has increased in the 13 centers where the reports are being used. Focus group discussions have shown that mothers, community members, primary health workers, and doctors and nurses liked the records. It is important to adapt criteria for high-risk conditions to the local areas where the records will be used to ensure the relevance of risk diagnosis. The evidence shows that home-based maternal and child records can be an important tool in the promotion of self-reliance and family participation in health care. In addition, home-based records can be used for the implementation of primary health care at the local level, and serve as a resource for data collection.

  3. Clinical Assistant Diagnosis for Electronic Medical Record Based on Convolutional Neural Network.

    Science.gov (United States)

    Yang, Zhongliang; Huang, Yongfeng; Jiang, Yiran; Sun, Yuxi; Zhang, Yu-Jin; Luo, Pengcheng

    2018-04-20

    Automatically extracting useful information from electronic medical records along with conducting disease diagnoses is a promising task for both clinical decision support(CDS) and neural language processing(NLP). Most of the existing systems are based on artificially constructed knowledge bases, and then auxiliary diagnosis is done by rule matching. In this study, we present a clinical intelligent decision approach based on Convolutional Neural Networks(CNN), which can automatically extract high-level semantic information of electronic medical records and then perform automatic diagnosis without artificial construction of rules or knowledge bases. We use collected 18,590 copies of the real-world clinical electronic medical records to train and test the proposed model. Experimental results show that the proposed model can achieve 98.67% accuracy and 96.02% recall, which strongly supports that using convolutional neural network to automatically learn high-level semantic features of electronic medical records and then conduct assist diagnosis is feasible and effective.

  4. A shared computer-based problem-oriented patient record for the primary care team.

    Science.gov (United States)

    Linnarsson, R; Nordgren, K

    1995-01-01

    1. INTRODUCTION. A computer-based patient record (CPR) system, Swedestar, has been developed for use in primary health care. The principal aim of the system is to support continuous quality improvement through improved information handling, improved decision-making, and improved procedures for quality assurance. The Swedestar system has evolved during a ten-year period beginning in 1984. 2. SYSTEM DESIGN. The design philosophy is based on the following key factors: a shared, problem-oriented patient record; structured data entry based on an extensive controlled vocabulary; advanced search and query functions, where the query language has the most important role; integrated decision support for drug prescribing and care protocols and guidelines; integrated procedures for quality assurance. 3. A SHARED PROBLEM-ORIENTED PATIENT RECORD. The core of the CPR system is the problem-oriented patient record. All problems of one patient, recorded by different members of the care team, are displayed on the problem list. Starting from this list, a problem follow-up can be made, one problem at a time or for several problems simultaneously. Thus, it is possible to get an integrated view, across provider categories, of those problems of one patient that belong together. This shared problem-oriented patient record provides an important basis for the primary care team work. 4. INTEGRATED DECISION SUPPORT. The decision support of the system includes a drug prescribing module and a care protocol module. The drug prescribing module is integrated with the patient records and includes an on-line check of the patient's medication list for potential interactions and data-driven reminders concerning major drug problems. Care protocols have been developed for the most common chronic diseases, such as asthma, diabetes, and hypertension. The patient records can be automatically checked according to the care protocols. 5. PRACTICAL EXPERIENCE. The Swedestar system has been implemented in a

  5. Waveform shape analysis: extraction of physiologically relevant information from Doppler recordings.

    Science.gov (United States)

    Ramsay, M M; Broughton Pipkin, F; Rubin, P C; Skidmore, R

    1994-05-01

    1. Doppler recordings were made from the brachial artery of healthy female subjects during a series of manoeuvres which altered the pressure-flow characteristics of the vessel. 2. Changes were induced in the peripheral circulation of the forearm by the application of heat or ice-packs. A sphygmomanometer cuff was used to create graded occlusion of the vessel above and below the point of measurement. Recordings were also made whilst the subjects performed a standardized Valsalva manoeuvre. 3. The Doppler recordings were analysed both with the standard waveform indices (systolic/diastolic ratio, pulsatility index and resistance index) and by the method of Laplace transform analysis. 4. The waveform parameters obtained by Laplace transform analysis distinguished the different changes in flow conditions; they thus had direct physiological relevance, unlike the standard waveform indices.

  6. 'Citizen science' recording of fossils by adapting existing computer-based biodiversity recording tools

    Science.gov (United States)

    McGowan, Alistair

    2014-05-01

    Biodiversity recording activities have been greatly enhanced by the emergence of online schemes and smartphone applications for recording and sharing data about a wide variety of flora and fauna. As a palaeobiologist, one of the areas of research I have been heavily involved in is the question of whether the amount of rock available to sample acts as a bias on our estimates of biodiversity through time. Although great progress has been made on this question over the past ten years by a number of researchers, I still think palaeontology has not followed the lead offered by the 'citizen science' revolution in studies of extant biodiversity. By constructing clearly structured surveys with online data collection support, it should be possible to collect field data on the occurrence of fossils at the scale of individual exposures, which are needed to test competing hypotheses about these effects at relatively small spatial scales. Such data collection would be hard to justify for universities and museums with limited personnel but a co-ordinated citizen science programme would be capable of delivering such a programme. Data collection could be based on the MacKinnon's Lists method, used in rapid conservation assessment work. It relies on observers collecting lists of a fixed length (e.g. 10 species long) but what is important is that it focuses on getting observers to ignore sightings of the same species until that list is complete. This overcomes the problem of 'common taxa being commonly recorded' and encourages observers to seek out and identify the rarer taxa. This gives a targeted but finite task. Rather than removing fossils, participants would be encouraged to take photographs to share via a recording website. The success of iSpot, which allows users to upload photos of plants and animals for other users to help with identifications, offers a model for overcoming the problems of identifying fossils, which can often look nothing like the examples illustrated in

  7. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Joe, Yang Hee; Cho, Sung Gook

    2003-01-01

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  8. A Kinect-based system for automatic recording of some pigeon behaviors.

    Science.gov (United States)

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M

    2015-12-01

    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  9. Climatic Changes on Tibetan Plateau Based on Ice Core Records

    Science.gov (United States)

    Yao, T.

    2008-12-01

    Climatic changes have been reconstructed for the Tibetan Plateau based on ice core records. The Guliya ice core on the Tibetan Plateau presents climatic changes in the past 100,000 years, thus is comparative with that from Vostok ice core in Antarctica and GISP2 record in Arctic. These three records share an important common feature, i.e., our climate is not stable. It is also evident that the major patterns of climatic changes are similar on the earth. Why does climatic change over the earth follow a same pattern? It might be attributed to solar radiation. We found that the cold periods correspond to low insolation periods, and warm periods to high insolation periods. We found abrupt climatic change in the ice core climatic records, which presented dramatic temperature variation of as much as 10 °C in 50 or 60 years. Our major challenge in the study of both climate and environment is that greenhouse gases such as CO2, CH4 are possibly amplifying global warming, though at what degree remains unclear. One of the ways to understand the role of greenhouse gases is to reconstruct the past greenhouse gases recorded in ice. In 1997, we drilled an ice core from 7100 m a.s.l. in the Himalayas to reconstruct methane record. Based on the record, we found seasonal cycles in methane variation. In particular, the methane concentration is high in summer, suggestiing active methane emission from wet land in summer. Based on the seasonal cycle, we can reconstruct the methane fluctuation history in the past 500 years. The most prominent feature of the methane record in the Himalayan ice core is the abrupt increase since 1850 A.D.. This is closely related to the industrial revolution worldwide. We can also observe sudden decrease in methane concentration during the World War I and World War II. It implies that the industrial revolution has dominated the atmospheric greenhouse gas emission for about 100 years. Besides, the average methane concentration in the Himalayan ice core is

  10. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  11. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  13. HJD-I record and analysis meter for nuclear information

    International Nuclear Information System (INIS)

    Di Shaoliang; Huang Yong; Xiao Yanbin

    1992-01-01

    A low-cost, small-volume, multi-function and new model intelligent nuclear electronic meter HJD-I Record and Analysis Meter are stated for Nuclear Information. It's hardware and software were detailed and the 137 Cs spectrum with this meter was presented

  14. Privacy Impact Assessment for the Lead-based Paint System of Records

    Science.gov (United States)

    The Lead-based Paint System of Records collects personally identifiable information, test scores, and submitted fees. Learn how this data is collected, how it will be used, access to the data, the purpose of data collection, and record retention policies.

  15. A near real-time satellite-based global drought climate data record

    International Nuclear Information System (INIS)

    AghaKouchak, Amir; Nakhjiri, Navid

    2012-01-01

    Reliable drought monitoring requires long-term and continuous precipitation data. High resolution satellite measurements provide valuable precipitation information on a quasi-global scale. However, their short lengths of records limit their applications in drought monitoring. In addition to this limitation, long-term low resolution satellite-based gauge-adjusted data sets such as the Global Precipitation Climatology Project (GPCP) one are not available in near real-time form for timely drought monitoring. This study bridges the gap between low resolution long-term satellite gauge-adjusted data and the emerging high resolution satellite precipitation data sets to create a long-term climate data record of droughts. To accomplish this, a Bayesian correction algorithm is used to combine GPCP data with real-time satellite precipitation data sets for drought monitoring and analysis. The results showed that the combined data sets after the Bayesian correction were a significant improvement compared to the uncorrected data. Furthermore, several recent major droughts such as the 2011 Texas, 2010 Amazon and 2010 Horn of Africa droughts were detected in the combined real-time and long-term satellite observations. This highlights the potential application of satellite precipitation data for regional to global drought monitoring. The final product is a real-time data-driven satellite-based standardized precipitation index that can be used for drought monitoring especially over remote and/or ungauged regions. (letter)

  16. Microcontroller-based wireless recorder for biomedical signals.

    Science.gov (United States)

    Chien, C-N; Hsu, H-W; Jang, J-K; Rau, C-L; Jaw, F-S

    2005-01-01

    A portable multichannel system is described for the recording of biomedical signals wirelessly. Instead of using the conversional time-division analog-modulation method, the technique of digital multiplexing was applied to increase the number of signal channels to 4. Detailed design considerations and functional allocation of the system is discussed. The frontend unit was modularly designed to condition the input signal in an optimal manner. Then, the microcontroller handled the tasks of data conversion, wireless transmission, as well as providing the ability of simple preprocessing such as waveform averaging or rectification. The low-power nature of this microcontroller affords the benefit of battery operation and hence, patient isolation of the system. Finally, a single-chip receiver, which compatible with the RF transmitter of the microcontroller, was used to implement a compact interface with the host computer. An application of this portable recorder for low-back pain studies is shown. This device can simultaneously record one ECG and two surface EMG wirelessly, thus, is helpful in relieving patients' anxiety devising clinical measurement. Such an approach, microcontroller-based wireless measurement, could be an important trend for biomedical instrumentation and we help that this paper could be useful for other colleagues.

  17. Speech watermarking: an approach for the forensic analysis of digital telephonic recordings.

    Science.gov (United States)

    Faundez-Zanuy, Marcos; Lucena-Molina, Jose J; Hagmüller, Martin

    2010-07-01

    In this article, the authors discuss the problem of forensic authentication of digital audio recordings. Although forensic audio has been addressed in several articles, the existing approaches are focused on analog magnetic recordings, which are less prevalent because of the large amount of digital recorders available on the market (optical, solid state, hard disks, etc.). An approach based on digital signal processing that consists of spread spectrum techniques for speech watermarking is presented. This approach presents the advantage that the authentication is based on the signal itself rather than the recording format. Thus, it is valid for usual recording devices in police-controlled telephone intercepts. In addition, our proposal allows for the introduction of relevant information such as the recording date and time and all the relevant data (this is not always possible with classical systems). Our experimental results reveal that the speech watermarking procedure does not interfere in a significant way with the posterior forensic speaker identification.

  18. Understanding the Connection Between Traumatic Brain Injury and Alzheimer’s Disease: A Population-Based Medical Record Review Analysis

    Science.gov (United States)

    2017-10-01

    Medical Record Review Analysis PRINCIPAL INVESTIGATOR: Allen W. Brown, MD CONTRACTING ORGANIZATION: Mayo Clinic Rochester, MN 55905 REPORT DATE...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and... reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information

  19. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  20. Metrics for Electronic-Nursing-Record-Based Narratives: Cross-sectional Analysis

    Science.gov (United States)

    Kim, Kidong; Jeong, Suyeon; Lee, Kyogu; Park, Hyeoun-Ae; Min, Yul Ha; Lee, Joo Yun; Kim, Yekyung; Yoo, Sooyoung; Doh, Gippeum

    2016-01-01

    Summary Objectives We aimed to determine the characteristics of quantitative metrics for nursing narratives documented in electronic nursing records and their association with hospital admission traits and diagnoses in a large data set not limited to specific patient events or hypotheses. Methods We collected 135,406,873 electronic, structured coded nursing narratives from 231,494 hospital admissions of patients discharged between 2008 and 2012 at a tertiary teaching institution that routinely uses an electronic health records system. The standardized number of nursing narratives (i.e., the total number of nursing narratives divided by the length of the hospital stay) was suggested to integrate the frequency and quantity of nursing documentation. Results The standardized number of nursing narratives was higher for patients aged 70 years (median = 30.2 narratives/day, interquartile range [IQR] = 24.0–39.4 narratives/day), long (8 days) hospital stays (median = 34.6 narratives/day, IQR = 27.2–43.5 narratives/day), and hospital deaths (median = 59.1 narratives/day, IQR = 47.0–74.8 narratives/day). The standardized number of narratives was higher in “pregnancy, childbirth, and puerperium” (median = 46.5, IQR = 39.0–54.7) and “diseases of the circulatory system” admissions (median = 35.7, IQR = 29.0–43.4). Conclusions Diverse hospital admissions can be consistently described with nursing-document-derived metrics for similar hospital admissions and diagnoses. Some areas of hospital admissions may have consistently increasing volumes of nursing documentation across years. Usability of electronic nursing document metrics for evaluating healthcare requires multiple aspects of hospital admissions to be considered. PMID:27901174

  1. [Problem list in computer-based patient records].

    Science.gov (United States)

    Ludwig, C A

    1997-01-14

    Computer-based clinical information systems are capable of effectively processing even large amounts of patient-related data. However, physicians depend on rapid access to summarized, clearly laid out data on the computer screen to inform themselves about a patient's current clinical situation. In introducing a clinical workplace system, we therefore transformed the problem list-which for decades has been successfully used in clinical information management-into an electronic equivalent and integrated it into the medical record. The table contains a concise overview of diagnoses and problems as well as related findings. Graphical information can also be integrated into the table, and an additional space is provided for a summary of planned examinations or interventions. The digital form of the problem list makes it possible to use the entire list or selected text elements for generating medical documents. Diagnostic terms for medical reports are transferred automatically to corresponding documents. Computer technology has an immense potential for the further development of problem list concepts. With multimedia applications sound and images will be included in the problem list. For hyperlink purpose the problem list could become a central information board and table of contents of the medical record, thus serving as the starting point for database searches and supporting the user in navigating through the medical record.

  2. Factors influencing consumer adoption of USB-based Personal Health Records in Taiwan

    Directory of Open Access Journals (Sweden)

    Jian Wen-Shan

    2012-08-01

    Full Text Available Abstract Background Usually patients receive healthcare services from multiple hospitals, and consequently their healthcare data are dispersed over many facilities’ paper and electronic-based record systems. Therefore, many countries have encouraged the research on data interoperability, access, and patient authorization. This study is an important part of a national project to build an information exchange environment for cross-hospital digital medical records carried out by the Department of Health (DOH of Taiwan in May 2008. The key objective of the core project is to set up a portable data exchange environment in order to enable people to maintain and own their essential health information. This study is aimed at exploring the factors influencing behavior and adoption of USB-based Personal Health Records (PHR in Taiwan. Methods Quota sampling was used, and structured questionnaires were distributed to the outpatient department at ten medical centers which participated in the DOH project to establish the information exchange environment across hospitals. A total of 3000 questionnaires were distributed and 1549 responses were collected, out of those 1465 were valid, accumulating the response rate to 48.83%. Results 1025 out of 1465 respondents had expressed their willingness to apply for the USB-PHR. Detailed analysis of the data reflected that there was a remarkable difference in the “usage intention” between the PHR adopters and non-adopters (χ2 =182.4, p  Conclusions Higher Usage Intentions, Perceived Usefulness and Subjective Norm of patients were found to be the key factors influencing PHR adoption. Thus, we suggest that government and hospitals should promote the potential usefulness of PHR, and physicians should encourage patients' to adopt the PHR.

  3. Impact of the recorded variable on recurrence quantification analysis of flows

    International Nuclear Information System (INIS)

    Portes, Leonardo L.; Benda, Rodolfo N.; Ugrinowitsch, Herbert; Aguirre, Luis A.

    2014-01-01

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA

  4. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  5. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features.

    Science.gov (United States)

    Chudáček, V; Spilka, J; Janků, P; Koucký, M; Lhotská, L; Huptych, M

    2011-08-01

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features.

  6. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Chudáček, V; Spilka, J; Lhotská, L; Huptych, M; Janků, P; Koucký, M

    2011-01-01

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  7. Automating occupational protection records systems

    International Nuclear Information System (INIS)

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs

  8. Miniature, Single Channel, Memory-Based, High-G Acceleration Recorder (Millipen)

    International Nuclear Information System (INIS)

    Rohwer, Tedd A.

    1999-01-01

    The Instrumentation and Telemetry Departments at Sandia National Laboratories have been instrumenting earth penetrators for over thirty years. Recorded acceleration data is used to quantify penetrator performance. Penetrator testing has become more difficult as desired impact velocities have increased. This results in the need for small-scale test vehicles and miniature instrumentation. A miniature recorder will allow penetrator diameters to significantly decrease, opening the window of testable parameters. Full-scale test vehicles will also benefit from miniature recorders by using a less intrusive system to instrument internal arming, fusing, and firing components. This single channel concept is the latest design in an ongoing effort to miniaturize the size and reduce the power requirement of acceleration instrumentation. A micro-controller/memory based system provides the data acquisition, signal conditioning, power regulation, and data storage. This architecture allows the recorder, including both sensor and electronics, to occupy a volume of less than 1.5 cubic inches, draw less than 200mW of power, and record 15kHz data up to 40,000 gs. This paper will describe the development and operation of this miniature acceleration recorder

  9. Validity of a hospital-based obstetric register using medical records as reference

    DEFF Research Database (Denmark)

    Brixval, Carina Sjöberg; Thygesen, Lau Caspar; Johansen, Nanna Roed

    2015-01-01

    BACKGROUND: Data from hospital-based registers and medical records offer valuable sources of information for clinical and epidemiological research purposes. However, conducting high-quality epidemiological research requires valid and complete data sources. OBJECTIVE: To assess completeness...... and validity of a hospital-based clinical register - the Obstetric Database - using a national register and medical records as references. METHODS: We assessed completeness of a hospital-based clinical register - the Obstetric Database - by linking data from all women registered in the Obstetric Database...... Database therefore offers a valuable source for examining clinical, administrative, and research questions....

  10. Electronic Health Record for Intensive Care based on Usual Windows Based Software.

    Science.gov (United States)

    Reper, Arnaud; Reper, Pascal

    2015-08-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  11. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    Science.gov (United States)

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  12. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    Science.gov (United States)

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  13. De-identification of unstructured paper-based health records for privacy-preserving secondary use.

    Science.gov (United States)

    Fenz, Stefan; Heurix, Johannes; Neubauer, Thomas; Rella, Antonio

    2014-07-01

    Abstract Whenever personal data is processed, privacy is a serious issue. Especially in the document-centric e-health area, the patients' privacy must be preserved in order to prevent any negative repercussions for the patient. Clinical research, for example, demands structured health records to carry out efficient clinical trials, whereas legislation (e.g. HIPAA) regulates that only de-identified health records may be used for research. However, unstructured and often paper-based data dominates information technology, especially in the healthcare sector. Existing approaches are geared towards data in English-language documents only and have not been designed to handle the recognition of erroneous personal data which is the result of the OCR-based digitization of paper-based health records.

  14. Revised estimates of Greenland ice sheet thinning histories based on ice-core records

    DEFF Research Database (Denmark)

    Lecavalier, B.S.; Milne, G.A.; Fisher, D.A.

    2013-01-01

    -based reconstructions and, to some extent, the estimated elevation histories. A key component of the ice core analysis involved removing the influence of vertical surface motion on the dO signal measured from the Agassiz and Renland ice caps. We re-visit the original analysis with the intent to determine if the use...... of more accurate land uplift curves can account for some of the above noted discrepancy. To improve on the original analysis, we apply a geophysical model of glacial isostatic adjustment calibrated to sea-level records from the Queen Elizabeth Islands and Greenland to calculate the influence of land...... in this selection is further complicated by the possible influence of Innuitian ice during the early Holocene (12-8 ka BP). Our results indicate that a more accurate treatment of the uplift correction leads to elevation histories that are, in general, shifted down relative to the original curves at GRIP, NGRIP, DYE...

  15. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  16. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis.

    Science.gov (United States)

    Shenkin, S D; Zhang, M G; Der, G; Mathur, S; Mina, T H; Reynolds, R M

    2017-04-01

    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EMBASE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) to May 2015. We included studies that reported recalled birth weight and recorded birth weight. We excluded studies investigating a clinical population. Two reviewers independently reviewed citations, extracted data, assessed risk of bias. Data were pooled in a random effects meta-analysis for correlation and mean difference. In total, 40 studies were eligible for qualitative synthesis (n=78,997 births from 78,196 parents). Agreement between recalled and recorded birth weight was high: pooled estimate of correlation in 23 samples from 19 studies (n=7406) was 0.90 [95% confidence interval (CI) 0.87-0.93]. The difference between recalled and recorded birth weight in 29 samples from 26 studies (n=29,293) was small [range -86-129 g; random effects estimate 1.4 g (95% CI -4.0-6.9 g)]. Studies were heterogeneous, with no evidence for an effect of time since birth, person reporting, recall bias, or birth order. In post-hoc subgroup analysis, recall was higher than recorded birth weight by 80 g (95% CI 57-103 g) in low and middle income countries. In conclusion, there is high agreement between recalled and recorded birth weight. If birth weight is recalled, it is suitable for use in epidemiological studies, at least in high income countries.

  17. Computer analysis of sound recordings from two Anasazi sites in northwestern New Mexico

    Science.gov (United States)

    Loose, Richard

    2002-11-01

    Sound recordings were made at a natural outdoor amphitheater in Chaco Canyon and in a reconstructed great kiva at Aztec Ruins. Recordings included computer-generated tones and swept sine waves, classical concert flute, Native American flute, conch shell trumpet, and prerecorded music. Recording equipment included analog tape deck, digital minidisk recorder, and direct digital recording to a laptop computer disk. Microphones and geophones were used as transducers. The natural amphitheater lies between the ruins of Pueblo Bonito and Chetro Ketl. It is a semicircular arc in a sandstone cliff measuring 500 ft. wide and 75 ft. high. The radius of the arc was verified with aerial photography, and an acoustic ray trace was generated using cad software. The arc is in an overhanging cliff face and brings distant sounds to a line focus. Along this line, there are unusual acoustic effects at conjugate foci. Time history analysis of recordings from both sites showed that a 60-dB reverb decay lasted from 1.8 to 2.0 s, nearly ideal for public performances of music. Echoes from the amphitheater were perceived to be upshifted in pitch, but this was not seen in FFT analysis. Geophones placed on the floor of the great kiva showed a resonance at 95 Hz.

  18. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment

    Directory of Open Access Journals (Sweden)

    Hongyi Mao

    2017-01-01

    Full Text Available In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients’ medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed.

  19. 77 FR 27561 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking...

    Science.gov (United States)

    2012-05-11

    ...-Based Criminal History Records Checks for Individuals Seeking Unescorted Access to Non-Power Reactors... reactor (NPR) licensees to obtain fingerprint-based criminal history records checks before granting any... of the Energy Policy Act of 2005 (EPAct), which amended Section 149 of the Atomic Energy Act of 1954...

  20. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Science.gov (United States)

    2010-07-01

    ... records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests... and cellulose-acetate base film? (a) The nitrocellulose base, a substance akin to gun cotton, is chemically unstable and highly flammable. Agencies must handle nitrocellulose-base film (used in the...

  1. Laboratory-based recording of holographic fine structure in X-ray absorption anisotropy using polycapillary optics

    Energy Technology Data Exchange (ETDEWEB)

    Dabrowski, K.M. [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland); Korecki, P., E-mail: pawel.korecki@uj.edu.pl [Institute of Physics, Jagiellonian University, Reymonta 4, 30-059 Krakow (Poland)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer Holographic fine structures in X-ray absorption recorded using a tabletop setup. Black-Right-Pointing-Pointer Setup based on polycapillary collimating optics and an HOPG crystal. Black-Right-Pointing-Pointer Demonstration of element sensitivity by detection of X-ray fluorescence. Black-Right-Pointing-Pointer Potential of laboratory-based experiments for heavily doped crystals and thin films. - Abstract: A tabletop setup composed of a collimating polycapillary optics and a highly oriented pyrolytic graphite monochromator (HOPG) was characterized and used for recording two-dimensional maps of X-ray absorption anisotropy (XAA). XAA originates from interference of X-rays directly inside the sample. Depending on experimental conditions, fine structures in XAA can be interpreted in terms of X-ray holograms or X-ray standing waves and can be used for an element selective atomic-resolved structural analysis. The implementation of polycapillary optics resulted in a two-order of magnitude gain in the radiant intensity (photons/s/solid angle) as compared to a system without optics and enabled efficient recording of XAA with a resolution of 0.15 Degree-Sign for Mo K{alpha} radiation. Element sensitivity was demonstrated by acquisition of distinct XAA signals for Ga and As atoms in a GaAs (1 1 1) wafer by using X-ray fluorescence as a secondary signal. These results indicate the possibility of performing laboratory-based XAA experiments for heavily doped single crystals or thin films. So far, because of the weak holographic modulation of XAA, such experiments could be only performed using synchrotron radiation.

  2. Computer-based video analysis identifies infants with absence of fidgety movements.

    Science.gov (United States)

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  3. Secure Management of Personal Health Records by Applying Attribute-Based Encryption

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, Muhammad; Petkovic, M.

    2009-01-01

    The confidentiality of personal health records is a major problem when patients use commercial Web-based systems to store their health data. Traditional access control mechanisms, such as Role-Based Access Control, have several limitations with respect to enforcing access control policies and

  4. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    Science.gov (United States)

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  5. Simplified Technique for Incorporating a Metal Mesh into Record Bases for Mandibular Implant Overdentures.

    Science.gov (United States)

    Godoy, Antonio; Siegel, Sharon C

    2015-12-01

    Mandibular implant-retained overdentures have become the standard of care for patients with mandibular complete edentulism. As part of the treatment, the mandibular implant-retained overdenture may require a metal mesh framework to be incorporated to strengthen the denture and avoid fracture of the prosthesis. Integrating the metal mesh framework as part of the acrylic record base and wax occlusion rim before the jaw relation procedure will avoid the distortion of the record base and will minimize the chances of processing errors. A simplified method to incorporate the mesh into the record base and occlusion rim is presented in this technique article. © 2015 by the American College of Prosthodontists.

  6. Validity of electronic diet recording nutrient estimates compared to dietitian analysis of diet records: A randomized controlled trial

    Science.gov (United States)

    Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...

  7. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems.

    Science.gov (United States)

    Seo, Hwa Jeong; Kim, Hye Hyeon; Kim, Ju Han

    2011-09-01

    Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system.

  8. Results from a survey of national immunization programmes on home-based vaccination record practices in 2013.

    Science.gov (United States)

    Young, Stacy L; Gacic-Dobo, Marta; Brown, David W

    2015-07-01

    Data on home-based records (HBRs) practices within national immunization programmes are non-existent, making it difficult to determine whether current efforts of immunization programmes related to basic recording of immunization services are appropriately focused. During January 2014, WHO and the United Nations Children's Fund sent a one-page questionnaire to 195 countries to obtain information on HBRs including type of record used, number of records printed, whether records were provided free-of-charge or required by schools, whether there was a stock-out and the duration of any stock-outs that occurred, as well as the total expenditure for printing HBRs during 2013. A total of 140 countries returned a completed HBR questionnaire. Two countries were excluded from analysis because they did not use a HBR during 2013. HBR types varied across countries (vaccination only cards, 32/138 [23.1%]; vaccination plus growth monitoring records, 31/138 [22.4%]; child health books, 48/138 [34.7%]; combination of these, 27/138 [19.5%] countries). HBRs were provided free-of-charge in 124/138 (89.8%) respondent countries. HBRs were required for school entry in 62/138 (44.9%) countries. Nearly a quarter of countries reported HBR stock-outs during 2013. Computed printing cost per record was work remains to improve forecasting where appropriate, to prevent HBR stock-outs, to identify and improve sustainable financing options and to explore viable market shaping opportunities. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  9. The role of records management professionals in optical disk-based document imaging systems in the petroleum industry

    International Nuclear Information System (INIS)

    Cisco, S.L.

    1992-01-01

    Analyses of the data indicated that nearly one third of the 83 companies in this study had implemented one or more document imaging systems. Companies with imaging systems mostly were large (more than 1,001 employees), and mostly were international in scope. Although records management professionals traditionally were delegated responsibility for acquiring, designing, implementing, and maintaining paper-based information systems and the records therein, when records were converted to optical disks, responsibility for acquiring, designing, implementing, and maintaining optical disk-based information systems and the records therein, was delegated more frequently to end user departments and IS/MIS/DP professionals than to records professionals. Records management professionals assert that the need of an organization for a comprehensive records management program is not served best when individuals who are not professional records managers are responsible for the records stored in optical disk-based information systems

  10. Electronic system for recording proportional counter rare pulses with the pulse shape analysis

    International Nuclear Information System (INIS)

    Barabanov, I.R.; Gavrin, V.N.; Zakharov, Yu.I.; Tikhonov, A.A.

    1984-01-01

    The anutomated system for recording proportional counter rare pulses is described. The proportional counters are aimed at identification of 37 Ar and H7 1 Gr decays in chemical radiation detectors of solar neutrino. Pulse shape recording by means of a storage oscilloscope and a TV display is performed in the system considered besides two-parametric selection of events (measurement of pulse amplitude in a slow channel and the amplitude of pulse differentiated with time constant of about 10 ns in a parallel fast channel). Pulse discrimination by a front rise rate provides background decrease in the 55 Fe range (5.9 keV) by 6 times; the visual analysis of pulse shapes recorded allows to decrease the background additionally by 25-30%. The background counting rate in the 55 Fe range being equal to 1 pulse per 1.5 days, is obtained when using the installation described above, as well as the passive Pb shield 5 cm thick, and the active shield based on the anticoincidence NaI(Tl) detector with the cathode 5.6 mm in-diameter made of Fe fabircated by zone melting. The installation described allows to reach the background level of 0.6 pulse/day (the total coefficient of background attenuation is 400). Further background decrease is supposed to be provided by installation allocation in the low-noise underground laboratory of the Baksan Neutrino Observatory

  11. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, Gabriele; Smith, James A.; Serinaldi, Francesco; Bales, Jerad; Bates, Paul D.; Krajewski, Witold F.

    2009-08-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110km) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1mskm to a maximum of 5.1mskm. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2mskm). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2mskm ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades.

  12. m2-ABKS: Attribute-Based Multi-Keyword Search over Encrypted Personal Health Records in Multi-Owner Setting.

    Science.gov (United States)

    Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An

    2016-11-01

    Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.

  13. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  14. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  15. Gridded sunshine duration climate data record for Germany based on combined satellite and in situ observations

    Science.gov (United States)

    Walawender, Jakub; Kothe, Steffen; Trentmann, Jörg; Pfeifroth, Uwe; Cremer, Roswitha

    2017-04-01

    The purpose of this study is to create a 1 km2 gridded daily sunshine duration data record for Germany covering the period from 1983 to 2015 (33 years) based on satellite estimates of direct normalised surface solar radiation and in situ sunshine duration observations using a geostatistical approach. The CM SAF SARAH direct normalized irradiance (DNI) satellite climate data record and in situ observations of sunshine duration from 121 weather stations operated by DWD are used as input datasets. The selected period of 33 years is associated with the availability of satellite data. The number of ground stations is limited to 121 as there are only time series with less than 10% of missing observations over the selected period included to keep the long-term consistency of the output sunshine duration data record. In the first step, DNI data record is used to derive sunshine hours by applying WMO threshold of 120 W/m2 (SDU = DNI ≥ 120 W/m2) and weighting of sunny slots to correct the sunshine length between two instantaneous image data due to cloud movement. In the second step, linear regression between SDU and in situ sunshine duration is calculated to adjust the satellite product to the ground observations and the output regression coefficients are applied to create a regression grid. In the last step regression residuals are interpolated with ordinary kriging and added to the regression grid. A comprehensive accuracy assessment of the gridded sunshine duration data record is performed by calculating prediction errors (cross-validation routine). "R" is used for data processing. A short analysis of the spatial distribution and temporal variability of sunshine duration over Germany based on the created dataset will be presented. The gridded sunshine duration data are useful for applications in various climate-related studies, agriculture and solar energy potential calculations.

  16. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  17. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  18. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas

    2012-12-01

    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  19. Social science and linguistic text analysis of nurses' records: a systematic review and critique.

    Science.gov (United States)

    Buus, Niels; Hamilton, Bridget Elizabeth

    2016-03-01

    The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations. © 2015 John Wiley & Sons Ltd.

  20. Automatic lameness detection based on consecutive 3D-video recordings

    NARCIS (Netherlands)

    Hertem, van T.; Viazzi, S.; Steensels, M.; Maltz, E.; Antler, A.; Alchanatis, V.; Schlageter-Tello, A.; Lokhorst, C.; Romanini, C.E.B.; Bahr, C.; Berckmans, D.; Halachmi, I.

    2014-01-01

    Manual locomotion scoring for lameness detection is a time-consuming and subjective procedure. Therefore, the objective of this study is to optimise the classification output of a computer vision based algorithm for automated lameness scoring. Cow gait recordings were made during four consecutive

  1. Exploring Type-and-Identity-Based Proxy Re-Encryption Scheme to Securely Manage Personal Health Records

    NARCIS (Netherlands)

    Ibraimi, L.; Gangopadhyay, Aryya; Tang, Qiang; Hartel, Pieter H.; Jonker, Willem

    2010-01-01

    Commercial Web-based Personal-Health Record (PHR) systems can help patients to share their personal health records (PHRs) anytime from anywhere. PHRs are very sensitive data and an inappropriate disclosure may cause serious problems to an individual. Therefore commercial Web-based PHR systems have

  2. PATH ANALYSIS OF RECORDING SYSTEM INNOVATION FACTORS AFFECTING ADOPTION OF GOAT FARMERS

    Directory of Open Access Journals (Sweden)

    S. Okkyla

    2014-09-01

    Full Text Available The objective of this study was to evaluate the path analysis of recording system innovation factorsaffecting adoption of goat farmers. This study was conducted from January to February 2014 inPringapus District, Semarang Regency by using survey method. For determining the location, this studyused purposive sampling method. The amount of respondents were determined by quota samplingmethod. Total respondents randomly chosed were 146 farmers. The data were descriptively andquantitatively analyzed by using path analysis of statistical package for the social science (SPSS 16.Independent variables in this study were internal factor, motivation, innovation characteristics,information source, and dependent variable was adoption. Analysis of linear regression showed thatthere was no significant effect of internal factor on adoption, so that it was important to use the trimmingmethod in path analysis. The result of path analysis showed that the influence of motivation, innovationcharacteristics and information source on adoption were 0.168; 0.720 and 0.09, respectively. Innovationcharacteristics were the greatest effect on adoption. In conclusion, by improving innovationcharacteristics of respondent through motivation and information source may significantly increase theadoption of recording system in goat farmers.

  3. Transfer function for a superficial layer. Parametric analysis and relationship with SM records

    International Nuclear Information System (INIS)

    Sandi, H.; Stancu, O.

    2002-01-01

    The developments presented were aimed at providing an analytical and computational support for a research project intended to examine the contribution of source mechanism and of local conditions to the features of ground motion due to Vrancea earthquakes. The project referred to is being developed jointly, by the Academy of Technical Sciences of Romania, the Institute of Geodynamics, the Technical University of Civil Engineering, Bucharest, and GEOTEC, Bucharest. The modelling of the phenomenon of seismic oscillations of ground was based on assumptions of physical and geometrical linearity. The dynamic systems considered were assumed to consist of a sequence of plane = parallel homogeneous geologic layers, accepting that the relevant physical characteristics (thickness, density, low frequency S-wave velocity, rheological characteristic) are constant for a layer, but may change from one layer to another). Alternative constitutive laws were considered (the laws referred to were of Kelvin - Voigt, Poynting and Sorokin types). The transfer function of a geological package is determined as a product of transfer functions of the successive homogeneous layers. A first step of analysis corresponded to the consideration of a single homogeneous layer, for which full analytical solutions could be derived. A parametric analysis, aimed at determining the transfer function, was undertaken considering alternative (credible) values for the parameters characterizing the constitutive laws referred to. Considering alternative possible situations, it turned out that a strong amplification occurs (for any type of constitutive law) especially for the fundamental mode of the dynamic system, while the amplification is weaker for the upper normal modes. These results correlate well with the outcome of analysis of the spectral content of ground motion as obtained from the processing of strong motion records. The most striking fact is represented by the important modifications of the

  4. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  5. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Science.gov (United States)

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  6. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  7. Development of an algorithm for heartbeats detection and classification in Holter records based on temporal and morphological features

    International Nuclear Information System (INIS)

    García, A; Romano, H; Laciar, E; Correa, R

    2011-01-01

    In this work a detection and classification algorithm for heartbeats analysis in Holter records was developed. First, a QRS complexes detector was implemented and their temporal and morphological characteristics were extracted. A vector was built with these features; this vector is the input of the classification module, based on discriminant analysis. The beats were classified in three groups: Premature Ventricular Contraction beat (PVC), Atrial Premature Contraction beat (APC) and Normal Beat (NB). These beat categories represent the most important groups of commercial Holter systems. The developed algorithms were evaluated in 76 ECG records of two validated open-access databases 'arrhythmias MIT BIH database' and M IT BIH supraventricular arrhythmias database . A total of 166343 beats were detected and analyzed, where the QRS detection algorithm provides a sensitivity of 99.69 % and a positive predictive value of 99.84 %. The classification stage gives sensitivities of 97.17% for NB, 97.67% for PCV and 92.78% for APC.

  8. A microcontroller-based portable electrocardiograph recorder.

    Science.gov (United States)

    Segura-Juárez, José J; Cuesta-Frau, David; Samblas-Pena, Luis; Aboy, Mateo

    2004-09-01

    We describe a low cost portable Holter design that can be implemented with off-the-shelf components. The recorder is battery powered and includes a graphical display and keyboard. The recorder is capable of acquiring up to 48 hours of continuous electrocardiogram data at a sample rate of up to 250 Hz.

  9. Analysis of deep brain stimulation electrode characteristics for neural recording

    Science.gov (United States)

    Kent, Alexander R.; Grill, Warren M.

    2014-08-01

    Objective. Closed-loop deep brain stimulation (DBS) systems have the potential to optimize treatment of movement disorders by enabling automatic adjustment of stimulation parameters based on a feedback signal. Evoked compound action potentials (ECAPs) and local field potentials (LFPs) recorded from the DBS electrode may serve as suitable closed-loop control signals. The objective of this study was to understand better the factors that influence ECAP and LFP recording, including the physical presence of the electrode, the geometrical dimensions of the electrode, and changes in the composition of the peri-electrode space across recording conditions. Approach. Coupled volume conductor-neuron models were used to calculate single-unit activity as well as ECAP responses and LFP activity from a population of model thalamic neurons. Main results. Comparing ECAPs and LFPs measured with and without the presence of the highly conductive recording contacts, we found that the presence of these contacts had a negligible effect on the magnitude of single-unit recordings, ECAPs (7% RMS difference between waveforms), and LFPs (5% change in signal magnitude). Spatial averaging across the contact surface decreased the ECAP magnitude in a phase-dependent manner (74% RMS difference), resulting from a differential effect of the contact on the contribution from nearby or distant elements, and decreased the LFP magnitude (25% change). Reductions in the electrode diameter or recording contact length increased signal energy and increased spatial sensitivity of single neuron recordings. Moreover, smaller diameter electrodes (500 µm) were more selective for recording from local cells over passing axons, with the opposite true for larger diameters (1500 µm). Changes in electrode dimensions had phase-dependent effects on ECAP characteristics, and generally had small effects on the LFP magnitude. ECAP signal energy and LFP magnitude decreased with tighter contact spacing (100 µm), compared to

  10. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code

    Science.gov (United States)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi

    2014-12-01

    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  11. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    Science.gov (United States)

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (ptest-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  13. Thermal-Signature-Based Sleep Analysis Sensor

    Directory of Open Access Journals (Sweden)

    Ali Seba

    2017-10-01

    Full Text Available This paper addresses the development of a new technique in the sleep analysis domain. Sleep is defined as a periodic physiological state during which vigilance is suspended and reactivity to external stimulations diminished. We sleep on average between six and nine hours per night and our sleep is composed of four to six cycles of about 90 min each. Each of these cycles is composed of a succession of several stages of sleep that vary in depth. Analysis of sleep is usually done via polysomnography. This examination consists of recording, among other things, electrical cerebral activity by electroencephalography (EEG, ocular movements by electrooculography (EOG, and chin muscle tone by electromyography (EMG. Recordings are made mostly in a hospital, more specifically in a service for monitoring the pathologies related to sleep. The readings are then interpreted manually by an expert to generate a hypnogram, a curve showing the succession of sleep stages during the night in 30s epochs. The proposed method is based on the follow-up of the thermal signature that makes it possible to classify the activity into three classes: “awakening,” “calm sleep,” and “restless sleep”. The contribution of this non-invasive method is part of the screening of sleep disorders, to be validated by a more complete analysis of the sleep. The measure provided by this new system, based on temperature monitoring (patient and ambient, aims to be integrated into the tele-medicine platform developed within the framework of the Smart-EEG project by the SYEL–SYstèmes ELectroniques team. Analysis of the data collected during the first surveys carried out with this method showed a correlation between thermal signature and activity during sleep. The advantage of this method lies in its simplicity and the possibility of carrying out measurements of activity during sleep and without direct contact with the patient at home or hospitals.

  14. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    G. W. L. Gresik

    2013-07-01

    Full Text Available The goal of RECORDS (Reaching Recording Data Technologies is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  15. Records Reaching Recording Data Technologies

    Science.gov (United States)

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-07-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  16. Electronic Health Record Systems and Intent to Apply for Meaningful Use Incentives among Office-based Physician ...

    Science.gov (United States)

    ... Order from the National Technical Information Service NCHS Electronic Health Record Systems and Intent to Apply for ... In 2011, 57% of office-based physicians used electronic medical record/electronic health record (EMR/EHR) systems, ...

  17. Validity and practicability of smartphone-based photographic food records for estimating energy and nutrient intake.

    Science.gov (United States)

    Kong, Kaimeng; Zhang, Lulu; Huang, Lisu; Tao, Yexuan

    2017-05-01

    Image-assisted dietary assessment methods are frequently used to record individual eating habits. This study tested the validity of a smartphone-based photographic food recording approach by comparing the results obtained with those of a weighed food record. We also assessed the practicality of the method by using it to measure the energy and nutrient intake of college students. The experiment was implemented in two phases, each lasting 2 weeks. In the first phase, a labelled menu and a photograph database were constructed. The energy and nutrient content of 31 randomly selected dishes in three different portion sizes were then estimated by the photograph-based method and compared with a weighed food record. In the second phase, we combined the smartphone-based photographic method with the WeChat smartphone application and applied this to 120 randomly selected participants to record their energy and nutrient intake. The Pearson correlation coefficients for energy, protein, fat, and carbohydrate content between the weighed and the photographic food record were 0.997, 0.936, 0.996, and 0.999, respectively. Bland-Altman plots showed good agreement between the two methods. The estimated protein, fat, and carbohydrate intake by participants was in accordance with values in the Chinese Residents' Nutrition and Chronic Disease report (2015). Participants expressed satisfaction with the new approach and the compliance rate was 97.5%. The smartphone-based photographic dietary assessment method combined with the WeChat instant messaging application was effective and practical for use by young people.

  18. The present state of the medical record data base for the A-bomb survivors in Nagasaki University

    International Nuclear Information System (INIS)

    Mori, Hiroyuki; Mine, Mariko; Kondo, Hisayoshi; Okumura, Yutaka

    1992-01-01

    It has been 13 years since the operation of medical record data base for A-bomb survivors was started in the Scientific Data Center for Atomic Bomb Disaster at the Nagasaki University. This paper presents the basic data in handling the data base. The present data base consists of the following 6 items: (1) 'fundamental data' for approximately 120,000 A-bomb survivors having an A-bomb survivors' handbook who have been living in Nagasaki City; (2) 'Nagasaki Atomic Bomb Hospital's data', covering admission medical records in the ward of internal medicine; (3) 'pathological data', covering autopsy records in Nagasaki City; (4) 'household data reconstructed by the survey data'; (5) 'second generation A-bomb survivors data', including the results of mass screening since 1979, and (6) 'address data'. Based on the data, the number of A-bomb survivors, diagnosis records at the time of death, the number of A-bomb survivors' participants in health examination, tumor registration, records of admission to the internal ward in Nagasaki Atomic Bomb Hospital, autopsy records, and household records are tabulated in relation to annual changes, age at the time of A-bombing, distance from the hypocenter, or sex. (N.K.)

  19. K-Anonymity Based Privacy Risk Budgeting System for Interactive Record Linkage

    Directory of Open Access Journals (Sweden)

    Hye-Chung Kum

    2017-04-01

    The k-anonymity based privacy risk budgeting system provides a mechanism where we can concretely reason about the tradeoff between the privacy risks due to information disclosed, accuracy gained, and biases reduced during interactive record linkage.

  20. Probe-based recording technology

    International Nuclear Information System (INIS)

    Naberhuis, Steve

    2002-01-01

    The invention of the scanning tunneling microscope (STM) prompted researchers to contemplate whether such technology could be used as the basis for the storage and retrieval of information. With magnetic data storage technology facing limits in storage density due to the thermal instability of magnetic bits, the super-paramagnetic limit, the heir-apparent for information storage at higher densities appeared to be variants of the STM or similar probe-based storage techniques such as atomic force microscopy (AFM). Among these other techniques that could provide replacement technology for magnetic storage, near-field optical scanning optical microscopy (NSOM or SNOM) has also been investigated. Another alternative probe-based storage technology called atomic resolution storage (ARS) is also currently under development. An overview of these various technologies is herein presented, with an analysis of the advantages and disadvantages inherent in each particularly with respect to reduced device dimensions. The role of micro electro mechanical systems (MEMS) is emphasized

  1. Recording of electrohysterogram laplacian potential.

    Science.gov (United States)

    Alberola-Rubio, J; Garcia-Casado, J; Ye-Lin, Y; Prats-Boluda, G; Perales, A

    2011-01-01

    Preterm birth is the main cause of the neonatal morbidity. Noninvasive recording of uterine myoelectrical activity (electrohysterogram, EHG) could be an alternative to the monitoring of uterine dynamics which are currently based on tocodynamometers (TOCO). The analysis of uterine electromyogram characteristics could help the early diagnosis of preterm birth. Laplacian recordings of other bioelectrical signals have proved to enhance spatial selectivity and to reduce interferences in comparison to monopolar and bipolar surface recordings. The main objective of this paper is to check the feasibility of the noninvasive recording of uterine myoelectrical activity by means of laplacian techniques. Four bipolar EHG signals, discrete laplacian obtained from five monopolar electrodes and the signals picked up by two active concentric-ringed-electrodes were recorded on 5 women with spontaneous or induced labor. Intrauterine pressure (IUP) and TOCO were also simultaneously recorded. To evaluate the uterine contraction detectability of the different noninvasive methods in comparison to IUP the contractions consistency index (CCI) was calculated. Results show that TOCO is less consistent (83%) than most EHG bipolar recording channels (91%, 83%, 87%, and 76%) to detect the uterine contractions identified in IUP. Moreover laplacian EHG signals picked up by ringed-electrodes proved to be as consistent (91%) as the best bipolar recordings in addition to significantly reduce ECG interference.

  2. Earthquake response analysis of a base isolated building

    International Nuclear Information System (INIS)

    Mazda, T.; Shiojiri, H.; Sawada, Y.; Harada, O.; Kawai, N.; Ontsuka, S.

    1989-01-01

    Recently, the seismic isolation has become one of the popular methods in the design of important structures or equipments against the earthquakes. However, it is desired to accumulate the demonstration data on reliability of seismically isolated structures and to establish the analysis methods of those structures. Based on the above recognition, the vibration tests of a base isolated building were carried out in Tsukuba Science City. After that, many earthquake records have been obtained at the building. In order to examine the validity of numerical models, earthquake response analyses have been executed by using both lumped mass model, and finite element model

  3. An Enterprise Architecture Perspective to Electronic Health Record Based Care Governance.

    Science.gov (United States)

    Motoc, Bogdan

    2017-01-01

    This paper proposes an Enterprise Architecture viewpoint of Electronic Health Record (EHR) based care governance. The improvements expected are derived from the collaboration framework and the clinical health model proposed as foundation for the concept of EHR.

  4. Urban Noise Recorded by Stationary Monitoring Stations

    Science.gov (United States)

    Bąkowski, Andrzej; Radziszewski, Leszek; Dekýš, Vladimir

    2017-10-01

    The paper presents the analysis results of equivalent sound level recorded by two road traffic noise monitoring stations. The stations were located in Kielce (an example of a medium-size town in Poland) at the roads in the town in the direction of Łódź and Lublin. The measurements were carried out through stationary stations monitoring the noise and traffic of motor vehicles. The RMS values based on A-weighted sound level were recorded every 1 s in the buffer and the results were registered every 1 min over the period of investigations. The registered data were the basis for calculating the equivalent sound level for three time intervals: from 6:00 to 18:00, from 18:00 to 22:00 and from 22:00 to 6:00. Analysis included the values of the equivalent sound level recorded for different days of the week split into 24h periods, nights, days and evenings. The data analysed included recordings from 2013. The agreement of the distribution of the variable under analysis with normal distribution was evaluated. It was demonstrated that in most cases (for both roads) there was sufficient evidence to reject the null hypothesis at the significance level of 0.05. It was noted that compared with Łódź Road, in the case of Lublin Road data, more cases were recorded for which the null hypothesis could not be rejected. Uncertainties of the equivalent sound level measurements were compared within the periods under analysis. The standard deviation, coefficient of variation, the positional coefficient of variation, the quartile deviation was proposed for performing a comparative analysis of the obtained data scattering. The investigations indicated that the recorded data varied depending on the traffic routes and time intervals. The differences concerned the values of uncertainties and coefficients of variation of the equivalent sound levels.

  5. Validity of recalled v. recorded birth weight: a systematic review and meta-analysis

    OpenAIRE

    Shenkin, S.D.; Zhang, M.G.; Der, G.; Mathur, S.; Mina, T.H.; Reynolds, R.M.

    2017-01-01

    Low birth weight is associated with adverse health outcomes. If birth weight records are not available, studies may use recalled birth weight. It is unclear whether this is reliable. We performed a systematic review and meta-analysis of studies comparing recalled with recorded birth weights. We followed the Meta-Analyses of Observational Studies in Epidemiology (MOOSE) statement and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We searched MEDLINE, EM...

  6. High-efficient and high-content cytotoxic recording via dynamic and continuous cell-based impedance biosensor technology.

    Science.gov (United States)

    Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping

    2016-10-01

    Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.

  7. Natural Language Processing Based Instrument for Classification of Free Text Medical Records

    Directory of Open Access Journals (Sweden)

    Manana Khachidze

    2016-01-01

    Full Text Available According to the Ministry of Labor, Health and Social Affairs of Georgia a new health management system has to be introduced in the nearest future. In this context arises the problem of structuring and classifying documents containing all the history of medical services provided. The present work introduces the instrument for classification of medical records based on the Georgian language. It is the first attempt of such classification of the Georgian language based medical records. On the whole 24.855 examination records have been studied. The documents were classified into three main groups (ultrasonography, endoscopy, and X-ray and 13 subgroups using two well-known methods: Support Vector Machine (SVM and K-Nearest Neighbor (KNN. The results obtained demonstrated that both machine learning methods performed successfully, with a little supremacy of SVM. In the process of classification a “shrink” method, based on features selection, was introduced and applied. At the first stage of classification the results of the “shrink” case were better; however, on the second stage of classification into subclasses 23% of all documents could not be linked to only one definite individual subclass (liver or binary system due to common features characterizing these subclasses. The overall results of the study were successful.

  8. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.

    2009-01-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  9. Secure management of personal health records by applying attribute-based encryption

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, M.; Petkovic, M.

    2009-01-01

    The confidentiality of personal health records is a major problem when patients use commercial Web-based systems to store their health data. Traditional access control mechanisms have several limitations with respect to enforcing access control policies and ensuring data confidentiality. In

  10. Results from a survey of national immunization programmes on home-based vaccination record practices in 2013

    OpenAIRE

    Young, Stacy L.; Gacic-Dobo, Marta; Brown, David W.

    2015-01-01

    Background Data on home-based records (HBRs) practices within national immunization programmes are non-existent, making it difficult to determine whether current efforts of immunization programmes related to basic recording of immunization services are appropriately focused. Methods During January 2014, WHO and the United Nations Children's Fund sent a one-page questionnaire to 195 countries to obtain information on HBRs including type of record used, number of records printed, whether record...

  11. THE EVOLUTION OF THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) FROM THE AMIDON-FLANDERS INTERACTION ANALYSIS. APPENDIX G.

    Science.gov (United States)

    BALDWIN, PATRICIA

    A DETAILED LISTING IS GIVEN OF THE REVISIONS THAT WERE MADE TO THE AMIDON-FLANDERS INTERACTION ANALYSIS SCALE WHILE THE FILM ANALYSIS OF INTERACTION RECORD (FAIR) SCALE WAS BEING DEVELOPED. COMMENTS ARE GIVEN FOR GUIDANCE IN THE USE OF SOME OF THE RATINGS ALONG WITH SOME GROUND RULES AND GUIDELINES FOR MAKING A FILM RATING. RELATED REPORTS ARE AA…

  12. Time series analysis of soil Radon-222 recorded at Kutch region, Gujarat, India

    International Nuclear Information System (INIS)

    Madhusudan Rao, K.; Rastogi, B.K.; Barman, Chiranjib; Chaudhuri, Hirok

    2013-01-01

    Kutch region in Gujarat lies in a seismic vulnerable zone (seismic zone-v). After the devastating Bhuj earthquake (7.7M) of January 26, 2001 in the Kutch region several researcher focused their attention to monitor geophysical and geochemical precursors for earthquakes in the region. In order to find out the possible geochemical precursory signals for earthquake events, we monitored radioactive gas radon-222 in sub surface soil gas at Kutch region. We have analysed the recorded soil radon-222 time series by means of nonlinear techniques such as FFT power spectral analysis, empirical mode decomposition, multi-fractal analysis along with other linear statistical methods. Some fascinating and fruitful results originated out the nonlinear analysis of the said time series have been discussed in the present paper. The entire analytical method aided us to recognize the nature and pattern of soil radon-222 emanation process. Moreover the recording and statistical and non-linear analysis of soil radon data at Kutch region will assist us to understand the preparation phase of an imminent seismic event in the region. (author)

  13. A preliminary analysis of the reactor-based plutonium disposition alternative deployment schedules

    International Nuclear Information System (INIS)

    Zurn, R.M.

    1997-09-01

    This paper discusses the preliminary analysis of the implementation schedules of the reactor-based plutonium disposition alternatives. These schedule analyses are a part of a larger process to examine the nine decision criteria used to determine the most appropriate method of disposing of U.S. surplus weapons plutonium. The preliminary analysis indicates that the mission durations for the reactor-based alternatives range from eleven years to eighteen years and the initial mission fuel assemblies containing surplus weapons-usable plutonium could be loaded into the reactors between nine and fourteen years after the Record of Decision

  14. Paper-Based Medical Records: the Challenges and Lessons Learned from Studying Obstetrics and Gynaecological Post-Operation Records in a Nigerian Hospital

    Directory of Open Access Journals (Sweden)

    Adekunle Yisau Abdulkadir

    2010-10-01

    Full Text Available AIM: With the background knowledge that auditing of Medical Records (MR for adequacy and completeness is necessary if it is to be useful and reliable in continuing patient care; protection of the legal interest of the patient, physicians, and the Hospital; and meeting requirements for researches, we scrutinized theatre records of our hospital to identify routine omissions or deficiencies, and correctable errors in our MR system. METHOD: Obstetrics and Gynaecological post operation theatre records between January 2006 and December 2008 were quantitatively and qualitatively analyzed for details that included: hospital number; Patients age; diagnosis; surgery performed; types and modes of anesthesia; date of surgery; patients’ ward; Anesthetists names; surgeons and attending nurses names, and abbreviations used with SPSS 15.0 for Windows. RESULTS: Hardly were any of the 1270 surgeries during the study period documented without an omission or an abbreviation. Hospital numbers and patients’ age were not documented in 21.8% (n=277 and 59.1% (n=750 respectively. Diagnoses and surgeries were recorded with varying abbreviations in about 96% of instances. Surgical team names were mostly abbreviated or initials only given. CONCLUSION: To improve the quality of Paper-based Medical Record, regular auditing, training and good orientation of medical personnel for good record practices, and discouraging large volume record book to reduce paper damages and sheet loss from handling are necessary else what we record toady may neither be useful nor available tomorrow. [TAF Prev Med Bull 2010; 9(5.000: 427-432

  15. Microscopic analysis on showers recorded as single core on X-ray films

    International Nuclear Information System (INIS)

    Amato, N.M.; Arata, N.; Maldonado, R.H.C.

    1983-01-01

    Cosmic-ray particles recorded as single dark spots on X-ray films with use of the emulsion chamber data of Brazil-Japan Collaboration are studied. Some results of microscopic analysis of such single-core-like showers on nuclear emulsion plates are reported. (Author) [pt

  16. 21 CFR 120.12 - Records.

    Science.gov (United States)

    2010-04-01

    ... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.12 Records. (a... Analysis and Critical Control Point (HACCP) system: (1) Records documenting the implementation of the... § 120.7; (3) The written HACCP plan required by § 120.8; (4) Records documenting the ongoing application...

  17. Deconvolution of the tree ring based delta13C record

    International Nuclear Information System (INIS)

    Peng, T.; Broecker, W.S.; Freyer, H.D.; Trumbore, S.

    1983-01-01

    We assumed that the tree-ring based 13 C/ 12 C record constructed by Freyer and Belacy (1983) to be representative of the fossil fuel and forest-soil induced 13 C/ 12 C change for atmospheric CO 2 . Through the use of a modification of the Oeschger et al. ocean model, we have computed the contribution of the combustion of coal, oil, and natural gas to this observed 13 C/ 12 C change. A large residual remains when the tree-ring-based record is corrected for the contribution of fossil fuel CO 2 . A deconvolution was performed on this residual to determine the time history and magnitude of the forest-soil reservoir changes over the past 150 years. Several important conclusions were reached. (1) The magnitude of the integrated CO 2 input from these sources was about 1.6 times that from fossil fuels. (2) The forest-soil contribution reached a broad maximum centered at about 1900. (3) Over the 2 decade period covered by the Mauna Loa atmospheric CO 2 content record, the input from forests and soils was about 30% that from fossil fuels. (4) The 13 C/ 12 C trend over the last 20 years was dominated by the input of fossil fuel CO 2 . (5) The forest-soil release did not contribute significantly to the secular increase in atmospheric CO 2 observed over the last 20 years. (6) The pre-1850 atmospheric p2 values must have been in the range 245 to 270 x 10 -6 atmospheres

  18. Design of microcontroller-based EMG and the analysis of EMG signals.

    Science.gov (United States)

    Güler, Nihal Fatma; Hardalaç, Firat

    2002-04-01

    In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.

  19. The Private Communications of Magnetic Recording under Socialism (Retrospective Disco Analysis

    Directory of Open Access Journals (Sweden)

    Oleg Vladimir Sineokij

    2013-07-01

    Full Text Available The article analyzes the formation and development of a general model of rare sound records in the structure of institutions of a social communication. The author considers psychocomminicative features of the filophone communication as a special type of interaction in the field of entertainment. The author studied the causes and conditions of a tape subculture in the USSR. It is observed the dynamics of the disco-communication in limited information conditions from socialism till modern high-tech conditions.At the end of the article the author argues based achievements in the field of advanced technology systems, innovation revival in the industry of music-record. Hence, using innovative approaches in the study, the author sets out the basic concept of recording popular music as a special information and legal institution, in retrospect, the theory and practice of the future needs in the information society.

  20. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  1. Variability Analysis of the Horizontal Geomagnetic Component: A Case Study Based on Records from Vassouras Observatory (Brazil)

    Science.gov (United States)

    Klausner, Virginia; Papa, Andres; Mendes, Odim; Oliveira Domingues, Margarete

    It is well known that any of the components of the magnetic field measured on the Earth's surface presents characteristic frequencies with 24, 12, 8 and 6-hour period. Those typical kinds of oscillations of the geomagnetic field are known as solar quiet variation and are primary due to the global thermotidal wind systems which conduct currents flowing in the "dynamo region" of the ionosphere, the E-region. In this study, the horizontal component amplitude observed by ground-based observatories belonged to the INTERMAGNET network have been used to analyze the global pattern variance of the Sq variation. In particular we focused our attention on Vassouras Observatory (VSS), Rio de Janeiro, Brazil, which has been active since 1915. In the next years, a brazilian network of magnetometers will be implemented and VSS can be used as reference. This work aims mainly to highlight and interpret these quiet daily variations over the Brazilian sector compared to the features from other magnetic stations reasonably distributed over the whole Earth's surface. The methodological approach is based on wavelet cross-correlation technique. This technique is useful to isolate the period of the spectral components of geomagnetic field in each station and to correlate them as function of scale (period) between VSS and the other stations. The wavelet cross-correlation coefficient strongly depends on the scale. We study the geomagnetically quiet days at equinox and solstice months during low and high solar activity. As preliminary remarks, the results show that the records in the magnetic stations have primary a latitudinal dependence affected by the time of year and level of solar activity. On the other hand, records of magnetic stations located at the same dip latitude but at different longitude presented some peculiarities. These results indicated that the winds driven the dynamo are very sensitive of the location of the geomagnetic station, i. e., its effects depend upon the direction

  2. Assessing the spatial representability of charcoal and PAH-based paleofire records with integrated GIS, modelling, and empirical approaches

    Science.gov (United States)

    Vachula, R. S.; Huang, Y.; Russell, J. M.

    2017-12-01

    Lake sediment-based fire reconstructions offer paleoenvironmental context in which to assess modern fires and predict future burning. However, despite the ubiquity, many uncertainties remain regarding the taphonomy of paleofire proxies and the spatial scales for which they record variations in fire history. Here we present down-core proxy analyses of polycyclic aromatic hydrocarbons (PAHs) and three size-fractions of charcoal (63-150, >150 and >250 μm) from Swamp Lake, California, an annually laminated lacustrine archive. Using a statewide historical GIS dataset of area burned, we assess the spatial scales for which these proxies are reliable recorders of fire history. We find that the coherence of observed and proxy-recorded fire history inherently depends upon spatial scale. Contrary to conventional thinking that charcoal mainly records local fires, our results indicate that macroscopic charcoal (>150 μm) may record spatially broader (250 μm) may be a more conservative proxy for local burning. We find that sub-macroscopic charcoal particles (63-150 μm) reliably record regional (up to 150 km) changes in fire history. These results indicate that charcoal-based fire reconstructions may represent spatially broader fire history than previously thought, which has major implications for our understanding of spatiotemporal paleofire variations. Our analyses of PAHs show that dispersal mobility is heterogeneous between compounds, but that PAH fluxes are reliable proxies of fire history within 25-50 km, which suggests PAHs may be a better spatially constrained paleofire proxy than sedimentary charcoal. Further, using a linear discriminant analysis model informed by modern emissions analyses, we show that PAH assemblages preserved in lake sediments can differentiate vegetation type burned, and are thus promising paleoecological biomarkers warranting further research and implementation. In sum, our analyses offer new insight into the spatial dimensions of paleofire

  3. Client perceptions of the mental health engagement network: a qualitative analysis of an electronic personal health record.

    Science.gov (United States)

    Forchuk, Cheryl; Reiss, Jeffrey P; O'Regan, Tony; Ethridge, Paige; Donelle, Lorie; Rudnick, Abraham

    2015-10-14

    Information technologies such as websites, mobile phone applications, and virtual reality programs have been shown to deliver innovative and effective treatments for mental illness. Much of the research studying electronic mental health interventions focuses on symptom reduction; however, to facilitate the implementation of electronic interventions in usual mental health care, it is also important to investigate the perceptions of clients who will be using the technologies. To this end, a qualitative analysis of focus group discussions regarding the Mental Health Engagement Network, a web-based personal health record and smartphone intervention, is presented here. Individuals living in the community with a mood or psychotic disorder (n = 394) were provided with a smartphone and access to an electronic personal health record, the Lawson SMART Record, for 12 to 18 months to manage their mental health. This study employed a delayed-implementation design and obtained both quantitative and qualitative data through individual interviews and focus group sessions. Participants had the opportunity to participate in voluntary focus group sessions at three points throughout the study to discuss their perceptions of the technologies. Qualitative data from 95 focus group participants were analysed using a thematic analysis. Four overarching themes emerged from focus group discussions: 1) Versatile functionality of the Lawson SMART Record and smartphone facilitated use; 2) Aspects of the technologies as barriers to use; 3) Use of the Mental health Engagement Network technologies resulted in perceived positive outcomes; 4) Future enhancement of the Lawson SMART Record and intervention is recommended. These qualitative data provide a valuable contribution to the understanding of how smarttechnologies can be integrated into usual mental health care. Smartphones are extremely portable andcommonplace in society. Therefore, clients can use these devices to manage and track mental

  4. Health monitoring of Ceramic Matrix Composites from waveform-based analysis of Acoustic Emission

    Directory of Open Access Journals (Sweden)

    Maillet Emmanuel

    2015-01-01

    Full Text Available Ceramic Matrix Composites (CMCs are anticipated for use in the hot section of aircraft engines. Their implementation requires the understanding of the various damage modes that are involved and their relation to life expectancy. Acoustic Emission (AE has been shown to be an efficient technique for monitoring damage evolution in CMCs. However, only a waveform-based analysis of AE can offer the possibility to validate and precisely examine the recorded AE data with a view to damage localization and identification. The present work fully integrates wave initiation, propagation and acquisition in the analysis of Acoustic Emission waveforms recorded at various sensors, therefore providing more reliable information to assess the relation between Acoustic Emission and damage modes. The procedure allows selecting AE events originating from damage, accurate determination of their location as well as the characterization of effects of propagation on the recorded waveforms. This approach was developed using AE data recorded during tensile tests on carbon/carbon composites. It was then applied to melt-infiltrated SiC/SiC composites.

  5. A Satellite-Based Sunshine Duration Climate Data Record for Europe and Africa

    Directory of Open Access Journals (Sweden)

    Steffen Kothe

    2017-05-01

    Full Text Available Besides 2 m - temperature and precipitation, sunshine duration is one of the most important and commonly used parameter in climatology, with measured time series of partly more than 100 years in length. EUMETSAT’s Satellite Application Facility on Climate Monitoring (CM SAF presents a climate data record for daily and monthly sunshine duration (SDU for Europe and Africa. Basis for the advanced retrieval is a highly resolved satellite product of the direct solar radiation from measurements by Meteosat satellites 2 to 10. The data record covers the time period 1983 to 2015 with a spatial resolution of 0.05° × 0.05°. The comparison against ground-based data shows high agreement but also some regional differences. Sunshine duration is overestimated by the satellite-based data in many regions, compared to surface data. In West and Central Africa, low clouds seem to be the reason for a stronger overestimation of sunshine duration in this region (up to 20% for monthly sums. For most stations, the overestimation is low, with a bias below 7.5 h for monthly sums and below 0.4 h for daily sums. A high correlation of 0.91 for daily SDU and 0.96 for monthly SDU also proved the high agreement with station data. As SDU is based on a stable and homogeneous climate data record of more than 30 years length, it is highly suitable for climate applications, such as trend estimates.

  6. 40 CFR 141.571 - What records does subpart T require my system to keep?

    Science.gov (United States)

    2010-07-01

    ... records does subpart T require my system to keep? Your system must keep several types of records based on... Benchmarking(§§ 141.540-141.544) Benchmark (including raw data and analysis) Indefinitely. ...

  7. Holographic memory system based on projection recording of computer-generated 1D Fourier holograms.

    Science.gov (United States)

    Betin, A Yu; Bobrinev, V I; Donchenko, S S; Odinokov, S B; Evtikhiev, N N; Starikov, R S; Starikov, S N; Zlokazov, E Yu

    2014-10-01

    Utilization of computer generation of holographic structures significantly simplifies the optical scheme that is used to record the microholograms in a holographic memory record system. Also digital holographic synthesis allows to account the nonlinear errors of the record system to improve the microholograms quality. The multiplexed record of holograms is a widespread technique to increase the data record density. In this article we represent the holographic memory system based on digital synthesis of amplitude one-dimensional (1D) Fourier transform holograms and the multiplexed record of these holograms onto the holographic carrier using optical projection scheme. 1D Fourier transform holograms are very sensitive to orientation of the anamorphic optical element (cylindrical lens) that is required for encoded data object reconstruction. The multiplex record of several holograms with different orientation in an optical projection scheme allowed reconstruction of the data object from each hologram by rotating the cylindrical lens on the corresponding angle. Also, we discuss two optical schemes for the recorded holograms readout: a full-page readout system and line-by-line readout system. We consider the benefits of both systems and present the results of experimental modeling of 1D Fourier holograms nonmultiplex and multiplex record and reconstruction.

  8. Gap analysis between provisional diagnosis and final diagnosis in government and private teaching hospitals: A record-linked comparative study

    Directory of Open Access Journals (Sweden)

    Sudeshna Chatterjee

    2016-01-01

    Full Text Available Aims: 1. To identify the extent of clinical gaps at the context of knowledge, practice and systems. 2. To formulate necessary intervention measures towards bridging the gaps. Settings and Design: Comparative, cross-sectional and non-interventional study. Methods and Material: It is retrospective, record-based study conducted upon inpatients (n = 200 of major disciplines of two teaching hospitals. Major outcome variables were to observe the matching and un-matching of final and provisional diagnosis by using ICD-10 criteria. Statistical Analysis Used: Comparative analysis of specific and selective gaps were estimated in terms of percentage (%. Results: Pilot observation showed the existence of gaps between provisional and final diagnosis in both private and government institution. Both knowledge and skill gaps were evident in caregivers and gap in documentation was existent in medical records. Conclusions: The pilot data is may be an eye-opener to public and private governance systems for understanding and revising the process service planning and service delivery. Necessary intervention measures may be contemplated towards enhancing diagnostic skill of doctors for quality hospital care.

  9. Online Recorded Data-Based Composite Neural Control of Strict-Feedback Systems With Application to Hypersonic Flight Dynamics.

    Science.gov (United States)

    Xu, Bin; Yang, Daipeng; Shi, Zhongke; Pan, Yongping; Chen, Badong; Sun, Fuchun

    2017-09-25

    This paper investigates the online recorded data-based composite neural control of uncertain strict-feedback systems using the backstepping framework. In each step of the virtual control design, neural network (NN) is employed for uncertainty approximation. In previous works, most designs are directly toward system stability ignoring the fact how the NN is working as an approximator. In this paper, to enhance the learning ability, a novel prediction error signal is constructed to provide additional correction information for NN weight update using online recorded data. In this way, the neural approximation precision is highly improved, and the convergence speed can be faster. Furthermore, the sliding mode differentiator is employed to approximate the derivative of the virtual control signal, and thus, the complex analysis of the backstepping design can be avoided. The closed-loop stability is rigorously established, and the boundedness of the tracking error can be guaranteed. Through simulation of hypersonic flight dynamics, the proposed approach exhibits better tracking performance.

  10. Utilization of independent component analysis for accurate pathological ripple detection in intracranial EEG recordings recorded extra- and intra-operatively.

    Science.gov (United States)

    Shimamoto, Shoichi; Waldman, Zachary J; Orosz, Iren; Song, Inkyung; Bragin, Anatol; Fried, Itzhak; Engel, Jerome; Staba, Richard; Sharan, Ashwini; Wu, Chengyuan; Sperling, Michael R; Weiss, Shennan A

    2018-01-01

    To develop and validate a detector that identifies ripple (80-200 Hz) events in intracranial EEG (iEEG) recordings in a referential montage and utilizes independent component analysis (ICA) to eliminate or reduce high-frequency artifact contamination. Also, investigate the correspondence of detected ripples and the seizure onset zone (SOZ). iEEG recordings from 16 patients were first band-pass filtered (80-600 Hz) and Infomax ICA was next applied to derive the first independent component (IC1). IC1 was subsequently pruned, and an artifact index was derived to reduce the identification of high-frequency events introduced by the reference electrode signal. A Hilbert detector identified ripple events in the processed iEEG recordings using amplitude and duration criteria. The identified ripple events were further classified and characterized as true or false ripple on spikes, or ripples on oscillations by utilizing a topographical analysis to their time-frequency plot, and confirmed by visual inspection. The signal to noise ratio was improved by pruning IC1. The precision of the detector for ripple events was 91.27 ± 4.3%, and the sensitivity of the detector was 79.4 ± 3.0% (N = 16 patients, 5842 ripple events). The sensitivity and precision of the detector was equivalent in iEEG recordings obtained during sleep or intra-operatively. Across all the patients, true ripple on spike rates and also the rates of false ripple on spikes, that were generated due to filter ringing, classified the seizure onset zone (SOZ) with an area under the receiver operating curve (AUROC) of >76%. The magnitude and spectral content of true ripple on spikes generated in the SOZ was distinct as compared with the ripples generated in the NSOZ (p ripple rates and properties defined using this approach may accurately delineate the seizure onset zone. Strategies to improve the spatial resolution of intracranial EEG and reduce artifact can help improve the clinical utility of

  11. Structure and performance of a real-time algorithm to detect tsunami or tsunami-like alert conditions based on sea-level records analysis

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2011-05-01

    Full Text Available The goal of this paper is to present an original real-time algorithm devised for detection of tsunami or tsunami-like waves we call TEDA (Tsunami Early Detection Algorithm, and to introduce a methodology to evaluate its performance. TEDA works on the sea level records of a single station and implements two distinct modules running concurrently: one to assess the presence of tsunami waves ("tsunami detection" and the other to identify high-amplitude long waves ("secure detection". Both detection methods are based on continuously updated time functions depending on a number of parameters that can be varied according to the application. In order to select the most adequate parameter setting for a given station, a methodology to evaluate TEDA performance has been devised, that is based on a number of indicators and that is simple to use. In this paper an example of TEDA application is given by using data from a tide gauge located at the Adak Island in Alaska, USA, that resulted in being quite suitable since it recorded several tsunamis in the last years using the sampling rate of 1 min.

  12. Source properties of the 1998 July 17 Papua New Guinea tsunami based on tide gauge records

    Science.gov (United States)

    Heidarzadeh, Mohammad; Satake, Kenji

    2015-07-01

    We analysed four newly retrieved tide gauge records of the 1998 July 17 Papua New Guinea (PNG) tsunami to study statistical and spectral properties of this tsunami. The four tide gauge records were from Lombrum (PNG), Rabaul (PNG), Malakal Island (Palau) and Yap Island (State of Yap) stations located 600-1450 km from the source. The tsunami registered a maximum trough-to-crest wave height of 3-9 cm at these gauges. Spectral analysis showed two dominant peaks at period bands of 2-4 and 6-20 min with a clear separation at the period of ˜5 min. We interpreted these peak periods as belonging to the landslide and earthquake sources of the PNG tsunami, respectively. Analysis of the tsunami waveforms revealed 12-17 min delay in landslide generation compared to the origin time of the main shock. Numerical simulations including this delay fairly reproduced the observed tide gauge records. This is the first direct evidence of the delayed landslide source of the 1998 PNG tsunami which was previously indirectly estimated from acoustic T-phase records.

  13. HHARP: The Historical Hospital Admission Records Project – a review

    Directory of Open Access Journals (Sweden)

    Cara Hirst

    2018-04-01

    Full Text Available Hospital records have frequently been used in epidemiological research (Kilgore et al. 2017; Rushton 2016, and in some cases palaeopathological research. However, the availability of data is problematic, with written records requiring considerable time to interpret, digitise and analyse. In 2001, the Historical Hospital Records Project (HHARP began digitising over 140,000 hospital admission records from four hospitals in London and Glasgow, providing researchers with an online data base of hospital records (Figure 1. I review the data available in the HHARP database, as well as make a preliminary analysis of the hospital records from London and Glasgow between c.1852-1921 which illustrates the value of the HHARP database in understanding disease and medical care during this period.

  14. Fast optical detecting media based on semiconductor nanostructures for recording images obtained using charges of free photocarriers

    International Nuclear Information System (INIS)

    Kasherininov, P. G.; Tomasov, A. A.; Beregulin, E. V.

    2011-01-01

    Available published data on the properties of optical recording media based on semiconductor structures are reviewed. The principles of operation, structure, parameters, and the range of application for optical recording media based on MIS structures formed of photorefractive crystals with a thick layer of insulator and MIS structures with a liquid crystal as the insulator (the MIS LC modulators), as well as the effect of optical bistability in semiconductor structures (semiconductor MIS structures with nanodimensionally thin insulator (TI) layer, M(TI)S nanostructures). Special attention is paid to recording media based on the M(TI)S nanostructures promising for fast processing of highly informative images and to fabrication of optoelectronic correlators of images for noncoherent light.

  15. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording

    Directory of Open Access Journals (Sweden)

    Marco Crescentini

    2016-05-01

    Full Text Available High-throughput screening (HTS using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i design of scalable microfluidic devices; (ii design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  16. A Low-Noise Transimpedance Amplifier for BLM-Based Ion Channel Recording.

    Science.gov (United States)

    Crescentini, Marco; Bennati, Marco; Saha, Shimul Chandra; Ivica, Josip; de Planque, Maurits; Morgan, Hywel; Tartagni, Marco

    2016-05-19

    High-throughput screening (HTS) using ion channel recording is a powerful drug discovery technique in pharmacology. Ion channel recording with planar bilayer lipid membranes (BLM) is scalable and has very high sensitivity. A HTS system based on BLM ion channel recording faces three main challenges: (i) design of scalable microfluidic devices; (ii) design of compact ultra-low-noise transimpedance amplifiers able to detect currents in the pA range with bandwidth >10 kHz; (iii) design of compact, robust and scalable systems that integrate these two elements. This paper presents a low-noise transimpedance amplifier with integrated A/D conversion realized in CMOS 0.35 μm technology. The CMOS amplifier acquires currents in the range ±200 pA and ±20 nA, with 100 kHz bandwidth while dissipating 41 mW. An integrated digital offset compensation loop balances any voltage offsets from Ag/AgCl electrodes. The measured open-input input-referred noise current is as low as 4 fA/√Hz at ±200 pA range. The current amplifier is embedded in an integrated platform, together with a microfluidic device, for current recording from ion channels. Gramicidin-A, α-haemolysin and KcsA potassium channels have been used to prove both the platform and the current-to-digital converter.

  17. AN ANALYSIS OF OFF RECORD STRATEGIES REFLECTING POLITENESS IMPLICATURE IN “OPRAH WINFREY SHOW”

    Directory of Open Access Journals (Sweden)

    Rahma Yanti

    2017-08-01

    Full Text Available This thesis discusses strategies off the record that describes implicatures modesty in a conversation. Off record strategy is one of the five strategies. This strategy is discussed for the use of the language used in the forms of direct. The object of research are strategies off the record that describes implicatures politeness in a famous talk show in America, namely, "Oprah Winfrey Show". The data were taken using methods refer to refer techniques involved free conversation, where the author was not involved in the dialogue that occurs because the data is taken from the TV show, recording technique with the aid of a recorded tape. Furthermore, the authors use the technique CAAT by way of transcribing talk show back in the form transcription ortrografis. This analysis uses methods equivalent pragmatic look at the role of external factors of language, especially the factor of interlocutors on selection strategies used off record. The results showed that the context of the situation and the violations of the maxim of conversation will influence the choice of strategies used off record. However there are some cases when this option do not follow the rules. This is because of other factors that come into play in a conbersation such as an intonation. mplicatures appear generally in the form of affirmation that is used in polite. In one sentence found two or more strategy off record selected speakers.

  18. Feasibility of a web-based system for police crash report review and information recording.

    Science.gov (United States)

    2016-04-01

    Police crash reports include useful additional information that is not available in crash summary records. : This information may include police sketches and narratives and is often needed for detailed site-specific : safety analysis. In addition, so...

  19. Ex post power economic analysis of record of decision operational restrictions at Glen Canyon Dam.

    Energy Technology Data Exchange (ETDEWEB)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.; Loftin, S.; Osiek, B; Decision and Information Sciences; Western Area Power Administration

    2010-07-31

    On October 9, 1996, Bruce Babbitt, then-Secretary of the U.S. Department of the Interior signed the Record of Decision (ROD) on operating criteria for the Glen Canyon Dam (GCD). Criteria selected were based on the Modified Low Fluctuating Flow (MLFF) Alternative as described in the Operation of Glen Canyon Dam, Colorado River Storage Project, Arizona, Final Environmental Impact Statement (EIS) (Reclamation 1995). These restrictions reduced the operating flexibility of the hydroelectric power plant and therefore its economic value. The EIS provided impact information to support the ROD, including an analysis of operating criteria alternatives on power system economics. This ex post study reevaluates ROD power economic impacts and compares these results to the economic analysis performed prior (ex ante) to the ROD for the MLFF Alternative. On the basis of the methodology used in the ex ante analysis, anticipated annual economic impacts of the ROD were estimated to range from approximately $15.1 million to $44.2 million in terms of 1991 dollars ($1991). This ex post analysis incorporates historical events that took place between 1997 and 2005, including the evolution of power markets in the Western Electricity Coordinating Council as reflected in market prices for capacity and energy. Prompted by ROD operational restrictions, this analysis also incorporates a decision made by the Western Area Power Administration to modify commitments that it made to its customers. Simulated operations of GCD were based on the premise that hourly production patterns would maximize the economic value of the hydropower resource. On the basis of this assumption, it was estimated that economic impacts were on average $26.3 million in $1991, or $39 million in $2009.

  20. Development of Software for dose Records Data Base Access; Programacion para la consulta del Banco de Datos Dosimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Amaro, M

    1990-07-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs.

  1. An Internet-Based Real-Time Audiovisual Link for Dual MEG Recordings.

    Directory of Open Access Journals (Sweden)

    Andrey Zhdanov

    Full Text Available Most neuroimaging studies of human social cognition have focused on brain activity of single subjects. More recently, "two-person neuroimaging" has been introduced, with simultaneous recordings of brain signals from two subjects involved in social interaction. These simultaneous "hyperscanning" recordings have already been carried out with a spectrum of neuroimaging modalities, such as functional magnetic resonance imaging (fMRI, electroencephalography (EEG, and functional near-infrared spectroscopy (fNIRS.We have recently developed a setup for simultaneous magnetoencephalographic (MEG recordings of two subjects that communicate in real time over an audio link between two geographically separated MEG laboratories. Here we present an extended version of the setup, where we have added a video connection and replaced the telephone-landline-based link with an Internet connection. Our setup enabled transmission of video and audio streams between the sites with a one-way communication latency of about 130 ms. Our software that allows reproducing the setup is publicly available.We demonstrate that the audiovisual Internet-based link can mediate real-time interaction between two subjects who try to mirror each others' hand movements that they can see via the video link. All the nine pairs were able to synchronize their behavior. In addition to the video, we captured the subjects' movements with accelerometers attached to their index fingers; we determined from these signals that the average synchronization accuracy was 215 ms. In one subject pair we demonstrate inter-subject coherence patterns of the MEG signals that peak over the sensorimotor areas contralateral to the hand used in the task.

  2. An ontology-based method for secondary use of electronic dental record data

    Science.gov (United States)

    Schleyer, Titus KL; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P.; Liu, Kaihong; Hernandez, Pedro

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance. PMID:24303273

  3. An ontology-based method for secondary use of electronic dental record data.

    Science.gov (United States)

    Schleyer, Titus Kl; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P; Liu, Kaihong; Hernandez, Pedro

    2013-01-01

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance.

  4. Continuous recording and interobserver agreement algorithms reported in the Journal of Applied Behavior Analysis (1995-2005).

    Science.gov (United States)

    Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles.

  5. Design of a system based on DSP and FPGA for video recording and replaying

    Science.gov (United States)

    Kang, Yan; Wang, Heng

    2013-08-01

    This paper brings forward a video recording and replaying system with the architecture of Digital Signal Processor (DSP) and Field Programmable Gate Array (FPGA). The system achieved encoding, recording, decoding and replaying of Video Graphics Array (VGA) signals which are displayed on a monitor during airplanes and ships' navigating. In the architecture, the DSP is a main processor which is used for a large amount of complicated calculation during digital signal processing. The FPGA is a coprocessor for preprocessing video signals and implementing logic control in the system. In the hardware design of the system, Peripheral Device Transfer (PDT) function of the External Memory Interface (EMIF) is utilized to implement seamless interface among the DSP, the synchronous dynamic RAM (SDRAM) and the First-In-First-Out (FIFO) in the system. This transfer mode can avoid the bottle-neck of the data transfer and simplify the circuit between the DSP and its peripheral chips. The DSP's EMIF and two level matching chips are used to implement Advanced Technology Attachment (ATA) protocol on physical layer of the interface of an Integrated Drive Electronics (IDE) Hard Disk (HD), which has a high speed in data access and does not rely on a computer. Main functions of the logic on the FPGA are described and the screenshots of the behavioral simulation are provided in this paper. In the design of program on the DSP, Enhanced Direct Memory Access (EDMA) channels are used to transfer data between the FIFO and the SDRAM to exert the CPU's high performance on computing without intervention by the CPU and save its time spending. JPEG2000 is implemented to obtain high fidelity in video recording and replaying. Ways and means of acquiring high performance for code are briefly present. The ability of data processing of the system is desirable. And smoothness of the replayed video is acceptable. By right of its design flexibility and reliable operation, the system based on DSP and FPGA

  6. "It's like texting at the dinner table": A qualitative analysis of the impact of electronic health records on patient-physician interaction in hospitals.

    Science.gov (United States)

    Pelland, Kimberly D; Baier, Rosa R; Gardner, Rebekah L

    2017-06-30

    nBACKGROUND: Electronic health records (EHRs) may reduce medical errors and improve care, but can complicate clinical encounters. To describe hospital-based physicians' perceptions of the impact of EHRs on patient-physician interactions and contrast these findings against office-based physicians' perceptionsMethods: We performed a qualitative analysis of comments submitted in response to the 2014 Rhode Island Health Information Technology Survey. Office- and hospital-based physicians licensed in Rhode Island, in active practice, and located in Rhode Island or neighboring states completed the survey about their Electronic Health Record use. The survey's response rate was 68.3% and 2,236 (87.1%) respondents had EHRs. Among survey respondents, 27.3% of hospital-based and 37.8% of office-based physicians with EHRs responded to the question about patient interaction. Five main themes emerged for hospital-based physicians, with respondents generally perceiving EHRs as negatively altering patient interactions. We noted the same five themes among office-based physicians, but the rank-order of the top two responses differed by setting: hospital-based physicians commented most frequently that they spend less time with patients because they have to spend more time on computers; office-based physicians commented most frequently on EHRs worsening the quality of their interactions and relationships with patients. In our analysis of a large sample of physicians, hospital-based physicians generally perceived EHRs as negatively altering patient interactions, although they emphasized different reasons than their office-based counterparts. These findings add to the prior literature, which focuses on outpatient physicians, and can shape interventions to improve how EHRs are used in inpatient settings.

  7. LabTrove: a lightweight, web based, laboratory "blog" as a route towards a marked up record of work in a bioscience research laboratory.

    Science.gov (United States)

    Milsted, Andrew J; Hale, Jennifer R; Frey, Jeremy G; Neylon, Cameron

    2013-01-01

    The electronic laboratory notebook (ELN) has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research. We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system. LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN) system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse, repurposing, and redeployment.

  8. LabTrove: a lightweight, web based, laboratory "blog" as a route towards a marked up record of work in a bioscience research laboratory.

    Directory of Open Access Journals (Sweden)

    Andrew J Milsted

    Full Text Available The electronic laboratory notebook (ELN has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research.We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system.LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse, repurposing, and redeployment.

  9. Recording human cortical population spikes non-invasively--An EEG tutorial.

    Science.gov (United States)

    Waterstraat, Gunnar; Fedele, Tommaso; Burghoff, Martin; Scheer, Hans-Jürgen; Curio, Gabriel

    2015-07-30

    Non-invasively recorded somatosensory high-frequency oscillations (sHFOs) evoked by electric nerve stimulation are markers of human cortical population spikes. Previously, their analysis was based on massive averaging of EEG responses. Advanced neurotechnology and optimized off-line analysis can enhance the signal-to-noise ratio of sHFOs, eventually enabling single-trial analysis. The rationale for developing dedicated low-noise EEG technology for sHFOs is unfolded. Detailed recording procedures and tailored analysis principles are explained step-by-step. Source codes in Matlab and Python are provided as supplementary material online. Combining synergistic hardware and analysis improvements, evoked sHFOs at around 600 Hz ('σ-bursts') can be studied in single-trials. Additionally, optimized spatial filters increase the signal-to-noise ratio of components at about 1 kHz ('κ-bursts') enabling their detection in non-invasive surface EEG. sHFOs offer a unique possibility to record evoked human cortical population spikes non-invasively. The experimental approaches and algorithms presented here enable also non-specialized EEG laboratories to combine measurements of conventional low-frequency EEG with the analysis of concomitant cortical population spike responses. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Maximum Acceleration Recording Circuit

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1995-01-01

    Coarsely digitized maximum levels recorded in blown fuses. Circuit feeds power to accelerometer and makes nonvolatile record of maximum level to which output of accelerometer rises during measurement interval. In comparison with inertia-type single-preset-trip-point mechanical maximum-acceleration-recording devices, circuit weighs less, occupies less space, and records accelerations within narrower bands of uncertainty. In comparison with prior electronic data-acquisition systems designed for same purpose, circuit simpler, less bulky, consumes less power, costs and analysis of data recorded in magnetic or electronic memory devices. Circuit used, for example, to record accelerations to which commodities subjected during transportation on trucks.

  11. Data book of the component failure rate stored in the RECORD

    International Nuclear Information System (INIS)

    Oikawa, Testukuni; Sasaki, Shinobu; Hikawa, Michihiro; Higuchi, Suminori.

    1989-04-01

    The Japan Atomic Energy Research Insitute (JAERI) has developed a computerized component reliability data base and its retrieval system, RECORD, on collected failure rates from published literatures in order to promote convenience and efficiency of systems reliability analysis in the PSA (Probabilistic Safety Assessment). In order to represent collected failure rates in a uniform format, codes are defined for component category, failure mode, data source, unit of failure rate and statistocal parameter. Up to now, approximately 11,500 pieces of component failure rate data from about 35 open literatures have been stored in the RECORD. This report provides the failure rate stored in the RECORD data base for the usage by systems analysts, as well as brief descriptions about the data base structure and how to use this data book. (author)

  12. Variability of road traffic noise recorded by stationary monitoring stations

    Science.gov (United States)

    Bąkowski, Andrzej; Radziszewski, Leszek

    2017-11-01

    The paper presents the analysis results of equivalent sound level recorded by two road traffic noise monitoring stations. The stations were located in Kielce (an example of a medium-size town in Poland) at the roads out of the town in the direction of Kraków and Warszawa. The measurements were carried out through stationary stations monitoring the noise and traffic of motor vehicles. The RMS values based on A-weighted sound level were recorded every 1 s in the buffer and the results were registered every 1 min over the period of investigations. The registered data were the basis for calculating the equivalent sound level for three time intervals: from 6:00 to 18:00, from 18:00 to 22:00 and from 22:00 to 6:00. Analysis included the values of the equivalent sound level recorded for different days of the week split into 24h periods, nights, days and evenings. The data analysed included recordings from 2013. The coefficient of variation and positional variation were proposed for performing comparative analysis of the obtained data scattering. The investigations indicated that the recorded data varied depending on the traffic routes. The differences concerned the values of coefficients of variation of the equivalent sound levels.

  13. Continuous Record Laplace-based Inference about the Break Date in Structural Change Models

    OpenAIRE

    Casini, Alessandro; Perron, Pierre

    2018-01-01

    Building upon the continuous record asymptotic framework recently introduced by Casini and Perron (2017a) for inference in structural change models, we propose a Laplace-based (Quasi-Bayes) procedure for the construction of the estimate and confidence set for the date of a structural change. The procedure relies on a Laplace-type estimator defined by an integration-based rather than an optimization-based method. A transformation of the leastsquares criterion function is evaluated in order to ...

  14. Electronic medical records for otolaryngology office-based practice.

    Science.gov (United States)

    Chernobilsky, Boris; Boruk, Marina

    2008-02-01

    Pressure is mounting on physicians to adopt electronic medical records. The field of health information technology is evolving rapidly with innovations and policies often outpacing science. We sought to review research and discussions about electronic medical records from the past year to keep abreast of these changes. Original scientific research, especially from otolaryngologists, is lacking in this field. Adoption rates are slowly increasing, but more of the burden is shouldered by physicians despite policy efforts and the clear benefits to third-party payers. Scientific research from the past year suggests lack of improvements and even decreasing quality of healthcare with electronic medical record adoption in the ambulatory care setting. The increasing prevalence and standardization of electronic medical record systems results in a new set of problems including rising costs, audits, difficulties in transition and public concerns about security of information. As major players in healthcare continue to push for adoption, increased effort must be made to demonstrate actual improvements in patient care in the ambulatory care setting. More scientific studies are needed to demonstrate what features of electronic medical records actually improve patient care. Otolaryngologists should help each other by disseminating research about improvement in patient outcomes with their systems since current adoption and outcomes policies do not apply to specialists.

  15. Head in the clouds: Re-imagining the experimental laboratory record for the web-based networked world.

    Science.gov (United States)

    Neylon, Cameron

    2009-10-29

    The means we use to record the process of carrying out research remains tied to the concept of a paginated paper notebook despite the advances over the past decade in web based communication and publication tools. The development of these tools offers an opportunity to re-imagine what the laboratory record would look like if it were re-built in a web-native form. In this paper I describe a distributed approach to the laboratory record based which uses the most appropriate tool available to house and publish each specific object created during the research process, whether they be a physical sample, a digital data object, or the record of how one was created from another. I propose that the web-native laboratory record would act as a feed of relationships between these items. This approach can be seen as complementary to, rather than competitive with, integrative approaches that aim to aggregate relevant objects together to describe knowledge. The potential for the recent announcement of the Google Wave protocol to have a significant impact on realizing this vision is discussed along with the issues of security and provenance that are raised by such an approach.

  16. Multi-scale dynamical analysis (MSDA) of sea level records versus PDO, AMO, and NAO indexes

    OpenAIRE

    Scafetta, Nicola

    2013-01-01

    Herein I propose a multi-scale dynamical analysis to facilitate the physical interpretation of tide gauge records. The technique uses graphical diagrams. It is applied to six secular-long tide gauge records representative of the world oceans: Sydney, Pacific coast of Australia; Fremantle, Indian Ocean coast of Australia; New York City, Atlantic coast of USA; Honolulu, U.S. state of Hawaii; San Diego, U.S. state of California; and Venice, Mediterranean Sea, Italy. For comparison, an equivalent...

  17. Single-chip microcomputer based protection, diagnostic and recording system for longwall shearers

    Energy Technology Data Exchange (ETDEWEB)

    Heyduk, A.; Krasucki, F. (Politechnika Slaska, Gliwice (Poland). Katedra Elektryfikacji i Automatyzacji Gornictwa)

    1993-05-01

    Presents a concept of microcomputer-aided operation, protection, diagnostics and recording for shearer loaders. A two-stage mathematical model is suggested and explained. The model represents the thermal processes that determine the overcurrent protection of drive motors. Circuits for monitoring fuses, supply voltages, contacts, relays, contactors and electro-hydraulic distributors with the use of transoptors are shown. Recording characteristic operation parameters of a shearer loader during the 5 minutes before a failure is proposed. Protection, diagnosis and control functions are suggested as additional functions to the microcomputer-aided system of shearer loader control being developed at the Silesian Technical University. The system is based on the NECmicroPD 78310 microprocessor. 10 refs.

  18. Non-linear transient behavior during soil liquefaction based on re-evaluation of seismic records

    OpenAIRE

    Kamagata, S.; Takewaki, Izuru

    2015-01-01

    Focusing on soil liquefaction, the seismic records during the Niigata-ken earthquake in 1964, the southern Hyogo prefecture earthquake in 1995 and the 2011 off the Pacific coast of Tohoku earthquake are analyzed by the non-stationary Fourier spectra. The shift of dominant frequency in the seismic record of Kawagishi-cho during the Niigata-ken earthquake is evaluated based on the time-variant property of dominant frequencies. The reduction ratio of the soil stiffness is evaluated from the shif...

  19. Child and adolescent abuse recorded at a national referral hospital, 2006-2011

    OpenAIRE

    Escalante-Romero, Lorena; Facultad de Medicina, Universidad de San Martin de Porres. Lima, Perú. Interno de Medicina.; Huamaní, Charles; Oficina General de Informática y Sistemas, Instituto Nacional de Salud. Lima, Perú. Médico cirujano.; Serpa, Hilda; Departamento de Investigación, Docencia y Atención en Salud Mental, Instituto Nacional de Salud del Niño. Lima, Perú. médico psiquiatra.; Urbano-Durand, Carlos; Departamento de Atención y Servicios al Paciente, Instituto Nacional de Salud del Niño. Lima, Perú. médico pediatra.; Farfán-Meza, Gaudy; Facultad de Medicina, Universidad de San Martin de Porres. Lima, Perú. Interno de Medicina.; Ferrer-Salas, Carolina; Facultad de Medicina, Universidad de San Martin de Porres. Lima, Perú. Interno de Medicina.; Granados-Chávez, Gilda; Facultad de Medicina, Universidad de San Martin de Porres. Lima, Perú. Interno de Medicina.

    2014-01-01

    Objectives. To describe the records of child and adolescent abuse of the Instituto Nacional de Salud del Niño (INSN) from January 2006 to September 2011, characterizing the victim and perpetrator. Materials and methods. A secondary sources analysis was performed, based on the domestic violence and child abuse records, from froms administered by Child Abuse and Adolescent Health Unit (MAMIS) at the INSN. The records include data of the victim, offender and characteristics of the aggression...

  20. Interpreting land records

    CERN Document Server

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  1. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  2. A stationary wavelet transform and a time-frequency based spike detection algorithm for extracellular recorded data.

    Science.gov (United States)

    Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane

    2017-06-01

    Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.

  3. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  4. How many records should be used in ASCE/SEI-7 ground motion scaling procedure?

    Science.gov (United States)

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.

  5. A Satellite-Based Surface Radiation Climatology Derived by Combining Climate Data Records and Near-Real-Time Data

    Directory of Open Access Journals (Sweden)

    Bodo Ahrens

    2013-09-01

    Full Text Available This study presents a method for adjusting long-term climate data records (CDRs for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS. Recently, a 23-year long (1983–2005 continuous SIS CDR has been generated based on the visible channel (0.45–1 μm of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF. Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005 and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present was applied that combines the Standard Normal Homogeneity Test (SNHT and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN and about 50 stations of the Global Energy Balance Archive (GEBA over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of

  6. High-performance, polymer-based direct cellular interfaces for electrical stimulation and recording

    Science.gov (United States)

    Kim, Seong-Min; Kim, Nara; Kim, Youngseok; Baik, Min-Seo; Yoo, Minsu; Kim, Dongyoon; Lee, Won-June; Kang, Dong-Hee; Kim, Sohee; Lee, Kwanghee; Yoon, Myung-Han

    2018-04-01

    Due to the trade-off between their electrical/electrochemical performance and underwater stability, realizing polymer-based, high-performance direct cellular interfaces for electrical stimulation and recording has been very challenging. Herein, we developed transparent and conductive direct cellular interfaces based on a water-stable, high-performance poly(3,4-ethylenedioxythiophene):polystyrene sulfonate (PEDOT:PSS) film via solvent-assisted crystallization. The crystallized PEDOT:PSS on a polyethylene terephthalate (PET) substrate exhibited excellent electrical/electrochemical/optical characteristics, long-term underwater stability without film dissolution/delamination, and good viability for primarily cultured cardiomyocytes and neurons over several weeks. Furthermore, the highly crystallized, nanofibrillar PEDOT:PSS networks enabled dramatically enlarged surface areas and electrochemical activities, which were successfully employed to modulate cardiomyocyte beating via direct electrical stimulation. Finally, the high-performance PEDOT:PSS layer was seamlessly incorporated into transparent microelectrode arrays for efficient, real-time recording of cardiomyocyte action potentials with a high signal fidelity. All these results demonstrate the strong potential of crystallized PEDOT:PSS as a crucial component for a variety of versatile bioelectronic interfaces.

  7. LabTrove: A Lightweight, Web Based, Laboratory “Blog” as a Route towards a Marked Up Record of Work in a Bioscience Research Laboratory

    Science.gov (United States)

    Milsted, Andrew J.; Hale, Jennifer R.; Frey, Jeremy G.; Neylon, Cameron

    2013-01-01

    Background The electronic laboratory notebook (ELN) has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research. Methodology/Principal Findings We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system. Conclusions/Significance LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN) system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse

  8. Investigations of anomalous gravity signals prior to 71 large earthquakes based on a 4-years long superconducting gravimeter records

    Directory of Open Access Journals (Sweden)

    Dijin Wang

    2017-09-01

    Full Text Available Using continuous 1-Hz sampling time-series recorded by a SG (superconducting gravimeter at Hsinchu station, Taiwan of China, we investigate the anomalous gravity signals prior to 71 large earthquakes with moment magnitude larger than 7.0 (Mw7.0 occurred between 1 Jan 2008 and 31 Dec 2011. We firstly evaluate the noise level of the SG records at Hsinchu (HS station in microseismic bands from 0.05 Hz to 0.1 Hz by computing the PSD (power spectral density of seismically quiet days selected based on the RMS of records. Based on the analysis of the noise level and the spectral features of the seismically quiet SG records at HS station, we detect AGSs (anomalous gravity signals prior to large earthquakes. We apply HHT (Hilbert-Huang transformation to establish the TFEP (time-frequency-energy paradigms and MS (marginal spectra of the SG data before the large earthquakes, and the characteristics of TFEP and MS of the SGs data during the typhoon event are also analyzed. By comparing the spectral characteristics of the SGs data during seismically quiet period, three types of AGSs are found; and the occurrence rate of AGSs before 71 earthquakes is given in terms of the cases with different epicenter distance and different focal depth. The statistical results show that 56.3% of all the examined large earthquakes were preceded by AGSs; and if we constrain the epicenter distance to be smaller than 3500 km and focal depth less than 300 km, 75.3% of the examined large earthquakes can be associated with the AGSs. Especially, we note that for all the large earthquakes occurred in the Eurasian plate in recent four years, the precursory AGSs can always be found in the SG data recorded at HS station. Our investigations suggest that the AGSs prior to large earthquakes may be related to focal depth, epicentre distance and location.

  9. Communication Base Station Log Analysis Based on Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Zhang Shao-Hua

    2017-01-01

    Full Text Available Communication base stations generate massive data every day, these base station logs play an important value in mining of the business circles. This paper use data mining technology and hierarchical clustering algorithm to group the scope of business circle for the base station by recording the data of these base stations.Through analyzing the data of different business circle based on feature extraction and comparing different business circle category characteristics, which can choose a suitable area for operators of commercial marketing.

  10. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko

    2015-01-01

    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  11. Submarine paleoseismology based on turbidite records.

    Science.gov (United States)

    Goldfinger, Chris

    2011-01-01

    Many of the largest earthquakes are generated at subduction zones or other plate boundary fault systems near enough to the coast that marine environments may record evidence of them. During and shortly after large earthquakes in the coastal and marine environments, a spectrum of evidence may be left behind, mirroring onshore paleoseismic evidence. Shaking or displacement of the seafloor can trigger processes such as turbidity currents, submarine landslides, tsunami (which may be recorded both onshore and offshore), and soft-sediment deformation. Marine sites may also share evidence of fault scarps, colluvial wedges, offset features, and liquefaction or fluid expulsion with their onshore counterparts. This article reviews the use of submarine turbidite deposits for paleoseismology, focuses on the dating and correlation techniques used to establish stratigraphic continuity of marine deposits, and outlines criteria for distinguishing earthquake deposits and the strategies used to acquire suitable samples and data for marine paleoseismology.

  12. Web application for recording learners’ mouse trajectories and retrieving their study logs for data analysis

    Directory of Open Access Journals (Sweden)

    Yoshinori Miyazaki

    2012-03-01

    Full Text Available With the accelerated implementation of e-learning systems in educational institutions, it has become possible to record learners’ study logs in recent years. It must be admitted that little research has been conducted upon the analysis of the study logs that are obtained. In addition, there is no software that traces the mouse movements of learners during their learning processes, which the authors believe would enable teachers to better understand their students’ behaviors. The objective of this study is to develop a Web application that records students’ study logs, including their mouse trajectories, and to devise an IR tool that can summarize such diversified data. The results of an experiment are also scrutinized to provide an analysis of the relationship between learners’ activities and their study logs.

  13. Development of Hospital-based Data Sets as a Vehicle for Implementation of a National Electronic Health Record.

    Science.gov (United States)

    Keikha, Leila; Farajollah, Seyede Sedigheh Seied; Safdari, Reza; Ghazisaeedi, Marjan; Mohammadzadeh, Niloofar

    2018-01-01

    In developing countries such as Iran, international standards offer good sources to survey and use for appropriate planning in the domain of electronic health records (EHRs). Therefore, in this study, HL7 and ASTM standards were considered as the main sources from which to extract EHR data. The objective of this study was to propose a hospital data set for a national EHR consisting of data classes and data elements by adjusting data sets extracted from the standards and paper-based records. This comparative study was carried out in 2017 by studying the contents of the paper-based records approved by the health ministry in Iran and the international ASTM and HL7 standards in order to extract a minimum hospital data set for a national EHR. As a result of studying the standards and paper-based records, a total of 526 data elements in 174 classes were extracted. An examination of the data indicated that the highest number of extracted data came from the free text elements, both in the paper-based records and in the standards related to the administrative data. The major sources of data extracted from ASTM and HL7 were the E1384 and Hl7V.x standards, respectively. In the paper-based records, data were extracted from 19 forms sporadically. By declaring the confidentiality of information, the ASTM standards acknowledge the issue of confidentiality of information as one of the main challenges of EHR development, and propose new types of admission, such as teleconference, tele-video, and home visit, which are inevitable with the advent of new technology for providing healthcare and treating diseases. Data related to finance and insurance, which were scattered in different categories by three organizations, emerged as the financial category. Documenting the role and responsibility of the provider by adding the authenticator/signature data element was deemed essential. Not only using well-defined and standardized data, but also adapting EHR systems to the local facilities and

  14. A technique to stabilize record bases for Gothic arch tracings in patients with implant-retained complete dentures.

    Science.gov (United States)

    Raigrodski, A J; Sadan, A; Carruth, P L

    1998-12-01

    Clinicians have long expressed concern about the accuracy of the Gothic arch tracing for recording centric relation in edentulous patients. With the use of dental implants to assist in retaining complete dentures, the problem of inaccurate recordings, made for patients without natural teeth, can be significantly reduced. This article presents a technique that uses healing abutments to stabilize the record bases so that an accurate Gothic arch tracing can be made.

  15. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    Science.gov (United States)

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  16. Developing an integrated electronic nursing record based on standards.

    Science.gov (United States)

    van Grunsven, Arno; Bindels, Rianne; Coenen, Chel; de Bel, Ernst

    2006-01-01

    The Radboud University Nijmegen Medical Centre in the Netherlands develops a multidisciplinar (Electronic Health Record) based on the latest HL7 v3 (Health Level 7 version 3) D-MIM : Care provision. As part of this process we are trying to establish which nursing diagnoses and activities are minimally required. These NMDS (Nursing Minimal Data Set) are mapped or translated to ICF (for diagnoses) and CEN1828 Structures for (for activities). The mappings will be the foundation for the development of user interfaces for the registration of nursing activities. A homegrown custom-made web based configuration tool is used to exploit the possibilities of HL7 v3. This enables a sparkling launch of user interfaces that can contain the diversity of health care work processes. The first screens will be developed to support history taking for the nursing chart of the Neurology ward. The screens will contain both Dutch NMDS items and ward specific information. This will be configured dynamically per (group of) ward(s).

  17. Digital Communication and Records in Service Provision and Supervision: Regulation and Practice.

    Science.gov (United States)

    Cavalari, Rachel N S; Gillis, Jennifer M; Kruser, Nathan; Romanczyk, Raymond G

    2015-10-01

    While the use of computer-based communication, video recordings, and other "electronic" records is commonplace in clinical service settings and research, management of digital records can become a great burden from both practical and regulatory perspectives. Three types of challenges commonly present themselves: regulatory requirements; storage, transmission, and access; and analysis for clinical and research decision-making. Unfortunately, few practitioners and organizations are well enough informed to set necessary policies and procedures in an effective, comprehensive manner. The three challenges are addressed using a demonstrative example of policies and procedural guidelines from an applied perspective, maintaining the unique emphasis behavior analysts place upon quantitative analysis. Specifically, we provide a brief review of federal requirements relevant to the use of video and electronic records in the USA; non-jargon pragmatic solutions to managing and storing video and electronic records; and last, specific methodologies to facilitate extraction of quantitative information in a cost-effective manner.

  18. Continuous Recording and Interobserver Agreement Algorithms Reported in The Journal of Applied Behavior Analysis (1995–2005)

    Science.gov (United States)

    Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles. PMID:19721737

  19. Smart Card Based Integrated Electronic Health Record System For Clinical Practice

    OpenAIRE

    N. Anju Latha; B. Rama Murthy; U. Sunitha

    2012-01-01

    Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Smart cards are used as a Electronic Health Record (EHR) Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in hospitals. In this paper, a smart card based Integrated Electronic health Reco...

  20. Analysis of chaos attractors of MCG-recordings.

    Science.gov (United States)

    Jiang, Shiqin; Yang, Fan; Yi, Panke; Chen, Bo; Luo, Ming; Wang, Lemin

    2006-01-01

    By studying the chaos attractor of cardiac magnetic induction strength B(z) generated by the electrical activity of the heart, we found that its projection in the reconstructed phase space has a similar shape with the map of the total current dipole vector. It is worth noting that the map of the total current dipole vector is computed with MCG recordings measured at 36 locations, whereas the chaos attractor of B(z) is generated by only one cardiac magnetic field recordings on the measured plan. We discuss only two subjects of different ages in this paper.

  1. Reconstruction of human brain spontaneous activity based on frequency-pattern analysis of magnetoencephalography data

    Directory of Open Access Journals (Sweden)

    Rodolfo R Llinas

    2015-10-01

    Full Text Available A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in ten healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in ten healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject’s head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space.

  2. Reconstruction of human brain spontaneous activity based on frequency-pattern analysis of magnetoencephalography data

    Science.gov (United States)

    Llinás, Rodolfo R.; Ustinin, Mikhail N.; Rykunov, Stanislav D.; Boyko, Anna I.; Sychev, Vyacheslav V.; Walton, Kerry D.; Rabello, Guilherme M.; Garcia, John

    2015-01-01

    A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in 10 healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in 10 healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject's head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space. PMID:26528119

  3. Effects of scanning and eliminating paper-based medical records on hospital physicians' clinical work practice.

    Science.gov (United States)

    Laerum, Hallvard; Karlsen, Tom H; Faxvaag, Arild

    2003-01-01

    It is not automatically given that the paper-based medical record can be eliminated after the introduction of an electronic medical record (EMR) in a hospital. Many keep and update the paper-based counterpart, and this limits the use of the EMR system. The authors have evaluated the physicians' clinical work practices and attitudes toward a system in a hospital that has eliminated the paper-based counterpart using scanning technology. Combined open-ended interviews (8 physicians) and cross-sectional survey (70 physicians) were conducted and compared with reference data from a previous national survey (69 physicians from six hospitals). The hospitals in the reference group were using the same EMR system without the scanning module. The questionnaire (English translation available as an online data supplement at ) covered frequency of use of the EMR system for 19 defined tasks, ease of performing them, and user satisfaction. The interviews were open-ended. The physicians routinely used the system for nine of 11 tasks regarding retrieval of patient data, which the majority of the physicians found more easily performed than before. However, 22% to 25% of the physicians found retrieval of patient data more difficult, particularly among internists (33%). Overall, the physicians were equally satisfied with the part of the system handling the regular electronic data as that of the physicians in the reference group. They were, however, much less satisfied with the use of scanned document images than that of regular electronic data, using the former less frequently than the latter. Scanning and elimination of the paper-based medical record is feasible, but the scanned document images should be considered an intermediate stage toward fully electronic medical records. To our knowledge, this is the first assessment from a hospital in the process of completing such a scanning project.

  4. The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine.

    Science.gov (United States)

    Rodriguez Torres, Yasaira; Huang, Jordan; Mihlstin, Melanie; Juzych, Mark S; Kromrei, Heidi; Hwang, Frank S

    2017-01-01

    This study aimed to determine the role of electronic health record software in resident education by evaluating documentation of 30 elements extracted from the American Academy of Ophthalmology Dry Eye Syndrome Preferred Practice Pattern. The Kresge Eye Institute transitioned to using electronic health record software in June 2013. We evaluated the charts of 331 patients examined in the resident ophthalmology clinic between September 1, 2011, and March 31, 2014, for an initial evaluation for dry eye syndrome. We compared documentation rates for the 30 evidence-based elements between electronic health record chart note templates among the ophthalmology residents. Overall, significant changes in documentation occurred when transitioning to a new version of the electronic health record software with average compliance ranging from 67.4% to 73.6% (p Electronic Health Record A had high compliance (>90%) in 13 elements while Electronic Health Record B had high compliance (>90%) in 11 elements. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin examination, contact lens wear, and smoking exposure. Significant differences in documentation were correlated with electronic health record template design rather than individual resident or residents' year in training. Our results show that electronic health record template design influences documentation across all resident years. Decreased documentation likely results from "mouse click fatigue" as residents had to access multiple dialog boxes to complete documentation. These findings highlight the importance of EHR template design to improve resident documentation and integration of evidence-based medicine into their clinical notes.

  5. New operator's console recorder

    International Nuclear Information System (INIS)

    Anon.

    2009-01-01

    This article described a software module that automatically records images being shown on multiple HMI or SCADA operator's displays. Videos used for monitoring activities at industrial plants can be combined with the operator console videos and data from a process historian. This enables engineers, analysts or investigators to see what is occurring in the plant, what the operator is seeing on the HMI screen, and all relevant real-time data from an event. In the case of a leak at a pumping station, investigators could watch plant video taken at a remote site showing fuel oil creeping across the floor, real-time data being acquired from pumps, valves and the receiving tank while the leak is occurring. The video shows the operator's HMI screen as well as the alarm screen that signifies the leak detection. The Longwatch Operator's Console Recorder and Video Historian are used together to acquire data about actual plant plant management because they show everything that happens during an event. The Console Recorder automatically retrieves and replays operator displays by clicking on a time-based alarm or system message. Play back of video feed is a valuable tool for training and analysis purposes, and can help mitigate insurance and regulatory issues by eliminating uncertainty and conjecture. 1 fig.

  6. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  7. Use of a Web-based physical activity record system to analyze behavior in a large population: cross-sectional study.

    Science.gov (United States)

    Namba, Hideyuki; Yamada, Yosuke; Ishida, Mika; Takase, Hideto; Kimura, Misaka

    2015-03-19

    The use of Web-based physical activity systems has been proposed as an easy method for collecting physical activity data. We have developed a system that has exhibited high accuracy as assessed by the doubly labeled water method. The purpose of this study was to collect behavioral data from a large population using our Web-based physical activity record system and assess the physical activity of the population based on these data. In this paper, we address the difference in physical activity for each urban scale. In total, 2046 participants (aged 30-59 years; 1105 men and 941 women) participated in the study. They were asked to complete data entry before bedtime using their personal computer on 1 weekday and 1 weekend day. Their residential information was categorized as urban, urban-rural, or rural. Participant responses expressed the intensity of each activity at 15-minute increments and were recorded on a Web server. Residential areas were compared and multiple regression analysis was performed. Most participants had a metabolic equivalent (MET) ranging from 1.4 to 1.8, and the mean MET was 1.60 (SD 0.28). The median value of moderate-to-vigorous physical activity (MVPA, ≥3 MET) was 7.92 MET-hours/day. A 1-way ANCOVA showed that total physical activity differed depending on the type of residential area (F2,2027=5.19, P=.006). The urban areas (n=950) had the lowest MET-hours/day (mean 37.8, SD, 6.0), followed by urban-rural areas (n=432; mean 38.6, SD 6.5; P=.04), and rural areas (n=664; mean 38.8, SD 7.4; P=.002). Two-way ANCOVA showed a significant interaction between sex and area of residence on the urban scale (F2,2036=4.53, P=.01). Men in urban areas had the lowest MET-hours/day (MVPA, ≥3 MET) at mean 7.9 (SD 8.7); men in rural areas had a MET-hours/day (MVPA, ≥3 MET) of mean 10.8 (SD 12.1, P=.002). No significant difference was noted in women among the 3 residential areas. Multiple regression analysis showed that physical activity consisting of

  8. Gap analysis between provisional diagnosis and final diagnosis in government and private teaching hospitals: A record-linked comparative study.

    Science.gov (United States)

    Chatterjee, Sudeshna; Ray, Krishnangshu; Das, Anup Kumar

    2016-01-01

    1. To identify the extent of clinical gaps at the context of knowledge, practice and systems. 2. To formulate necessary intervention measures towards bridging the gaps. Comparative, cross-sectional and non-interventional study. It is retrospective, record-based study conducted upon inpatients ( n = 200) of major disciplines of two teaching hospitals. Major outcome variables were to observe the matching and un-matching of final and provisional diagnosis by using ICD-10 criteria. Comparative analysis of specific and selective gaps were estimated in terms of percentage (%). Pilot observation showed the existence of gaps between provisional and final diagnosis in both private and government institution. Both knowledge and skill gaps were evident in caregivers and gap in documentation was existent in medical records. The pilot data is may be an eye-opener to public and private governance systems for understanding and revising the process service planning and service delivery. Necessary intervention measures may be contemplated towards enhancing diagnostic skill of doctors for quality hospital care.

  9. Spectral analysis in overmodulated holographic reflection gratings recorded with BB640 ultrafine grain emulsion

    Science.gov (United States)

    Mas-Abellán, P.; Madrigal, R.; Fimia, A.

    2015-05-01

    Silver halide emulsions have been considered one of the most energetic sensitive materials for holographic applications. Nonlinear recording effects on holographic reflection gratings recorded on silver halide emulsions have been studied by different authors obtaining excellent experimental results. In this communication specifically we focused our investigation on the effects of refractive index modulation, trying to get high levels of overmodulation. We studied the influence of the grating thickness on the overmodulation and its effects on the transmission spectra for a wide exposure range by use of two different thickness ultrafine grain emulsion BB640, thin films (6 μm) and thick films (9 μm), exposed to single collimated beams using a red He-Ne laser (wavelength 632.8 nm) with Denisyuk configuration obtaining a spatial frequency of 4990 l/mm recorded on the emulsion. The experimental results show that high overmodulation levels of refractive index could offer some benefits such as high diffraction efficiency (reaching 90 %), increase of grating bandwidth (close to 80 nm), making lighter holograms, or diffraction spectra deformation, transforming the spectrum from sinusoidal to approximation of square shape. Based on these results, we demonstrate that holographic reflection gratings spectra recorded with overmodulation of refractive index is formed by the combination of several non-linear components due to very high overmodulation. This study is the first step to develop a new easy multiplexing technique based on the use of high index modulation reflection gratings.

  10. Improvements in heavy water analysis using a ratio recording infrared spectrophotometer (Preprint No. CA-12)

    Energy Technology Data Exchange (ETDEWEB)

    Sutawane, U B; Alphonse, K P; Rathi, B N [Bhabha Atomic Research Centre, Bombay (India). Heavy Water Div.

    1989-04-01

    With a view to optimise existing analytical procedures for routine analyses of heavy water, studies were carried out using a ratio recording instrument with and without a reference beam attenuator in infrared spectrophotometric method. Absorbance difference as well as absorbance values with different path length cells were used for measurements. Due to various practical considerations, a method based on measurement of absorbance values rather than absorbance difference was found to be convenient for all routine work. However, scanning is essential since there is slight shifting of peak position. Measurements at fixed wave lengths should generally be avoided. Use of standards for calibration of instrument is essential and frequent check of calibration is recommended. Optimum conditions for analysis of heavy water in different ranges on the instrument used in this study are tabulated. (author). 6 refs., 1 tab.

  11. Improvements in heavy water analysis using a ratio recording infrared spectrophotometer (Preprint No. CA-12)

    International Nuclear Information System (INIS)

    Sutawane, U.B.; Alphonse, K.P.; Rathi, B.N.

    1989-04-01

    With a view to optimise existing analytical procedures for routine analyses of heavy water, studies were carried out using a ratio recording instrument with and without a reference beam attenuator in infrared spectrophotometric method. Absorbance difference as well as absorbance values with different path length cells were used for measurements. Due to various practical considerations, a method based on measurement of absorbance values rather than absorbance difference was found to be convenient for all routine work. However, scanning is essential since there is slight shifting of peak position. Measurements at fixed wave lengths should generally be avoided. Use of standards for calibration of instrument is essential and frequent check of calibration is recommended. Optimum conditions for analysis of heavy water in different ranges on the instrument used in this study are tabulated. (author). 6 refs., 1 tab

  12. Beyond the computer-based patient record: re-engineering with a vision.

    Science.gov (United States)

    Genn, B; Geukers, L

    1995-01-01

    In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.

  13. How to limit the burden of data collection for Quality Indicators based on medical records? The COMPAQH experience

    Directory of Open Access Journals (Sweden)

    Grenier Catherine

    2008-10-01

    Full Text Available Abstract Background Our objective was to limit the burden of data collection for Quality Indicators (QIs based on medical records. Methods The study was supervised by the COMPAQH project. Four QIs based on medical records were tested: medical record conformity; traceability of pain assessment; screening for nutritional disorders; time elapsed before sending copy of discharge letter to the general practitioner. Data were collected by 6 Clinical Research Assistants (CRAs in a panel of 36 volunteer hospitals and analyzed by COMPAQH. To limit the burden of data collection, we used the same sample of medical records for all 4 QIs, limited sample size to 80 medical records, and built a composite score of only 10 items to assess medical record completeness. We assessed QI feasibility by completing a grid of 19 potential problems and evaluating time spent. We assessed reliability (κ coefficient as well as internal consistency (Cronbach α coefficient in an inter-observer study, and discriminatory power by analysing QI variability among hospitals. Results Overall, 23 115 data items were collected for the 4 QIs and analyzed. The average time spent on data collection was 8.5 days per hospital. The most common feasibility problem was misunderstanding of the item by hospital staff. QI reliability was good (κ: 0.59–0.97 according to QI. The hospitals differed widely in their ability to meet the quality criteria (mean value: 19–85%. Conclusion These 4 QIs based on medical records can be used to compare the quality of record keeping among hospitals while limiting the burden of data collection, and can therefore be used for benchmarking purposes. The French National Health Directorate has included them in the new 2009 version of the accreditation procedure for healthcare organizations.

  14. “It’s like texting at the dinner table”: A qualitative analysis of the impact of electronic health records on patient-physician interaction in hospitals

    Directory of Open Access Journals (Sweden)

    Kimberly D Pelland

    2017-06-01

    Full Text Available Background: Electronic health records (EHRs may reduce medical errors and improve care, but can complicate clinical encounters. Objective: To describe hospital-based physicians’ perceptions of the impact of EHRs on patient-physician interactions and contrast these findings against office-based physicians’ perceptions Methods: We performed a qualitative analysis of comments submitted in response to the 2014 Rhode Island Health Information Technology Survey. Office- and hospital-based physicians licensed in Rhode Island, in active practice, and located in Rhode Island or neighboring states completed the survey about their Electronic Health Record use. Results: The survey’s response rate was 68.3% and 2,236 (87.1% respondents had EHRs. Among survey respondents, 27.3% of hospital-based and 37.8% of office-based physicians with EHRs responded to the question about patient interaction. Five main themes emerged for hospital-based physicians, with respondents generally perceiving EHRs as negatively altering patient interactions. We noted the same five themes among office-based physicians, but the rank-order of the top two responses differed by setting: hospital-based physicians commented most frequently that they spend less time with patients because they have to spend more time on computers; office-based physicians commented most frequently on EHRs worsening the quality of their interactions and relationships with patients. Conclusion: In our analysis of a large sample of physicians, hospital-based physicians generally perceived EHRs as negatively altering patient interactions, although they emphasized different reasons than their office-based counterparts. These findings add to the prior literature, which focuses on outpatient physicians, and can shape interventions to improve how EHRs are used in inpatient settings.

  15. Development of clinical contents model markup language for electronic health records.

    Science.gov (United States)

    Yun, Ji-Hyun; Ahn, Sun-Ju; Kim, Yoon

    2012-09-01

    To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. CCML HAS THE FOLLOWING STRENGTHS: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems.

  16. A stationary wavelet transform and a time-frequency based spike detection algorithm for extracellular recorded data

    Science.gov (United States)

    Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane

    2017-06-01

    Objective. Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. Approach. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. Main results. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. Significance. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.

  17. Identification of input variables for feature based artificial neural networks-saccade detection in EOG recordings.

    Science.gov (United States)

    Tigges, P; Kathmann, N; Engel, R R

    1997-07-01

    Though artificial neural networks (ANN) are excellent tools for pattern recognition problems when signal to noise ratio is low, the identification of decision relevant features for ANN input data is still a crucial issue. The experience of the ANN designer and the existing knowledge and understanding of the problem seem to be the only links for a specific construction. In the present study a backpropagation ANN based on modified raw data inputs showed encouraging results. Investigating the specific influences of prototypical input patterns on a specially designed ANN led to a new sparse and efficient input data presentation. This data coding obtained by a semiautomatic procedure combining existing expert knowledge and the internal representation structures of the raw data based ANN yielded a list of feature vectors, each representing the relevant information for saccade identification. The feature based ANN produced a reduction of the error rate of nearly 40% compared with the raw data ANN. An overall correct classification of 92% of so far unknown data was realized. The proposed method of extracting internal ANN knowledge for the production of a better input data representation is not restricted to EOG recordings, and could be used in various fields of signal analysis.

  18. Analysis of dose record and epidemiology for radiation workers in Korea

    International Nuclear Information System (INIS)

    Choi, S.Y.; Kim, T.H.

    2003-01-01

    This study presents data on the externally received doses and preliminary results of epidemiological survey for radiation workers. The statistical analysis was carried out in order to understand better the occupational radiation doses in Korea. Records containing dose information from 1984 to 1999 for 64,518 persons were extracted from the National Dose Registry of Korea (Korea Radioisotope Association's personal dose record). The total number of workers registered from 1984 to 1999 was 64,518. The number of workers steadily increased and the accumulated dose somewhat increased. The proportion of radiation workers by occupation was 38.4% for nuclear power plant, 20.3% for industrial organization and 12.4% for non-destructive industry, respectively. The collective annual dose of radiation workers was 31.72 man Sv in 1999. The mean annual dose by sex was 1.49 mSv for male and 0.56 mSv for female. The mean annual dose for workers was 1.41 mSv with the highest mean dose being received by non-destructive industry (3.53 mSv). Very few workers(0.8%) received more than 20 mSv and only one more than 50 mSv, the legal limit for an annual dose. There has been a steady decline in the mean dose since 1984, showing a significant decrease in dose with time. The data showed that radiation protection in Korea was improving, though annual doses were still higher than other countries. Nevertheless, this finding brings to light the necessity of the workers to pay more careful attention to radiation protection procedures and practices, and suggest the need for continuous effort to implement procedures. We are carrying out epidemiological survey in order to evaluate radiation effects on Korean workers based on radiation dose data from the year of 2000. Follow-up is carrying out in order to detect and measure directly the risks of cancer using the Korean Mortality Data, Cancer Registry and individual investigation

  19. Satellite-based climate data records of surface solar radiation from the CM SAF

    Science.gov (United States)

    Trentmann, Jörg; Cremer, Roswitha; Kothe, Steffen; Müller, Richard; Pfeifroth, Uwe

    2017-04-01

    The incoming surface solar radiation has been defined as an essential climate variable by GCOS. Long term monitoring of this part of the earth's energy budget is required to gain insights on the state and variability of the climate system. In addition, climate data sets of surface solar radiation have received increased attention over the recent years as an important source of information for solar energy assessments, for crop modeling, and for the validation of climate and weather models. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) is deriving climate data records (CDRs) from geostationary and polar-orbiting satellite instruments. Within the CM SAF these CDRs are accompanied by operational data at a short time latency to be used for climate monitoring. All data from the CM SAF is freely available via www.cmsaf.eu. Here we present the regional and the global climate data records of surface solar radiation from the CM SAF. The regional climate data record SARAH (Surface Solar Radiation Dataset - Heliosat, doi: 10.5676/EUM_SAF_CM/SARAH/V002) is based on observations from the series of Meteosat satellites. SARAH provides 30-min, daily- and monthly-averaged data of the effective cloud albedo, the solar irradiance (incl. spectral information), the direct solar radiation (horizontal and normal), and the sunshine duration from 1983 to 2015 for the full view of the Meteosat satellite (i.e, Europe, Africa, parts of South America, and the Atlantic ocean). The data sets are generated with a high spatial resolution of 0.05° allowing for detailed regional studies. The global climate data record CLARA (CM SAF Clouds, Albedo and Radiation dataset from AVHRR data, doi: 10.5676/EUM_SAF_CM/CLARA_AVHRR/V002) is based on observations from the series of AVHRR satellite instruments. CLARA provides daily- and monthly-averaged global data of the solar irradiance (SIS) from 1982 to 2015 with a spatial resolution of 0.25°. In addition to the solar surface

  20. Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes

    Science.gov (United States)

    O'Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul

    2016-08-01

    Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact. This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface.

  1. Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes

    International Nuclear Information System (INIS)

    O’Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul

    2016-01-01

    Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact.This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface. (paper)

  2. AGGLOMERATIVE CLUSTERING OF SOUND RECORD SPEECH SEGMENTS BASED ON BAYESIAN INFORMATION CRITERION

    Directory of Open Access Journals (Sweden)

    O. Yu. Kydashev

    2013-01-01

    Full Text Available This paper presents the detailed description of agglomerative clustering system implementation for speech segments based on Bayesian information criterion. Numerical experiment results with different acoustic features, as well as the full and diagonal covariance matrices application are given. The error rate DER equal to 6.4% for audio records of radio «Svoboda» was achieved by means of designed system.

  3. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    Science.gov (United States)

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant

  4. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  5. Seismic Response Analysis and Design of Structure with Base Isolation

    International Nuclear Information System (INIS)

    Rosko, Peter

    2010-01-01

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  6. Connectivity inference from neural recording data: Challenges, mathematical bases and research directions.

    Science.gov (United States)

    Magrans de Abril, Ildefons; Yoshimoto, Junichiro; Doya, Kenji

    2018-06-01

    This article presents a review of computational methods for connectivity inference from neural activity data derived from multi-electrode recordings or fluorescence imaging. We first identify biophysical and technical challenges in connectivity inference along the data processing pipeline. We then review connectivity inference methods based on two major mathematical foundations, namely, descriptive model-free approaches and generative model-based approaches. We investigate representative studies in both categories and clarify which challenges have been addressed by which method. We further identify critical open issues and possible research directions. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  7. Archetype-based data warehouse environment to enable the reuse of electronic health record data.

    Science.gov (United States)

    Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G

    2015-09-01

    The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world

  8. Factors that influence the efficiency of beef and dairy cattle recording system in Kenya: A SWOT-AHP analysis.

    Science.gov (United States)

    Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J

    2011-01-01

    Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.

  9. Relationship between road traffic accidents and conflicts recorded by drive recorders.

    Science.gov (United States)

    Lu, Guangquan; Cheng, Bo; Kuzumaki, Seigo; Mei, Bingsong

    2011-08-01

    Road traffic conflicts can be used to estimate the probability of accident occurrence, assess road safety, or evaluate road safety programs if the relationship between road traffic accidents and conflicts is known. To this end, we propose a model for the relationship between road traffic accidents and conflicts recorded by drive recorders (DRs). DRs were installed in 50 cars in Beijing to collect records of traffic conflicts. Data containing 1366 conflicts were collected in 193 days. The hourly distributions of conflicts and accidents were used to model the relationship between accidents and conflicts. To eliminate time series and base number effects, we defined and used 2 parameters: average annual number of accidents per 10,000 vehicles per hour and average number of conflicts per 10,000 vehicles per hour. A model was developed to describe the relationship between the two parameters. If A(i) = average annual number of accidents per 10,000 vehicles per hour at hour i, and E(i) = average number of conflicts per 10,000 vehicles per hour at hour i, the relationship can be expressed as [Formula in text] (α>0, β>0). The average number of traffic accidents increases as the number of conflicts rises, but the rate of increase decelerates as the number of conflicts increases further. The proposed model can describe the relationship between road traffic accidents and conflicts in a simple manner. According to our analysis, the model fits the present data.

  10. Disturbance recording system

    International Nuclear Information System (INIS)

    Chandra, A.K.; Deshpande, S.V.; Mayya, A.; Vaidya, U.W.; Premraj, M.K.; Patil, N.B.

    1994-01-01

    A computerized system for disturbance monitoring, recording and display has been developed for use in nuclear power plants and is versatile enough to be used where ever a large number of parameters need to be recorded, e.g. conventional power plants, chemical industry etc. The Disturbance Recording System (DRS) has been designed to continuously monitor a process plant and record crucial parameters. The DRS provides a centralized facility to monitor and continuously record 64 process parameters scanned every 1 sec for 5 days. The system also provides facility for storage of 64 parameters scanned every 200 msec during 2 minutes prior to and 3 minutes after a disturbance. In addition the system can initiate, on demand, the recording of 8 parameters at a fast rate of every 5 msec for a period of 5 sec. and thus act as a visicorder. All this data is recorded in non-volatile memory and can be displayed, printed/plotted and used for subsequent analysis. Since data can be stored densely on floppy disks, the volume of space required for archival storage is also low. As a disturbance recorder, the DRS allows the operator to view the state of the plant prior to occurrence of the disturbance and helps in identifying the root cause. (author). 10 refs., 7 figs

  11. The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine.

    Directory of Open Access Journals (Sweden)

    Yasaira Rodriguez Torres

    Full Text Available This study aimed to determine the role of electronic health record software in resident education by evaluating documentation of 30 elements extracted from the American Academy of Ophthalmology Dry Eye Syndrome Preferred Practice Pattern. The Kresge Eye Institute transitioned to using electronic health record software in June 2013. We evaluated the charts of 331 patients examined in the resident ophthalmology clinic between September 1, 2011, and March 31, 2014, for an initial evaluation for dry eye syndrome. We compared documentation rates for the 30 evidence-based elements between electronic health record chart note templates among the ophthalmology residents. Overall, significant changes in documentation occurred when transitioning to a new version of the electronic health record software with average compliance ranging from 67.4% to 73.6% (p 90% in 13 elements while Electronic Health Record B had high compliance (>90% in 11 elements. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin examination, contact lens wear, and smoking exposure. Significant differences in documentation were correlated with electronic health record template design rather than individual resident or residents' year in training. Our results show that electronic health record template design influences documentation across all resident years. Decreased documentation likely results from "mouse click fatigue" as residents had to access multiple dialog boxes to complete documentation. These findings highlight the importance of EHR template design to improve resident documentation and integration of evidence-based medicine into their clinical notes.

  12. Audit of Medical Records of Shahid Madani Hospital

    Directory of Open Access Journals (Sweden)

    Mohammad farough-khosravi

    2016-12-01

    Full Text Available Background and Objectives: Evaluation of the quality of services and provided cares through comparing them with existing standards in order to identify and prioritize problems and trying to fix them are important steps in the audit of clinical functions. This study aimed to improve the quality of performance of medical records registrations about patients admitted to hospital Shahid Madani and deals with the audit of records listed them. Material and Methods: To perform this study, data were collected using researcher checklist. Target data of 30 medical records were gathered. We used software package of Mini Tab and SPSS to develop process statistical control charts and for statistical analysis of data, respectively. Results: By plotting control charts, we determined three specific reasons in the ADMISSION AND DISCHARGE SUMMARY SHEET, four specific reasons in the SUMMARY SHEET, and three specific reasons in CONSULTATION REQUEST SHEET. The lack of the standard form (on-delivered copies of a summary form “with 90%”, lack of the main form in patient's clinical record “with 83.3%”, lack of the patient's procedure “with 73.3%”are ranked as the most defects in SUMMARY SHEET. In the CONSULTATION REQUEST SHEET, failure to comply with doctor's stamp and signature standard with “20%” has highest percentage of defects. In the ADMISSION AND DISCHARGE SUMMARY SHEET nonconformity of standard records, patient's duration of stay “100%”, coding of diseases based on ICD"100%", recording of patient number based on signs and symptoms "93.3%", usingthe abbreviations to record the recognitions "93.3%" have highest percentage of defects respectively. Conclusion: Based on the results of this study and noting that studied standards of process statistical control charts are in the range of control, the quality of standards and the documentations of  the records

  13. Techniques for extracting single-trial activity patterns from large-scale neural recordings

    Science.gov (United States)

    Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V

    2008-01-01

    Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826

  14. Controlled dissemination of Electronic Medical Records

    NARCIS (Netherlands)

    van 't Noordende, G.

    2011-01-01

    Building upon a security analysis of the Dutch electronic patient record system, this paper describes an approach to construct a fully decentralized patient record system, using controlled disclosure of references to medical records. This paper identifies several paths that can be used to disclose

  15. Prevalence and etiology of false normal aEEG recordings in neonatal hypoxic-ischaemic encephalopathy

    OpenAIRE

    Marics, Gábor; Csekő, Anna; Vásárhelyi, Barna; Zakariás, Dávid; Schuster, György; Szabó, Miklós

    2013-01-01

    Background Amplitude-integrated electroencephalography (aEEG) is a useful tool to determine the severity of neonatal hypoxic-ischemic encephalopathy (HIE). Our aim was to assess the prevalence and study the origin of false normal aEEG recordings based on 85 aEEG recordings registered before six hours of age. Methods Raw EEG recordings were reevaluated retrospectively with Fourier analysis to identify and describe the frequency patterns of the raw EEG signal, in cases with inconsistent aEEG re...

  16. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review

    Directory of Open Access Journals (Sweden)

    Martínez-García Ana I

    2010-10-01

    Full Text Available Abstract Background The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. Methods This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. Results The review identifies two main issues: 1 a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2 the definition of a base for the design of a conceptual

  17. A knowledge-based taxonomy of critical factors for adopting electronic health record systems by physicians: a systematic literature review.

    Science.gov (United States)

    Castillo, Víctor H; Martínez-García, Ana I; Pulido, J R G

    2010-10-15

    The health care sector is an area of social and economic interest in several countries; therefore, there have been lots of efforts in the use of electronic health records. Nevertheless, there is evidence suggesting that these systems have not been adopted as it was expected, and although there are some proposals to support their adoption, the proposed support is not by means of information and communication technology which can provide automatic tools of support. The aim of this study is to identify the critical adoption factors for electronic health records by physicians and to use them as a guide to support their adoption process automatically. This paper presents, based on the PRISMA statement, a systematic literature review in electronic databases with adoption studies of electronic health records published in English. Software applications that manage and process the data in the electronic health record have been considered, i.e.: computerized physician prescription, electronic medical records, and electronic capture of clinical data. Our review was conducted with the purpose of obtaining a taxonomy of the physicians main barriers for adopting electronic health records, that can be addressed by means of information and communication technology; in particular with the information technology roles of the knowledge management processes. Which take us to the question that we want to address in this work: "What are the critical adoption factors of electronic health records that can be supported by information and communication technology?". Reports from eight databases covering electronic health records adoption studies in the medical domain, in particular those focused on physicians, were analyzed. The review identifies two main issues: 1) a knowledge-based classification of critical factors for adopting electronic health records by physicians; and 2) the definition of a base for the design of a conceptual framework for supporting the design of knowledge-based

  18. Based on records of Three Gorge Telemetric Seismic Network to analyze Vibration process of micro fracture of rock landslide

    Science.gov (United States)

    WANG, Q.

    2017-12-01

    Used the finite element analysis software GeoStudio to establish vibration analysis model of Qianjiangping landslide, which locates at the Three Gorges Reservoir area. In QUAKE/W module, we chosen proper Dynamic elasticity modulus and Poisson's ratio of soil layer and rock stratum. When loading, we selected the waveform data record of Three Gorge Telemetric Seismic Network as input ground motion, which includes five rupture events recorded of Lujiashan seismic station. In dynamic simulating, we mainly focused on sliding process when the earthquake date record was applied. The simulation result shows that Qianjiangping landslide wasn't not only affected by its own static force, but also experienced the dynamic process of micro fracture-creep-slip rupture-creep-slip.it provides a new approach for the early warning feasibility of rock landslide in future research.

  19. One positive impact of health care reform to physicians: the computer-based patient record.

    Science.gov (United States)

    England, S P

    1993-11-01

    The health care industry is an information-dependent business that will require a new generation of health information systems if successful health care reform is to occur. We critically need integrated clinical management information systems to support the physician and related clinicians at the direct care level, which in turn will have linkages with secondary users of health information such as health payors, regulators, and researchers. The economic dependence of health care industry on the CPR cannot be underestimated, says Jeffrey Ritter. He sees the U.S. health industry as about to enter a bold new age where our records are electronic, our computers are interconnected, and our money is nothing but pulses running across the telephone lines. Hence the United States is now in an age of electronic commerce. Clinical systems reform must begin with the community-based patient chart, which is located in the physician's office, the hospital, and other related health care provider offices. A community-based CPR and CPR system that integrates all providers within a managed care network is the most logical step since all health information begins with the creation of a patient record. Once a community-based CPR system is in place, the physician and his or her clinical associates will have a common patient record upon which all direct providers have access to input and record patient information. Once a community-level CPR system is in place with a community provider network, each physician will have available health information and data processing capability that will finally provide real savings in professional time and effort. Lost patient charts will no longer be a problem. Data input and storage of health information would occur electronically via transcripted text, voice, and document imaging. All electronic clinical information, voice, and graphics could be recalled at any time and transmitted to any terminal location within the health provider network. Hence

  20. Selecting a summation base in diffraction transformation of seismic recordings (in an example of Northern Sakhalin)

    Energy Technology Data Exchange (ETDEWEB)

    Bulatov, M.G.; Telegin, A.N.

    1984-01-01

    The effect of the dimensions of a processing base on the results of diffraction transformation of seismic recordings is examined. A formula is cited for rating the optimal summation base on the basis of a proposed range of slant angles of the reflecting boundaries. The recommendations for selecting a processing base are confirmed by factual material.

  1. Groundwater potentiality mapping using geoelectrical-based aquifer hydraulic parameters: A GIS-based multi-criteria decision analysis modeling approach

    Directory of Open Access Journals (Sweden)

    Kehinde Anthony Mogaji Hwee San Lim

    2017-01-01

    Full Text Available This study conducted a robust analysis on acquired 2D resistivity imaging data and borehole pumping test records to optimize groundwater potentiality mapping in Perak province, Malaysia using derived aquifer hydraulic properties. The transverse resistance (TR parameter was determined from the interpreted 2D resistivity imaging data by applying the Dar-Zarrouk parameter equation. Linear regression and GIS techniques were used to regress the estimated values for TR parameters with the aquifer transmissivity values extracted from the geospatially produced BPT records-based aquifer transmissivity map to develop the aquifer transmissivity parameter predictive (ATPP model. The reliability evaluated ATPP model using the Theil inequality coefficient measurement approach was used to establish geoelectrical-based hydraulic parameters (GHP modeling equations for the modeling of transmissivity (Tr, hydraulic conductivity (K, storativity (St, and hydraulic diffusivity (D properties. The applied GHP modeling equation results to the delineated aquifer media was used to produce aquifer potential conditioning factor maps for Tr, K, St, and D. The maps were modeled to develop an aquifer potential mapping index (APMI model via applying the multi-criteria decision analysis-analytic hierarchy process principle. The area groundwater reservoir productivity potential model map produced based on the processed APMI model estimates in the GIS environment was found to be 71% accurate. This study establishes a good alternative approach to determine aquifer hydraulic parameters even in areas where pumping test information is unavailable using a cost effective geophysical data. The produced map can be explored for hydrological decision making.

  2. Use and Characteristics of Electronic Health Record Systems among Office-Based Physician Practices: United States, ...

    Science.gov (United States)

    ... the National Technical Information Service NCHS Use and Characteristics of Electronic Health Record Systems Among Office-based ... physicians that collects information on physician and practice characteristics, including the adoption and use of EHR systems. ...

  3. BASE Temperature Data Record (TDR) from the SSM/I and SSMIS Sensors, CSU Version 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BASE Temperature Data Record (TDR) dataset from Colorado State University (CSU) is a collection of the raw unprocessed antenna temperature data that has been...

  4. Analysis on the Correlation of Traffic Flow in Hainan Province Based on Baidu Search

    Science.gov (United States)

    Chen, Caixia; Shi, Chun

    2018-03-01

    Internet search data records user’s search attention and consumer demand, providing necessary database for the Hainan traffic flow model. Based on Baidu Index, with Hainan traffic flow as example, this paper conduct both qualitative and quantitative analysis on the relationship between search keyword from Baidu Index and actual Hainan tourist traffic flow, and build multiple regression model by SPSS.

  5. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re

  6. Analyzing the reliability of volcanic and archeomagnetic data by comparison with historical records

    Science.gov (United States)

    Arneitz, Patrick; Egli, Ramon; Leonhardt, Roman

    2017-04-01

    Records of the past geomagnetic field are obtained from historical observations (direct records) on the one hand, and by the magnetization acquired by archeological artifacts, rocks and sediments (indirect records) on the other hand. Indirect records are generally less reliable than direct ones due to recording mechanisms that cannot be fully reproduced in the laboratory, age uncertainties and alteration problems. Therefore, geomagnetic field modeling approaches must deal with random and systematic errors of field values and age estimates that are hard to assess. Here, we present a new approach to investigate the reliability of volcanic and archeomagnetic data, which is based on comparisons with historical records. Temporal and spatial mismatches between data are handled by the implementation of weighting functions and error estimates derived from a stochastic model of secular variation. Furthermore, a new strategy is introduced for the statistical analysis of inhomogeneous and internally correlated data sets. Application of these new analysis tools to an extended database including direct and indirect records shows an overall good agreement between different record categories. Nevertheless, some biases exist between selected material categories, laboratory procedures, and quality checks/corrections (e.g., inclination shallowing of volcanic records). These findings can be used to obtain a better understanding of error sources affecting indirect records, thereby facilitating more reliable reconstructions of the geomagnetic past.

  7. Electronic tools for health information exchange: an evidence-based analysis.

    Science.gov (United States)

    2013-01-01

    As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a

  8. Improved Overpressure Recording and Modeling for Near-Surface Explosion Forensics

    Science.gov (United States)

    Kim, K.; Schnurr, J.; Garces, M. A.; Rodgers, A. J.

    2017-12-01

    The accurate recording and analysis of air-blast acoustic waveforms is a key component of the forensic analysis of explosive events. Smartphone apps can enhance traditional technologies by providing scalable, cost-effective ubiquitous sensor solutions for monitoring blasts, undeclared activities, and inaccessible facilities. During a series of near-surface chemical high explosive tests, iPhone 6's running the RedVox infrasound recorder app were co-located with high-fidelity Hyperion overpressure sensors, allowing for direct comparison of the resolution and frequency content of the devices. Data from the traditional sensors is used to characterize blast signatures and to determine relative iPhone microphone amplitude and phase responses. A Wiener filter based source deconvolution method is applied, using a parameterized source function estimated from traditional overpressure sensor data, to estimate system responses. In addition, progress on a new parameterized air-blast model is presented. The model is based on the analysis of a large set of overpressure waveforms from several surface explosion test series. An appropriate functional form with parameters determined empirically from modern air-blast and acoustic data will allow for better parameterization of signals and the improved characterization of explosive sources.

  9. Rehabilitation System based on the Use of Biomechanical Analysis and Videogames through the Kinect Sensor

    Directory of Open Access Journals (Sweden)

    John E. Muñoz-Cardona

    2013-11-01

    Full Text Available This paper presents development of a novel system for physical rehabilitation of patients with multiple pathologies, through dynamic with exercise videogames (exergames and analysis of the movements of patients using developed software. This system is based on the use of the Kinect sensor for both purposes: amusing the patient in therapy through of specialist exergames and provide a tool to record and analyze MoCap data taken through the Kinect sensor and processed using biomechanical analysis through Euler angles. All interactive system is installed in a rehabilitation center and works with different pathologies (stroke, IMOC, craneoencephallic trauma, etc., patients interact with the platform while the specialist records data for later analysis, which is performed by software designed for this purpose. The motion graphics are shown in the sagittal, frontal and rotationalplanefrom20 points distributed in the body. The final system is portable, non-invasive, inexpensive, natural interaction with the patient and easily implemented for medical purposes.

  10. The environmental correlates of overall and neighborhood based recreational walking (a cross-sectional analysis of the RECORD Study).

    Science.gov (United States)

    Chaix, Basile; Simon, Chantal; Charreire, Hélène; Thomas, Frédérique; Kestens, Yan; Karusisi, Noëlla; Vallée, Julie; Oppert, Jean-Michel; Weber, Christiane; Pannier, Bruno

    2014-02-21

    Preliminary evidence suggests that recreational walking has different environmental determinants than utilitarian walking. However, previous studies are limited in their assessment of environmental exposures and recreational walking and in the applied modeling strategies. Accounting for individual sociodemographic profiles and weather over the walking assessment period, the study examined whether numerous street network-based neighborhood characteristics related to the sociodemographic, physical, service, social-interactional, and symbolic environments were associated with overall recreational walking and recreational walking in one's residential neighborhood and could explain their spatial distribution. Based on the RECORD Cohort Study (Paris region, France, n=7105, 2007-2008 data), multilevel-spatial regression analyses were conducted to investigate environmental factors associated with recreational walking (evaluated by questionnaire at baseline). A risk score approach was applied to quantify the overall disparities in recreational walking that were predicted by the environmental determinants. Sixty-nine percent of the participants reported recreational walking over the past 7 days. Their mean reported recreational walking time was 3h 31mn. After individual-level adjustment, a higher neighborhood education, a higher density of destinations, green and open spaces of quality, and the absence of exposure to air traffic were associated with higher odds of recreational walking and/or a higher recreational walking time in one's residential neighborhood. As the overall disparities that were predicted by these environmental factors, the odds of reporting recreational walking and the odds of a higher recreational walking time in one's neighborhood were, respectively, 1.59 [95% confidence interval (CI): 1.56, 1.62] times and 1.81 (95% CI: 1.73, 1.87) times higher in the most vs. the least supportive environments (based on the quartiles). Providing green/open spaces of

  11. Objectively recorded physical activity in early pregnancy: a multiethnic population-based study.

    Science.gov (United States)

    Berntsen, S; Richardsen, K R; Mørkrid, K; Sletner, L; Birkeland, K I; Jenum, A K

    2014-06-01

    This study aimed to compare objectively recorded physical activity (PA) levels and walking steps among pregnant women. Cross-sectional data from a multiethnic cohort (n = 823) of pregnant women consisting of 44% from Western countries, 24% from South Asia, 14% from Middle East, and 18% from other countries. PA and steps were recorded by the activity monitor SenseWear™ Pro3 Armband. A total of 678 women were included in the analysis. Western women walked significantly more steps and had higher moderate-to-vigorous-intensity physical activity (MVPA) levels compared with South Asian women per weekday and weekend day. Interaction terms (P = 0.008) between ethnicity (Western vs South Asian) and parity, and education, respectively, were identified: having ≥ 1 children was positively associated with steps during weekends in South Asians in contrast to Western women. Having pregnancy and South Asian women without children and with higher education may have an elevated risk for an inactive lifestyle during pregnancy. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. GEPAS, a web-based tool for microarray data analysis and interpretation

    Science.gov (United States)

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  13. What records have we been breaking?

    Science.gov (United States)

    Bartholow, J.M.; Milhous, R.

    2002-01-01

    "Today was another record-breaking day," the evening radio or television declares. High temperatures, low temperatures, floods, drought - take your choice. But how can we put these pronouncements in perspective? What do they really mean?We present two types of information in this article: 1) an analysis of daily air temperature and precipitation for Fort Collins and 2) an analysis of annual precipitation for Fort Collins. Each analysis provides a different meaning to the statement about a record-breaking day or year.

  14. Extended seizure detection algorithm for intracranial EEG recordings

    DEFF Research Database (Denmark)

    Kjaer, T. W.; Remvig, L. S.; Henriksen, J.

    2010-01-01

    Objective: We implemented and tested an existing seizure detection algorithm for scalp EEG (sEEG) with the purpose of improving it to intracranial EEG (iEEG) recordings. Method: iEEG was obtained from 16 patients with focal epilepsy undergoing work up for resective epilepsy surgery. Each patient...... had 4 or 5 recorded seizures and 24 hours of non-ictal data were used for evaluation. Data from three electrodes placed at the ictal focus were used for the analysis. A wavelet based feature extraction algorithm delivered input to a support vector machine (SVM) classifier for distinction between ictal...... and non-ictal iEEG. We compare our results to a method published by Shoeb in 2004. While the original method on sEEG was optimal with the use of only four subbands in the wavelet analysis, we found that better seizure detection could be made if all subbands were used for iEEG. Results: When using...

  15. Reclink: aplicativo para o relacionamento de bases de dados, implementando o método probabilistic record linkage Reclink: an application for database linkage implementing the probabilistic record linkage method

    Directory of Open Access Journals (Sweden)

    Kenneth R. de Camargo Jr.

    2000-06-01

    Full Text Available Apresenta-se um sistema de relacionamento de bases de dados fundamentado na técnica de relacionamento probabilístico de registros, desenvolvido na linguagem C++ com o ambiente de programação Borland C++ Builder versão 3.0. O sistema foi testado a partir de fontes de dados de diferentes tamanhos, tendo sido avaliado em tempo de processamento e sensibilidade para a identificação de pares verdadeiros. O tempo gasto com o processamento dos registros foi menor quando se empregou o programa do que ao ser realizado manualmente, em especial, quando envolveram bases de maior tamanho. As sensibilidades do processo manual e do processo automático foram equivalentes quando utilizaram bases com menor número de registros; entretanto, à medida que as bases aumentaram, percebeu-se tendência de diminuição na sensibilidade apenas no processo manual. Ainda que em fase inicial de desenvolvimento, o sistema apresentou boa performance tanto em velocidade quanto em sensibilidade. Embora a performance dos algoritmos utilizados tenha sido satisfatória, o objetivo é avaliar outras rotinas, buscando aprimorar o desempenho do sistema.This paper presents a system for database linkage based on the probabilistic record linkage technique, developed in the C++ language with the Borland C++ Builder version 3.0 programming environment. The system was tested in the linkage of data sources of different sizes, evaluated both in terms of processing time and sensitivity for identifying true record pairs. Significantly less time was spent in record processing when the program was used, as compared to manual processing, especially in situations where larger databases were used. Manual and automatic processes had equivalent sensitivities in situations where we used databases with fewer records. However, as the number of records grew we noticed a clear reduction in the sensitivity of the manual process, but not in the automatic one. Although in its initial stage of

  16. Using Web-Based Questionnaires and Obstetric Records to Assess General Health Characteristics Among Pregnant Women: A Validation Study.

    Science.gov (United States)

    van Gelder, Marleen M H J; Schouten, Naomi P E; Merkus, Peter J F M; Verhaak, Chris M; Roeleveld, Nel; Roukema, Jolt

    2015-06-16

    Self-reported medical history information is included in many studies. However, data on the validity of Web-based questionnaires assessing medical history are scarce. If proven to be valid, Web-based questionnaires may provide researchers with an efficient means to collect data on this parameter in large populations. The aim of this study was to assess the validity of a Web-based questionnaire on chronic medical conditions, allergies, and blood pressure readings against obstetric records and data from general practitioners. Self-reported questionnaire data were compared with obstetric records for 519 pregnant women participating in the Dutch PRegnancy and Infant DEvelopment (PRIDE) Study from July 2011 through November 2012. These women completed Web-based questionnaires around their first prenatal care visit and in gestational weeks 17 and 34. We calculated kappa statistics (κ) and the observed proportions of positive and negative agreement between the baseline questionnaire and obstetric records for chronic conditions and allergies. In case of inconsistencies between these 2 data sources, medical records from the woman's general practitioner were consulted as the reference standard. For systolic and diastolic blood pressure, intraclass correlation coefficients (ICCs) were calculated for multiple data points. Agreement between the baseline questionnaire and the obstetric record was substantial (κ=.61) for any chronic condition and moderate for any allergy (κ=.51). For specific conditions, we found high observed proportions of negative agreement (range 0.88-1.00) and on average moderate observed proportions of positive agreement with a wide range (range 0.19-0.90). Using the reference standard, the sensitivity of the Web-based questionnaire for chronic conditions and allergies was comparable to or even better than the sensitivity of the obstetric records, in particular for migraine (0.90 vs 0.40, P=.02), asthma (0.86 vs 0.61, P=.04), inhalation allergies (0

  17. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    Science.gov (United States)

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  18. Anonymization of Electronic Medical Records to Support Clinical Analysis

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2013-01-01

    Anonymization of Electronic Medical Records to Support Clinical Analysis closely examines the privacy threats that may arise from medical data sharing, and surveys the state-of-the-art methods developed to safeguard data against these threats. To motivate the need for computational methods, the book first explores the main challenges facing the privacy-protection of medical data using the existing policies, practices and regulations. Then, it takes an in-depth look at the popular computational privacy-preserving methods that have been developed for demographic, clinical and genomic data sharing, and closely analyzes the privacy principles behind these methods, as well as the optimization and algorithmic strategies that they employ. Finally, through a series of in-depth case studies that highlight data from the US Census as well as the Vanderbilt University Medical Center, the book outlines a new, innovative class of privacy-preserving methods designed to ensure the integrity of transferred medical data for su...

  19. Acoustic analysis of snoring sounds recorded with a smartphone according to obstruction site in OSAS patients.

    Science.gov (United States)

    Koo, Soo Kweon; Kwon, Soon Bok; Kim, Yang Jae; Moon, J I Seung; Kim, Young Jun; Jung, Sung Hoon

    2017-03-01

    Snoring is a sign of increased upper airway resistance and is the most common symptom suggestive of obstructive sleep apnea. Acoustic analysis of snoring sounds is a non-invasive diagnostic technique and may provide a screening test that can determine the location of obstruction sites. We recorded snoring sounds according to obstruction level, measured by DISE, using a smartphone and focused on the analysis of formant frequencies. The study group comprised 32 male patients (mean age 42.9 years). The spectrogram pattern, intensity (dB), fundamental frequencies (F 0 ), and formant frequencies (F 1 , F 2 , and F 3 ) of the snoring sounds were analyzed for each subject. On spectrographic analysis, retropalatal level obstruction tended to produce sharp and regular peaks, while retrolingual level obstruction tended to show peaks with a gradual onset and decay. On formant frequency analysis, F 1 (retropalatal level vs. retrolingual level: 488.1 ± 125.8 vs. 634.7 ± 196.6 Hz) and F 2 (retropalatal level vs. retrolingual level: 1267.3 ± 306.6 vs. 1723.7 ± 550.0 Hz) of retrolingual level obstructions showed significantly higher values than retropalatal level obstruction (p smartphone can be effective for recording snoring sounds.

  20. Implementation of a next-generation electronic nursing records system based on detailed clinical models and integration of clinical practice guidelines.

    Science.gov (United States)

    Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook

    2013-12-01

    The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation system. The terminology server manages nursing narratives generated from entity-attribute-value triplets of detailed clinical models using a natural language generation system. The nursing documentation system provides nurses with a set of nursing narratives arranged around the recommendations extracted from clinical practice guidelines. An electronic nursing records system based on detailed clinical models and clinical practice guidelines was successfully implemented in a hospital in Korea. The next-generation electronic nursing records system can support nursing practice and nursing documentation, which in turn will improve data quality.

  1. Comparing a medical records-based and a claims-based index for measuring comorbidity in patients with lung or colon cancer.

    Science.gov (United States)

    Kehl, Kenneth L; Lamont, Elizabeth B; McNeil, Barbara J; Bozeman, Samuel R; Kelley, Michael J; Keating, Nancy L

    2015-05-01

    Ascertaining comorbid conditions in cancer patients is important for research and clinical quality measurement, and is particularly important for understanding care and outcomes for older patients and those with multi-morbidity. We compared the medical records-based ACE-27 index and the claims-based Charlson index in predicting receipt of therapy and survival for lung and colon cancer patients. We calculated the Charlson index using administrative data and the ACE-27 score using medical records for Veterans Affairs patients diagnosed with stage I/II non-small cell lung or stage III colon cancer from January 2003 to December 2004. We compared the proportion of patients identified by each index as having any comorbidity. We used multivariable logistic regression to ascertain the predictive power of each index regarding delivery of guideline-recommended therapies and two-year survival, comparing the c-statistic and the Akaike information criterion (AIC). Overall, 97.2% of lung and 90.9% of colon cancer patients had any comorbidity according to the ACE-27 index, versus 59.5% and 49.7%, respectively, according to the Charlson. Multivariable models including the ACE-27 index outperformed Charlson-based models when assessing receipt of guideline-recommended therapies, with higher c-statistics and lower AICs. Neither index was clearly superior in prediction of two-year survival. The ACE-27 index measured using medical records captured more comorbidity and outperformed the Charlson index measured using administrative data for predicting receipt of guideline-recommended therapies, demonstrating the potential value of more detailed comorbidity data. However, the two indices had relatively similar performance when predicting survival. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Fetal movement detection based on QRS amplitude variations in abdominal ECG recordings.

    Science.gov (United States)

    Rooijakkers, M J; de Lau, H; Rabotti, C; Oei, S G; Bergmans, J W M; Mischi, M

    2014-01-01

    Evaluation of fetal motility can give insight in fetal health, as a strong decrease can be seen as a precursor to fetal death. Typically, the assessment of fetal health by fetal movement detection relies on the maternal perception of fetal activity. The percentage of detected movements is strongly subject dependent and with undivided attention of the mother varies between 37% to 88%. Various methods to assist in fetal movement detection exist based on a wide spectrum of measurement techniques. However, these are typically unsuitable for ambulatory or long-term observation. In this paper, a novel method for fetal motion detection is presented based on amplitude and shape changes in the abdominally recorded fetal ECG. The proposed method has a sensitivity and specificity of 0.67 and 0.90, respectively, outperforming alternative fetal ECG-based methods from the literature.

  3. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  4. Pulling smarties out of a bag: a Headed Records analysis of children's recall of their own past beliefs.

    Science.gov (United States)

    Barreau, S; Morton, J

    1999-11-09

    The work reported provides an information processing account of young children's performance on the Smarties task (Perner, J., Leekam, S.R., & Wimmer, H. 1987, Three-year-olds' difficulty with false belief: the case for a conceptual deficit. British Journal of Developmental Psychology, 5, 125-137). In this task, a 3-year-old is shown a Smarties tube and asked about the supposed contents. The true contents, pencils, is then revealed, and the majority of 3-year-olds cannot recall their initial belief that the tube contained Smarties. The theoretical analysis, based on the Headed Records framework (Morton, J., Hammersley, R.J., & Bekerian, D.A. 1985, Headed records: a model for memory and its failures, Cognition, 20, 1-23), focuses on the computational conditions that are required to resolve the Smarties task; on the possible limitations in the developing memory system that may lead to a computational breakdown; and on ways of bypassing such limitations to ensure correct resolution. The design, motivated by this analysis, is a variation on Perner's Smarties task. Instead of revealing the tube's contents immediately after establishing the child's beliefs about it, these contents were then transferred to a bag and a (false) belief about the bag's contents established. Only then were the true contents of the bag revealed. The same procedure (different contents) was carried out a week later. As predicted children's performance was better (a) in the 'tube' condition; and (b) on the second test. Consistent with the proposed analysis, the data show that when the computational demands imposed by the original task are reduced, young children can and do remember what they had thought about the contents of the tube even after its true contents are revealed.

  5. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1999-05-01

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  6. Admission medical records made at night time have the same quality as day and evening time records.

    Science.gov (United States)

    Amirian, Ilda; Mortensen, Jacob F; Rosenberg, Jacob; Gögenur, Ismail

    2014-07-01

    A thorough and accurate admission medical record is an important tool in ensuring patient safety during the hospital stay. Surgeons' performance might be affected during night shifts due to sleep deprivation. The aim of the study was to assess the quality of admission medical records during day, evening and night time. A total of 1,000 admission medical records were collected from 2009 to 2013 based equally on four diagnoses: mechanical bowel obstruction, appendicitis, gallstone disease and gastrointestinal bleeding. The records were reviewed for errors by a pre-defined checklist based on Danish standards for admission medical records. The time of dictation for the medical record was registered. A total of 1,183 errors were found in 778 admission medical records made during day- and evening time, and 322 errors in 222 admission medical records from night time shifts. No significant overall difference in error was found in the admission medical records when day and evening values were compared to night values. Subgroup analyses made for all four diagnoses showed no difference in day and evening values compared with night time values. Night time deterioration was not seen in the quality of the medical records.

  7. Use of lecture recordings in dental education: assessment of status quo and recommendations.

    Science.gov (United States)

    Horvath, Zsuzsa; O'Donnell, Jean A; Johnson, Lynn A; Karimbux, Nadeem Y; Shuler, Charles F; Spallek, Heiko

    2013-11-01

    This research project was part of a planned initiative at the University of Pittsburgh School of Dental Medicine to incorporate lecture recordings as standard educational support technologies. The goal of an institutional survey was 1) to gather current data about how dental educators across the United States and Canada use lecture recordings; 2) determine dental educators' perceived value and outcomes of using lecture recordings; and 3) develop recommendations based on #1 and #2 for the dental education community. Of the sixty-six North American dental schools at the time of the study, forty-five schools responded to the survey, for a 68 percent response rate. Of the respondents, twenty-eight schools were found to currently conduct lecture recording; these comprised the study sample. This study focused on the dental schools' past experiences with lecture recording; thus, those not currently engaged in lecture recording were excluded from further analysis. The survey questions covered a wide range of topics, such as the scope of the lecture recording, logistics, instructional design considerations, outcomes related to student learning, evaluation and reception, barriers to lecture recording, and issues related to copyright and intellectual property. The literature review and results from the survey showed that no common guidelines for best practice were available regarding lecture recordings in dental education. The article concludes with some preliminary recommendations based on this study.

  8. SU-F-BRA-13: Knowledge-Based Treatment Planning for Prostate LDR Brachytherapy Based On Principle Component Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roper, J; Bradshaw, B; Godette, K; Schreibmann, E [Winship Cancer Institute of Emory University, Atlanta, GA (United States); Chanyavanich, V [Rocky Mountain Cancer Centers, Denver, CO (United States)

    2015-06-15

    Purpose: To create a knowledge-based algorithm for prostate LDR brachytherapy treatment planning that standardizes plan quality using seed arrangements tailored to individual physician preferences while being fast enough for real-time planning. Methods: A dataset of 130 prior cases was compiled for a physician with an active prostate seed implant practice. Ten cases were randomly selected to test the algorithm. Contours from the 120 library cases were registered to a common reference frame. Contour variations were characterized on a point by point basis using principle component analysis (PCA). A test case was converted to PCA vectors using the same process and then compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. The seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Computational time was recorded. Any subsequent modifications were recorded that required input from a treatment planner to achieve an acceptable plan. Results: The computational time required to register contours from a test case and evaluate PCA similarity across the library was approximately 10s. Five of the ten test cases did not require any seed additions, deletions, or moves to obtain an acceptable plan. The remaining five test cases required on average 4.2 seed modifications. The time to complete manual plan modifications was less than 30s in all cases. Conclusion: A knowledge-based treatment planning algorithm was developed for prostate LDR brachytherapy based on principle component analysis. Initial results suggest that this approach can be used to quickly create treatment plans that require few if any modifications by the treatment planner. In general, test case plans have seed arrangements which are very similar to prior cases, and thus are inherently tailored to physician preferences.

  9. Mobile personal health records for pregnancy monitoring functionalities: Analysis and potential.

    Science.gov (United States)

    Bachiri, Mariam; Idri, Ali; Fernández-Alemán, José Luis; Toval, Ambrosio

    2016-10-01

    Personal Health Records (PHRs) are a rapidly growing area of health information technology. PHR users are able to manage their own health data and communicate with doctors in order to improve healthcare quality and efficiency. Mobile PHR (mPHR) applications for mobile devices have obtained an interesting market quota since the appearance of more powerful mobile devices. These devices allow users to gain access to applications that used to be available only for personal computers. This paper analyzes the functionalities of mobile PHRs that are specific to pregnancy monitoring. A well-known Systematic Literature Review (SLR) protocol was used in the analysis process. A questionnaire was developed for this task, based on the rigorous study of scientific literature concerning pregnancy and applications available on the market, with 9 data items and 35 quality assessments. The data items contain calendars, pregnancy information, health habits, counters, diaries, mobile features, security, backup, configuration and architectural design. A total of 33 mPHRs for pregnancy monitoring, available for iOS and Android, were selected from Apple App store and Google Play store, respectively. The results show that none of the mPHRs selected met 100% of the functionalities analyzed in this paper. The highest score achieved was 77%, while the lowest was 17%. In this paper, these features are discussed and possible paths for future development of similar applications are proposed, which may lead to a more efficient use of smartphone capabilities. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  11. Record Values of a Pareto Distribution.

    Science.gov (United States)

    Ahsanullah, M.

    The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…

  12. Connectivity maps based analysis of EEG for the advanced diagnosis of schizophrenia attributes.

    Directory of Open Access Journals (Sweden)

    Zack Dvey-Aharon

    Full Text Available This article presents a novel connectivity analysis method that is suitable for multi-node networks such as EEG, MEG or EcOG electrode recordings. Its diagnostic power and ability to interpret brain states in schizophrenia is demonstrated on a set of 50 subjects that constituted of 25 healthy and 25 diagnosed with schizophrenia and treated with medication. The method can also be used for the automatic detection of schizophrenia; it exhibits higher sensitivity than state-of-the-art methods with no false positives. The detection is based on an analysis from a minute long pattern-recognition computer task. Moreover, this connectivity analysis leads naturally to an optimal choice of electrodes and hence to highly statistically significant results that are based on data from only 3-5 electrodes. The method is general and can be used for the diagnosis of other psychiatric conditions, provided an appropriate computer task is devised.

  13. Analysis of Driver Evasive Maneuvering Prior to Intersection Crashes Using Event Data Recorders.

    Science.gov (United States)

    Scanlon, John M; Kusano, Kristofer D; Gabler, Hampton C

    2015-01-01

    Intersection crashes account for over 4,500 fatalities in the United States each year. Intersection Advanced Driver Assistance Systems (I-ADAS) are emerging vehicle-based active safety systems that have the potential to help drivers safely navigate across intersections and prevent intersection crashes and injuries. The performance of an I-ADAS is expected to be highly dependent upon driver evasive maneuvering prior to an intersection crash. Little has been published, however, on the detailed evasive kinematics followed by drivers prior to real-world intersection crashes. The objective of this study was to characterize the frequency, timing, and kinematics of driver evasive maneuvers prior to intersection crashes. Event data recorders (EDRs) downloaded from vehicles involved in intersection crashes were investigated as part of NASS-CDS years 2001 to 2013. A total of 135 EDRs with precrash vehicle speed and braking application were downloaded to investigate evasive braking. A smaller subset of 59 EDRs that collected vehicle yaw rate was additionally analyzed to investigate evasive steering. Each vehicle was assigned to one of 3 precrash movement classifiers (traveling through the intersection, completely stopped, or rolling stop) based on the vehicle's calculated acceleration and observed velocity profile. To ensure that any significant steering input observed was an attempted evasive maneuver, the analysis excluded vehicles at intersections that were turning, driving on a curved road, or performing a lane change. Braking application at the last EDR-recorded time point was assumed to indicate evasive braking. A vehicle yaw rate greater than 4° per second was assumed to indicate an evasive steering maneuver. Drivers executed crash avoidance maneuvers in four-fifths of intersection crashes. A more detailed analysis of evasive braking frequency by precrash maneuver revealed that drivers performing complete or rolling stops (61.3%) braked less often than drivers

  14. Artificial earthquake record generation using cascade neural network

    Directory of Open Access Journals (Sweden)

    Bani-Hani Khaldoon A.

    2017-01-01

    Full Text Available This paper presents the results of using artificial neural networks (ANN in an inverse mapping problem for earthquake accelerograms generation. This study comprises of two parts: 1-D site response analysis; performed for Dubai Emirate at UAE, where eight earthquakes records are selected and spectral matching are performed to match Dubai response spectrum using SeismoMatch software. Site classification of Dubai soil is being considered for two classes C and D based on shear wave velocity of soil profiles. Amplifications factors are estimated to quantify Dubai soil effect. Dubai’s design response spectra are developed for site classes C & D according to International Buildings Code (IBC -2012. In the second part, ANN is employed to solve inverse mapping problem to generate time history earthquake record. Thirty earthquakes records and their design response spectrum with 5% damping are used to train two cascade forward backward neural networks (ANN1, ANN2. ANN1 is trained to map the design response spectrum to time history and ANN2 is trained to map time history records to the design response spectrum. Generalized time history earthquake records are generated using ANN1 for Dubai’s site classes C and D, and ANN2 is used to evaluate the performance of ANN1.

  15. Joint analysis of longitudinal feed intake and single recorded production traits in pigs using a novel horizontal model

    DEFF Research Database (Denmark)

    Shirali, M.; Strathe, A. B.; Mark, T.

    2017-01-01

    - and first-order Legendre polynomials of age on test, respectively. The fixed effect and random residual variance were estimated for each weekly FI trait. Residual feed intake (RFI) was derived from the conditional distribution of FI given the breeding values of ADG100 and LMP. The heritability of FI varied......A novel Horizontal model is presented for multitrait analysis of longitudinal traits through random regression analysis combined with single recorded traits. Weekly ADFI on test for Danish Duroc, Landrace, and Yorkshire boars were available from the national test station and were collected from 30...... to 100 kg BW. Single recorded production traits of ADG from birth to 30 kg BW (ADG30), ADG from 30 to 100 kg BW (ADG100), and lean meat percentage (LMP) were available from breeding herds or the national test station. The Horizontal model combined random regression analysis of feed intake (FI...

  16. An iPad and Android-based Application for Digitally Recording Geologic Field Data

    Science.gov (United States)

    Malinconico, L. L.; Sunderlin, D.; Liew, C.; Ho, A. S.; Bekele, K. A.

    2011-12-01

    Field experience is a significant component in most geology courses, especially sed/strat and structural geology. Increasingly, the spatial presentation, analysis and interpretation of geologic data is done using digital methodologies (GIS, Google Earth, stereonet and spreadsheet programs). However, students and professionals continue to collect field data manually on paper maps and in the traditional "orange field notebooks". Upon returning from the field, data are then manually transferred into digital formats for processing, mapping and interpretation. The transfer process is both cumbersome and prone to transcription error. In conjunction with the computer science department, we are in the process of developing an application (App) for iOS (the iPad) and Android platforms that can be used to digitally record data measured in the field. This is not a mapping program, but rather a way of bypassing the field book step to acquire digital data directly that can then be used in various analysis and display programs. Currently, the application allows the user to select from five different structural data situations: contact, bedding, fault, joints and "other". The user can define a folder for the collection and separation of data for each project. Observations are stored as individual records of field measurements in each folder. The exact information gathered depends on the nature of the observation, but common to all pages is the ability to log date, time, and lat/long directly from the tablet. Information like strike and dip are entered using scroll wheels and formation names are also entered using scroll wheels that access easy-to-modify lists of the area's stratigraphic units. This insures uniformity in the creation of the digital records from day-to-day and across field teams. Pictures can also be taken using the tablet's camera that are linked to each record. Once the field collection is complete the data (including images) can be easily exported to a .csv file

  17. Interpretation of Coronary Angiograms Recorded Using Google Glass: A Comparative Analysis.

    Science.gov (United States)

    Duong, Thao; Wosik, Jedrek; Christakopoulos, Georgios E; Martínez Parachini, José Roberto; Karatasakis, Aris; Tarar, Muhammad Nauman Javed; Resendes, Erica; Rangan, Bavana V; Roesle, Michele; Grodin, Jerrold; Abdullah, Shuaib M; Banerjee, Subhash; Brilakis, Emmanouil S

    2015-10-01

    Google Glass (Google, Inc) is a voice-activated, hands-free, optical head-mounted display device capable of taking pictures, recording videos, and transmitting data via wi-fi. In the present study, we examined the accuracy of coronary angiogram interpretation, recorded using Google Glass. Google Glass was used to record 15 angiograms with 17 major findings and the participants were asked to interpret those recordings on: (1) an iPad (Apple, Inc); or (2) a desktop computer. Interpretation was compared with the original angiograms viewed on a desktop. Ten physicians (2 interventional cardiologists and 8 cardiology fellows) participated. One point was assigned for each correct finding, for a maximum of 17 points. The mean angiogram interpretation score for Google Glass angiogram recordings viewed on an iPad or a desktop vs the original angiograms viewed on a desktop was 14.9 ± 1.1, 15.2 ± 1.8, and 15.9 ± 1.1, respectively (P=.06 between the iPad and the original angiograms, P=.51 between the iPad and recordings viewed on a desktop, and P=.43 between the recordings viewed on a desktop and the original angiograms). In a post-study survey, one of the 10 physicians (10%) was "neutral" with the quality of the recordings using Google Glass, 6 physicians (60%) were "somewhat satisfied," and 3 physicians (30%) were "very satisfied." This small pilot study suggests that the quality of coronary angiogram video recordings obtained using Google Glass may be adequate for recognition of major findings, supporting its expanding use in telemedicine.

  18. Mathematical analysis of running performance and world running records.

    Science.gov (United States)

    Péronnet, F; Thibault, G

    1989-07-01

    The objective of this study was to develop an empirical model relating human running performance to some characteristics of metabolic energy-yielding processes using A, the capacity of anaerobic metabolism (J/kg); MAP, the maximal aerobic power (W/kg); and E, the reduction in peak aerobic power with the natural logarithm of race duration T, when T greater than TMAP = 420 s. Accordingly, the model developed describes the average power output PT (W/kg) sustained over any T as PT = [S/T(1 - e-T/k2)] + 1/T integral of T O [BMR + B(1 - e-t/k1)]dt where S = A and B = MAP - BMR (basal metabolic rate) when T less than TMAP; and S = A + [Af ln(T/TMAP)] and B = (MAP - BMR) + [E ln(T/TMAP)] when T greater than TMAP; k1 = 30 s and k2 = 20 s are time constants describing the kinetics of aerobic and anaerobic metabolism, respectively, at the beginning of exercise; f is a constant describing the reduction in the amount of energy provided from anaerobic metabolism with increasing T; and t is the time from the onset of the race. This model accurately estimates actual power outputs sustained over a wide range of events, e.g., average absolute error between actual and estimated T for men's 1987 world records from 60 m to the marathon = 0.73%. In addition, satisfactory estimations of the metabolic characteristics of world-class male runners were made as follows: A = 1,658 J/kg; MAP = 83.5 ml O2.kg-1.min-1; 83.5% MAP sustained over the marathon distance. Application of the model to analysis of the evolution of A, MAP, and E, and of the progression of men's and women's world records over the years, is presented.

  19. Cost-benefit analysis of electronic medical record system at a tertiary care hospital.

    Science.gov (United States)

    Choi, Jong Soo; Lee, Woo Baik; Rhee, Poong-Lyul

    2013-09-01

    Although Electronic Medical Record (EMR) systems provide various benefits, there are both advantages and disadvantages regarding its cost-effectiveness. This study analyzed the economic effects of EMR systems using a cost-benefit analysis based on the differential costs of managerial accounting. Samsung Medical Center (SMC) is a general hospital in Korea that developed an EMR system for outpatients from 2006 to 2008. This study measured the total costs and benefits during an 8-year period after EMR adoption. The costs include the system costs of building the EMR and the costs incurred in smoothing its adoption. The benefits included cost reductions after its adoption and additional revenues from both remodeling of paper-chart storage areas and medical transcriptionists' contribution. The measured amounts were discounted by SMC's expected interest rate to calculate the net present value (NPV), benefit-cost ratio (BCR), and discounted payback period (DPP). During the analysis period, the cumulative NPV and the BCR were US$3,617 thousand and 1.23, respectively. The DPP was about 6.18 years. Although the adoption of an EMR resulted in overall growth in administrative costs, it is cost-effective since the cumulative NPV was positive. The positive NPV was attributed to both cost reductions and additional revenues. EMR adoption is not so attractive to management in that the DPP is longer than 5 years at 6.18 and the BCR is near 1 at 1.23. However, an EMR is a worthwhile investment, seeing that this study did not include any qualitative benefits and that the paper-chart system was cost-centric.

  20. Feasibility and performance evaluation of generating and recording visual evoked potentials using ambulatory Bluetooth based system.

    Science.gov (United States)

    Ellingson, Roger M; Oken, Barry

    2010-01-01

    Report contains the design overview and key performance measurements demonstrating the feasibility of generating and recording ambulatory visual stimulus evoked potentials using the previously reported custom Complementary and Alternative Medicine physiologic data collection and monitoring system, CAMAS. The methods used to generate visual stimuli on a PDA device and the design of an optical coupling device to convert the display to an electrical waveform which is recorded by the CAMAS base unit are presented. The optical sensor signal, synchronized to the visual stimulus emulates the brain's synchronized EEG signal input to CAMAS normally reviewed for the evoked potential response. Most importantly, the PDA also sends a marker message over the wireless Bluetooth connection to the CAMAS base unit synchronized to the visual stimulus which is the critical averaging reference component to obtain VEP results. Results show the variance in the latency of the wireless marker messaging link is consistent enough to support the generation and recording of visual evoked potentials. The averaged sensor waveforms at multiple CPU speeds are presented and demonstrate suitability of the Bluetooth interface for portable ambulatory visual evoked potential implementation on our CAMAS platform.

  1. Technological Change in a Small European Country: A Patent- Based Analysis for Greece

    OpenAIRE

    Maria Markatou

    2012-01-01

    The description of the development of the Greek technological change is the main aim of this paper. The analysis is based on the examination and elaboration of patent records and relies on the study of their technological content and their economic direction. Results show that technological change focuses on producing new technologies for the ‘agricultural sector’, ‘food’, ‘pharmaceuticals’, ‘metal shaping-separation’, ‘rubber-plastic products’, ‘building-housing’, ‘instruments’ and ‘electric...

  2. Unsupervised Idealization of Ion Channel Recordings by Minimum Description Length

    DEFF Research Database (Denmark)

    Gnanasambandam, Radhakrishnan; Nielsen, Morten S; Nicolai, Christopher

    2017-01-01

    and characterize an idealization algorithm based on Rissanen's Minimum Description Length (MDL) Principle. This method uses minimal assumptions and idealizes ion channel recordings without requiring a detailed user input or a priori assumptions about channel conductance and kinetics. Furthermore, we demonstrate...... that correlation analysis of conductance steps can resolve properties of single ion channels in recordings contaminated by signals from multiple channels. We first validated our methods on simulated data defined with a range of different signal-to-noise levels, and then showed that our algorithm can recover...... channel currents and their substates from recordings with multiple channels, even under conditions of high noise. We then tested the MDL algorithm on real experimental data from human PIEZO1 channels and found that our method revealed the presence of substates with alternate conductances....

  3. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  4. A chironomid-based record of temperature variability during the past 4000 years in northern China and its possible societal implications

    Science.gov (United States)

    Wang, Haipeng; Chen, Jianhui; Zhang, Shengda; Zhang, David D.; Wang, Zongli; Xu, Qinghai; Chen, Shengqian; Wang, Shijin; Kang, Shichang; Chen, Fahu

    2018-03-01

    Long-term, high-resolution temperature records which combine an unambiguous proxy and precise dating are rare in China. In addition, the societal implications of past temperature change on a regional scale have not been sufficiently assessed. Here, based on the modern relationship between chironomids and temperature, we use fossil chironomid assemblages in a precisely dated sediment core from Gonghai Lake to explore temperature variability during the past 4000 years in northern China. Subsequently, we address the possible regional societal implications of temperature change through a statistical analysis of the occurrence of wars. Our results show the following. (1) The mean annual temperature (TANN) was relatively high during 4000-2700 cal yr BP, decreased gradually during 2700-1270 cal yr BP and then fluctuated during the last 1270 years. (2) A cold event in the Period of Disunity, the Sui-Tang Warm Period (STWP), the Medieval Warm Period (MWP) and the Little Ice Age (LIA) can all be recognized in the paleotemperature record, as well as in many other temperature reconstructions in China. This suggests that our chironomid-inferred temperature record for the Gonghai Lake region is representative. (3) Local wars in Shanxi Province, documented in the historical literature during the past 2700 years, are statistically significantly correlated with changes in temperature, and the relationship is a good example of the potential societal implications of temperature change on a regional scale.

  5. Gating treatment delivery QA based on a surrogate motion analysis

    International Nuclear Information System (INIS)

    Chojnowski, J.; Simpson, E.

    2011-01-01

    Full text: To develop a methodology to estimate intrafractional target position error during a phase-based gated treatment. Westmead Cancer Care Centre is using respiratory correlated phase-based gated beam delivery in the treatment of lung cancer. The gating technique is managed by the Varian Real-time Position Management (RPM) system, version 1.7.5. A 6-dot block is placed on the abdomen of the patient and acts as a surrogate for the target motion. During a treatment session, the motion of the surrogate can be recorded by RPM application. Analysis of the surrogate motion file by in-house developed software allows the intrafractional error of the treatment session to be computed. To validate the computed error, a simple test that involves the introduction of deliberate errors is performed. Errors of up to 1.1 cm are introduced to a metal marker placed on a surrogate using the Varian Breathing Phantom. The moving marker was scanned in prospective mode using a GE Lightspeed 16 CT scanner. Using the CT images, a difference of the marker position with and without introduced errors is compared to the calculated errors based on the surrogate motion. The average and standard deviation of a difference between calculated target position errors and measured introduced artificial errors of the marker position is 0.02 cm and 0.07 cm respectively. Conclusion The calculated target positional error based on surrogate motion analysis provides a quantitative measure of intrafractional target positional errors during treatment. Routine QA for gated treatment using surrogate motion analysis is relatively quick and simple.

  6. A Socio-Technical Analysis of Patient Accessible Electronic Health Records.

    Science.gov (United States)

    Hägglund, Maria; Scandurra, Isabella

    2017-01-01

    In Sweden, and internationally, there is a movement towards increased transparency in healthcare including giving patients online access to their electronic health records (EHR). The purpose of this paper is to analyze the Swedish patient accessible EHR (PAEHR) service using a socio-technical framework, to increase the understanding of factors that influence the design, implementation, adoption and use of the service. Using the Sitting and Singh socio-technical framework as a basis for analyzing the Swedish PAEHR system and its context indicated that there are many stakeholders engaged in these types of services, with different driving forces and incentives that may influence the adoption and usefulness of PAEHR services. The analysis was useful in highlighting important areas that need to be further explored in evaluations of PAEHR services, and can act as a guide when planning evaluations of any PAEHR service.

  7. Unbiased analysis of geomagnetic data sets and comparison of historical data with paleomagnetic and archeomagnetic records

    Science.gov (United States)

    Arneitz, Patrick; Egli, Ramon; Leonhardt, Roman

    2017-03-01

    Reconstructions of the past geomagnetic field provide fundamental constraints for understanding the dynamics of the Earth's interior, as well as serving as basis for magnetostratigraphic and archeomagnetic dating tools. Such reconstructions, when extending over epochs that precede the advent of instrumental measurements, rely exclusively on magnetic records from archeological artifacts, and, further in the past, from rocks and sediments. The most critical component of such indirect records is field intensity because of possible biases introduced by material properties and by laboratory protocols, which do not reproduce exactly the original field recording conditions. Large biases are usually avoided by the use of appropriate checking procedures; however, smaller ones can remain undetected in individual studies and might significantly affect field reconstructions. We introduce a new general approach for analyzing geomagnetic databases in order to investigate the reliability of indirect records. This approach is based on the comparison of historical records with archeomagnetic and volcanic data, considering temporal and spatial mismatches with adequate weighting functions and error estimation. A good overall agreement is found between indirect records and historical measurements, while for several subsets systematic bias is detected (e.g., inclination shallowing of lava records). We also demonstrate that simple approaches to analyzing highly inhomogeneous and internally correlated paleomagnetic data sets can lead to incorrect conclusions about the efficiency of quality checks and corrections. Consistent criteria for selecting and weighting data are presented in this review and can be used to improve current geomagnetic field modeling techniques.

  8. Digital recording system

    International Nuclear Information System (INIS)

    Chandra, A.K.; Deshpande, S.V.; Iyer, A.; Vaidya, U.W.

    1987-01-01

    A large number of critical process parameters in nuclear power plants have hitherto been monitored using electromechanical chart recorders. The reducing costs of electronics systems have led to a trend towards modernizing power plant control rooms by computerizing all the panel instrumentation. As a first step, it has been decided to develop a digital recording system to record the values of 48 process parameters. The system as developed and described in this report is more than a replacement for recorders; it offers substantial advantages in terms of lower overall system cost, excellent time resolution, accurate data and absolute synchronization for correlated signals. The system provides high speed recording of 48 process parameters, maintains historical records and permits retrieval and display of archival information on a colour monitor, a plotter and a printer. It is implemented using a front end data acquisition unit connected on a serial link to a PC-XT computer with 20 MB Winchester. The system offers an extremely user friendly man machine interaction, based on a hierarchical paged menu driven scheme. Softwre development for this system has been carried out using the C language. (author). 9 figs

  9. Presentations and recorded keynotes of the First European Workshop on Latent Semantic Analysis in Technology Enhanced Learning

    NARCIS (Netherlands)

    Several

    2007-01-01

    Presentations and recorded keynotes at the 1st European Workshop on Latent Semantic Analysis in Technology-Enhanced Learning, March, 29-30, 2007. Heerlen, The Netherlands: The Open University of the Netherlands. Please see the conference website for more information:

  10. An integrated platform for simultaneous multi-well field potential recording and Fura-2-based calcium transient ratiometry in human induced pluripotent stem cell (hiPSC)-derived cardiomyocytes.

    Science.gov (United States)

    Rast, Georg; Weber, Jürgen; Disch, Christoph; Schuck, Elmar; Ittrich, Carina; Guth, Brian D

    2015-01-01

    Human induced pluripotent stem cell-derived cardiomyocytes are available from various sources and they are being evaluated for safety testing. Several platforms are available offering different assay principles and read-out parameters: patch-clamp and field potential recording, imaging or photometry, impedance measurement, and recording of contractile force. Routine use will establish which assay principle and which parameters best serve the intended purpose. We introduce a combination of field potential recording and calcium ratiometry from spontaneously beating cardiomyocytes as a novel assay providing a complementary read-out parameter set. Field potential recording is performed using a commercial multi-well multi-electrode array platform. Calcium ratiometry is performed using a fiber optic illumination and silicon avalanche photodetectors. Data condensation and statistical analysis are designed to enable statistical inference of differences and equivalence with regard to a solvent control. Simultaneous recording of field potentials and calcium transients from spontaneously beating monolayers was done in a nine-well format. Calcium channel blockers (e.g. nifedipine) and a blocker of calcium store release (ryanodine) can be recognized and discriminated based on the calcium transient signal. An agonist of L-type calcium channels, FPL 64176, increased and prolonged the calcium transient, whereas BAY K 8644, another L-type calcium channel agonist, had no effect. Both FPL 64176 and various calcium channel antagonists have chronotropic effects, which can be discriminated from typical "chronotropic" compounds, like (±)isoprenaline (positive) and arecaidine propargyl ester (negative), based on their effects on the calcium transient. Despite technical limitations in temporal resolution and exact matching of composite calcium transient with the field potential of a subset of cells, the combined recording platform enables a refined interpretation of the field potential

  11. Accuracy and Efficiency of Recording Pediatric Early Warning Scores Using an Electronic Physiological Surveillance System Compared With Traditional Paper-Based Documentation.

    Science.gov (United States)

    Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E D

    2017-05-01

    Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined "norm." Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team.

  12. CONTEXT BASED FOOD IMAGE ANALYSIS

    OpenAIRE

    He, Ye; Xu, Chang; Khanna, Nitin; Boushey, Carol J.; Delp, Edward J.

    2013-01-01

    We are developing a dietary assessment system that records daily food intake through the use of food images. Recognizing food in an image is difficult due to large visual variance with respect to eating or preparation conditions. This task becomes even more challenging when different foods have similar visual appearance. In this paper we propose to incorporate two types of contextual dietary information, food co-occurrence patterns and personalized learning models, in food image analysis to r...

  13. Polymer SU-8 Based Microprobes for Neural Recording and Drug Delivery

    Science.gov (United States)

    Altuna, Ane; Fernandez, Luis; Berganzo, Javier

    2015-06-01

    This manuscript makes a reflection about SU-8 based microprobes for neural activity recording and drug delivery. By taking advantage of improvements in microfabrication technologies and using polymer SU-8 as the only structural material, we developed several microprobe prototypes aimed to: a) minimize injury in neural tissue, b) obtain high-quality electrical signals and c) deliver drugs at a micrometer precision scale. Dedicated packaging tools have been developed in parallel to fulfill requirements concerning electric and fluidic connections, size and handling. After these advances have been experimentally proven in brain using in vivo preparation, the technological concepts developed during consecutive prototypes are discussed in depth now.

  14. POLYMER SU-8 BASED MICROPROBES FOR NEURAL RECORDING AND DRUG DELIVERY

    Directory of Open Access Journals (Sweden)

    Ane eAltuna

    2015-06-01

    Full Text Available This manuscript makes a reflection about SU-8 based microprobes for neural activity recording and drug delivery. By taking advantage of improvements in microfabrication technologies and using polymer SU-8 as the only structural material, we developed several microprobe prototypes aimed to: a minimize injury in neural tissue, b obtain high-quality electrical signals and c deliver drugs at a micrometer precision scale. Dedicated packaging tools have been developed in parallel to fulfill requirements concerning electric and fluidic connections, size and handling. After these advances have been experimentally proven in brain using in vivo preparation, the technological concepts developed during consecutive prototypes are discussed in depth now.

  15. The Analysis and Suppression of the spike noise in vibrator record

    Science.gov (United States)

    Jia, H.; Jiang, T.; Xu, X.; Ge, L.; Lin, J.; Yang, Z.

    2013-12-01

    During the seismic exploration with vibrator, seismic recording systems have often been affected by random spike noise in the background, which leads to strong data distortions as a result of the cross-correlation processing of the vibrator method. Partial or total loss of the desired seismic information is possible if no automatic spike reduction is available in the field prior to correlation of the field record. Generally speaking, original record of vibrator is uncorrelated data, in which the signal is non-wavelet form. In order to obtain the seismic record similar to explosive source, the signal of uncorrelated data needs to use the correlation algorithm to compress into wavelet form. The correlation process results in that the interference of spike in correlated data is not only being suppressed, but also being expanded. So the spike noise suppression of vibrator is indispensable. According to numerical simulation results, the effect of spike in the vibrator record is mainly affected by the amplitude and proportional points in the uncorrelated record. When the spike noise ratio in uncorrelated record reaches 1.5% and the average amplitude exceeds 200, it will make the SNR(signal-to-noise ratio) of the correlated record lower than 0dB, so that it is difficult to separate the signal. While the amplitude and ratio is determined by the intensity of background noise. Therefore, when the noise level is strong, in order to improve SNR of the seismic data, the uncorrelated record of vibrator need to take necessary steps to suppress spike noise. For the sake of reducing the influence of the spike noise, we need to make the detection and suppression of spike noise process for the uncorrelated record. Because vibrator works by inputting sweep signal into the underground long time, ideally, the peak and valley values of each trace have little change. On the basis of the peak and valley values, we can get a reference amplitude value. Then the spike can be detected and

  16. Gait Recognition Using Wearable Motion Recording Sensors

    Directory of Open Access Journals (Sweden)

    Davrondzhon Gafurov

    2009-01-01

    Full Text Available This paper presents an alternative approach, where gait is collected by the sensors attached to the person's body. Such wearable sensors record motion (e.g. acceleration of the body parts during walking. The recorded motion signals are then investigated for person recognition purposes. We analyzed acceleration signals from the foot, hip, pocket and arm. Applying various methods, the best EER obtained for foot-, pocket-, arm- and hip- based user authentication were 5%, 7%, 10% and 13%, respectively. Furthermore, we present the results of our analysis on security assessment of gait. Studying gait-based user authentication (in case of hip motion under three attack scenarios, we revealed that a minimal effort mimicking does not help to improve the acceptance chances of impostors. However, impostors who know their closest person in the database or the genders of the users can be a threat to gait-based authentication. We also provide some new insights toward the uniqueness of gait in case of foot motion. In particular, we revealed the following: a sideway motion of the foot provides the most discrimination, compared to an up-down or forward-backward directions; and different segments of the gait cycle provide different level of discrimination.

  17. A Real-Time Recording Model of Key Indicators for Energy Consumption and Carbon Emissions of Sustainable Buildings

    Directory of Open Access Journals (Sweden)

    Weiwei Wu

    2014-05-01

    Full Text Available Buildings’ sustainability is one of the crucial parts for achieving urban sustainability. Applied to buildings, life-cycle assessment encompasses the analysis and assessment of the environmental effects of building materials, components and assemblies throughout the entire life of the building construction, use and demolition. Estimate of carbon emissions is essential and crucial for an accurate and reasonable life-cycle assessment. Addressing the need for more research into integrating analysis of real-time and automatic recording of key indicators for a more accurate calculation and comparison, this paper aims to design a real-time recording model of these crucial indicators concerning the calculation and estimation of energy use and carbon emissions of buildings based on a Radio Frequency Identification (RFID-based system. The architecture of the RFID-based carbon emission recording/tracking system, which contains four functional layers including data record layer, data collection/update layer, data aggregation layer and data sharing/backup layer, is presented. Each of these layers is formed by RFID or network devices and sub-systems that operate at a specific level. In the end, a proof-of-concept system is developed to illustrate the implementation of the proposed architecture and demonstrate the feasibility of the design. This study would provide the technical solution for real-time recording system of building carbon emissions and thus is of great significance and importance to improve urban sustainability.

  18. A Real-Time Recording Model of Key Indicators for Energy Consumption and Carbon Emissions of Sustainable Buildings

    Science.gov (United States)

    Wu, Weiwei; Yang, Huanjia; Chew, David; Hou, Yanhong; Li, Qiming

    2014-01-01

    Buildings' sustainability is one of the crucial parts for achieving urban sustainability. Applied to buildings, life-cycle assessment encompasses the analysis and assessment of the environmental effects of building materials, components and assemblies throughout the entire life of the building construction, use and demolition. Estimate of carbon emissions is essential and crucial for an accurate and reasonable life-cycle assessment. Addressing the need for more research into integrating analysis of real-time and automatic recording of key indicators for a more accurate calculation and comparison, this paper aims to design a real-time recording model of these crucial indicators concerning the calculation and estimation of energy use and carbon emissions of buildings based on a Radio Frequency Identification (RFID)-based system. The architecture of the RFID-based carbon emission recording/tracking system, which contains four functional layers including data record layer, data collection/update layer, data aggregation layer and data sharing/backup layer, is presented. Each of these layers is formed by RFID or network devices and sub-systems that operate at a specific level. In the end, a proof-of-concept system is developed to illustrate the implementation of the proposed architecture and demonstrate the feasibility of the design. This study would provide the technical solution for real-time recording system of building carbon emissions and thus is of great significance and importance to improve urban sustainability. PMID:24831109

  19. Stakeholder analysis for adopting a personal health record standard in Korea.

    Science.gov (United States)

    Kang, Min-Jeoung; Jung, Chai Young; Kim, Soyoun; Boo, Yookyung; Lee, Yuri; Kim, Sundo

    Interest in health information exchanges (HIEs) is increasing. Several countries have adopted core health data standards with appropriate strategies. This study was conducted to determine the feasibility of a continuity of care record (CCR) as the standard for an electronic version of the official transfer note and the HIE in Korean healthcare. A technical review of the CCR standard and analysis of stakeholders' views were undertaken. Transfer notes were reviewed and matched with CCR standard categories. The standard for the Korean coding system was selected. Stakeholder analysis included an online survey of members of the Korean Society of Medical Informatics, a public hearing to derive opinions of consumers, doctors, vendors, academic societies and policy makers about the policy process, and a focus group meeting with EMR vendors to determine which HIE objects were technically applicable. Data objects in the official transfer note form matched CCR standards. Korean Classification of Diseases, Korean Standard Terminology of Medicine, Electronic Data Interchange code (EDI code), Logical Observation Identifiers Names and Codes, and Korean drug codes (KD code) were recommended as the Korean coding standard.'Social history', 'payers', and 'encounters' were mostly marked as optional or unnecessary sections, and 'allergies', 'alerts', 'medication list', 'problems/diagnoses', 'results',and 'procedures' as mandatory. Unlike the US, 'social history' was considered optional and 'advance directives' mandatory.At the public hearing there was some objection from the Korean Medical Association to the HIE on legal grounds in termsof intellectual property and patients' personal information. Other groups showed positive or neutral responses. Focus group members divided CCR data objects into three phases based onpredicted adoption time in CCR: (i) immediate adoption; (ii) short-term adoption ('alerts', 'family history'); and (iii) long-term adoption ('results', 'advanced directives

  20. Fetal QRS extraction from abdominal recordings via model-based signal processing and intelligent signal merging

    International Nuclear Information System (INIS)

    Haghpanahi, Masoumeh; Borkholder, David A

    2014-01-01

    Noninvasive fetal ECG (fECG) monitoring has potential applications in diagnosing congenital heart diseases in a timely manner and assisting clinicians to make more appropriate decisions during labor. However, despite advances in signal processing and machine learning techniques, the analysis of fECG signals has still remained in its preliminary stages. In this work, we describe an algorithm to automatically locate QRS complexes in noninvasive fECG signals obtained from a set of four electrodes placed on the mother’s abdomen. The algorithm is based on an iterative decomposition of the maternal and fetal subspaces and filtering of the maternal ECG (mECG) components from the fECG recordings. Once the maternal components are removed, a novel merging technique is applied to merge the signals and detect the fetal QRS (fQRS) complexes. The algorithm was trained and tested on the fECG datasets provided by the PhysioNet/CinC challenge 2013. The final results indicate that the algorithm is able to detect fetal peaks for a variety of signals with different morphologies and strength levels encountered in clinical practice. (paper)

  1. Monitoring sleep depth: analysis of bispectral index (BIS) based on polysomnographic recordings and sleep deprivation.

    Science.gov (United States)

    Giménez, Sandra; Romero, Sergio; Alonso, Joan Francesc; Mañanas, Miguel Ángel; Pujol, Anna; Baxarias, Pilar; Antonijoan, Rosa Maria

    2017-02-01

    The assessment and management of sleep are increasingly recommended in the clinical practice. Polysomnography (PSG) is considered the gold standard test to monitor sleep objectively, but some practical and technical constraints exist due to environmental and patient considerations. Bispectral index (BIS) monitoring is commonly used in clinical practice for guiding anesthetic administration and provides an index based on relationships between EEG components. Due to similarities in EEG synchronization between anesthesia and sleep, several studies have assessed BIS as a sleep monitor with contradictory results. The aim of this study was to evaluate objectively both the feasibility and reliability of BIS for sleep monitoring through a robust methodology, which included full PSG recordings at a baseline situation and after 40 h of sleep deprivation. Results confirmed that the BIS index was highly correlated with the hypnogram (0.89 ± 0.02), showing a progressive decrease as sleep deepened, and an increase during REM sleep (awake: 91.77 ± 8.42; stage N1: 83.95 ± 11.05; stage N2: 71.71 ± 11.99; stage N3: 42.41 ± 9.14; REM: 80.11 ± 8.73). Mean and median BIS values were lower in the post-deprivation night than in the baseline night, showing statistical differences for the slow wave sleep (baseline: 42.41 ± 9.14 vs. post-deprivation: 39.49 ± 10.27; p = 0.02). BIS scores were able to discriminate properly between deep (N3) and light (N1, N2) sleep. BIS values during REM overlapped those of other sleep stages, although EMG activity provided by the BIS monitor could help to identify REM sleep if needed. In conclusion, BIS monitors could provide a useful measure of sleep depth in especially particular situations such as intensive care units, and they could be used as an alternative for sleep monitoring in order to reduce PSG-derived costs and to increase capacity in ambulatory care.

  2. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson

    2010-06-01

    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  3. Integration of strategy experiential learning in e-module of electronic records management

    Directory of Open Access Journals (Sweden)

    S. Sutirman

    2018-01-01

    Full Text Available This study aims to determine the effectiveness of e-module of electronic records management integrated with experiential learning strategies to improve student achievement in the domain of cognitive, psychomotor, and affective. This study is a research and development. Model research and development used is Web-Based Instructional Design (WBID developed by Davidson-Shivers and Rasmussen. The steps of research and development carried out by analysis, evaluation planning, concurrent design, implementation, and a summative evaluation. The approach used in this study consisted of qualitative and quantitative approaches. Collecting data used the Delphi technique, observation, documentation studies and tests. Research data analysis used qualitative analysis and quantitative analysis. Testing the effectiveness of the product used a quasi-experimental research design pretest-posttest non-equivalent control group. The results showed that the e-module of electronic records management integrated with experiential learning strategies can improve student achievement in the domain of cognitive, psychomotor, and affective.

  4. Spatiotemporal climatic, hydrological, and environmental variations based on records of annually laminated lake sediments from northern Poland

    Science.gov (United States)

    Tylmann, W.; Blanke, L.; Kinder, M.; Loewe, T.; Mayr, C.; Ohlendorf, C.; Zolitschka, B.

    2009-12-01

    In northern Poland there is the unique opportunity to compare varved lake sediment records with distinct climatic trends along a 700 km long W-E transect. Annually laminated Holocene sediment sequences from Lake Lubinskie, Lake Suminko, Lake Lazduny, and Lake Szurpily were cored for high-resolution multiproxy climate and environmental reconstruction in the framework of the Polish-German project “Northern Polish Lake Research” (NORPOLAR). First results from a 139 cm long gravity core of Lake Lazduny (53°51.4’N, 21°57.3’E) document deposition of an organic (mean organic matter: 13.9%; mean biogenic opal: 9.8%) and highly carbonaceous gyttja (mean calcite content: 61.6%). The finely laminated sediment consists of biochemical varves. Pale spring/summer layers composed of autochthonous carbonates alternate with dark fall/winter layers made of organic and minerogenic detritus. The established chronology for the last 1500 calendar-years is based on thin section analysis supported by independent radiometric dating (C-14, Pb-210). Sedimentological, geochemical and stable isotope analyses were carried out with a decadal temporal resolution. Additionally, non-destructive and high-resolution XRF scanning data reveal a rhythmic variation in the Ca content that reflects seasonal calcite deposition. Redox-sensitive elements like Fe, Mn and S are interpreted to be the response to mean winter temperatures: colder winter temperatures → extended lake ice cover → intensification of meromixis → increased Fe/Mn ratio. In turn, these parameters can be linked to NAO (North Atlantic Oscillation) variability, because a negative NAO is related to colder and drier conditions in northeastern Europe. Climate variability is also mirrored by the δ13C record of the endogenic calcite fraction. In mid-latitude lakes calcite precipitation is dominated by productivity-controlled consumption of the dissolved inorganic carbon (DIC) pool. Thus the δ13C record potentially provides a

  5. Dedicated data recording video system for Spacelab experiments

    Science.gov (United States)

    Fukuda, Toshiyuki; Tanaka, Shoji; Fujiwara, Shinji; Onozuka, Kuniharu

    1984-04-01

    A feasibility study of video tape recorder (VTR) modification to add the capability of data recording etc. was conducted. This system is an on-broad system to support Spacelab experiments as a dedicated video system and a dedicated data recording system to operate independently of the normal operation of the Orbiter, Spacelab and the other experiments. It continuously records the video image signals with the acquired data, status and operator's voice at the same time on one cassette video tape. Such things, the crews' actions, animals' behavior, microscopic views and melting materials in furnace, etc. are recorded. So, it is expected that experimenters can make a very easy and convenient analysis of the synchronized video, voice and data signals in their post flight analysis.

  6. Filtration of human EEG recordings from physiological artifacts with empirical mode method

    Science.gov (United States)

    Grubov, Vadim V.; Runnova, Anastasiya E.; Khramova, Marina V.

    2017-03-01

    In the paper we propose the new method for dealing with noise and physiological artifacts in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We consider noises and physiological artifacts on EEG as specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from eye-moving artifacts and show high efficiency of the method.

  7. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  8. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records

    DEFF Research Database (Denmark)

    Walker, Mike; Johnsen, Sigfus Johann; Rasmussen, Sune Olander

    2009-01-01

    The Greenland ice core from NorthGRIP (NGRIP) contains a proxy climate record across the Pleistocene-Holocene boundary of unprecedented clarity and resolution. Analysis of an array of physical and chemical parameters within the ice enables the base of the Holocene, as reflected in the first signs...... the Global Stratotype Section and Point (GSSP) for the base of the Holocene Series/Epoch (Quaternary System/Period) has been ratified by the International Union of Geological Sciences. Five auxiliary stratotypes for the Pleistocene-Holocene boundary have also been recognised....

  9. Construction contract revenue recording comparison

    Directory of Open Access Journals (Sweden)

    Hana Bohušová

    2008-01-01

    Full Text Available Publicly traded companies prepare their consolidated accounts in conformity with the international accounting standards (IAS/IFRS in accordance with the Regulation No. 1606/2002. This is obliged for all publicly traded joint-stock companies in the Czech Republic. Other companies prepare financial statements in accordance with national accounting standards. There are Accounting Act No. 563/1991 of Coll. and Regulation No. 500/2002 of Coll., Czech Accounting Standards in the Czech Republic. Both systems are based on different principles so there are many differences. The Czech Accounting System (CAS is based on the rules while IAS/IFRS are based on principles (Kovanicová, 2005. These differences are mainly caused by the different philosophy. CAS prefers the fiscal policy to the economic substance while IAS/IFRS prefere the economic substance. One of the most significant dif­fe­ren­ces is in the field of revenue recording. There are two standards concerning the revenues recording (IAS 18 − Revenue, IAS 11 – Construction Contracts in IAS/IFRS. CAS 019 – Expenses and Revenue are dealing with the revenue recording in the Czech Republic. The paper is aimed at the comparison of the methodical approaches for revenue recording used by IAS/IFRS and by CAS. The most important differences are caused by the different approach to the long term contracts (construction contracts, software development contracts revenues recording.

  10. Disassociation for electronic health record privacy.

    Science.gov (United States)

    Loukides, Grigorios; Liagouris, John; Gkoulalas-Divanis, Aris; Terrovitis, Manolis

    2014-08-01

    The dissemination of Electronic Health Record (EHR) data, beyond the originating healthcare institutions, can enable large-scale, low-cost medical studies that have the potential to improve public health. Thus, funding bodies, such as the National Institutes of Health (NIH) in the U.S., encourage or require the dissemination of EHR data, and a growing number of innovative medical investigations are being performed using such data. However, simply disseminating EHR data, after removing identifying information, may risk privacy, as patients can still be linked with their record, based on diagnosis codes. This paper proposes the first approach that prevents this type of data linkage using disassociation, an operation that transforms records by splitting them into carefully selected subsets. Our approach preserves privacy with significantly lower data utility loss than existing methods and does not require data owners to specify diagnosis codes that may lead to identity disclosure, as these methods do. Consequently, it can be employed when data need to be shared broadly and be used in studies, beyond the intended ones. Through extensive experiments using EHR data, we demonstrate that our method can construct data that are highly useful for supporting various types of clinical case count studies and general medical analysis tasks. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Comparative Analysis of Music Recordings from Western and Non-Western traditions by Automatic Tonal Feature Extraction

    Directory of Open Access Journals (Sweden)

    Emilia Gómez

    2008-09-01

    Full Text Available The automatic analysis of large musical corpora by means of computational models overcomes some limitations of manual analysis, and the unavailability of scores for most existing music makes necessary to work with audio recordings. Until now, research on this area has focused on music from the Western tradition. Nevertheless, we might ask if the available methods are suitable when analyzing music from other cultures. We present an empirical approach to the comparative analysis of audio recordings, focusing on tonal features and data mining techniques. Tonal features are related to the pitch class distribution, pitch range and employed scale, gamut and tuning system. We provide our initial but promising results obtained when trying to automatically distinguish music from Western and non- Western traditions; we analyze which descriptors are most relevant and study their distribution over 1500 pieces from different traditions and styles. As a result, some feature distributions differ for Western and non-Western music, and the obtained classification accuracy is higher than 80% for different classification algorithms and an independent test set. These results show that automatic description of audio signals together with data mining techniques provide means to characterize huge music collections from different traditions and complement musicological manual analyses.

  12. Nurses' Experiences of an Initial and Reimplemented Electronic Health Record Use.

    Science.gov (United States)

    Chang, Chi-Ping; Lee, Ting-Ting; Liu, Chia-Hui; Mills, Mary Etta

    2016-04-01

    The electronic health record is a key component of healthcare information systems. Currently, numerous hospitals have adopted electronic health records to replace paper-based records to document care processes and improve care quality. Integrating healthcare information system into traditional nursing daily operations requires time and effort for nurses to become familiarized with this new technology. In the stages of electronic health record implementation, smooth adoption can streamline clinical nursing activities. In order to explore the adoption process, a descriptive qualitative study design and focus group interviews were conducted 3 months after and 2 years after electronic health record system implementation (system aborted 1 year in between) in one hospital located in southern Taiwan. Content analysis was performed to analyze the interview data, and six main themes were derived, in the first stage: (1) liability, work stress, and anticipation for electronic health record; (2) slow network speed, user-unfriendly design for learning process; (3) insufficient information technology/organization support; on the second stage: (4) getting used to electronic health record and further system requirements, (5) benefits of electronic health record in time saving and documentation, (6) unrealistic information technology competence expectation and future use. It concluded that user-friendly design and support by informatics technology and manpower backup would facilitate this adoption process as well.

  13. Coding and signal processing for magnetic recording systems

    CERN Document Server

    Vasic, Bane

    2004-01-01

    RECORDING SYSTEMSA BriefHistory of Magnetic Storage, Dean PalmerPhysics of Longitudinal and Perpendicular Recording, Hong Zhou, Tom Roscamp, Roy Gustafson, Eric Boernern, and Roy ChantrellThe Physics of Optical Recording, William A. Challener and Terry W. McDanielHead Design Techniques for Recording Devices, Robert E. RottmayerCOMMUNICATION AND INFORMATION THEORY OF MAGNETIC RECORDING CHANNELSModeling the Recording Channel, Jaekyun MoonSignal and Noise Generation for Magnetic Recording Channel Simulations, Xueshi Yang and Erozan M. KurtasStatistical Analysis of Digital Signals and Systems, Dra

  14. Segmentation of heart sound recordings by a duration-dependent hidden Markov model

    International Nuclear Information System (INIS)

    Schmidt, S E; Graff, C; Toft, E; Struijk, J J; Holst-Hansen, C

    2010-01-01

    Digital stethoscopes offer new opportunities for computerized analysis of heart sounds. Segmentation of heart sound recordings into periods related to the first and second heart sound (S1 and S2) is fundamental in the analysis process. However, segmentation of heart sounds recorded with handheld stethoscopes in clinical environments is often complicated by background noise. A duration-dependent hidden Markov model (DHMM) is proposed for robust segmentation of heart sounds. The DHMM identifies the most likely sequence of physiological heart sounds, based on duration of the events, the amplitude of the signal envelope and a predefined model structure. The DHMM model was developed and tested with heart sounds recorded bedside with a commercially available handheld stethoscope from a population of patients referred for coronary arterioangiography. The DHMM identified 890 S1 and S2 sounds out of 901 which corresponds to 98.8% (CI: 97.8–99.3%) sensitivity in 73 test patients and 13 misplaced sounds out of 903 identified sounds which corresponds to 98.6% (CI: 97.6–99.1%) positive predictivity. These results indicate that the DHMM is an appropriate model of the heart cycle and suitable for segmentation of clinically recorded heart sounds

  15. [Method of recording impulses from an implanted cardiostimulator].

    Science.gov (United States)

    Vetkin, A N; Osipov, V P

    1976-01-01

    An analysis of pulses from an implanted cardiostimulator recorded from the surface of the patient's body is one of the methods permitting it to pass judgment as to its functioning. Because of the possibility of the recording electrodes location coinciding with the equipotential line an erroneous interpretation of the cardiostimulator's condition is not to be ruled out. It is recommended that the pulses should be recorded with their subsequent analysis in no less than 2 standard ECG leads from the limbs.

  16. New high-resolution record of Holocene climate change in the Weddell Sea from combined biomarker analysis of the Patriot Hills blue ice area

    Science.gov (United States)

    Fogwill, Christopher; Turney, Chris; Baker, Andy; Ellis, Bethany; Cooper, Alan; Etheridge, David; Rubino, Mauro; Thornton, David; Fernando, Francisco; Bird, Michale; Munksgaard, Niels

    2017-04-01

    We report preliminary analysis of biomarkers (including dissolved organic matter (DOM) and DNA) from the Patriot Hills blue ice area (BIA), from the Ellsworth Mountains in the Weddell Sea Embayment. Preliminary isotopic and multiple gas analysis (CO2, CH4, N2O and CO) demonstrate that the Holocene comprises more than 50% of the 800m long BIA record, and in combination isotopic and biomarker analysis reveals a remarkable record of centennial variability through the Holocene in this sector of the Weddell Sea. Analysis using a Horiba Aqualog - which measures the fluorescence of DOM by producing a map of the fluorescence through an excitation-emission matrix (EEM) - identifies the presence of two marine protein-like components in both modern snow pit samples and within the Holocene part of Patriot Hills BIA transect. Intriguingly, the modern seasonal trends in DOM, recorded in contemporary snow pits, have relatively low signals compared to those recorded in the mid-Holocene record, suggesting a reduction in DOM signal in contemporary times. Given that the δD excess data suggests the source of precipitation has remained constant through the Holocene, the biomarker signal must relate to multi-year marine productivity signals from the Weddell Sea. The marked variability in DOM between the mid-Holocene and contemporary times can only relate to periods of sustained, enhanced biological productivity in the Weddell Sea associated with shifts in Southern Annular Mode, sea ice variability, changes in ventilation or polynya activity. Here we discuss the possible drivers of these changes and describe how this approach at this BIA could benefit conventional ice core records regionally.

  17. Access control and privilege management in electronic health record: a systematic literature review.

    Science.gov (United States)

    Jayabalan, Manoj; O'Daniel, Thomas

    2016-12-01

    This study presents a systematic literature review of access control for electronic health record systems to protect patient's privacy. Articles from 2006 to 2016 were extracted from the ACM Digital Library, IEEE Xplore Digital Library, Science Direct, MEDLINE, and MetaPress using broad eligibility criteria, and chosen for inclusion based on analysis of ISO22600. Cryptographic standards and methods were left outside the scope of this review. Three broad classes of models are being actively investigated and developed: access control for electronic health records, access control for interoperability, and access control for risk analysis. Traditional role-based access control models are extended with spatial, temporal, probabilistic, dynamic, and semantic aspects to capture contextual information and provide granular access control. Maintenance of audit trails and facilities for overriding normal roles to allow full access in emergency cases are common features. Access privilege frameworks utilizing ontology-based knowledge representation for defining the rules have attracted considerable interest, due to the higher level of abstraction that makes it possible to model domain knowledge and validate access requests efficiently.

  18. 77 FR 5781 - Record of Decision for the Air Space Training Initiative Shaw Air Force Base, South Carolina...

    Science.gov (United States)

    2012-02-06

    ... DEPARTMENT OF DEFENSE Department of the Air Force Record of Decision for the Air Space Training Initiative Shaw Air Force Base, South Carolina Final Environmental Impact Statement ACTION: Notice of... signed the ROD for the Airspace Training Initiative Shaw Air Force Base, South Carolina Final...

  19. The buffer management scheme for the new Triumf VAX-based data acquisition and analysis system

    International Nuclear Information System (INIS)

    Ludgate, G.A.; Haley, B.; Lee, L.

    1987-01-01

    The new TRIUMF VAX-based DAAS requires data to be exchanged between acquisition, monitoring and analysis processes executing on a VAX. Data records are passed via a set of buffers contained in a region of memory shared by all processes. The responsibility for buffer management is distributed among the processes and synchronized access to the region is achieved by using the VAX self-relative queue instructions and common event flags

  20. 13 CFR 106.302 - What provisions must be set forth in a Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What provisions must be set forth in a Fee Based Record? 106.302 Section 106.302 Business Credit and Assistance SMALL BUSINESS... does not constitute or imply an endorsement by SBA of the Donor or the Donor's products or services. ...

  1. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005)

    Science.gov (United States)

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  2. Recording Information on Architectural Heritage Should Meet the Requirements for Conservation Digital Recording Practices at the Summer Palace

    Science.gov (United States)

    Zhang, L.; Cong, Y.; Wu, C.; Bai, C.; Wu, C.

    2017-08-01

    , structural distortion analysis, display, statue restoration and thematic research. Three points will be highlighted in our discussion: 1. Understanding of the heritage is more important than the particular technology used: Architectural heritage information collection and recording are based on an understanding of the value and nature of the architectural heritage. Understanding is the purpose, whereas information collection and recording are the means. 2. Demand determines technology: Collecting and recording architectural heritage information is to serve the needs of heritage research, conservation, management and display. These different needs determine the different technologies that we use. 3. Set the level of accuracy appropriately: For information recording, high accuracy is not the key criterion; rather an appropriate level of accuracy is key. There is considerable deviation between the nominal accuracy of any instrument and the accuracy of any particular measurement.

  3. Analysis of observational records of Dae-gyupyo in Joseon Dynasty

    Science.gov (United States)

    Mihn, Byeong-Hee; Lee, Ki-Won; Kim, Sang-Hyuk; Ahn, Young Sook; Lee, Yong Sam

    2012-09-01

    It is known that Dae-gyupyo (the Large Noon Gnomon) and So-gyupyo (the Small Noon Gnomon) were constructed in the reign of King Sejong (1418--1450) of the Joseon Dynasty. Gyupyo is an astronomical instrument for measuring the length of the shadow cast by a celestial body at the meridian passage time; it consists of two basic parts: a measuring scale and a vertical column. According to the Veritable Records of King Sejong and of King Myeongjong (1545--1567), the column of Dae-gyupyo was 40 Cheok (˜ 8 m) in height from the measuring scale and had a cross-bar, like the Guibiao of Shoujing Guo of the Yuan Dynasty in China. In the latter Veritable Records, three observations of the Sun on the date of the winter solstice and two of the full Moon on the first month in a luni-solar calendar are also recorded. In particular, the observational record of Dae-gyupyo for the Sun on Dec. 12, 1563 is ˜ 1 m shorter than the previous two records. To explain this, we investigated two possibilities: the vertical column was inclined, and the cross-bar was lowered. The cross-bar was attached to the column by a supporting arm; that should be installed at an angle of ˜ 36.9° to the north on the basis of a geometric structure inferred from the records of Yuanshi (History of the Yuan Dynasty). We found that it was possible that the vertical column was inclined ˜ 7.7° to the south or the supporting arm was tilted ˜ 58.3° downward. We suggest that the arm was tilted by ˜ 95° (= 36.9° + 58.3°).

  4. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  5. [Desmoid fibromatosis in absorption infrared spectroscopy, emission spectral analysis and roentgen diffraction recording].

    Science.gov (United States)

    Zejkan, A; Bejcek, Z; Horejs, J; Vrbová, H; Bakosová, M; Macholda, F; Rykl, D

    1989-10-01

    The authors present results of serial quality and quantity microanalyses of bone patterns and dental tissue patterns in patient with desmoid fibromatosis. Methods of absorption spectroscopy, emission spectral analysis and X-ray diffraction analysis with follow-up to x-ray examination are tested. The above mentioned methods function in a on-line system by means of specially adjusted monitor unit which is controlled centrally by the computer processor system. The whole process of measurement is fully automated and the data obtained are recorded processed in the unit data structure classified into index sequence blocks of data. Serial microanalyses offer exact data for the study of structural changes of dental and bone tissues which manifest themselves in order of crystal grid shifts. They prove the fact that microanalyses give new possibilities in detection and interpretation of chemical and structural changes of apatite cell.

  6. A minimum operating system based on the SM5300.01 magnetic tape recorder for the Micro-8 computer

    International Nuclear Information System (INIS)

    Kartashov, S.V.

    1987-01-01

    An operating system (OS) for microcomputers based on INTEL-8080, 8085 microprocessors oriented to use a magnetic tape recorder is described. This system comprises a tape-recorder manager and a file structure organization system (nucleus of OS), a symbol text editor, a macroassembler, an interactive disasembler and a program of communication with an EC-computer. The OS makes it possible to develop, debug, store and exploit the program written in INTEL-8085 assembly language

  7. 9 CFR 417.5 - Records.

    Science.gov (United States)

    2010-01-01

    ... and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... records documenting the establishment's HACCP plan: (1) The written hazard analysis prescribed in § 417.2...

  8. Mapping the Early Language Environment Using All-Day Recordings and Automated Analysis.

    Science.gov (United States)

    Gilkerson, Jill; Richards, Jeffrey A; Warren, Steven F; Montgomery, Judith K; Greenwood, Charles R; Kimbrough Oller, D; Hansen, John H L; Paul, Terrance D

    2017-05-17

    This research provided a first-generation standardization of automated language environment estimates, validated these estimates against standard language assessments, and extended on previous research reporting language behavior differences across socioeconomic groups. Typically developing children between 2 to 48 months of age completed monthly, daylong recordings in their natural language environments over a span of approximately 6-38 months. The resulting data set contained 3,213 12-hr recordings automatically analyzed by using the Language Environment Analysis (LENA) System to generate estimates of (a) the number of adult words in the child's environment, (b) the amount of caregiver-child interaction, and (c) the frequency of child vocal output. Child vocalization frequency and turn-taking increased with age, whereas adult word counts were age independent after early infancy. Child vocalization and conversational turn estimates predicted 7%-16% of the variance observed in child language assessment scores. Lower socioeconomic status (SES) children produced fewer vocalizations, engaged in fewer adult-child interactions, and were exposed to fewer daily adult words compared with their higher socioeconomic status peers, but within-group variability was high. The results offer new insight into the landscape of the early language environment, with clinical implications for identification of children at-risk for impoverished language environments.

  9. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.

    Science.gov (United States)

    Ratwani, Raj M; Fairbanks, Rollin J; Hettinger, A Zachary; Benda, Natalie C

    2015-11-01

    The usability of electronic health records (EHRs) continues to be a point of dissatisfaction for providers, despite certification requirements from the Office of the National Coordinator that require EHR vendors to employ a user-centered design (UCD) process. To better understand factors that contribute to poor usability, a research team visited 11 different EHR vendors in order to analyze their UCD processes and discover the specific challenges that vendors faced as they sought to integrate UCD with their EHR development. Our analysis demonstrates a diverse range of vendors' UCD practices that fall into 3 categories: well-developed UCD, basic UCD, and misconceptions of UCD. Specific challenges to practicing UCD include conducting contextually rich studies of clinical workflow, recruiting participants for usability studies, and having support from leadership within the vendor organization. The results of the study provide novel insights for how to improve usability practices of EHR vendors. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Wayside Bearing Fault Diagnosis Based on a Data-Driven Doppler Effect Eliminator and Transient Model Analysis

    Science.gov (United States)

    Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang

    2014-01-01

    A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197

  11. Analysis of Human Mobility Based on Cellular Data

    Science.gov (United States)

    Arifiansyah, F.; Saptawati, G. A. P.

    2017-01-01

    Nowadays not only adult but even teenager and children have then own mobile phones. This phenomena indicates that the mobile phone becomes an important part of everyday’s life. Based on these indication, the amount of cellular data also increased rapidly. Cellular data defined as the data that records communication among mobile phone users. Cellular data is easy to obtain because the telecommunications company had made a record of the data for the billing system of the company. Billing data keeps a log of the users cellular data usage each time. We can obtained information from the data about communication between users. Through data visualization process, an interesting pattern can be seen in the raw cellular data, so that users can obtain prior knowledge to perform data analysis. Cellular data processing can be done using data mining to find out human mobility patterns and on the existing data. In this paper, we use frequent pattern mining and finding association rules to observe the relation between attributes in cellular data and then visualize them. We used weka tools for finding the rules in stage of data mining. Generally, the utilization of cellular data can provide supporting information for the decision making process and become a data support to provide solutions and information needed by the decision makers.

  12. Charging for hospital pharmaceutical services: flat free based on the medication record.

    Science.gov (United States)

    Wyatt, B K

    1979-03-01

    A 200-bed hospital's change in pricing drug products from a cost-plus-fee system to a flat fee per dose based on the medication administration record (MAR) is described. With the flat-fee system, drug charges are not recorded when the drug is dispensed by the pharmacy; data for charging doses are obtained directly from the MAR forms generated by the nursing staff. Charges are 55 cents per oral or suppository dose and $3.00 per injection dose. Drugs administered intravenously, topical drugs, injections costing more than $10.00 per dose, and miscellaneous nondrug items are still charged on a cost-plus-fee basis. Man-hours are saved in the pharmacy department because of the elimination of the pricing function and maintenance of price lists. The need for nursing staff to charge for any doses administered from emergency or Schedule II floor-stock supplies is eliminated. The workload for business office personnel is reduced because the number of individual charges is less than with the cost-plus charging system. The system is accepted by patients and third-party payers and has made a complete unit dose drug distribution system possible at lower cost.

  13. Statistical Metadata Analysis of the Variability of Latency, Device Transfer Time, and Coordinate Position from Smartphone-Recorded Infrasound Data

    Science.gov (United States)

    Garces, E. L.; Garces, M. A.; Christe, A.

    2017-12-01

    The RedVox infrasound recorder app uses microphones and barometers in smartphones to record infrasound, low-frequency sound below the threshold of human hearing. We study a device's metadata, which includes position, latency time, the differences between the device's internal times and the server times, and the machine time, searching for patterns and possible errors or discontinuities in these scaled parameters. We highlight metadata variability through scaled multivariate displays (histograms, distribution curves, scatter plots), all created and organized through software development in Python. This project is helpful in ascertaining variability and honing the accuracy of smartphones, aiding the emergence of portable devices as viable geophysical data collection instruments. It can also improve the app and cloud service by increasing efficiency and accuracy, allowing to better document and foresee drastic natural movements like tsunamis, earthquakes, volcanic eruptions, storms, rocket launches, and meteor impacts; recorded data can later be used for studies and analysis by a variety of professions. We expect our final results to produce insight on how to counteract problematic issues in data mining and improve accuracy in smartphone data-collection. By eliminating lurking variables and minimizing the effect of confounding variables, we hope to discover efficient processes to reduce superfluous precision, unnecessary errors, and data artifacts. These methods should conceivably be transferable to other areas of software development, data analytics, and statistics-based experiments, contributing a precedent of smartphone metadata studies from geophysical rather than societal data. The results should facilitate the rise of civilian-accessible, hand-held, data-gathering mobile sensor networks and yield more straightforward data mining techniques.

  14. Record reach : ExxonMobil extends its own world record

    Energy Technology Data Exchange (ETDEWEB)

    Wells, P.

    2008-06-15

    Extended reach drilling (ERD) records are now regularly being broken by ExxonMobil Corporation's Sakhalin project on Russia's east coast. In 2008, an oil well on the coast established a new record by achieving a measured depth of 11,680 meters. The well was punched out by a Texas-based drilling company using the world's largest land-based drilling rig. The use of ERD has reduced the capital and operating costs of the project in addition to reducing its environmental impacts. ERD has been used to drill onshore beneath the seafloor and has eliminated the need for additional offshore structure and pipelines. The horizontal reach of the wells has improved productivity while also avoiding disturbing whale migrations in the region. The rig features a 1.5 million pound load capacity, 3000 horsepower draw-works. The top-drive drilling systems were used to transmit real time data to external locations for further evaluation. Oil and gas is also produced from a gravity-based offshore platform. It was concluded that longer wellbore are now being developed by the corporation in order to drill under the Beaufort Sea. 2 figs.

  15. Home recording for musicians for dummies

    CERN Document Server

    Strong, Jeff

    2008-01-01

    Invaluable advice that will be music to your ears! Are you thinking of getting started in home recording? Do you want to know the latest home recording technologies? Home Recording For Musicians For Dummies will get you recording music at home in no time. It shows you how to set up a home studio, record and edit your music, master it, and even distribute your songs. With this guide, you?ll learn how to compare studio-in-a-box, computer-based, and stand-alone recording systems and choose what you need. You?ll gain the skills to manage your sound, take full advantage of MIDI, m

  16. Development of the electronic patient record system based on problem oriented system.

    Science.gov (United States)

    Uto, Yumiko; Iwaanakuchi, Takashi; Muranaga, Fuminori; Kumamoto, Ichiro

    2013-01-01

    In Japan, POS (problem oriented system) is recommended in the clinical guideline. Therefore, the records are mainly made by SOAP. We developed a system mainly with a function which enabled our staff members of all kinds of professions including doctors to enter the patients' clinical information as an identical record, regardless if they were outpatients or inpatients, and to observe the contents chronologically. This electric patient record system is called "e-kanja recording system". On this system, all staff members in the medical team can now share the same information. Moreover, the contents can be reviewed by colleagues; the quality of records has been improved as it is evaluated by the others.

  17. Knowledge and attitudes of nurses in community health centres about electronic medical records

    Directory of Open Access Journals (Sweden)

    Don O’Mahony

    2014-03-01

    Full Text Available Background: Nurses in primary healthcare record data for the monitoring and evaluation of diseases and services. Information and communications technology (ICT can improve quality in healthcare by providing quality medical records. However, worldwide, the majority of health ICT projects have failed. Individual user acceptance is a crucial factor in successful ICT implementation. Objectives: The aim of this study is to explore nurses’ knowledge, attitudes and perceptions regarding ICT so as to inform the future implementation of electronic medical record (EMR systems. Methods: A qualitative design was used. Semi-structured interviews were undertaken with nurses at three community health centres (CHCs in the King Sabata Dalyindyebo Local Municipality. The interview guide was informed by the literature on user acceptance of ICT. Interviews were recorded and analysed using content analysis. Results: Many nurses knew about health ICT and articulated clearly the potential benefits of an EMR such as fewer errors, more complete records, easier reporting and access to information. They thought that an EMR system would solve the challenges they identified with the current paper-based record system, including duplication of data, misfiling, lack of a chronological patient record, excessive time in recording and reduced time for patient care. For personal ICT needs, approximately half used cellphone Internet-based services and computers. Conclusions: In this study, nurses identified many challenges with the current recording methods. They thought that an EMR should be installed at CHCs. Their knowledge about EMR, positive attitudes to ICT and personal use of ICT devices increase the likelihood of successful EMR implementation at CHCs.

  18. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  19. Using recurrence plot for determinism analysis of EEG recordings in genetic absence epilepsy rats.

    Science.gov (United States)

    Ouyang, Gaoxiang; Li, Xiaoli; Dang, Chuangyin; Richards, Douglas A

    2008-08-01

    Understanding the transition of brain activity towards an absence seizure is a challenging task. In this paper, we use recurrence quantification analysis to indicate the deterministic dynamics of EEG series at the seizure-free, pre-seizure and seizure states in genetic absence epilepsy rats. The determinism measure, DET, based on recurrence plot, was applied to analyse these three EEG datasets, each dataset containing 300 single-channel EEG epochs of 5-s duration. Then, statistical analysis of the DET values in each dataset was carried out to determine whether their distributions over the three groups were significantly different. Furthermore, a surrogate technique was applied to calculate the significance level of determinism measures in EEG recordings. The mean (+/-SD) DET of EEG was 0.177+/-0.045 in pre-seizure intervals. The DET values of pre-seizure EEG data are significantly higher than those of seizure-free intervals, 0.123+/-0.023, (Pdeterminism in EEG epochs was present in 25 of 300 (8.3%), 181 of 300 (60.3%) and 289 of 300 (96.3%) in seizure-free, pre-seizure and seizure intervals, respectively. Results provide some first indications that EEG epochs during pre-seizure intervals exhibit a higher degree of determinism than seizure-free EEG epochs, but lower than those in seizure EEG epochs in absence epilepsy. The proposed methods have the potential of detecting the transition between normal brain activity and the absence seizure state, thus opening up the possibility of intervention, whether electrical or pharmacological, to prevent the oncoming seizure.

  20. Impact of a computerized system for evidence-based diabetes care on completeness of records: a before–after study

    Directory of Open Access Journals (Sweden)

    Roshanov Pavel S

    2012-07-01

    Full Text Available Abstract Background Physicians practicing in ambulatory care are adopting electronic health record (EHR systems. Governments promote this adoption with financial incentives, some hinged on improvements in care. These systems can improve care but most demonstrations of successful systems come from a few highly computerized academic environments. Those findings may not be generalizable to typical ambulatory settings, where evidence of success is largely anecdotal, with little or no use of rigorous methods. The purpose of our pilot study was to evaluate the impact of a diabetes specific chronic disease management system (CDMS on recording of information pertinent to guideline-concordant diabetes care and to plan for larger, more conclusive studies. Methods Using a before–after study design we analyzed the medical record of approximately 10 patients from each of 3 diabetes specialists (total = 31 who were seen both before and after the implementation of a CDMS. We used a checklist of key clinical data to compare the completeness of information recorded in the CDMS record to both the clinical note sent to the primary care physician based on that same encounter and the clinical note sent to the primary care physician based on the visit that occurred prior to the implementation of the CDMS, accounting for provider effects with Generalized Estimating Equations. Results The CDMS record outperformed by a substantial margin dictated notes created for the same encounter. Only 10.1% (95% CI, 7.7% to 12.3% of the clinically important data were missing from the CDMS chart compared to 25.8% (95% CI, 20.5% to 31.1% from the clinical note prepared at the time (p p  Conclusions The CDMS chart captured information important for the management of diabetes more often than dictated notes created with or without its use but we were unable to detect a difference in completeness between notes dictated in CDMS-associated and usual-care encounters. Our sample of

  1. A microcomputer-based daily living activity recording system.

    Science.gov (United States)

    Matsuoka, Shingo; Yonezawa, Yoshiharu; Maki, Hiromichi; Ogawa, Hidekuni; Hahn, Allen W; Thayer, Julian F; Caldwell, W Morton

    2003-01-01

    A new daily living activity recording system has been developed for monitoring health conditions and living patterns, such as respiration, posture, activity/rest ratios and general activity level. The system employs a piezoelectric sensor, a dual axis accelerometer, two low-power active filters, a low-power 8-bit single chip microcomputer and a 128 MB compact flash memory. The piezoelectric sensor, whose electrical polarization voltage is produced by mechanical strain, detects body movements. Its high-frequency output components reflect body movements produced by walking and running activities, while the low frequency components are mainly respiratory. The dual axis accelerometer detects, from body X and Y tilt angles, whether the patient is standing, sitting or lying down (prone, supine, left side or right side). The detected respiratory, behavior and posture signals are stored by the compact flash memory. After recording, these data are downloaded to a desktop computer and analyzed.

  2. Assessment of nursing records on cardiopulmonary resuscitation based on the utstein model

    Directory of Open Access Journals (Sweden)

    Daiane Lopes Grisante

    2014-01-01

    Full Text Available Cross-sectional study that assessed the quality of nursing records on cardiopulmonary resuscitation. Forty-two patients’ charts were reviewed in an intensive care unit, using the Utstein protocol. There was a predominance of men (54.8%, aged from 21-70 years old (38.1%, correction of acquired heart diseases (42.7%, with more than one pre-existing device (147. As immediate cause of cardiac arrest, hypotension predominated (48.3% and as the initial rhythm, bradycardia (37.5%. Only the time of death and time of arrest were recorded in 100% of the sample. Professional training in Advanced Life Support was not recorded. The causes of arrest and initial rhythm were recorded in 69% and 76.2% of the sample. Chest compressions, patent airway obtainment and defibrillation were recorded in less than 16%. Records were considered of low quality and may cause legal sanctions to professionals and do not allow the comparison of the effectiveness of the maneuvers with other centers.

  3. Medical record management systems: criticisms and new perspectives.

    Science.gov (United States)

    Frénot, S; Laforest, F

    1999-06-01

    The first generation of computerized medical records stored the data as text, but these records did not bring any improvement in information manipulation. The use of a relational database management system (DBMS) has largely solved this problem as it allows for data requests by using SQL. However, this requires data structuring which is not very appropriate to medicine. Moreover, the use of templates and icon user interfaces has introduced a deviation from the paper-based record (still existing). The arrival of hypertext user interfaces has proven to be of interest to fill the gap between the paper-based medical record and its electronic version. We think that further improvement can be accomplished by using a fully document-based system. We present the architecture, advantages and disadvantages of classical DBMS-based and Web/DBMS-based solutions. We also present a document-based solution and explain its advantages, which include communication, security, flexibility and genericity.

  4. A wind proxy based on migrating dunes at the Baltic coast: statistical analysis of the link between wind conditions and sand movement

    Science.gov (United States)

    Bierstedt, Svenja E.; Hünicke, Birgit; Zorita, Eduardo; Ludwig, Juliane

    2017-07-01

    We statistically analyse the relationship between the structure of migrating dunes in the southern Baltic and the driving wind conditions over the past 26 years, with the long-term aim of using migrating dunes as a proxy for past wind conditions at an interannual resolution. The present analysis is based on the dune record derived from geo-radar measurements by Ludwig et al. (2017). The dune system is located at the Baltic Sea coast of Poland and is migrating from west to east along the coast. The dunes present layers with different thicknesses that can be assigned to absolute dates at interannual timescales and put in relation to seasonal wind conditions. To statistically analyse this record and calibrate it as a wind proxy, we used a gridded regional meteorological reanalysis data set (coastDat2) covering recent decades. The identified link between the dune annual layers and wind conditions was additionally supported by the co-variability between dune layers and observed sea level variations in the southern Baltic Sea. We include precipitation and temperature into our analysis, in addition to wind, to learn more about the dependency between these three atmospheric factors and their common influence on the dune system. We set up a statistical linear model based on the correlation between the frequency of days with specific wind conditions in a given season and dune migration velocities derived for that season. To some extent, the dune records can be seen as analogous to tree-ring width records, and hence we use a proxy validation method usually applied in dendrochronology, cross-validation with the leave-one-out method, when the observational record is short. The revealed correlations between the wind record from the reanalysis and the wind record derived from the dune structure is in the range between 0.28 and 0.63, yielding similar statistical validation skill as dendroclimatological records.

  5. A compact self-recording pressure based sea level gauge suitable for deployments at harbour and offshore environments

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.; Peshwe, V.B.; Joseph, A.; Mehra, P.; Naik, G.P.; Kumar, V.; Desa, E.S.; Desai, R.G.P.; Nagvekar, S.; Desai, S.P.

    A compact and lightweight self-recording pressure based sea level gauge has been designed to suit deployments from harbour and offshore environments. A novel hydraulic coupling device designed in-house was used to transfer the seawater pressure...

  6. Identifying FRBR Work-Level Data in MARC Bibliographic Records for Manifestations of Moving Images

    Directory of Open Access Journals (Sweden)

    Lynne Bisko

    2008-12-01

    Full Text Available The library metadata community is dealing with the challenge of implementing the conceptual model, Functional Requirements for Bibliographic Records (FRBR. In response, the Online Audiovisual Catalogers (OLAC created a task force to study the issues related to creating and using FRBR-based work-level records for moving images. This article presents one part of the task force's work: it looks at the feasibility of creating provisional FRBR work-level records for moving images by extracting data from existing manifestation-level bibliographic records. Using a sample of 941 MARC records, a subgroup of the task force conducted a pilot project to look at five characteristics of moving image works. Here they discuss their methodology; analysis; selected results for two elements, original date (year and director name; and conclude with some suggested changes to MARC coding and current cataloging policy.

  7. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  8. Influence of mechanical scratch on the recorded magnetization stability of perpendicular recording media

    International Nuclear Information System (INIS)

    Nagano, Katsumasa; Sasaki, Syota; Futamoto, Masaaki

    2010-01-01

    Stability of recorded magnetization of hard disk drives (HDDs) is influenced by external environments, such as temperature, magnetic field, etc. Small scratches are frequently formed on HDD medium surface upon contacts with the magnetic head. Influence of temperature and mechanical scratch on the magnetization structure stability of perpendicular recording media was investigated by using a magnetic force microscope. The magnetic bit shape started to change at around 300 0 C for an area with no scratches, whereas for the area near a shallow mechanical scratch it started to change at a lower temperature around 250 0 C. An analysis of magnetization structure under an influence of temperature and mechanical scratch is carried out for the magnetization structure variation and recorded magnetization strength.

  9. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Directory of Open Access Journals (Sweden)

    Sergio Miranda Freire

    Full Text Available This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when

  10. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  11. Comparing the Performance of NoSQL Approaches for Managing Archetype-Based Electronic Health Record Data.

    Science.gov (United States)

    Freire, Sergio Miranda; Teodoro, Douglas; Wei-Kleiner, Fang; Sundvall, Erik; Karlsson, Daniel; Lambrix, Patrick

    2016-01-01

    This study provides an experimental performance evaluation on population-based queries of NoSQL databases storing archetype-based Electronic Health Record (EHR) data. There are few published studies regarding the performance of persistence mechanisms for systems that use multilevel modelling approaches, especially when the focus is on population-based queries. A healthcare dataset with 4.2 million records stored in a relational database (MySQL) was used to generate XML and JSON documents based on the openEHR reference model. Six datasets with different sizes were created from these documents and imported into three single machine XML databases (BaseX, eXistdb and Berkeley DB XML) and into a distributed NoSQL database system based on the MapReduce approach, Couchbase, deployed in different cluster configurations of 1, 2, 4, 8 and 12 machines. Population-based queries were submitted to those databases and to the original relational database. Database size and query response times are presented. The XML databases were considerably slower and required much more space than Couchbase. Overall, Couchbase had better response times than MySQL, especially for larger datasets. However, Couchbase requires indexing for each differently formulated query and the indexing time increases with the size of the datasets. The performances of the clusters with 2, 4, 8 and 12 nodes were not better than the single node cluster in relation to the query response time, but the indexing time was reduced proportionally to the number of nodes. The tested XML databases had acceptable performance for openEHR-based data in some querying use cases and small datasets, but were generally much slower than Couchbase. Couchbase also outperformed the response times of the relational database, but required more disk space and had a much longer indexing time. Systems like Couchbase are thus interesting research targets for scalable storage and querying of archetype-based EHR data when population-based use

  12. First high speed imaging of lightning from summer thunderstorms over India: Preliminary results based on amateur recording using a digital camera

    Science.gov (United States)

    Narayanan, V. L.

    2017-12-01

    For the first time, high speed imaging of lightning from few isolated tropical thunderstorms are observed from India. The recordings are made from Tirupati (13.6oN, 79.4oE, 180 m above mean sea level) during summer months with a digital camera capable of recording high speed videos up to 480 fps. At 480 fps, each individual video file is recorded for 30 s resulting in 14400 deinterlaced images per video file. An automatic processing algorithm is developed for quick identification and analysis of the lightning events which will be discussed in detail. Preliminary results indicating different types of phenomena associated with lightning like stepped leader, dart leader, luminous channels corresponding to continuing current and M components are discussed. While most of the examples show cloud to ground discharges, few interesting cases of intra-cloud, inter-cloud and cloud-air discharges will also be displayed. This indicates that though high speed cameras with few 1000 fps are preferred for a detailed study on lightning, moderate range CMOS sensor based digital cameras can provide important information as well. The lightning imaging activity presented herein is initiated as an amateur effort and currently plans are underway to propose a suite of supporting instruments to conduct coordinated campaigns. The images discussed here are acquired from normal residential area and indicate how frequent lightning strikes are in such tropical locations during thunderstorms, though no towering structures are nearby. It is expected that popularizing of such recordings made with affordable digital cameras will trigger more interest in lightning research and provide a possible data source from amateur observers paving the way for citizen science.

  13. Influence of weather factors on population dynamics of two lagomorph species based on hunting bag records

    NARCIS (Netherlands)

    Rödel, H.; Dekker, J.J.A.

    2012-01-01

    Weather conditions can have a significant influence on short-term fluctuations of animal populations. In our study, which is based on time series of hunting bag records of up to 28 years from 26 counties of The Netherlands and Germany, we investigated the impact of different weather variables on

  14. 13 CFR 106.402 - What provisions must be set forth in a Non-Fee Based Record?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What provisions must be set forth in a Non-Fee Based Record? 106.402 Section 106.402 Business Credit and Assistance SMALL BUSINESS... endorsement by SBA of the Donor, or the Donor's products or services. ...

  15. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  16. The role of MFM signal in mark size measurement in probe-based magnetic recording on CoNi/Pt multilayers

    NARCIS (Netherlands)

    Zhang, Li; Bain, James A.; Zhu, Jian-Gang; Abelmann, Leon; Onoue, T.

    2007-01-01

    A method of heat-assisted magnetic recording (HAMR) potentially suitable for probe-based storage systems is characterized. Magnetic marks were formed by a scanning tunneling microscopy (STM)-based thermal magnetic mechanism on a perpendicular CoNi/Pt multilayered film. Magnetic force microscopy

  17. Implementation of Electronic Health Records in US Nursing Homes.

    Science.gov (United States)

    Bjarnadottir, Ragnhildur I; Herzig, Carolyn T A; Travers, Jasmine L; Castle, Nicholas G; Stone, Patricia W

    2017-08-01

    While electronic health records have emerged as promising tools to help improve quality of care, nursing homes have lagged behind in implementation. This study assessed electronic health records implementation, associated facility characteristics, and potential impact on quality indicators in nursing homes. Using national Centers for Medicare & Medicaid Services and survey data for nursing homes, a cross-sectional analysis was conducted to identify variations between nursing homes that had and had not implemented electronic health records. A difference-in-differences analysis was used to estimate the longitudinal effect of electronic health records on commonly used quality indicators. Data from 927 nursing homes were examined, 49.1% of which had implemented electronic health records. Nursing homes with electronic health records were more likely to be nonprofit/government owned (P = .04) and had a lower percentage of Medicaid residents (P = .02) and higher certified nursing assistant and registered nurse staffing levels (P = .002 and .02, respectively). Difference-in-differences analysis showed greater quality improvements after implementation for five long-stay and two short-stay quality measures (P = .001 and .01, respectively) compared with those who did not implement electronic health records. Implementation rates in nursing homes are low compared with other settings, and better-resourced facilities are more likely to have implemented electronic health records. Consistent with other settings, electronic health records implementation improves quality in nursing homes, but further research is needed to better understand the mechanism for improvement and how it can best be supported.

  18. Predictive value of casual ECG-based resting heart rate compared with resting heart rate obtained from Holter recording

    DEFF Research Database (Denmark)

    Carlson, Nicholas; Dixen, Ulrik; Marott, Jacob L

    2014-01-01

    BACKGROUND: Elevated resting heart rate (RHR) is associated with cardiovascular mortality and morbidity. Assessment of heart rate (HR) from Holter recording may afford a more precise estimate of the effect of RHR on cardiovascular risk, as compared to casual RHR. Comparative analysis was carried ...

  19. The fossil record of the sixth extinction.

    Science.gov (United States)

    Plotnick, Roy E; Smith, Felisa A; Lyons, S Kathleen

    2016-05-01

    Comparing the magnitude of the current biodiversity crisis with those in the fossil record is difficult without an understanding of differential preservation. Integrating data from palaeontological databases with information on IUCN status, ecology and life history characteristics of contemporary mammals, we demonstrate that only a small and biased fraction of threatened species (fossil record, compared with 20% of non-threatened species. We find strong taphonomic biases related to body size and geographic range. Modern species with a fossil record tend to be large and widespread and were described in the 19(th) century. The expected magnitude of the current extinction based only on species with a fossil record is about half of that of one based on all modern species; values for genera are similar. The record of ancient extinctions may be similarly biased, with many species having originated and gone extinct without leaving a tangible record. © 2016 John Wiley & Sons Ltd/CNRS.

  20. A technique to reduce motion artifact for externally triggered cine-MRI(EC-MRI) based on detecting the onset of the articulated word with spectral analysis

    International Nuclear Information System (INIS)

    Shimada, Yasuhiro; Nishimoto, Hironori; Kochiyama, Takanori; Fujimoto, Ichiro; Mano, Hiroaki; Masaki, Shinobu; Murase, Kenya

    2012-01-01

    One issue in externally triggered cine-magnetic resonance imaging (EC-MRI) for the dynamic observation of speech organs is motion artifact in the phase-encoding direction caused by unstable repetitions of speech during data acquisition. We propose a technique to reduce such artifact by rearranging the k-space data used to reconstruct MR images based on the analysis of recorded speech sounds. We recorded the subject's speech sounds during EC-MRI and used post hoc acoustical processing to reduce scanning noise and detect the onset of each utterance based on analysis of the recorded sounds. We selected each line of k-space from several data acquisition sessions and rearranged them to reconstruct a new series of dynamic MR images according to the analyzed time of utterance onset. Comparative evaluation showed significant reduction in motion artifact signal in the dynamic MR images reconstructed by the proposed method. The quality of the reconstructed images was sufficient to observe the dynamic aspects of speech production mechanisms. (author)

  1. [Examination of safety improvement by failure record analysis that uses reliability engineering].

    Science.gov (United States)

    Kato, Kyoichi; Sato, Hisaya; Abe, Yoshihisa; Ishimori, Yoshiyuki; Hirano, Hiroshi; Higashimura, Kyoji; Amauchi, Hiroshi; Yanakita, Takashi; Kikuchi, Kei; Nakazawa, Yasuo

    2010-08-20

    How the maintenance checks of the medical treatment system, including start of work check and the ending check, was effective for preventive maintenance and the safety improvement was verified. In this research, date on the failure of devices in multiple facilities was collected, and the data of the trouble repair record was analyzed by the technique of reliability engineering. An analysis of data on the system (8 general systems, 6 Angio systems, 11 CT systems, 8 MRI systems, 8 RI systems, and the radiation therapy system 9) used in eight hospitals was performed. The data collection period assumed nine months from April to December 2008. Seven items were analyzed. (1) Mean time between failures (MTBF) (2) Mean time to repair (MTTR) (3) Mean down time (MDT) (4) Number found by check in morning (5) Failure generation time according to modality. The classification of the breakdowns per device, the incidence, and the tendency could be understood by introducing reliability engineering. Analysis, evaluation, and feedback on the failure generation history are useful to keep downtime to a minimum and to ensure safety.

  2. A comparison between flexible electrogoniometers, inclinometers and three-dimensional video analysis system for recording neck movement.

    Science.gov (United States)

    Carnaz, Letícia; Moriguchi, Cristiane S; de Oliveira, Ana Beatriz; Santiago, Paulo R P; Caurin, Glauco A P; Hansson, Gert-Åke; Coury, Helenice J C Gil

    2013-11-01

    This study compared neck range of movement recording using three different methods goniometers (EGM), inclinometers (INC) and a three-dimensional video analysis system (IMG) in simultaneous and synchronized data collection. Twelve females performed neck flexion-extension, lateral flexion, rotation and circumduction. The differences between EGM, INC, and IMG were calculated sample by sample. For flexion-extension movement, IMG underestimated the amplitude by 13%; moreover, EGM showed a crosstalk of about 20% for lateral flexion and rotation axes. In lateral flexion movement, all systems showed similar amplitude and the inter-system differences were moderate (4-7%). For rotation movement, EGM showed a high crosstalk (13%) for flexion-extension axis. During the circumduction movement, IMG underestimated the amplitude of flexion-extension movements by about 11%, and the inter-system differences were high (about 17%) except for INC-IMG regarding lateral flexion (7%) and EGM-INC regarding flexion-extension (10%). For application in workplace, INC presents good results compared to IMG and EGM though INC cannot record rotation. EGM should be improved in order to reduce its crosstalk errors and allow recording of the full neck range of movement. Due to non-optimal positioning of the cameras for recording flexion-extension, IMG underestimated the amplitude of these movements. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  3. Estimation of optimal hologram recording modes on photothermal materials

    Science.gov (United States)

    Dzhamankyzov, Nasipbek Kurmanalievich; Ismanov, Yusupzhan Khakimzhanovich; Zhumaliev, Kubanychbek Myrzabekovich; Alymkulov, Samsaly Amanovich

    2018-01-01

    A theoretical analysis of the hologram recording process on photothermal media to estimate the required laser radiation power for the information recording as the function of the spatial frequency and radiation exposure duration is considered. Results of the analysis showed that materials with a low thermal diffusivity are necessary to increase the recording density in these media and the recording should be performed with short pulses to minimize the thermal diffusion length. A solution for the heat conduction equation for photothermal materials heated by an interference laser field was found. The solution obtained allows one to determine the required value of the recording temperature for given spatial frequencies, depending on the thermal physical parameters of the medium and on the power and duration of the heating radiation.

  4. Sensing and recording the vibration of a spinning rotor with NCDT and UV recorder

    International Nuclear Information System (INIS)

    Ahmed, Z.; Khan, N.

    1998-01-01

    One among the problems faced during commissioning of an ultra centrifuge, developed at Dr. A.Q . Khan Research Laboratories for separation of heavy nuclei through centrifugation process, was the unwanted mechanical vibrations that developed in its fast spinning rotor. These high amplitude vibrations invariably resulted n the crash of the rotor ending up in operational failure. This paper describes a practical procedure adopted to sense these vibrations with the help of a non-contact displacement transducer (N.C.D.T.) and their recording through an ultra violet (UV) recorder. After wards analysis of these recording guided towards the alteration/modification is required in the design/manufacturing process. Hereby making the operation successful. (author)

  5. Documentation of Accounting Records in Light of Legislative Innovations

    Directory of Open Access Journals (Sweden)

    K. V. BEZVERKHIY

    2017-05-01

    Full Text Available Legislative reforms in accounting aim to simplify accounting records and compilation of financial reports by business entities, thus increasing the position of Ukraine in the global ranking of Doing Business. This simplification is implied in the changes in the Regulation on Documentation of Accounting Records, entered into force to the Resolution of the Ukrainian Ministry of Finance. The objective of the study is to analyze the legislative innovations involved. The review of changes in documentation of accounting records is made. A comparative analysis of changes in the Regulation on Documentation of Accounting Records is made by sections: 1 General; 2 Primary documents; 3 Accounting records; 4 Correction of errors in primary documents and accounting records; 5 Organization of document circulation; 6 Storage of documents. Methods of analysis and synthesis are used for separating the differences in the editions of the Regulation on Documentation of Accounting Records. The result of the study has theoretical and practical value for the domestic business enterprise sector.

  6. A novel high electrode count spike recording array using an 81,920 pixel transimpedance amplifier-based imaging chip.

    Science.gov (United States)

    Johnson, Lee J; Cohen, Ethan; Ilg, Doug; Klein, Richard; Skeath, Perry; Scribner, Dean A

    2012-04-15

    Microelectrode recording arrays of 60-100 electrodes are commonly used to record neuronal biopotentials, and these have aided our understanding of brain function, development and pathology. However, higher density microelectrode recording arrays of larger area are needed to study neuronal function over broader brain regions such as in cerebral cortex or hippocampal slices. Here, we present a novel design of a high electrode count picocurrent imaging array (PIA), based on an 81,920 pixel Indigo ISC9809 readout integrated circuit camera chip. While originally developed for interfacing to infrared photodetector arrays, we have adapted the chip for neuron recording by bonding it to microwire glass resulting in an array with an inter-electrode pixel spacing of 30 μm. In a high density electrode array, the ability to selectively record neural regions at high speed and with good signal to noise ratio are both functionally important. A critical feature of our PIA is that each pixel contains a dedicated low noise transimpedance amplifier (∼0.32 pA rms) which allows recording high signal to noise ratio biocurrents comparable to single electrode voltage amplifier recordings. Using selective sampling of 256 pixel subarray regions, we recorded the extracellular biocurrents of rabbit retinal ganglion cell spikes at sampling rates up to 7.2 kHz. Full array local electroretinogram currents could also be recorded at frame rates up to 100 Hz. A PIA with a full complement of 4 readout circuits would span 1cm and could acquire simultaneous data from selected regions of 1024 electrodes at sampling rates up to 9.3 kHz. Published by Elsevier B.V.

  7. Performance-based seismic assessment of vulnerability of dam using time history analysis

    Directory of Open Access Journals (Sweden)

    Elmrabet Oumnia

    2018-01-01

    Full Text Available The current performance-based seismic assessment procedure can be computationally intensive as it requires many time history analyses (THA each requiring time intensive post-processing of results. Time history analysis is a part of structural analysis and is the calculation of the response of a structure to any earthquake. It is one of the main processes of structural design in regions where earthquakes are prevalent. The objective of this study is to evaluate the seismic performance of embankment dam located on the Oued RHISS in the Province of AL HOCEIMA using the THA method. To monitor structural behavior, the seismic vulnerability of structure is evaluated under real earthquake records with considering the soil-structure-fluide interaction. In this study, a simple assistant program is developed for implementing earthquake analyses of structure with ANSYS, ground acceleration–time history data are used for seismic analysis and dynamic numerical simulations were conducted to study and identify the total response of the soil-structure system.

  8. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Base Height (CBH) Environmental Data Record (EDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of Cloud Base Heights (CBH) from the Visible Infrared Imaging Radiometer Suite...

  9. An analysis of electronic health record-related patient safety incidents.

    Science.gov (United States)

    Palojoki, Sari; Mäkelä, Matti; Lehtonen, Lasse; Saranto, Kaija

    2017-06-01

    The aim of this study was to analyse electronic health record-related patient safety incidents in the patient safety incident reporting database in fully digital hospitals in Finland. We compare Finnish data to similar international data and discuss their content with regard to the literature. We analysed the types of electronic health record-related patient safety incidents that occurred at 23 hospitals during a 2-year period. A procedure of taxonomy mapping served to allow comparisons. This study represents a rare examination of patient safety risks in a fully digital environment. The proportion of electronic health record-related incidents was markedly higher in our study than in previous studies with similar data. Human-computer interaction problems were the most frequently reported. The results show the possibility of error arising from the complex interaction between clinicians and computers.

  10. An analysis of concert saxophone vibrato through the examination of recordings by eight prominent soloists

    Science.gov (United States)

    Zinninger, Thomas

    This study examines concert saxophone vibrato through the analysis of several recordings of standard repertoire by prominent soloists. The vibrato of Vincent Abato, Arno Bornkamp, Claude Delangle, Jean-Marie Londeix, Marcel Mule, Otis Murphy, Sigurd Rascher, and Eugene Rousseau is analyzed with regards to rate, extent, shape, and discretionary use. Examination of these parameters was conducted through both general observation and precise measurements with the aid of a spectrogram. Statistical analyses of the results provide tendencies for overall vibrato use, as well as the effects of certain musical attributes (note length, tempo, dynamic, range) on vibrato. The results of this analysis are also compared among each soloist and against pre-existing theories or findings in vibrato research.

  11. Comparison of radon-daughter exposures calculated for US- underground uranium miners based on MSHA and company records

    International Nuclear Information System (INIS)

    Cooper, W.E.

    1981-01-01

    How accurate are past and present employee radon-daughter exposure records of underground uranium miners employed in the United States. This often-debated question is essential for future substantiation of safe exposure limits. An apparent discrepancy between company-reported exposures and Mining Enforcement and Safety Administration (MESA) projected exposures was detected in 1977. For these reasons a need for an updated comparison of these exposure data was indicated. This paper gives some of the conclusions of the earlier study and compares more recent exposure records compiled by the Atomic Industrial Forum, Inc., with projected exposures based on sampling by Federal mine inspectors

  12. Comparison of semiautomated bird song recognition with manual detection of recorded bird song samples

    Directory of Open Access Journals (Sweden)

    Lisa A. Venier

    2017-12-01

    Full Text Available Automated recording units are increasingly being used to sample wildlife populations. These devices can produce large amounts of data that are difficult to process manually. However, the information in the recordings can be summarized with semiautomated sound recognition software. Our objective was to assess the utility of the semiautomated bird song recognizers to produce data useful for conservation and sustainable forest management applications. We compared detection data generated from expert-interpreted recordings of bird songs collected with automated recording units and data derived from a semiautomated recognition process. We recorded bird songs at 109 sites in boreal forest in 2013 and 2014 using automated recording units. We developed bird-song recognizers for 10 species using Song Scope software (Wildlife Acoustics and each recognizer was used to scan a set of recordings that was also interpreted manually by an expert in birdsong identification. We used occupancy models to estimate the detection probability associated with each method. Based on these detection probability estimates we produced cumulative detection probability curves. In a second analysis we estimated detection probability of bird song recognizers using multiple 10-minute recordings for a single station and visit (35-63, 10-minute recordings in each of four one-week periods. Results show that the detection probability of most species from single 10-min recordings is substantially higher using expert-interpreted bird song recordings than using the song recognizer software. However, our results also indicate that detection probabilities for song recognizers can be significantly improved by using more than a single 10-minute recording, which can be easily done with little additional cost with the automate procedure. Based on these results we suggest that automated recording units and song recognizer software can be valuable tools to estimate detection probability and

  13. Knowledge and attitudes of nurses in community health centres about electronic medical records

    Directory of Open Access Journals (Sweden)

    Don O'Mahony

    2014-02-01

    Full Text Available Background: Nurses in primary healthcare record data for the monitoring and evaluation of diseases and services. Information and communications technology (ICT can improve quality in healthcare by providing quality medical records. However, worldwide, the majority of health ICT projects have failed. Individual user acceptance is a crucial factor in successful ICT implementation. Objectives: The aim of this study is to explore nurses’ knowledge, attitudes and perceptions regarding ICT so as to inform the future implementation of electronic medical record (EMR systems. Methods: A qualitative design was used. Semi-structured interviews were undertaken with nurses at three community health centres (CHCs in the King Sabata Dalyindyebo Local Municipality. The interview guide was informed by the literature on user acceptance of ICT. Interviews were recorded and analysed using content analysis. Results: Many nurses knew about health ICT and articulated clearly the potential benefits of an EMR such as fewer errors, more complete records, easier reporting and access to information. They thought that an EMR system would solve the challenges they identified with the current paper-based record system, including duplication of data, misfiling, lack of a chronological patient record, excessive time in recording and reduced time for patient care. For personal ICT needs, approximately half used cellphone Internet-based services and computers. Conclusions: In this study, nurses identified many challenges with the current recording methods. They thought that an EMR should be installed at CHCs. Their knowledge about EMR, positive attitudes to ICT and personal use of ICT devices increase the likelihood of successful EMR implementation at CHCs.

  14. Comparison of Nonlinear Model Results Using Modified Recorded and Synthetic Ground Motions

    International Nuclear Information System (INIS)

    Spears, Robert E.; Wilkins, J. Kevin

    2011-01-01

    A study has been performed that compares results of nonlinear model runs using two sets of earthquake ground motion time histories that have been modified to fit the same design response spectra. The time histories include applicable modified recorded earthquake ground motion time histories and synthetic ground motion time histories. The modified recorded earthquake ground motion time histories are modified from time history records that are selected based on consistent magnitude and distance. The synthetic ground motion time histories are generated using appropriate Fourier amplitude spectrums, Arias intensity, and drift correction. All of the time history modification is performed using the same algorithm to fit the design response spectra. The study provides data to demonstrate that properly managed synthetic ground motion time histories are reasonable for use in nonlinear seismic analysis.

  15. Proposal for a New Model for Highway Records in the Republic of Croatia

    Directory of Open Access Journals (Sweden)

    Rinaldo Paar

    2012-05-01

    Full Text Available Highways are public roads, whose function is to integrate Croatia in the European transport system, link the regions of Croatia and facilitate transit traffic. They are public property for general use, owned by the Republic of Croatia, and they cannot be the subject of acquisition or other proprietary rights of any kind. Today, there are two types of highway record-keeping. The first is conducted by leading companies authorised to manage highways in order to develop a highway database, or create a highway register. The second is conducted by land-management systems; the Cadastre and Land Registry. They are the official public registers for keeping records of land plots, buildings and other structures, and their ownership. Procedures that need to be implemented in the second type of record-keeping often get "stuck" in practice. Based on the problems identified in this model and an analysis of the state of record-keeping, a proposal for a new model for highway records in the Cadastre and Land Registry is given. Keywords: highways; building plot; expropriation; record; Cadastre; Land Registry

  16. An Ensemble Learning Based Framework for Traditional Chinese Medicine Data Analysis with ICD-10 Labels

    Directory of Open Access Journals (Sweden)

    Gang Zhang

    2015-01-01

    Full Text Available Objective. This study aims to establish a model to analyze clinical experience of TCM veteran doctors. We propose an ensemble learning based framework to analyze clinical records with ICD-10 labels information for effective diagnosis and acupoints recommendation. Methods. We propose an ensemble learning framework for the analysis task. A set of base learners composed of decision tree (DT and support vector machine (SVM are trained by bootstrapping the training dataset. The base learners are sorted by accuracy and diversity through nondominated sort (NDS algorithm and combined through a deep ensemble learning strategy. Results. We evaluate the proposed method with comparison to two currently successful methods on a clinical diagnosis dataset with manually labeled ICD-10 information. ICD-10 label annotation and acupoints recommendation are evaluated for three methods. The proposed method achieves an accuracy rate of 88.2%  ±  2.8% measured by zero-one loss for the first evaluation session and 79.6%  ±  3.6% measured by Hamming loss, which are superior to the other two methods. Conclusion. The proposed ensemble model can effectively model the implied knowledge and experience in historic clinical data records. The computational cost of training a set of base learners is relatively low.

  17. Required number of records for ASCE/SEI 7 ground-motion scaling procedure

    Science.gov (United States)

    Reyes, Juan C.; Kalkan, Erol

    2011-01-01

    The procedures and criteria in 2006 IBC (International Council of Building Officials, 2006) and 2007 CBC (International Council of Building Officials, 2007) for the selection and scaling ground-motions for use in nonlinear response history analysis (RHA) of structures are based on ASCE/SEI 7 provisions (ASCE, 2005, 2010). According to ASCE/SEI 7, earthquake records should be selected from events of magnitudes, fault distance, and source mechanisms that comply with the maximum considered earthquake, and then scaled so that the average value of the 5-percent-damped response spectra for the set of scaled records is not less than the design response spectrum over the period range from 0.2Tn to 1.5Tn sec (where Tn is the fundamental vibration period of the structure). If at least seven ground-motions are analyzed, the design values of engineering demand parameters (EDPs) are taken as the average of the EDPs determined from the analyses. If fewer than seven ground-motions are analyzed, the design values of EDPs are taken as the maximum values of the EDPs. ASCE/SEI 7 requires a minimum of three ground-motions. These limits on the number of records in the ASCE/SEI 7 procedure are based on engineering experience, rather than on a comprehensive evaluation. This study statistically examines the required number of records for the ASCE/SEI 7 procedure, such that the scaled records provide accurate, efficient, and consistent estimates of" true" structural responses. Based on elastic-perfectly-plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI 7 scaling procedure is applied to 480 sets of ground-motions. The number of records in these sets varies from three to ten. The records in each set were selected either (i) randomly, (ii) considering their spectral shapes, or (iii) considering their spectral shapes and design spectral-acceleration value, A(Tn). As compared to benchmark (that is, "true") responses from unscaled records using a larger catalog of ground

  18. Analysis of clinical records of dental patients attending Jordan University Hospital: Documentation of drug prescriptions and local anesthetic injections

    Directory of Open Access Journals (Sweden)

    Najla Dar-Odeh

    2008-08-01

    Full Text Available Najla Dar-Odeh1, Soukaina Ryalat1, Mohammad Shayyab1, Osama Abu-Hammad21Department of Oral and Maxillofacial Surgery Oral Medicine and Periodontics, Faculty of Dentistry, University of Jordan, Jordan; 2Department of Prosthetic Dentistry, Faculty of Dentistry, University of Jordan, JordanObjectives: The aim of this study was to analyze clinical records of dental patients attending the Dental Department at the University of Jordan Hospital: a teaching hospital in Jordan. Analysis aimed at determining whether dental specialists properly documented the drug prescriptions and local anesthetic injections given to their patients.Methods: Dental records of the Dental Department at the Jordan University Hospital were reviewed during the period from April 3rd until April 26th 2007 along with the issued prescriptions during that period.Results: A total of 1000 records were reviewed with a total of 53 prescriptions issued during that period. Thirty records documented the prescription by stating the category of the prescribed drug. Only 13 records stated the generic or the trade names of the prescribed drugs. Of these, 5 records contained the full elements of a prescription. As for local anesthetic injections, the term “LA used” was found in 22 records while the names and quantities of the local anesthetics used were documented in only 13 records. Only 5 records documented the full elements of a local anesthetic injection.Conclusion: The essential data of drug prescriptions and local anesthetic injections were poorly documented by the investigated group of dental specialists. It is recommended that the administration of the hospital and the dental department implement clear and firm guidelines for dental practitioners in particular to do the required documentation procedure.Keywords: dental records, documentation, prescriptions, local anesthesia

  19. Bat records from Malawi (Mammalia, Chiroptera)

    NARCIS (Netherlands)

    Bergmans, Wim; Jachmann, Hugo

    1983-01-01

    Five species of bats are recorded from Kasungu National Park, Malawi: Eidolon helvum (Kerr, 1792); Epomophorus anurus Heuglin, 1864; Epomophorus minor Dobson, 1880; Epomops dobsonii (Bocage, 1889); and Scotoecus hindei Thomas, 1901. Some other Malawian records of these species, based on literature

  20. Network Analysis of Foramen Ovale Electrode Recordings in Drug-resistant Temporal Lobe Epilepsy Patients

    Science.gov (United States)

    Sanz-García, Ancor; Vega-Zelaya, Lorena; Pastor, Jesús; Torres, Cristina V.; Sola, Rafael G.; Ortega, Guillermo J.

    2016-01-01

    Approximately 30% of epilepsy patients are refractory to antiepileptic drugs. In these cases, surgery is the only alternative to eliminate/control seizures. However, a significant minority of patients continues to exhibit post-operative seizures, even in those cases in which the suspected source of seizures has been correctly localized and resected. The protocol presented here combines a clinical procedure routinely employed during the pre-operative evaluation of temporal lobe epilepsy (TLE) patients with a novel technique for network analysis. The method allows for the evaluation of the temporal evolution of mesial network parameters. The bilateral insertion of foramen ovale electrodes (FOE) into the ambient cistern simultaneously records electrocortical activity at several mesial areas in the temporal lobe. Furthermore, network methodology applied to the recorded time series tracks the temporal evolution of the mesial networks both interictally and during the seizures. In this way, the presented protocol offers a unique way to visualize and quantify measures that considers the relationships between several mesial areas instead of a single area. PMID:28060326

  1. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  2. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    Science.gov (United States)

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  3. Quality assurance records and records' system

    International Nuclear Information System (INIS)

    Link, M.; Martinek, J.

    1980-01-01

    For nuclear power plants extensive proof of quality is required which has to be documented reliably by quality records. With respect to the paper volume it is the most comprehensive 'curriculum vitae' of the technique. Traditional methods of information and recording are unsatisfactory for meeting regulatory requirements for maintaining the QA-aspects of status reporting, completeness, traceability and retrieval. Therefore KWU has established a record (documentation) subsystem within the overall component qualification system. Examples of the general documentation requirements, the procedure and handling in accordance with this subsystem for mechanical equipment are to be described examplarily. Topics are: - National and international requirements - Definition of QA records - Modular and product orientated KWU-record subsystem - Criteria for developing records - Record control, distribution, collection, storage - New documentation techniques (microfilm, data processing) - Education and training of personnel. (orig./RW)

  4. Towards Semantic Search and Inference in Electronic Medical Records

    Directory of Open Access Journals (Sweden)

    Bevan Koopman

    2012-09-01

    Full Text Available Background This paper presents a novel approach to searching electronic medical records that is based on concept matching rather than keyword matching. Aims The concept-based approach is intended to overcome specific challenges we identified in searching medical records. Method Queries and documents were transformed from their term-based originals into medical concepts as defined by the SNOMED-CT ontology. Results Evaluation on a real-world collection of medical records showed our concept-based approach outperformed a keyword baseline by 25% in Mean Average Precision. Conclusion The concept-based approach provides a framework for further development of inference based search systems for dealing with medical data.

  5. Comparison between uroflowmetry and sonouroflowmetry in recording of urinary flow in healthy men.

    Science.gov (United States)

    Krhut, Jan; Gärtner, Marcel; Sýkora, Radek; Hurtík, Petr; Burda, Michal; Luňáček, Libor; Zvarová, Katarína; Zvara, Peter

    2015-08-01

    To evaluate the accuracy of sonouroflowmetry in recording urinary flow parameters and voided volume. A total of 25 healthy male volunteers (age 18-63 years) were included in the study. All participants were asked to carry out uroflowmetry synchronous with recording of the sound generated by the urine stream hitting the water level in the urine collection receptacle, using a dedicated cell phone. From 188 recordings, 34 were excluded, because of voided volume Pearson's correlation coefficient was used to compare parameters recorded by uroflowmetry with those calculated based on sonouroflowmetry recordings. The flow pattern recorded by sonouroflowmetry showed a good correlation with the uroflowmetry trace. A strong correlation (Pearson's correlation coefficient 0.87) was documented between uroflowmetry-recorded flow time and duration of the sound signal recorded with sonouroflowmetry. A moderate correlation was observed in voided volume (Pearson's correlation coefficient 0.68) and average flow rate (Pearson's correlation coefficient 0.57). A weak correlation (Pearson's correlation coefficient 0.38) between maximum flow rate recorded using uroflowmetry and sonouroflowmetry-recorded peak sound intensity was documented. The present study shows that the basic concept utilizing sound analysis for estimation of urinary flow parameters and voided volume is valid. However, further development of this technology and standardization of recording algorithm are required. © 2015 The Japanese Urological Association.

  6. Analysis of valve failures from the NUCLARR data base

    International Nuclear Information System (INIS)

    Moore, L.M.

    1997-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) contains data on component failures with categorical and qualifying information such as component design, normal operating state, system application and safety grade information which is important to the development of risk-based component surveillance testing requirements. This report presents descriptions and results of analyses of valve component failure data and covariate information available in the document Nuclear Computerized Library for Assessing Reactor Reliability Data Manual, Part 3: Hardware Component Failure Data (NUCLARR Data Manual). Although there are substantial records on valve performance, there are many categories of the corresponding descriptors and qualifying information for which specific values are missing. Consequently, this limits the data available for analysis of covariate effects. This report presents cross tabulations by different covariate categories and limited modeling of covariate effects for data subsets with substantive non-missing covariate information

  7. Laser Based Color Film Recorder System With GaAs Microlaser

    Science.gov (United States)

    Difrancesco, David J.

    1989-07-01

    In 1984 Pixar's research and development group built and applied to the motion-picture arts at Lucasfilm's ILM facility a three color laser based film scanner/recorder system. The digital film printer is capable of reading and writing 35mm film formats on a variety of film stocks. The system has been used in award-winning special-effects work, and has been operated in a normal production environment since that time. The primary objective was to develop a full color high resolution system, free from scan artifacts, enabling traditionally photographed motion-picture film to be inter-cut with digital raster image photography. Its use is applied to the art of blue-screen traveling-matte cinematography for motion pic-ture special effects. The system was designed using the Pixar Image Computer and conventional gas laser technology as the illumination source. This paper will discuss recent experimental work in the application of GaAs microlaser technology to a digital film printing system of the future.

  8. Seismic demand evaluation based on actual earthquake records

    International Nuclear Information System (INIS)

    Jhaveri, D.P.; Czarnecki, R.M.; Kassawara, R.P.; Singh, A.

    1990-01-01

    Seismic input in the form of floor response spectra (FRS) are needed in seismic design and evaluation of equipment in nuclear power plants (NPPs). These are typically determined by analytical procedures using mathematical models of NPP structures and are known to be very conservative. Recorded earthquake data, in the form of acceleration response spectra computed from the recorded acceleration time histories, have been collected from NPP structures located in seismically active areas. Statistics of the ratios, or amplification factors, between the FRS at typical floors and the acceleration response spectra at the basemat or in the freefield, are obtained for typical NPP structures. These amplification factors are typically in terms of the peak spectral and zero period values, as well as a function of frequency. The average + 1σ values of these ratios, for those cases where enough data are available, are proposed to be used as limits to analytically calculated FRS, or for construction of simplified FRS for determining seismic input or demand in equipment qualification. (orig.)

  9. 76 FR 75421 - Managing Government Records

    Science.gov (United States)

    2011-12-01

    ... redundant efforts, to save money, and to share knowledge within and across their organizations. In these... current plans for improving or maintaining its records management program, particularly with respect to managing electronic records, including email and social media, deploying cloud-based services or storage...

  10. In Pursuit of Reciprocity: Researchers, Teachers, and School Reformers Engaged in Collaborative Analysis of Video Records

    Science.gov (United States)

    Curry, Marnie W.

    2012-01-01

    In the ideal, reciprocity in qualitative inquiry occurs when there is give-and-take between researchers and the researched; however, the demands of the academy and resource constraints often make the pursuit of reciprocity difficult. Drawing on two video-based, qualitative studies in which researchers utilized video records as resources to enhance…

  11. Adoptive Parents' Attitudes Toward Open Birth Records.

    Science.gov (United States)

    Geissinger, Shirley

    1984-01-01

    Investigated adoptive parents' (N=42) attitudes toward the open birth record issues using a mail survey. Analysis indicated that parental fear was the most important variable. Most supported a measure allowing adult adoptees access to birth records, provided such access was agreeable to birth and adoptive parents. (JAC)

  12. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  13. Definition of a Storage Accounting Record

    CERN Document Server

    Jensen, H. T.; Müller-Pfefferkorn, R.; Nilsen, J. K.; Zsolt, M.; Zappi, R.

    2011-01-01

    In this document a storage accounting record StAR is defined reflecting practical, financial and legal requirements of storage location, usage and space and data flow. The definition might be the base for a standardized schema or an extension of an existing record like the OGF UR.

  14. Design of a cluster-randomized trial of electronic health record-based tools to address overweight and obesity in primary care.

    Science.gov (United States)

    Baer, Heather J; Wee, Christina C; DeVito, Katerina; Orav, E John; Frolkis, Joseph P; Williams, Deborah H; Wright, Adam; Bates, David W

    2015-08-01

    Primary care providers often fail to identify patients who are overweight or obese or discuss weight management with them. Electronic health record-based tools may help providers with the assessment and management of overweight and obesity. We describe the design of a trial to examine the effectiveness of electronic health record-based tools for the assessment and management of overweight and obesity among adult primary care patients, as well as the challenges we encountered. We developed several new features within the electronic health record used by primary care practices affiliated with Brigham and Women's Hospital in Boston, MA. These features included (1) reminders to measure height and weight, (2) an alert asking providers to add overweight or obesity to the problem list, (3) reminders with tailored management recommendations, and (4) a Weight Management screen. We then conducted a pragmatic, cluster-randomized controlled trial in 12 primary care practices. We randomized 23 clinical teams ("clinics") within the practices to the intervention group (n = 11) or the control group (n = 12). The new features were activated only for clinics in the intervention group. The intervention was implemented in two phases: the height and weight reminders went live on 15 December 2011 (Phase 1), and all of the other features went live on 11 June 2012 (Phase 2). Study enrollment went from December 2011 through December 2012, and follow-up ended in December 2013. The primary outcomes were 6-month and 12-month weight change among adult patients with body mass index ≥25 who had a visit at one of the primary care clinics during Phase 2. Secondary outcome measures included the proportion of patients with a recorded body mass index in the electronic health record, the proportion of patients with body mass index ≥25 who had a diagnosis of overweight or obesity on the electronic health record problem list, and the proportion of patients with body mass index ≥25 who had

  15. Spatiotemporal analysis of single-trial EEG of emotional pictures based on independent component analysis and source location

    Science.gov (United States)

    Liu, Jiangang; Tian, Jie

    2007-03-01

    The present study combined the Independent Component Analysis (ICA) and low-resolution brain electromagnetic tomography (LORETA) algorithms to identify the spatial distribution and time course of single-trial EEG record differences between neural responses to emotional stimuli vs. the neutral. Single-trial multichannel (129-sensor) EEG records were collected from 21 healthy, right-handed subjects viewing the emotion emotional (pleasant/unpleasant) and neutral pictures selected from International Affective Picture System (IAPS). For each subject, the single-trial EEG records of each emotional pictures were concatenated with the neutral, and a three-step analysis was applied to each of them in the same way. First, the ICA was performed to decompose each concatenated single-trial EEG records into temporally independent and spatially fixed components, namely independent components (ICs). The IC associated with artifacts were isolated. Second, the clustering analysis classified, across subjects, the temporally and spatially similar ICs into the same clusters, in which nonparametric permutation test for Global Field Power (GFP) of IC projection scalp maps identified significantly different temporal segments of each emotional condition vs. neutral. Third, the brain regions accounted for those significant segments were localized spatially with LORETA analysis. In each cluster, a voxel-by-voxel randomization test identified significantly different brain regions between each emotional condition vs. the neutral. Compared to the neutral, both emotional pictures elicited activation in the visual, temporal, ventromedial and dorsomedial prefrontal cortex and anterior cingulated gyrus. In addition, the pleasant pictures activated the left middle prefrontal cortex and the posterior precuneus, while the unpleasant pictures activated the right orbitofrontal cortex, posterior cingulated gyrus and somatosensory region. Our results were well consistent with other functional imaging

  16. [Design and Implementation of a Mobile Operating Room Information Management System Based on Electronic Medical Record].

    Science.gov (United States)

    Liu, Baozhen; Liu, Zhiguo; Wang, Xianwen

    2015-06-01

    A mobile operating room information management system with electronic medical record (EMR) is designed to improve work efficiency and to enhance the patient information sharing. In the operating room, this system acquires the information from various medical devices through the Client/Server (C/S) pattern, and automatically generates XML-based EMR. Outside the operating room, this system provides information access service by using the Browser/Server (B/S) pattern. Software test shows that this system can correctly collect medical information from equipment and clearly display the real-time waveform. By achieving surgery records with higher quality and sharing the information among mobile medical units, this system can effectively reduce doctors' workload and promote the information construction of the field hospital.

  17. Certificate-Based Encryption with Keyword Search: Enabling Secure Authorization in Electronic Health Record

    Directory of Open Access Journals (Sweden)

    Clémentine Gritti

    2016-11-01

    Full Text Available In an e-Health scenario, we study how the practitioners are authorized when they are requesting access to medical documents containing sensitive information. Consider the following scenario. A clinician wants to access and retrieve a patient’s Electronic Health Record (EHR, and this means that the clinician must acquire sufficient access right to access this document. As the EHR is within a collection of many other patients, the clinician would need to specify some requirements (such as a keyword which match the patient’s record, as well as having a valid access right. The complication begins when we do not want the server to learn anything from this query (as the server might be outsourced to other place. To encompass this situation, we define a new cryptographic primitive called Certificate-Based Encryption with Keyword Search (CBEKS, which will be suitable in this scenario. We also specify the corresponding security models, namely computational consistency, indistinguishability against chosen keyword and ciphertext attacks, indistinguishability against keyword-guessing attacks and collusion resistance. We provide a CBEKS construction that is proven secure in the standard model with respect to the aforementioned security models.

  18. Coral-based climate records from tropical South Atlantic

    DEFF Research Database (Denmark)

    Pereira, Natan S.; Sial, Alcides N.; Kikuchi, Ruy K.P.

    2015-01-01

    the two colonies are observed, yet both record the 2009/2010 El Niño event - a period of widespread coral bleaching - as anomalously negative δ18O values (up to −1 permil). δ13C is found to be measurably affected by the El Niño event in one colony, by more positive values (+0.39 ‰), and together...

  19. Analysis of the hologram recording on the novel chloride photo-thermo-refractive glass

    Science.gov (United States)

    Ivanov, S. A.; Nikonorov, N. V.; Dubrovin, V. D.; Krykova, V. A.

    2017-05-01

    In this research, we present new holographic material based on fluoride photo-thermo-refractive glass(PTR) - chloride PTR glass. One of the benefit of this type of PTR glass is positive refractive index change. During this work, for the first-time volume Bragg gratings were recorded in this kind of material. The first experiments revealed that such gratings are mixed i.e. possess both absorption and phase components. Complex analysis shows that both refractive index and absorption coefficient are modulated inside the grating structure. We found out that at first there is no strict dependence of the refractive index change from dosage, but as we continue the process of thermal treatment - dependence is appear. Exposure influence on the refractive index change for this glass differs from fluoride one and shows some sort of saturation after the exposure of 4-6 J/cm2 . We distinguished refractive index change and absorption coefficient change and observed both behavior with increasing thermal treatment time. We found out that the increase of thermal treatment time results in the significant refractive index change. At the same time the absorption does `not practically change. It was found that maximum modulation of refractive index is comparable with fluoride PTR glass and achieves value of 1600 ppm. The modulation of absorption is equal to induced absorption caused by silver nanoparticles and depends from reading wavelength. Our study shows that almost all absorption is modulated inside the grating.

  20. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  1. An Efficient Searchable Encryption Against Keyword Guessing Attacks for Sharable Electronic Medical Records in Cloud-based System.

    Science.gov (United States)

    Wu, Yilun; Lu, Xicheng; Su, Jinshu; Chen, Peixin

    2016-12-01

    Preserving the privacy of electronic medical records (EMRs) is extremely important especially when medical systems adopt cloud services to store patients' electronic medical records. Considering both the privacy and the utilization of EMRs, some medical systems apply searchable encryption to encrypt EMRs and enable authorized users to search over these encrypted records. Since individuals would like to share their EMRs with multiple persons, how to design an efficient searchable encryption for sharable EMRs is still a very challenge work. In this paper, we propose a cost-efficient secure channel free searchable encryption (SCF-PEKS) scheme for sharable EMRs. Comparing with existing SCF-PEKS solutions, our scheme reduces the storage overhead and achieves better computation performance. Moreover, our scheme can guard against keyword guessing attack, which is neglected by most of the existing schemes. Finally, we implement both our scheme and a latest medical-based scheme to evaluate the performance. The evaluation results show that our scheme performs much better performance than the latest one for sharable EMRs.

  2. Comparison of dementia recorded in routinely collected hospital admission data in England with dementia recorded in primary care.

    Science.gov (United States)

    Brown, Anna; Kirichek, Oksana; Balkwill, Angela; Reeves, Gillian; Beral, Valerie; Sudlow, Cathie; Gallacher, John; Green, Jane

    2016-01-01

    Electronic linkage of UK cohorts to routinely collected National Health Service (NHS) records provides virtually complete follow-up for cause-specific hospital admissions and deaths. The reliability of dementia diagnoses recorded in NHS hospital data is not well documented. For a sample of Million Women Study participants in England we compared dementia recorded in routinely collected NHS hospital data (Hospital Episode Statistics: HES) with dementia recorded in two separate sources of primary care information: a primary care database [Clinical Practice Research Datalink (CPRD), n = 340] and a survey of study participants' General Practitioners (GPs, n = 244). Dementia recorded in HES fully agreed both with CPRD and with GP survey data for 85% of women; it did not agree for 1 and 4%, respectively. Agreement was uncertain for the remaining 14 and 11%, respectively; and among those classified as having uncertain agreement in CPRD, non-specific terms compatible with dementia, such as 'memory loss', were recorded in the CPRD database for 79% of the women. Agreement was significantly better (p primary care (CPRD) than in hospital (HES) data. Age-specific rates for dementia based on the hospital admission data were lower than the rates based on the primary care data, but were similar if the delay in recording in HES was taken into account. Dementia recorded in routinely collected NHS hospital admission data for women in England agrees well with primary care records of dementia assessed separately from two different sources, and is sufficiently reliable for epidemiological research.

  3. SU-F-T-458: Tracking Trends of TG-142 Parameters Via Analysis of Data Recorded by 2D Chamber Array

    Energy Technology Data Exchange (ETDEWEB)

    Alexandrian, A; Kabat, C; Defoor, D; Saenz, D; Rasmussen, K; Kirby, N; Gutierrez, A; Papanikolaou, N; Stathakis, S [University of Texas HSC SA, San Antonio, TX (United States)

    2016-06-15

    Purpose: With increasing QA demands of medical physicists in clinical radiation oncology, the need for an effective method of tracking clinical data has become paramount. A tool was produced which scans through data automatically recorded by a 2D chamber array and extracts relevant information recommended by TG-142. Using this extracted information a timely and comprehensive analysis of QA parameters can be easily performed enabling efficient monthly checks on multiple linear accelerators simultaneously. Methods: A PTW STARCHECK chamber array was used to record several months of beam outputs from two Varian 2100 series linear accelerators and a Varian NovalisTx−. In conjunction with the chamber array, a beam quality phantom was used to simultaneously to determine beam quality. A minimalist GUI was created in MatLab that allows a user to set the file path of the data for each modality to be analyzed. These file paths are recorded to a MatLab structure and then subsequently accessed by a script written in Python (version 3.5.1) which then extracts values required to perform monthly checks as outlined by recommendations from TG-142. The script incorporates calculations to determine if the values recorded by the chamber array fall within an acceptable threshold. Results: Values obtained by the script are written to a spreadsheet where results can be easily viewed and annotated with a “pass” or “fail” and saved for further analysis. In addition to creating a new scheme for reviewing monthly checks, this application allows for able to succinctly store data for follow up analysis. Conclusion: By utilizing this tool, parameters recommended by TG-142 for multiple linear accelerators can be rapidly obtained and analyzed which can be used for evaluation of monthly checks.

  4. A new ICA-based fingerprint method for the automatic removal of physiological artifacts from EEG recordings

    Science.gov (United States)

    Tamburro, Gabriella; Fiedler, Patrique; Stone, David; Haueisen, Jens

    2018-01-01

    Background EEG may be affected by artefacts hindering the analysis of brain signals. Data-driven methods like independent component analysis (ICA) are successful approaches to remove artefacts from the EEG. However, the ICA-based methods developed so far are often affected by limitations, such as: the need for visual inspection of the separated independent components (subjectivity problem) and, in some cases, for the independent and simultaneous recording of the inspected artefacts to identify the artefactual independent components; a potentially heavy manipulation of the EEG signals; the use of linear classification methods; the use of simulated artefacts to validate the methods; no testing in dry electrode or high-density EEG datasets; applications limited to specific conditions and electrode layouts. Methods Our fingerprint method automatically identifies EEG ICs containing eyeblinks, eye movements, myogenic artefacts and cardiac interference by evaluating 14 temporal, spatial, spectral, and statistical features composing the IC fingerprint. Sixty-two real EEG datasets containing cued artefacts are recorded with wet and dry electrodes (128 wet and 97 dry channels). For each artefact, 10 nonlinear SVM classifiers are trained on fingerprints of expert-classified ICs. Training groups include randomly chosen wet and dry datasets decomposed in 80 ICs. The classifiers are tested on the IC-fingerprints of different datasets decomposed into 20, 50, or 80 ICs. The SVM performance is assessed in terms of accuracy, False Omission Rate (FOR), Hit Rate (HR), False Alarm Rate (FAR), and sensitivity (p). For each artefact, the quality of the artefact-free EEG reconstructed using the classification of the best SVM is assessed by visual inspection and SNR. Results The best SVM classifier for each artefact type achieved average accuracy of 1 (eyeblink), 0.98 (cardiac interference), and 0.97 (eye movement and myogenic artefact). Average classification sensitivity (p) was 1

  5. Analysis and correction of ballistocardiogram contamination of EEG recordings in MR

    International Nuclear Information System (INIS)

    Jaeger, L.; Hoffmann, A.; Reiser, M.F.; Werhahn, K.J.

    2005-01-01

    Purpose: to examine the influence of cardiac activity-related head movements and varying blood pulse frequencies on the shape of electroencephalography (EEG) recordings in a high magnetic field, and to implement a post-processing technique to eliminate cardiac activity-related artifacts. Material and methods: respiratory thoracic movements, changes of blood pulse frequency and passive head movements to 20 healthy subjects were examined outside and inside an MR magnet at rest in a simultaneously recorded 21-channel surface EEG. An electrocardiogram (ECG) was recorded simultaneously. On the basis of the correlation of the left ventricular ejection time (LVET) with the heart-rate, a post-processing heart-rate dependent subtraction of the cardiac activity-related artifacts of the EEG was developed. The quality of the post-processed EEG was tested by detecting alpha-activity in the pre- and post-processed EEGs. Results: inside the magnet, passive head motion but not respiratory thoracic movements resulted in EEG artifacts that correlated strongly with cardiac activity-related artifacts of the EEG. The blood pulse frequency influenced the appearance of the cardiac activity-related artifacts of the EEG. The removal of the cardiac activity-related artifacts of the EEG by the implemented post-processing algorithm resulted in an EEG of diagnostic quality with detected alpha-activity. Conclusion: when recording an EEG in MR environment, heart rate-dependent subtraction of EEG artifacts caused by ballistocardiogram contamination is essential to obtain EEG recordings of diagnostic quality and reliability. (orig.)

  6. Detection of movement artifact in recorded pulse oximeter saturation.

    Science.gov (United States)

    Poets, C F; Stebbens, V A

    1997-10-01

    Movement artifact (MA) must be detected when analysing recordings of pulse oximeter saturation (SpO2). Visual analysis of individual pulse waveforms is the safest, but also the most tedious, method for this purpose. We wanted to test the reliability of a computer algorithm (Edentec Motion Annotation System), based on a comparison between pulse and heart rate, for MA detection. Ten 12-h recordings of SpO2, pulse waveforms and heart rate from ten preterm infants were analysed for the presence of MA on the pulse waveform signal. These data were used to determine the sensitivity and specificity of the computer algorithm, and of the oximeter itself, in detecting MA. Recordings were divided into segments of 2.5 s duration to compare the movement identification methods. Of the segments 31% +/- 6% (mean +/- SD) contained MA. The computer algorithm identified 95% +/- 3% of these segments, the pulse oximeter only 18% +/- 11%. Specificity was 85% +/- 4% and 99% +/- 0%, respectively. SpO2 was signal showed MA during this time, leaving a significant potential for erroneous identification of hypoxaemia. Recordings of SpO2 do not allow a reliable identification of MA. Without additional information about movement artifact, a significant proportion of recording time of pulse oximeter signal may be regarded as demonstrating hypoxaemia which, in fact, simply reflects poor measurement conditions. The computer algorithm used in this study identified periods of movement artifact reliably.

  7. Near Field Communication-based telemonitoring with integrated ECG recordings.

    Science.gov (United States)

    Morak, J; Kumpusch, H; Hayn, D; Leitner, M; Scherr, D; Fruhwald, F M; Schreier, G

    2011-01-01

    Telemonitoring of vital signs is an established option in treatment of patients with chronic heart failure (CHF). In order to allow for early detection of atrial fibrillation (AF) which is highly prevalent in the CHF population telemonitoring programs should include electrocardiogram (ECG) signals. It was therefore the aim to extend our current home monitoring system based on mobile phones and Near Field Communication technology (NFC) to enable patients acquiring their ECG signals autonomously in an easy-to-use way. We prototypically developed a sensing device for the concurrent acquisition of blood pressure and ECG signals. The design of the device equipped with NFC technology and Bluetooth allowed for intuitive interaction with a mobile phone based patient terminal. This ECG monitoring system was evaluated in the course of a clinical pilot trial to assess the system's technical feasibility, usability and patient's adherence to twice daily usage. 21 patients (4f, 54 ± 14 years) suffering from CHF were included in the study and were asked to transmit two ECG recordings per day via the telemonitoring system autonomously over a monitoring period of seven days. One patient dropped out from the study. 211 data sets were transmitted over a cumulative monitoring period of 140 days (overall adherence rate 82.2%). 55% and 8% of the transmitted ECG signals were sufficient for ventricular and atrial rhythm assessment, respectively. Although ECG signal quality has to be improved for better AF detection the developed communication design of joining Bluetooth and NFC technology in our telemonitoring system allows for ambulatory ECG acquisition with high adherence rates and system usability in heart failure patients.

  8. Application of Entropy-Based Metrics to Identify Emotional Distress from Electroencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Beatriz García-Martínez

    2016-06-01

    Full Text Available Recognition of emotions is still an unresolved challenge, which could be helpful to improve current human-machine interfaces. Recently, nonlinear analysis of some physiological signals has shown to play a more relevant role in this context than their traditional linear exploration. Thus, the present work introduces for the first time the application of three recent entropy-based metrics: sample entropy (SE, quadratic SE (QSE and distribution entropy (DE to discern between emotional states of calm and negative stress (also called distress. In the last few years, distress has received growing attention because it is a common negative factor in the modern lifestyle of people from developed countries and, moreover, it may lead to serious mental and physical health problems. Precisely, 279 segments of 32-channel electroencephalographic (EEG recordings from 32 subjects elicited to be calm or negatively stressed have been analyzed. Results provide that QSE is the first single metric presented to date with the ability to identify negative stress. Indeed, this metric has reported a discriminant ability of around 70%, which is only slightly lower than the one obtained by some previous works. Nonetheless, discriminant models from dozens or even hundreds of features have been previously obtained by using advanced classifiers to yield diagnostic accuracies about 80%. Moreover, in agreement with previous neuroanatomy findings, QSE has also revealed notable differences for all the brain regions in the neural activation triggered by the two considered emotions. Consequently, given these results, as well as easy interpretation of QSE, this work opens a new standpoint in the detection of emotional distress, which may gain new insights about the brain’s behavior under this negative emotion.

  9. iSpectra: An Open Source Toolbox For The Analysis of Spectral Images Recorded on Scanning Electron Microscopes.

    Science.gov (United States)

    Liebske, Christian

    2015-08-01

    iSpectra is an open source and system-independent toolbox for the analysis of spectral images (SIs) recorded on energy-dispersive spectroscopy (EDS) systems attached to scanning electron microscopes (SEMs). The aim of iSpectra is to assign pixels with similar spectral content to phases, accompanied by cumulative phase spectra with superior counting statistics for quantification. Pixel-to-phase assignment starts with a threshold-based pre-sorting of spectra to create groups of pixels with identical elemental budgets, similar to a method described by van Hoek (2014). Subsequent merging of groups and re-assignments of pixels using elemental or principle component histogram plots enables the user to generate chemically and texturally plausible phase maps. A variety of standard image processing algorithms can be applied to groups of pixels to optimize pixel-to-phase assignments, such as morphology operations to account for overlapping excitation volumes over pixels located at phase boundaries. iSpectra supports batch processing and allows pixel-to-phase assignments to be applied to an unlimited amount of SIs, thus enabling phase mapping of large area samples like petrographic thin sections.

  10. Parameter estimation of a nonlinear Burger's model using nanoindentation and finite element-based inverse analysis

    Science.gov (United States)

    Hamim, Salah Uddin Ahmed

    Nanoindentation involves probing a hard diamond tip into a material, where the load and the displacement experienced by the tip is recorded continuously. This load-displacement data is a direct function of material's innate stress-strain behavior. Thus, theoretically it is possible to extract mechanical properties of a material through nanoindentation. However, due to various nonlinearities associated with nanoindentation the process of interpreting load-displacement data into material properties is difficult. Although, simple elastic behavior can be characterized easily, a method to characterize complicated material behavior such as nonlinear viscoelasticity is still lacking. In this study, a nanoindentation-based material characterization technique is developed to characterize soft materials exhibiting nonlinear viscoelasticity. Nanoindentation experiment was modeled in finite element analysis software (ABAQUS), where a nonlinear viscoelastic behavior was incorporated using user-defined subroutine (UMAT). The model parameters were calibrated using a process called inverse analysis. In this study, a surrogate model-based approach was used for the inverse analysis. The different factors affecting the surrogate model performance are analyzed in order to optimize the performance with respect to the computational cost.

  11. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Height (Top and Base) Environmental Data Record (EDR) from NDE

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a high quality operational Environmental Data Record (EDR) of cloud height (top and base) from the Visible Infrared Imaging Radiometer Suite...

  12. Recognition and pseudonymisation of medical records for secondary use.

    Science.gov (United States)

    Heurix, Johannes; Fenz, Stefan; Rella, Antonio; Neubauer, Thomas

    2016-03-01

    Health records rank among the most sensitive personal information existing today. An unwanted disclosure to unauthorised parties usually results in significant negative consequences for an individual. Therefore, health records must be adequately protected in order to ensure the individual's privacy. However, health records are also valuable resources for clinical studies and research activities. In order to make the records available for privacy-preserving secondary use, thorough de-personalisation is a crucial prerequisite to prevent re-identification. This paper introduces MEDSEC, a system which automatically converts paper-based health records into de-personalised and pseudonymised documents which can be accessed by secondary users without compromising the patients' privacy. The system converts the paper-based records into a standardised structure that facilitates automated processing and the search for useful information.

  13. Automated signal quality assessment of mobile phone-recorded heart sound signals.

    Science.gov (United States)

    Springer, David B; Brennan, Thomas; Ntusi, Ntobeko; Abdelrahman, Hassan Y; Zühlke, Liesl J; Mayosi, Bongani M; Tarassenko, Lionel; Clifford, Gari D

    Mobile phones, due to their audio processing capabilities, have the potential to facilitate the diagnosis of heart disease through automated auscultation. However, such a platform is likely to be used by non-experts, and hence, it is essential that such a device is able to automatically differentiate poor quality from diagnostically useful recordings since non-experts are more likely to make poor-quality recordings. This paper investigates the automated signal quality assessment of heart sound recordings performed using both mobile phone-based and commercial medical-grade electronic stethoscopes. The recordings, each 60 s long, were taken from 151 random adult individuals with varying diagnoses referred to a cardiac clinic and were professionally annotated by five experts. A mean voting procedure was used to compute a final quality label for each recording. Nine signal quality indices were defined and calculated for each recording. A logistic regression model for classifying binary quality was then trained and tested. The inter-rater agreement level for the stethoscope and mobile phone recordings was measured using Conger's kappa for multiclass sets and found to be 0.24 and 0.54, respectively. One-third of all the mobile phone-recorded phonocardiogram (PCG) signals were found to be of sufficient quality for analysis. The classifier was able to distinguish good- and poor-quality mobile phone recordings with 82.2% accuracy, and those made with the electronic stethoscope with an accuracy of 86.5%. We conclude that our classification approach provides a mechanism for substantially improving auscultation recordings by non-experts. This work is the first systematic evaluation of a PCG signal quality classification algorithm (using a separate test dataset) and assessment of the quality of PCG recordings captured by non-experts, using both a medical-grade digital stethoscope and a mobile phone.

  14. Analysis of the steel braced frames equipped with ADAS devices under the far field records

    Directory of Open Access Journals (Sweden)

    Mahmoud Bayat

    Full Text Available The usefulness of supplementary energy dissipation devices is now quite well-known in earthquake structural engineering for reducing the earthquake-induced response of structural systems. The seismic behavior of structures with supplemental ADAS devices is concerned in this study. In this paper, the ratio of the hysteretic energy to input energy is compared in different structural systems. The main purpose of this paper is to evaluate the behavior of structures equipped with yielding dampers (ADAS, located in far fields based on energy concepts. In order to optimize their seismic behavior, the codes and solutions are also presented. Three cases including five, ten and fifteen-story three-bay Concentric Braced Frames (CBF with and without ADAS were selected. The PERFORM 3D.V4 software along with three earthquake records (Northridge, Imperial Valley and Tabas is used for nonlinear time history analysis and the conclusions are drawn upon energy criterion. The effect of PGA variation and height of the frames are also considered in the study. Finally, to increase the energy damping ability and reduce the destructive effects in structures on an earthquake event, so that a great amount of induced energy is damped and destruction of the structure is prevented as much as possible by using ADAS dampers.

  15. Signal analysis of accelerometry data using gravity-based modeling

    Science.gov (United States)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  16. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    Kamenopoulou, V.; Dimitriou, P.; Proukakis, Ch.

    1995-01-01

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  17. UMTRA Project Administrative Files Collection Records Management Program

    International Nuclear Information System (INIS)

    1994-09-01

    The UPAFC Records Management Plan is based on the life cycle of a record - the evolution of a record from creation until final disposition. There are three major phases in the life cycle of a record: (1) creation and receipt, (2) maintenance and use, and (3) disposition. Accordingly, the Records Management Plan is structured to follow each of those phases. During each of the three phases, some kind of control is mandatory. The Records Management Plan establishes appropriate standards, policies, and procedures to ensure adequate control is always maintained. It includes a plan for records management, a plan for records management training activities, and a plan for auditing and appraising the program

  18. Software for objective comparison of vocal acoustic features over weeks of audio recording: KLFromRecordingDays

    Science.gov (United States)

    Soderstrom, Ken; Alalawi, Ali

    KLFromRecordingDays allows measurement of Kullback-Leibler (KL) distances between 2D probability distributions of vocal acoustic features. Greater KL distance measures reflect increased phonological divergence across the vocalizations compared. The software has been used to compare *.wav file recordings made by Sound Analysis Recorder 2011 of songbird vocalizations pre- and post-drug and surgical manipulations. Recordings from individual animals in *.wav format are first organized into subdirectories by recording day and then segmented into individual syllables uttered and acoustic features of these syllables using Sound Analysis Pro 2011 (SAP). KLFromRecordingDays uses syllable acoustic feature data output by SAP to a MySQL table to generate and compare "template" (typically pre-treatment) and "target" (typically post-treatment) probability distributions. These distributions are a series of virtual 2D plots of the duration of each syllable (as x-axis) to each of 13 other acoustic features measured by SAP for that syllable (as y-axes). Differences between "template" and "target" probability distributions for each acoustic feature are determined by calculating KL distance, a measure of divergence of the target 2D distribution pattern from that of the template. KL distances and the mean KL distance across all acoustic features are calculated for each recording day and output to an Excel spreadsheet. Resulting data for individual subjects may then be pooled across treatment groups and graphically summarized and used for statistical comparisons. Because SAP-generated MySQL files are accessed directly, data limits associated with spreadsheet output are avoided, and the totality of vocal output over weeks may be objectively analyzed all at once. The software has been useful for measuring drug effects on songbird vocalizations and assessing recovery from damage to regions of vocal motor cortex. It may be useful in studies employing other species, and as part of speech

  19. Software for objective comparison of vocal acoustic features over weeks of audio recording: KLFromRecordingDays

    Directory of Open Access Journals (Sweden)

    Ken Soderstrom

    2017-01-01

    Full Text Available KLFromRecordingDays allows measurement of Kullback–Leibler (KL distances between 2D probability distributions of vocal acoustic features. Greater KL distance measures reflect increased phonological divergence across the vocalizations compared. The software has been used to compare *.wav file recordings made by Sound Analysis Recorder 2011 of songbird vocalizations pre- and post-drug and surgical manipulations. Recordings from individual animals in *.wav format are first organized into subdirectories by recording day and then segmented into individual syllables uttered and acoustic features of these syllables using Sound Analysis Pro 2011 (SAP. KLFromRecordingDays uses syllable acoustic feature data output by SAP to a MySQL table to generate and compare “template” (typically pre-treatment and “target” (typically post-treatment probability distributions. These distributions are a series of virtual 2D plots of the duration of each syllable (as x-axis to each of 13 other acoustic features measured by SAP for that syllable (as y-axes. Differences between “template” and “target” probability distributions for each acoustic feature are determined by calculating KL distance, a measure of divergence of the target 2D distribution pattern from that of the template. KL distances and the mean KL distance across all acoustic features are calculated for each recording day and output to an Excel spreadsheet. Resulting data for individual subjects may then be pooled across treatment groups and graphically summarized and used for statistical comparisons. Because SAP-generated MySQL files are accessed directly, data limits associated with spreadsheet output are avoided, and the totality of vocal output over weeks may be objectively analyzed all at once. The software has been useful for measuring drug effects on songbird vocalizations and assessing recovery from damage to regions of vocal motor cortex. It may be useful in studies employing other

  20. A record-driven growth process

    International Nuclear Information System (INIS)

    Godrèche, C; Luck, J M

    2008-01-01

    We introduce a novel stochastic growth process, the record-driven growth process, which originates from the analysis of a class of growing networks in a universal limiting regime. Nodes are added one by one to a network, each node possessing a quality. The new incoming node connects to the pre-existing node with best quality, that is, with record value for the quality. The emergent structure is that of a growing network, where groups are formed around record nodes (nodes endowed with the best intrinsic qualities). Special emphasis is put on the statistics of leaders (nodes whose degrees are the largest). The asymptotic probability for a node to be a leader is equal to the Golomb–Dickman constant ω = 0.624 329 ..., which arises in problems of combinatorial nature. This outcome solves the problem of the determination of the record breaking rate for the sequence of correlated inter-record intervals. The process exhibits temporal self-similarity in the late-time regime. Connections with the statistics of the cycles of random permutations, the statistical properties of randomly broken intervals, and the Kesten variable are given

  1. Evaluation of arterial propagation velocity based on the automated analysis of the Pulse Wave Shape

    International Nuclear Information System (INIS)

    Clara, F M; Scandurra, A G; Meschino, G J; Passoni, L I

    2011-01-01

    This paper proposes the automatic estimation of the arterial propagation velocity from the pulse wave raw records measured in the region of the radial artery. A fully automatic process is proposed to select and analyze typical pulse cycles from the raw data. An adaptive neuro-fuzzy inference system, together with a heuristic search is used to find a functional approximation of the pulse wave. The estimation of the propagation velocity is carried out via the analysis of the functional approximation obtained with the fuzzy model. The analysis of the pulse wave records with the proposed methodology showed small differences compared with the method used so far, based on a strong interaction with the user. To evaluate the proposed methodology, we estimated the propagation velocity in a population of healthy men from a wide range of ages. It has been found in these studies that propagation velocity increases linearly with age and it presents a considerable dispersion of values in healthy individuals. We conclude that this process could be used to evaluate indirectly the propagation velocity of the aorta, which is related to physiological age in healthy individuals and with the expectation of life in cardiovascular patients.

  2. Acoustic ambient noise recorder

    Digital Repository Service at National Institute of Oceanography (India)

    Saran, A.K.; Navelkar, G.S.; Almeida, A.M.; More, S.R.; Chodankar, P.V.; Murty, C.S.

    with a robust outfit that can withstand high pressures and chemically corrosion resistant materials. Keeping these considerations in view, a CMOS micro-controller-based marine acoustic ambient noise recorder has been developed with a real time clock...

  3. Privacy preserving interactive record linkage (PPIRL).

    Science.gov (United States)

    Kum, Hye-Chung; Krishnamurthy, Ashok; Machanavajjhala, Ashwin; Reiter, Michael K; Ahalt, Stanley

    2014-01-01

    Record linkage to integrate uncoordinated databases is critical in biomedical research using Big Data. Balancing privacy protection against the need for high quality record linkage requires a human-machine hybrid system to safely manage uncertainty in the ever changing streams of chaotic Big Data. In the computer science literature, private record linkage is the most published area. It investigates how to apply a known linkage function safely when linking two tables. However, in practice, the linkage function is rarely known. Thus, there are many data linkage centers whose main role is to be the trusted third party to determine the linkage function manually and link data for research via a master population list for a designated region. Recently, a more flexible computerized third-party linkage platform, Secure Decoupled Linkage (SDLink), has been proposed based on: (1) decoupling data via encryption, (2) obfuscation via chaffing (adding fake data) and universe manipulation; and (3) minimum information disclosure via recoding. We synthesize this literature to formalize a new framework for privacy preserving interactive record linkage (PPIRL) with tractable privacy and utility properties and then analyze the literature using this framework. Human-based third-party linkage centers for privacy preserving record linkage are the accepted norm internationally. We find that a computer-based third-party platform that can precisely control the information disclosed at the micro level and allow frequent human interaction during the linkage process, is an effective human-machine hybrid system that significantly improves on the linkage center model both in terms of privacy and utility.

  4. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  5. Conflict Detection Performance Analysis for Function Allocation Using Time-Shifted Recorded Traffic Data

    Science.gov (United States)

    Guerreiro, Nelson M.; Butler, Ricky W.; Maddalon, Jeffrey M.; Hagen, George E.; Lewis, Timothy A.

    2015-01-01

    The performance of the conflict detection function in a separation assurance system is dependent on the content and quality of the data available to perform that function. Specifically, data quality and data content available to the conflict detection function have a direct impact on the accuracy of the prediction of an aircraft's future state or trajectory, which, in turn, impacts the ability to successfully anticipate potential losses of separation (detect future conflicts). Consequently, other separation assurance functions that rely on the conflict detection function - namely, conflict resolution - are prone to negative performance impacts. The many possible allocations and implementations of the conflict detection function between centralized and distributed systems drive the need to understand the key relationships that impact conflict detection performance, with respect to differences in data available. This paper presents the preliminary results of an analysis technique developed to investigate the impacts of data quality and data content on conflict detection performance. Flight track data recorded from a day of the National Airspace System is time-shifted to create conflicts not present in the un-shifted data. A methodology is used to smooth and filter the recorded data to eliminate sensor fusion noise, data drop-outs and other anomalies in the data. The metrics used to characterize conflict detection performance are presented and a set of preliminary results is discussed.

  6. Dealing with noise and physiological artifacts in human EEG recordings: empirical mode methods

    Science.gov (United States)

    Runnova, Anastasiya E.; Grubov, Vadim V.; Khramova, Marina V.; Hramov, Alexander E.

    2017-04-01

    In the paper we propose the new method for removing noise and physiological artifacts in human EEG recordings based on empirical mode decomposition (Hilbert-Huang transform). As physiological artifacts we consider specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the proposed method with steps including empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing these empirical modes and reconstructing of initial EEG signal. We show the efficiency of the method on the example of filtration of human EEG signal from eye-moving artifacts.

  7. Acoustic signature of thunder from seismic records

    Science.gov (United States)

    Kappus, Mary E.; Vernon, Frank L.

    1991-06-01

    Thunder, the sound wave through the air associated with lightning, transfers sufficient energy to the ground to trigger seismometers set to record regional earthquakes. The acoustic signature recorded on seismometers, in the form of ground velocity as a function of time, contains the same type features as pressure variations recorded with microphones in air. At a seismic station in Kislovodsk, USSR, a nearly direct lightning strike caused electronic failure of borehole instruments while leaving a brief impulsive acoustic signature on the surface instruments. The peak frequency of 25-55 Hz is consistent with previously published values for cloud-to-ground lightning strikes, but spectra from this station are contaminated by very strong wind noise in this band. A thunderstorm near a similar station in Karasu triggered more than a dozen records of individual lightning strikes during a 2-hour period. The spectra for these events are fairly broadband, with peaks at low frequencies, varying from 6 to 13 Hz. The spectra were all computed by multitaper analysis, which deals appropriately with the nonstationary thunder signal. These independent measurements of low-frequency peaks corroborate the occasional occurrences in traditional microphone records, but a theory concerning the physical mechanism to account for them is still in question. Examined separately, the individual claps in each record have similar frequency distributions, discounting a need for multiple mechanisms to explain different phases of the thunder sequence. Particle motion, determined from polarization analysis of the three-component records, is predominantly vertical downward, with smaller horizontal components indicative of the direction to the lightning bolt. In three of the records the azimuth to the lightning bolt changes with time, confirming a significant horizontal component to the lightning channel itself.

  8. Acute Precipitants of Physical Elder Abuse: Qualitative Analysis of Legal Records From Highly Adjudicated Cases.

    Science.gov (United States)

    Rosen, Tony; Bloemen, Elizabeth M; LoFaso, Veronica M; Clark, Sunday; Flomenbaum, Neal E; Breckman, Risa; Markarian, Arlene; Riffin, Catherine; Lachs, Mark S; Pillemer, Karl

    2016-08-01

    Elder abuse is a common phenomenon with potentially devastating consequences for older adults. Although researchers have begun to identify predisposing risk factors for elder abuse victims and abusers, little is known about the acute precipitants that lead to escalation to physical violence. We analyzed legal records from highly adjudicated cases to describe these acute precipitants for physical elder abuse. In collaboration with a large, urban district attorney's office, we qualitatively evaluated legal records from 87 successfully prosecuted physical elder abuse cases from 2003 to 2015. We transcribed and analyzed narratives of the events surrounding physical abuse within victim statements, police reports, and prosecutor records. We identified major themes using content analysis. We identified 10 categories of acute precipitants that commonly triggered physical elder abuse, including victim attempting to prevent the abuser from entering or demanding that he or she leave, victim threatening or attempting to leave/escape, threat or perception that the victim would involve the authorities, conflict about a romantic relationship, presence during/intervention in ongoing family violence, issues in multi-generational child rearing, conflict about the abuser's substance abuse, confrontation about financial exploitation, dispute over theft/destruction of property, and disputes over minor household issues. Common acute precipitants of physical elder abuse may be identified. Improved understanding of these acute precipitants for escalation to physical violence and their contribution to elder abuse may assist in the development of prevention and management strategies.

  9. Inverse Analysis of Pavement Structural Properties Based on Dynamic Finite Element Modeling and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Xiaochao Tang

    2013-03-01

    Full Text Available With the movement towards the implementation of mechanistic-empirical pavement design guide (MEPDG, an accurate determination of pavement layer moduli is vital for predicting pavement critical mechanistic responses. A backcalculation procedure is commonly used to estimate the pavement layer moduli based on the non-destructive falling weight deflectometer (FWD tests. Backcalculation of flexible pavement layer properties is an inverse problem with known input and output signals based upon which unknown parameters of the pavement system are evaluated. In this study, an inverse analysis procedure that combines the finite element analysis and a population-based optimization technique, Genetic Algorithm (GA has been developed to determine the pavement layer structural properties. A lightweight deflectometer (LWD was used to infer the moduli of instrumented three-layer scaled flexible pavement models. While the common practice in backcalculating pavement layer properties still assumes a static FWD load and uses only peak values of the load and deflections, dynamic analysis was conducted to simulate the impulse LWD load. The recorded time histories of the LWD load were used as the known inputs into the pavement system while the measured time-histories of surface central deflections and subgrade deflections measured with a linear variable differential transformers (LVDT were considered as the outputs. As a result, consistent pavement layer moduli can be obtained through this inverse analysis procedure.

  10. Eye of the storm: analysis of shelter treatment records of evacuees to Acadiana from Hurricanes Katrina and Rita.

    Science.gov (United States)

    Caillouet, L Philip; Paul, P Joseph; Sabatier, Steven M; Caillouet, Kevin A

    2012-01-01

    specific evaluation and care, no population-based experimental hypothesis was framed nor was the effectiveness of any specific intervention researched at the time. This study reports experiential data collected without a particular preconceived hypothesis, because no specific outcome measures had been designed in advance. Data analysis revealed much about the origins and demographics of the evacuees, their hurricane-related risks and injuries, and the loss of continuity in their prior and ongoing healthcare. The authors believe that much can be learned from studying data collected in evacuee triage clinics, and that such insights may influence personal and official preparedness for future events. In the Katrina-Rita evacuations, only paper-based data collection mechanisms were used-and those with great inconsistency-and there was no predeployed mechanism for close-to-real-time collation of evacuee data. Deployment of simple electronic health record systems might well have allowed for a better real-time understanding of the unfolding of events, upon arrival of evacuees in shelters. Information and communication technologies have advanced since 2005, but predisaster staging and training on such technologies is still lacking.

  11. TEPAPA: a novel in silico feature learning pipeline for mining prognostic and associative factors from text-based electronic medical records.

    Science.gov (United States)

    Lin, Frank Po-Yen; Pokorny, Adrian; Teng, Christina; Epstein, Richard J

    2017-07-31

    Vast amounts of clinically relevant text-based variables lie undiscovered and unexploited in electronic medical records (EMR). To exploit this untapped resource, and thus facilitate the discovery of informative covariates from unstructured clinical narratives, we have built a novel computational pipeline termed Text-based Exploratory Pattern Analyser for Prognosticator and Associator discovery (TEPAPA). This pipeline combines semantic-free natural language processing (NLP), regular expression induction, and statistical association testing to identify conserved text patterns associated with outcome variables of clinical interest. When we applied TEPAPA to a cohort of head and neck squamous cell carcinoma patients, plausible concepts known to be correlated with human papilloma virus (HPV) status were identified from the EMR text, including site of primary disease, tumour stage, pathologic characteristics, and treatment modalities. Similarly, correlates of other variables (including gender, nodal status, recurrent disease, smoking and alcohol status) were also reliably recovered. Using highly-associated patterns as covariates, a patient's HPV status was classifiable using a bootstrap analysis with a mean area under the ROC curve of 0.861, suggesting its predictive utility in supporting EMR-based phenotyping tasks. These data support using this integrative approach to efficiently identify disease-associated factors from unstructured EMR narratives, and thus to efficiently generate testable hypotheses.

  12. Electronic medical records: a developing and developed country analysis

    CSIR Research Space (South Africa)

    Sikhondze, NC

    2016-05-01

    Full Text Available of Electronic Medical Records (EMR) systems in developed and developing countries. There is a direct relationship between the income of the country and the use of electronic information and communication systems as part of healthcare systems hence the division...

  13. The cost of caring for end-stage kidney disease patients: an analysis based on hospital financial transaction records.

    Science.gov (United States)

    Bruns, F J; Seddon, P; Saul, M; Zeidel, M L

    1998-05-01

    The costs of care for end-stage renal disease patients continue to rise because of increased numbers of patients. Efforts to contain these costs have focused on the development of capitated payment schemes, in which all costs for the care of these patients are covered in a single payment. To determine the effect of a capitated reimbursement scheme on care of dialysis patients (both hemodialysis [HD] and peritoneal dialysis [PD]), complete financial records (all reimbursements for inpatient and outpatient care, as well as physician collections) of dialysis patients at a single medical center over 1 year were analyzed. For the period from July 1994 to July 1995, annualized cost per dialysis patient-year averaged $63,340, or 9.8% higher than the corrected estimate from the U.S. Renal Data Service (USRDS; $57,660). The "most expensive" 25% of patients engendered 44 to 48% of the total costs, and inpatient costs accounted for 37 to 40% of total costs. Nearly half of the inpatient costs resulted from only two categories (room charges and inpatient dialysis), whereas other categories each made up a small fraction of the inpatient costs. PD patients were far less expensive to care for than HD patients, due to reduced hospital days and lower cost of outpatient dialysis. Care for a university-based dialysis population was only slightly more expensive than estimates predicted from the USRDS. These results validate the USRDS spending data and suggest that they can be used effectively for setting capitated rates. Efforts to control costs without sacrificing quality of care must center on reducing inpatient costs, particularly room charges and the cost of inpatient dialysis.

  14. Análise econômica de esquemas alternativos de controle leiteiro Economic analysis of alternative schemes of milk recording

    Directory of Open Access Journals (Sweden)

    V.L. Cardoso

    2005-02-01

    Full Text Available Compararam-se os custos de seis esquemas de controle leiteiro (CL, estabelecidos com base em diferentes freqüências de controles (com possibilidade de controle trimestral supervisionado e aproveitamento do CL da fazenda, no número de ordenhas e na realização de análises qualitativas (composição do leite e/ou contagem de células somáticas. Computaram-se os percentuais que os custos totais do CL representariam na receita bruta mensal do leite (CL/RB e na receita líquida mensal (CL/RL. O esquema tradicional (freqüência mensal apresentou o mais elevado custo mensal e os maiores CL/RB e CL/RL. Esquemas com intervalo maior entre controles e amostragem em ordenhas alternadas resultaram na redução dos custos com diárias e quilometragem. O custo mensal do CL variou de 0,68% a 1,8% sobre a receita bruta do leite e de 6,6% a 17,0% sobre a receita líquida.Costs of six different milk recording schemes based on the interval between controls (with the possibility of a three-month interval supervision control and the use of the farmers’ milk recording data, the number of milking recorded and milk quality analyses (milk composition and/or somatic cell counting were compared. The ratio milk recording costs over gross income (CL/RB and over net income (CL/RL were also estimated. The traditional scheme (monthly recording presented the highest monthly costs and ratios CL/RB and CL/RL. Schemes with longer intervals between controls and alternate sampling presented the lowest costs due to the reduction of travel costs. Monthly milk recording costs over milk gross income and over net income ranged from 0.68% to 1.8% and from 6.6% to 17.0%, respectively.

  15. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries

    Science.gov (United States)

    Kannan, Vaishnavi; Fish, Jason C.; Willett, DuWayne L.

    2018-01-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system’s requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. “Agile Modeling” retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams. PMID:29750222

  16. Joint time-frequency analysis of EEG signals based on a phase-space interpretation of the recording process

    Science.gov (United States)

    Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.

    2012-10-01

    Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.

  17. The Likelihood of Recent Record Warmth.

    Science.gov (United States)

    Mann, Michael E; Rahmstorf, Stefan; Steinman, Byron A; Tingley, Martin; Miller, Sonya K

    2016-01-25

    2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP5) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing.

  18. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning

    Science.gov (United States)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.

    2012-07-01

    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get

  19. Similarities and differences between on-scalp and conventional in-helmet magnetoencephalography recordings.

    Directory of Open Access Journals (Sweden)

    Lau M Andersen

    Full Text Available The development of new magnetic sensor technologies that promise sensitivities approaching that of conventional MEG technology while operating at far lower operating temperatures has catalysed the growing field of on-scalp MEG. The feasibility of on-scalp MEG has been demonstrated via benchmarking of new sensor technologies performing neuromagnetic recordings in close proximity to the head surface against state-of-the-art in-helmet MEG sensor technology. However, earlier work has provided little information about how these two approaches compare, or about the reliability of observed differences. Herein, we present such a comparison, based on recordings of the N20m component of the somatosensory evoked field as elicited by electric median nerve stimulation. As expected from the proximity differences between the on-scalp and in-helmet sensors, the magnitude of the N20m activation as recorded with the on-scalp sensor was higher than that of the in-helmet sensors. The dipole pattern of the on-scalp recordings was also more spatially confined than that of the conventional recordings. Our results furthermore revealed unexpected temporal differences in the peak of the N20m component. An analysis protocol was therefore developed for assessing the reliability of this observed difference. We used this protocol to examine our findings in terms of differences in sensor sensitivity between the two types of MEG recordings. The measurements and subsequent analysis raised attention to the fact that great care has to be taken in measuring the field close to the zero-line crossing of the dipolar field, since it is heavily dependent on the orientation of sensors. Taken together, our findings provide reliable evidence that on-scalp and in-helmet sensors measure neural sources in mostly similar ways.

  20. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  1. 77 FR 51910 - Privacy Act, Exempt Record System

    Science.gov (United States)

    2012-08-28

    ... unwilling to report possible research misconduct because of fear of retaliation (e.g., from an employer or... establish procedures for notification, access to records, amendment of records, or appeals of denials of... accessed at http://www.fda.gov/RegulatoryInformation/Guidances/ucm125166.htm . III. Analysis of Impacts HHS...

  2. Quantitative analysis of single muscle fibre action potentials recorded at known distances

    NARCIS (Netherlands)

    Albers, B.A.; Put, J.H.M.; Wallinga, W.; Wirtz, P.

    1989-01-01

    In vivo records of single fibre action potentials (SFAPs) have always been obtained at unknown distance from the active muscle fibre. A new experimental method has been developed enabling the derivation of the recording distance in animal experiments. A single fibre is stimulated with an

  3. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  4. Eielson Air Force Base operable unit 2 and other areas record of decision

    International Nuclear Information System (INIS)

    Lewis, R.E.; Smith, R.M.

    1994-10-01

    This decision document presents the selected remedial actions and no action decisions for Operable Unit 2 (OU2) at Eielson Air Force Base (AFB), Alaska, chosen in accordance with state and federal regulations. This document also presents the decision that no further action is required for 21 other source areas at Eielson AFB. This decision is based on the administrative record file for this site. OU2 addresses sites contaminated by leaks and spills of fuels. Soils contaminated with petroleum products occur at or near the source of contamination. Contaminated subsurface soil and groundwater occur in plumes on the top of a shallow groundwater table that fluctuates seasonally. These sites pose a risk to human health and the environment because of ingestion, inhalation, and dermal contact with contaminated groundwater. The purpose of this response is to prevent current or future exposure to the contaminated groundwater, to reduce further contaminant migration into the groundwater, and to remediate groundwater

  5. Characterising Record Flooding in the United Kingdom

    Science.gov (United States)

    Cox, A.; Bates, P. D.; Smith, J. A.

    2017-12-01

    Though the most notable floods in history have been carefully explained, there remains a lack of literature that explores the nature of record floods as a whole in the United Kingdom. We characterise the seasonality, statistical and spatial distribution, and meteorological causes of peak river flows for 521 gauging stations spread across the British Isles. We use annual maximum data from the National River Flow Archive, catchment descriptors from the Flood Estimation Handbook, and historical records of large floods. What we aim to find is in what ways, if any, the record flood for a station is different from more 'typical' floods. For each station, we calculate two indices: the seasonal anomaly and the flood index. Broadly, the seasonal anomaly is the degree to which a station's record flood happens at a different time of year compared to typical floods at that site, whilst the flood index is a station's record flood discharge divided by the discharge of the 1-in-10-year return period event. We find that while annual maximum peaks are dominated by winter frontal rainfall, record floods are disproportionately caused by summer convective rainfall. This analysis also shows that the larger the seasonal anomaly, the higher the flood index. Additionally, stations across the country have record floods that occur in the summer with no notable spatial pattern, yet the most seasonally anomalous record events are concentrated around the south and west of the British Isles. Catchment descriptors tell us little about the flood index at a particular station, but generally areas with lower mean annual precipitation have a higher flood index. The inclusion of case studies from recent and historical examples of notable floods across the UK supplements our analysis and gives insight into how typical these events are, both statistically and meteorologically. Ultimately, record floods in general happen at relatively unexpected times and with unpredictable magnitudes, which is a

  6. Gait Analysis Using Computer Vision Based on Cloud Platform and Mobile Device

    Directory of Open Access Journals (Sweden)

    Mario Nieto-Hidalgo

    2018-01-01

    Full Text Available Frailty and senility are syndromes that affect elderly people. The ageing process involves a decay of cognitive and motor functions which often produce an impact on the quality of life of elderly people. Some studies have linked this deterioration of cognitive and motor function to gait patterns. Thus, gait analysis can be a powerful tool to assess frailty and senility syndromes. In this paper, we propose a vision-based gait analysis approach performed on a smartphone with cloud computing assistance. Gait sequences recorded by a smartphone camera are processed by the smartphone itself to obtain spatiotemporal features. These features are uploaded onto the cloud in order to analyse and compare them to a stored database to render a diagnostic. The feature extraction method presented can work with both frontal and sagittal gait sequences although the sagittal view provides a better classification since an accuracy of 95% can be obtained.

  7. A new global geomagnetic model based on archeomagnetic, volcanic and historical records

    Science.gov (United States)

    Arneitz, Patrick; Leonhardt, Roman; Fabian, Karl

    2016-04-01

    The major challenge of geomagnetic field reconstruction lies in the inhomogeneous spatio-temporal distribution of the available data and their highly variable quality. Paleo- and archeomagnetic records provide information about the ancient geomagnetic field beyond the historical period. Typically these data types have larger errors than their historical counterparts, and investigated materials and applied experimental methods potentially bias field readings. Input data for the modelling approach were extracted from available collections of archeomagnetic, volcanic and historical records, which were integrated into a single database along with associated meta-data. The used iterative Bayesian inversion scheme targets the implementation of reliable error treatments, which allows to combine the different data types. The proposed model is scrutinized by carrying out tests with artificial records. Records are synthesized using a known field evolution generated by a geodynamo model showing realistic energy characteristics. Using the artificial field, a synthetic data set is generated that exactly mirrors the existing measured records in all meta-data, but provides data that would have been observed if the artificial field would have been real. After inversion of the synthetic data, the comparison of known artificial Gauss coefficients and modelled ones allows for the verification of the applied modelling strategy as well as for the examination of the potential and limits of the current data compilation.

  8. Steps and pips in the history of the cumulative recorder.

    OpenAIRE

    Lattal, Kennon A

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This review traces the evolution of the cumulative recorder from Skinner's early modified kymographs through various models developed by Skinner and his co...

  9. 77 FR 51949 - Privacy Act, Exempt Record System

    Science.gov (United States)

    2012-08-28

    ... because of fear of retaliation (e.g., from an employer or coworkers). Subsections (d)(2) through (d)(4... records, or appeals of denials of access to records, is appropriate because the procedures would serve no... accessed at: http://www.fda.gov/RegulatoryInformation/Guidances/ucm125166.htm . III. Analysis of Impacts...

  10. 32 CFR 989.21 - Record of decision (ROD).

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Record of decision (ROD). 989.21 Section 989.21 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ENVIRONMENTAL PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.21 Record of decision (ROD). (a) The proponent and the EPF...

  11. Design and Implementation of Track Record System Based on Android Platform

    Directory of Open Access Journals (Sweden)

    Zhang Jiachen

    2017-01-01

    Full Text Available For the problem of losing and missing of vulnerable groups, a track record system is designed. The mobile terminal Android system is used as a platform, with the help of Auto Navi Map Android SDK positioning function, realize the positioning data acquisition of mobile terminals; using Apache Tomcat Server and MySQL database to build a Server which haves C/S(the client and the server server architecture. The mobile terminal interacts with the server through the JSON data transmission mode based on the HTTP protocol, and the server saves the relevant information provided by the mobile terminal through the JDBC to the corresponding table in the database. It can be used to monitor the trace of the family and friends, compared with the PC terminal, it is not only more flexible, convenient and fast, but also has the characteristics of real-time and high efficiency. Through the test, all functions can be used normally.

  12. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    Science.gov (United States)

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. Establishment of data base of regional seismic recordings from earthquakes, chemical explosions and nuclear explosions in the Former Soviet Union

    Energy Technology Data Exchange (ETDEWEB)

    Ermolenko, N.A.; Kopnichev, Yu.F.; Kunakov, V.G.; Kunakova, O.K.; Rakhmatullin, M.Kh.; Sokolova, I.N.; Vybornyy, Zh.I. [AN SSSR, Moscow (Russian Federation). Inst. Fiziki Zemli

    1995-06-01

    In this report results of work on establishment of a data base of regional seismic recordings from earthquakes, chemical explosions and nuclear explosions in the former Soviet Union are described. This work was carried out in the Complex Seismological Expedition (CSE) of the Joint Institute of Physics of the Earth of the Russian Academy of Sciences and Lawrence Livermore National Laboratory. The recording system, methods of investigations and primary data processing are described in detail. The largest number of digital records was received by the permanent seismic station Talgar, situated in the northern Tien Shan, 20 km to the east of Almaty city. More than half of the records are seismograms of underground nuclear explosions and chemical explosions. The nuclear explosions were recorded mainly from the Semipalatinsk test site. In addition, records of the explosions from the Chinese test site Lop Nor and industrial nuclear explosions from the West Siberia region were obtained. Four records of strong chemical explosions were picked out (two of them have been produced at the Semipalatinsk test site and two -- in Uzbekistan). We also obtained 16 records of crustal earthquakes, mainly from the Altai region, close to the Semipalatinsk test site, and also from the West China region, close to the Lop Nor test site. In addition, a small number of records of earthquakes and underground nuclear explosions, received by arrays of temporary stations, that have been working in the southern Kazakhstan region are included in this report. Parameters of the digital seismograms and file structure are described. Possible directions of future work on the digitizing of unique data archive are discussed.

  14. Attitudes toward inter-hospital electronic patient record exchange: discrepancies among physicians, medical record staff, and patients.

    Science.gov (United States)

    Wang, Jong-Yi; Ho, Hsiao-Yun; Chen, Jen-De; Chai, Sinkuo; Tai, Chih-Jaan; Chen, Yung-Fu

    2015-07-12

    In this era of ubiquitous information, patient record exchange among hospitals still has technological and individual barriers including resistance to information sharing. Most research on user attitudes has been limited to one type of user or aspect. Because few analyses of attitudes toward electronic patient records (EPRs) have been conducted, understanding the attitudes among different users in multiple aspects is crucial to user acceptance. This proof-of-concept study investigated the attitudes of users toward the inter-hospital EPR exchange system implemented nationwide and focused on discrepant behavioral intentions among three user groups. The system was designed by combining a Health Level 7-based protocol, object-relational mapping, and other medical informatics techniques to ensure interoperability in realizing patient-centered practices. After implementation, three user-specific questionnaires for physicians, medical record staff, and patients were administered, with a 70 % response rate. The instrument showed favorable convergent construct validity and internal consistency reliability. Two dependent variables were applied: the attitudes toward privacy and support. Independent variables comprised personal characteristics, work characteristics, human aspects, and technology aspects. Major statistical methods included exploratory factor analysis and general linear model. The results from 379 respondents indicated that the patients highly agreed with privacy protection by their consent and support for EPRs, whereas the physicians remained conservative toward both. Medical record staff was ranked in the middle among the three groups. The three user groups demonstrated discrepant intentions toward privacy protection and support. Experience of computer use, level of concerns, usefulness of functions, and specifically, reason to use electronic medical records and number of outpatient visits were significantly associated with the perceptions. Overall, four

  15. A 350 ka record of climate change from Lake El'gygytgyn, Far East Russian Arctic: refining the pattern of climate modes by means of cluster analysis

    Directory of Open Access Journals (Sweden)

    U. Frank

    2013-07-01

    Full Text Available Rock magnetic, biochemical and inorganic records of the sediment cores PG1351 and Lz1024 from Lake El'gygytgyn, Chukotka peninsula, Far East Russian Arctic, were subject to a hierarchical agglomerative cluster analysis in order to refine and extend the pattern of climate modes as defined by Melles et al. (2007. Cluster analysis of the data obtained from both cores yielded similar results, differentiating clearly between the four climate modes warm, peak warm, cold and dry, and cold and moist. In addition, two transitional phases were identified, representing the early stages of a cold phase and slightly colder conditions during a warm phase. The statistical approach can thus be used to resolve gradual changes in the sedimentary units as an indicator of available oxygen in the hypolimnion in greater detail. Based upon cluster analyses on core Lz1024, the published succession of climate modes in core PG1351, covering the last 250 ka, was modified and extended back to 350 ka. Comparison to the marine oxygen isotope (δ18O stack LR04 (Lisiecki and Raymo, 2005 and the summer insolation at 67.5° N, with the extended Lake El'gygytgyn parameter records of magnetic susceptibility (κLF, total organic carbon content (TOC and the chemical index of alteration (CIA; Minyuk et al., 2007, revealed that all stages back to marine isotope stage (MIS 10 and most of the substages are clearly reflected in the pattern derived from the cluster analysis.

  16. Cultural Heritage Recording Utilising Low-Cost Closerange Photogrammetry

    Directory of Open Access Journals (Sweden)

    Melanie Kirchhöfer

    2011-12-01

    Full Text Available Cultural heritage is under a constant threat of damage or even destruction and comprehensive and accurate recording is necessary to attenuate the risk of losing heritage or serve as basis for reconstruction. Cost effective and easy to use methods are required to record cultural heritage, particularly during a world recession, and close-range photogrammetry has proven potential in this area. Off-the-shelf digital cameras can be used to rapidly acquire data at low cost, allowing non-experts to become involved. Exterior orientation of the camera during exposure ideally needs to be established for every image, traditionally requiring known coordinated target points. Establishing these points is time consuming and costly and using targets can be often undesirable on sensitive sites. MEMS-based sensors can assist in overcoming this problem by providing small-size and low-cost means to directly determine exterior orientation for close-range photogrammetry. This paper describes development of an image-based recording system, comprising an off-the-shelf digital SLR camera, a MEMS-based 3D orientation sensor and a GPS antenna. All system components were assembled in a compact and rigid frame that allows calibration of rotational and positional offsets between the components. The project involves collaboration between English Heritage and Loughborough University and the intention is to assess the system’s achievable accuracy and practicability in a heritage recording environment. Tests were conducted at Loughborough University and a case study at St. Catherine’s Oratory on the Isle of Wight, UK. These demonstrate that the data recorded by the system can indeed meet the accuracy requirements for heritage recording at medium accuracy (1-4cm, with either a single or even no control points. As the recording system has been configured with a focus on low-cost and easy-to-use components, it is believed to be suitable for heritage recording by non

  17. Olivia Records: The Production of a Movement.

    Science.gov (United States)

    Morris, Bonnie

    2015-01-01

    This article looks at the early years of Olivia Records, setting the context for the historic release of the album Where Would I Be Without You. From its origins as a Washington, D.C.-based activist collective in 1973, Olivia became a hugely successful recording company, marketing radical lesbian recordings and performances that soon defined the "women's music" movement. Both artistically and politically, Olivia's woman-identified albums became the soundtrack for a generation awakening to lesbian activism. Pat Parker and Judy Grahn's 1976 spoken-word recording is a unique demonstration of Olivia's radical production values and expanding catalog.

  18. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  19. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  20. Designing ETL Tools to Feed a Data Warehouse Based on Electronic Healthcare Record Infrastructure.

    Science.gov (United States)

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2015-01-01

    Aim of this paper is to propose a methodology to design Extract, Transform and Load (ETL) tools in a clinical data warehouse architecture based on the Electronic Healthcare Record (EHR). This approach takes advantages on the use of this infrastructure as one of the main source of information to feed the data warehouse, taking also into account that clinical documents produced by heterogeneous legacy systems are structured using the HL7 CDA standard. This paper describes the main activities to be performed to map the information collected in the different types of document with the dimensional model primitives.

  1. Flood-flow analysis for Kabul river at Warsak on the basis of flow-records of Kabul river at Nowshera

    International Nuclear Information System (INIS)

    Khan, B.

    2007-01-01

    High flows and stream discharge have long been measured and used by the engineers in the design of hydraulic structures and flood-protection works and in planning for flood-plain use. Probability-analysis is the basis for the engineering design of many projects and advance information about flood-forecasting. High-flow analysis or flood-frequency studies interpret a past record of events, to predict the future probability of occurrence. In many countries, including the author's country, the long term flow data required for design of hydraulic structures and flood-protection works are not available. In such cases, the only tool with hydrologists is to extend the short-term flow data available at some other site in the region. The present study is made to find a reliable estimation of maximum instantaneous flood for higher frequencies of Kabul River at Warsak weir. Kabul River, at Nowshera gaging station is used or the purpose and regression-analysis is performed to extend the instantaneous peak-flow record up to 29 years at Warsak. The frequency-curves of high-flows are plotted on the normal probability paper, using different probability distributions. The Gumbel distribution seemed to be the best fit for the observed data-points, and is used here for estimation of flood for different return periods. (author)

  2. MultiElec: A MATLAB Based Application for MEA Data Analysis.

    Science.gov (United States)

    Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R

    2015-01-01

    We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.

  3. Comparisons of Portable Sleep Monitors of Different Modalities: Potential as Naturalistic Sleep Recorders

    Directory of Open Access Journals (Sweden)

    Masahiro Matsuo

    2016-07-01

    Full Text Available Background: Humans spend more than a fourth of their life sleeping, and sleep quality has been significantly linked to health. However, the objective examination of ambulatory sleep quality remains a challenge, since sleep is a state of unconsciousness, which limits the reliability of self-reports. Therefore, a non-invasive, continuous, and objective method for the recording and analysis of naturalistic sleep is required.Objective: Portable sleep recording devices provide a suitable solution for the ambulatory analysis of sleep quality. In this study, the performance of two activity-based sleep monitors (Actiwatch and MTN-210 and a single-channel EEG-based sleep monitor (SleepScope were compared in order to examine their reliability for the assessment of sleep quality.Methods: Twenty healthy adults were recruited for this study. First, data from daily activity recorded by Actiwatch and MTN-210 were compared to determine whether MTN-210, a more affordable device, could yield data similar to Actiwatch, the de-facto standard. In addition, sleep detection ability was examined using data obtained by polysomnography as reference. One simple analysis included comparing the sleep/wake detection ability of Actiwatch, MTN-210, and SleepScope. Furthermore, the fidelity of sleep stage determination was examined using SleepScope in finer time resolution. Results: The results indicate that MTN-210 demonstrates an activity pattern comparable to that of Actiwatch, although their sensitivity preferences were not identical. Moreover, MTN-210 provides assessment of sleep duration comparable to that of the wrist-worn Actiwatch when MTN-210 was attached to the body. SleepScope featured superior overall sleep detection performance among the three methods tested. Furthermore, SleepScope was able to provide information regarding sleep architecture, although systemic bias was found. Conclusion: The present results suggest that single-channel EEG-based sleep monitors are

  4. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  5. High agreement between the new Mongolian electronic immunization register and written immunization records: a health centre based audit

    Directory of Open Access Journals (Sweden)

    Jocelyn Chan

    2017-09-01

    Full Text Available Introduction: Monitoring of vaccination coverage is vital for the prevention and control of vaccine-preventable diseases. Electronic immunization registers have been increasingly adopted to assist with the monitoring of vaccine coverage; however, there is limited literature about the use of electronic registers in low- and middle-income countries such as Mongolia. We aimed to determine the accuracy and completeness of the newly introduced electronic immunization register for calculating vaccination coverage and determining vaccine effectiveness within two districts in Mongolia in comparison to written health provider records. Methods: We conducted a cross-sectional record review among children 2–23 months of age vaccinated at immunization clinics within the two districts. We linked data from written records with the electronic immunization register using the national identification number to determine the completeness and accuracy of the electronic register. Results: Both completeness (90.9%; 95% CI: 88.4–93.4 and accuracy (93.3%; 95% CI: 84.1–97.4 of the electronic immunization register were high when compared to written records. The increase in completeness over time indicated a delay in data entry. Conclusion: Through this audit, we have demonstrated concordance between a newly introduced electronic register and health provider records in a middle-income country setting. Based on this experience, we recommend that electronic registers be accompanied by routine quality assurance procedures for the monitoring of vaccination programmes in such settings.

  6. Rescaled range analysis of streamflow records in the São Francisco River Basin, Brazil

    Science.gov (United States)

    Araujo, Marcelo Vitor Oliveira; Celeste, Alcigeimes B.

    2018-01-01

    Hydrological time series are sometimes found to have a distinctive behavior known as long-term persistence, in which subsequent values depend on each other even under very large time scales. This implies multiyear consecutive droughts or floods. Typical models used to generate synthetic hydrological scenarios, widely used in the planning and management of water resources, fail to preserve this kind of persistence in the generated data and therefore may have a major impact on projects whose design lives span for long periods of time. This study deals with the evaluation of long-term persistence in streamflow records by means of the rescaled range analysis proposed by British engineer Harold E. Hurst, who first observed the phenomenon in the mid-twentieth century. In this paper, Hurst's procedure is enhanced by a strategy based on statistical hypothesis testing. The case study comprises the six main hydroelectric power plants located in the São Francisco River Basin, part of the Brazilian National Grid. Historical time series of inflows to the major reservoirs of the system are investigated and 5/6 sites show significant persistence, with values for the so-called Hurst exponent near or greater than 0.7, i.e., around 40% above the value 0.5 that represents a white noise process, suggesting that decision makers should take long-term persistence into consideration when conducting water resources planning and management studies in the region.

  7. A stacking method and its applications to Lanzarote tide gauge records

    Science.gov (United States)

    Zhu, Ping; van Ruymbeke, Michel; Cadicheanu, Nicoleta

    2009-12-01

    A time-period analysis tool based on stacking is introduced in this paper. The original idea comes from the classical tidal analysis method. It is assumed that the period of each major tidal component is precisely determined based on the astronomical constants and it is unchangeable with time at a given point in the Earth. We sum the tidal records at a fixed tidal component center period T then take the mean of it. The stacking could significantly increase the signal-to-noise ratio (SNR) if a certain number of stacking circles is reached. The stacking results were fitted using a sinusoidal function, the amplitude and phase of the fitting curve is computed by the least squares methods. The advantage of the method is that: (1) an individual periodical signal could be isolated by stacking; (2) one can construct a linear Stacking-Spectrum (SSP) by changing the stacking period Ts; (3) the time-period distribution of the singularity component could be approximated by a Sliding-Stacking approach. The shortcoming of the method is that in order to isolate a low energy frequency or separate the nearby frequencies, we need a long enough series with high sampling rate. The method was tested with a numeric series and then it was applied to 1788 days Lanzarote tide gauge records as an example.

  8. Developing an electronic health record (EHR) for methadone treatment recording and decision support

    Science.gov (United States)

    2011-01-01

    Background In this paper, we give an overview of methadone treatment in Ireland and outline the rationale for designing an electronic health record (EHR) with extensibility, interoperability and decision support functionality. Incorporating several international standards, a conceptual model applying a problem orientated approach in a hierarchical structure has been proposed for building the EHR. Methods A set of archetypes has been designed in line with the current best practice and clinical guidelines which guide the information-gathering process. A web-based data entry system has been implemented, incorporating elements of the paper-based prescription form, while at the same time facilitating the decision support function. Results The use of archetypes was found to capture the ever changing requirements in the healthcare domain and externalises them in constrained data structures. The solution is extensible enabling the EHR to cover medicine management in general as per the programme of the HRB Centre for Primary Care Research. Conclusions The data collected via this Irish system can be aggregated into a larger dataset, if necessary, for analysis and evidence-gathering, since we adopted the openEHR standard. It will be later extended to include the functionalities of prescribing drugs other than methadone along with the research agenda at the HRB Centre for Primary Care Research in Ireland. PMID:21284849

  9. Developing an electronic health record (EHR) for methadone treatment recording and decision support

    LENUS (Irish Health Repository)

    Xiao, Liang

    2011-02-01

    Abstract Background In this paper, we give an overview of methadone treatment in Ireland and outline the rationale for designing an electronic health record (EHR) with extensibility, interoperability and decision support functionality. Incorporating several international standards, a conceptual model applying a problem orientated approach in a hierarchical structure has been proposed for building the EHR. Methods A set of archetypes has been designed in line with the current best practice and clinical guidelines which guide the information-gathering process. A web-based data entry system has been implemented, incorporating elements of the paper-based prescription form, while at the same time facilitating the decision support function. Results The use of archetypes was found to capture the ever changing requirements in the healthcare domain and externalises them in constrained data structures. The solution is extensible enabling the EHR to cover medicine management in general as per the programme of the HRB Centre for Primary Care Research. Conclusions The data collected via this Irish system can be aggregated into a larger dataset, if necessary, for analysis and evidence-gathering, since we adopted the openEHR standard. It will be later extended to include the functionalities of prescribing drugs other than methadone along with the research agenda at the HRB Centre for Primary Care Research in Ireland.

  10. Record power, ultra-broadband supercontinuum source based on highly GeO2 doped silica fiber

    DEFF Research Database (Denmark)

    Jain, Deepak; Sidharthan, R.; Moselund, Peter M.

    2016-01-01

    the potential of germania based photonic crystal fiber or a step-index fiber supercontinuum source for high power ultra-broad band emission being by pumped a 1060 nm or a 1550 nm laser source. To the best of our knowledge, this is the record power, ultra-broadband, and all-fiberized supercontinuum light source...... based on silica and germania fiber ever demonstrated to the date. (C) 2016 Optical Society of America......We demonstrate highly germania doped fibers for mid-infrared supercontinuum generation. Experiments ensure a highest output power of 1.44 W for a broadest spectrum from 700 nm to 3200 nm and 6.4 W for 800 nm to 2700 nm from these fibers, while being pumped by a broadband Erbium-Ytterbium doped...

  11. A Recording-Based Method for Auralization of Rotorcraft Flyover Noise

    Science.gov (United States)

    Pera, Nicholas M.; Rizzi, Stephen A.; Krishnamurthy, Siddhartha; Fuller, Christopher R.; Christian, Andrew

    2018-01-01

    Rotorcraft noise is an active field of study as the sound produced by these vehicles is often found to be annoying. A means to auralize rotorcraft flyover noise is sought to help understand the factors leading to annoyance. Previous work by the authors focused on auralization of rotorcraft fly-in noise, in which a simplification was made that enabled the source noise synthesis to be based on a single emission angle. Here, the goal is to auralize a complete flyover event, so the source noise synthesis must be capable of traversing a range of emission angles. The synthesis uses a source noise definition process that yields periodic and aperiodic (modulation) components at a set of discrete emission angles. In this work, only the periodic components are used for the source noise synthesis for the flyover; the inclusion of modulation components is the subject of ongoing research. Propagation of the synthesized source noise to a ground observer is performed using the NASA Auralization Framework. The method is demonstrated using ground recordings from a flight test of the AS350 helicopter for the source noise definition.

  12. An analysis of electronic health record-related patient safety concerns

    Science.gov (United States)

    Meeks, Derek W; Smith, Michael W; Taylor, Lesley; Sittig, Dean F; Scott, Jean M; Singh, Hardeep

    2014-01-01

    Objective A recent Institute of Medicine report called for attention to safety issues related to electronic health records (EHRs). We analyzed EHR-related safety concerns reported within a large, integrated healthcare system. Methods The Informatics Patient Safety Office of the Veterans Health Administration (VA) maintains a non-punitive, voluntary reporting system to collect and investigate EHR-related safety concerns (ie, adverse events, potential events, and near misses). We analyzed completed investigations using an eight-dimension sociotechnical conceptual model that accounted for both technical and non-technical dimensions of safety. Using the framework analysis approach to qualitative data, we identified emergent and recurring safety concerns common to multiple reports. Results We extracted 100 consecutive, unique, closed investigations between August 2009 and May 2013 from 344 reported incidents. Seventy-four involved unsafe technology and 25 involved unsafe use of technology. A majority (70%) involved two or more model dimensions. Most often, non-technical dimensions such as workflow, policies, and personnel interacted in a complex fashion with technical dimensions such as software/hardware, content, and user interface to produce safety concerns. Most (94%) safety concerns related to either unmet data-display needs in the EHR (ie, displayed information available to the end user failed to reduce uncertainty or led to increased potential for patient harm), software upgrades or modifications, data transmission between components of the EHR, or ‘hidden dependencies’ within the EHR. Discussion EHR-related safety concerns involving both unsafe technology and unsafe use of technology persist long after ‘go-live’ and despite the sophisticated EHR infrastructure represented in our data source. Currently, few healthcare institutions have reporting and analysis capabilities similar to the VA. Conclusions Because EHR-related safety concerns have complex

  13. Application of a net-based baseline correction scheme to strong-motion records of the 2011 Mw 9.0 Tohoku earthquake

    Science.gov (United States)

    Tu, Rui; Wang, Rongjiang; Zhang, Yong; Walter, Thomas R.

    2014-06-01

    The description of static displacements associated with earthquakes is traditionally achieved using GPS, EDM or InSAR data. In addition, displacement histories can be derived from strong-motion records, allowing an improvement of geodetic networks at a high sampling rate and a better physical understanding of earthquake processes. Strong-motion records require a correction procedure appropriate for baseline shifts that may be caused by rotational motion, tilting and other instrumental effects. Common methods use an empirical bilinear correction on the velocity seismograms integrated from the strong-motion records. In this study, we overcome the weaknesses of an empirically based bilinear baseline correction scheme by using a net-based criterion to select the timing parameters. This idea is based on the physical principle that low-frequency seismic waveforms at neighbouring stations are coherent if the interstation distance is much smaller than the distance to the seismic source. For a dense strong-motion network, it is plausible to select the timing parameters so that the correlation coefficient between the velocity seismograms of two neighbouring stations is maximized after the baseline correction. We applied this new concept to the KiK-Net and K-Net strong-motion data available for the 2011 Mw 9.0 Tohoku earthquake. We compared the derived coseismic static displacement with high-quality GPS data, and with the results obtained using empirical methods. The results show that the proposed net-based approach is feasible and more robust than the individual empirical approaches. The outliers caused by unknown problems in the measurement system can be easily detected and quantified.

  14. Mindfulness-based interventions for binge eating: a systematic review and meta-analysis.

    Science.gov (United States)

    Godfrey, Kathryn M; Gallo, Linda C; Afari, Niloofar

    2015-04-01

    Mindfulness-based interventions are increasingly used to treat binge eating. The effects of these interventions have not been reviewed comprehensively. This systematic review and meta-analysis sought to summarize the literature on mindfulness-based interventions and determine their impact on binge eating behavior. PubMED, Web of Science, and PsycINFO were searched using keywords binge eating, overeating, objective bulimic episodes, acceptance and commitment therapy, dialectical behavior therapy, mindfulness, meditation, mindful eating. Of 151 records screened, 19 studies met inclusion criteria. Most studies showed effects of large magnitude. Results of random effects meta-analyses supported large or medium-large effects of these interventions on binge eating (within-group random effects mean Hedge's g = -1.12, 95 % CI -1.67, -0.80, k = 18; between-group mean Hedge's g = -0.70, 95 % CI -1.16, -0.24, k = 7). However, there was high statistical heterogeneity among the studies (within-group I(2) = 93 %; between-group I(2) = 90 %). Limitations and future research directions are discussed.

  15. On the use of binaural recordings for dynamic binaural reproduction

    DEFF Research Database (Denmark)

    Hoffmann, Pablo F.; Christensen, Flemming

    2011-01-01

    Binaural recordings are considered applicable only for static binaural reproduction. That is, playback of binaural recordings can only reproduce the sound field captured for the fixed position and orientation of the recording head. However, given some conditions it is possible to use binaural...... recordings for the reproduction of binaural signals that change according to the listener actions, i.e. dynamic binaural reproduction. Here we examine the conditions that allow for such dynamic recording/playback configuration and discuss advantages and disadvantages. Analysis and discussion focus on two...

  16. Collaborative Affordances of Hybrid Patient Record Technologies in Medical Work

    DEFF Research Database (Denmark)

    Houben, Steven; Frost, Mads; Bardram, Jakob E

    2015-01-01

    explored the integration of paper and digital technology, there are still a wide range of open issues in the design of technologies that integrate digital and paper-based medical records. This paper studies the use of one such novel technology, called the Hybrid Patient Record (HyPR), that is designed......The medical record is a central artifact used to organize, communicate and coordinate information related to patient care. Despite recent deployments of electronic health records (EHR), paper medical records are still widely used because of the affordances of paper. Although a number of approaches...... to digitally augment a paper medical record. We report on two studies: a field study in which we describe the benefits and challenges of using a combination of electronic and paper-based medical records in a large university hospital and a deployment study in which we analyze how 8 clinicians used the Hy...

  17. NMR-based urine analysis in rats: prediction of proximal tubule kidney toxicity and phospholipidosis.

    Science.gov (United States)

    Lienemann, Kai; Plötz, Thomas; Pestel, Sabine

    2008-01-01

    The aim of safety pharmacology is early detection of compound-induced side-effects. NMR-based urine analysis followed by multivariate data analysis (metabonomics) identifies efficiently differences between toxic and non-toxic compounds; but in most cases multiple administrations of the test compound are necessary. We tested the feasibility of detecting proximal tubule kidney toxicity and phospholipidosis with metabonomics techniques after single compound administration as an early safety pharmacology approach. Rats were treated orally, intravenously, inhalatively or intraperitoneally with different test compounds. Urine was collected at 0-8 h and 8-24 h after compound administration, and (1)H NMR-patterns were recorded from the samples. Variation of post-processing and feature extraction methods led to different views on the data. Support Vector Machines were trained on these different data sets and then aggregated as experts in an Ensemble. Finally, validity was monitored with a cross-validation study using a training, validation, and test data set. Proximal tubule kidney toxicity could be predicted with reasonable total classification accuracy (85%), specificity (88%) and sensitivity (78%). In comparison to alternative histological studies, results were obtained quicker, compound need was reduced, and very importantly fewer animals were needed. In contrast, the induction of phospholipidosis by the test compounds could not be predicted using NMR-based urine analysis or the previously published biomarker PAG. NMR-based urine analysis was shown to effectively predict proximal tubule kidney toxicity after single compound administration in rats. Thus, this experimental design allows early detection of toxicity risks with relatively low amounts of compound in a reasonably short period of time.

  18. Fibrous dysplasia of the cranial vault: quantitative analysis based on neural networks

    International Nuclear Information System (INIS)

    Arana, E.; Marti-Bonmati, L.; Paredes, R.; Molla, E.

    1998-01-01

    To assess the utility of statistical analysis and neural networks in the quantitative analysis of fibrous dysplasia of the cranial vault. Ten patients with fibrous dysplasia (six women and four men with a mean age of 23.60±17.85 years) were selected from a series of 167 patients with lesions of the cranial vault evaluated by plain radiography and computed tomography (CT). Nineteen variables were taken from their medical records and radiological study. Their characterization was based on statistical analysis and neural network, and was validated by means of the leave-one-out method. The performance of the neural network was estimated by means of receiver operating characteristics (ROC) curves, using as a parameter the area under the curve A z . Bivariate analysis identified age, duration of symptoms, lytic and sclerotic patterns, sclerotic margin, ovoid shape, soft-tissue mas and periosteal reaction as significant variables. The area under the neural network curve was 0.9601±0.0435. The network selected the matrix and soft-tissue mass a variables that were indispensable for diagnosis. The neural network presents a high performance in the characterization of fibrous dysplasia of the cranial vault, disclosing occult interactions among the variables. (Author) 24 refs

  19. Testing the correlation of fragmented pollen records of the middle and late Pleistocene temperate stages

    DEFF Research Database (Denmark)

    Kuneš, Petr; Odgaard, Bent Vad

    Quaternary temperate stages have long been described based on changing pollen abundances of various tree taxa in lacustrine sediments. Later, attempts have been made to assign such biostratigraphic units to distinct marine isotope stages (MIS). Existing continuous chronosequences from Southern...... records depends on site-to-site correlations. This comparison has often been performed on a visual basis, lacking clearly defined protocols and statements of underlying assumptions. Here I test the correlation of well and poorly known pollen records of the middle- and late-Pleistocene temperate stages...... from Northern-Central Europe and evaluate the usefulness of several numerical techniques. TWINSPAN analysis identifies groups of temperate stages based on presence/absence of their indicative taxa and may be useful for distinguishing between older and younger interglacials. Site-to-site sequence...

  20. Analysis of debris-flow recordings in an instrumented basin: confirmations and new findings

    Directory of Open Access Journals (Sweden)

    M. Arattano

    2012-03-01

    Full Text Available On 24 August 2006, a debris flow took place in the Moscardo Torrent, a basin of the Eastern Italian Alps instrumented for debris-flow monitoring. The debris flow was recorded by two seismic networks located in the lower part of the basin and on the alluvial fan, respectively. The event was also recorded by a pair of ultrasonic sensors installed on the fan, close to the lower seismic network. The comparison between the different recordings outlines particular features of the August 2006 debris flow, different from that of events recorded in previous years. A typical debris-flow wave was observed at the upper seismic network, with a main front abruptly appearing in the torrent, followed by a gradual decrease of flow height. On the contrary, on the alluvial fan the wave displayed an irregular pattern, with low flow depth and the main peak occurring in the central part of the surge both in the seismic recording and in the hydrographs. Recorded data and field evidences indicate that the surge observed on the alluvial fan was not a debris flow, and probably consisted in a water surge laden with fine to medium-sized sediment. The change in shape and characteristics of the wave can be ascribed to the attenuation of the surge caused by the torrent control works implemented in the lower basin during the last years.