WorldWideScience

Sample records for accurate telemetric recording

  1. Optimized surgical techniques and postoperative care improve survival rates and permit accurate telemetric recording in exercising mice

    Gassmann Max

    2009-08-01

    Full Text Available Abstract Background The laboratory mouse is commonly used as a sophisticated model in biomedical research. However, experiments requiring major surgery frequently lead to serious postoperative complications and death, particularly if genetically modified mice with anatomical and physiological abnormalities undergo extensive interventions such as transmitter implantation. Telemetric transmitters are used to study cardiovascular physiology and diseases. Telemetry yields reliable and accurate measurement of blood pressure in the free-roaming, unanaesthetized and unstressed mouse, but data recording is hampered substantially if measurements are made in an exercising mouse. Thus, we aimed to optimize transmitter implantation to improve telemetric signal recording in exercising mice as well as to establish a postoperative care regimen that promotes convalescence and survival of mice after major surgery in general. Results We report an optimized telemetric transmitter implantation technique (fixation of the transmitter body on the back of the mouse with stainless steel wires for subsequent measurement of arterial blood pressure during maximal exercise on a treadmill. This technique was used on normal (wildtype mice and on transgenic mice with anatomical and physiological abnormalities due to constitutive overexpression of recombinant human erythropoietin. To promote convalescence of the animals after surgery, we established a regimen for postoperative intensive care: pain treatment (flunixine 5 mg/kg bodyweight, subcutaneously, twice per day and fluid therapy (600 μl, subcutaneously, twice per day were administrated for 7 days. In addition, warmth and free access to high energy liquid in a drinking bottle were provided for 14 days following transmitter implantation. This regimen led to a substantial decrease in overall morbidity and mortality. The refined postoperative care and surgical technique were particularly successful in genetically modified

  2. A digital programmable telemetric system for recording extracellular action potentials.

    Heredia-López, Francisco J; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J; Azpiroz-Leehan, Joaquín

    2009-05-01

    This article describes the design and preliminary evaluation of a small-sized and low energy consumption wearable wireless telemetry system for the recording of extracellular neuronal activity, with the possibility of selecting one of four channels. The system comprises four radio frequency (RF) transceivers, three microcontrollers, and a digital amplifier and filter. This constitutes an innovative distributed processing approach. Gain, cutoff frequencies, and channel selection are remotely adjusted. Digital data transmission is used for both the bioelectrical signals and the control commands. This feature offers superior immunity to external RF interference. Real-time viewing of the acquired data allows the researcher to select only relevant data for storage. Simultaneous recordings of neuronal activity from the striatum of a freely moving rat, both with the wireless device and with a wired data acquisition system, are shown. PMID:19363175

  3. Accurate blood pressure recording: Is it difficult?

    Bhalla A

    2005-11-01

    Full Text Available BACKGROUND: Blood pressure (BP measurement is a routine procedure but errors are frequently committed during BP recording. AIMS AND SETTINGS: The aim of the study was to look at the prevalent practices in the institute regarding BP recording. The study was conducted in the Medicine Department at Government Medical College, Chandigarh, a teaching institute for MBBS students. METHODS: A prospective, observational study was performed amongst the 80 doctors in a tertiary care hospital. All of them were observed by a single observer during the act of BP recording. The observer was well versed with the guidelines issued by British Hypertension Society (BHS and the deviations from the standard set of guidelines issued by BHS were noted. The errors were defined as deviations from these guidelines. STATISTICAL METHODS: The results were recorded as percentage of doctors committing these errors. RESULTS: In our study, 90% used mercury type sphygmomanometer. Zero error of the apparatus, hand dominance was not noted by any one. Every one used the standard BP cuff for recording BP. 70% of them did not let the patient rest before recording BP. 80% did not remove the clothing from the arm. None of them recorded BP in both arms. In out patient setting, 80% recorded blood pressure in sitting position and 14% in supine position. In all the patients where BP was recorded in sitting position BP apparatus was below the level of heart and 20% did not have their arm supported. 60% did not use palpatory method for noticing systolic BP and 70% did not raise pressure 30-40 mm Hg above the systolic level before checking the BP by auscultation. 80% lowered the BP at a rate of more than 2 mm/s and 60% rounded off the BP to nearest 5-10 mm Hg. 70% recorded BP only once and 90% of the rest re inflated the cuff without completely deflating and allowing rest before a second reading was obtained. CONCLUSION: The practice of recording BP in our hospital varies from the standard

  4. Accurate blood pressure recording: Is it difficult?

    Bhalla A; Singh R; D'cruz S; Lehl S; Sachdev A

    2005-01-01

    BACKGROUND: Blood pressure (BP) measurement is a routine procedure but errors are frequently committed during BP recording. AIMS AND SETTINGS: The aim of the study was to look at the prevalent practices in the institute regarding BP recording. The study was conducted in the Medicine Department at Government Medical College, Chandigarh, a teaching institute for MBBS students. METHODS: A prospective, observational study was performed amongst the 80 doctors in a tertiary care hospital. All ...

  5. A Miniature Telemetric System for Freely Roaming Animals

    Zhan-Ping Wang; Chun-Peng Zhang; Guang-Zhan Fang; Yang Xia; Tie-Jun Liu; De-Zhong Yao

    2009-01-01

    Telemetric monitoring and control are the two critical aspects for a robot-rat.Development in this work is a telemetric system to record the electro-encephalogram (EEG) from adult freely roaming animals.The system consists of two separated components:the transmit-end system,which consists of the preamplifier,the LPF (low-pass filter) and the transmitter,and the receive-end system,which consists of the receiver,the interface of receive-end and PC.The transmit-end system with light weight (10 g including battery) and small size (20 mm?50 mm) is fettered on the back of the rat.The EEG signal is modulated at the RF frequency of 2.4 GHz by nRF24E1 and transmitted by the antenna.The system can measure the EEG signal of the rat in freely roaming over a wireless transmission distance up to 8 m,and provide a new platform for behavioral and neurophysiological experiments.

  6. Do foraminifera accurately record seawater neodymium isotope composition?

    Scrivner, Adam; Skinner, Luke; Vance, Derek

    2010-05-01

    Palaeoclimate studies involving the reconstruction of past Atlantic meridional overturning circulation increasingly employ isotopes of neodymium (Nd), measured on a variety of sample media (Frank, 2002). In the open ocean, Nd isotopes are a conservative tracer of water mass mixing and are unaffected by biological and low-temperature fractionation processes (Piepgras and Wasserburg, 1987; Lacan and Jeandel, 2005). For decades, benthic foraminifera have been widely utilised in stable isotope and geochemical studies, but have only recently begun to be exploited as a widely distributed, high-resolution Nd isotope archive (Klevenz et al., 2008), potentially circumventing the difficulties associated with other methods used to recover past deep-water Nd isotopes (Klevenz et al., 2008; Rutberg et al., 2000; Tachikawa et al., 2004). Thus far, a single pilot study (Klevenz et al., 2008) has indicated that core-top sedimentary benthic foraminifera record a Nd isotope composition in agreement with the nearest available bottom seawater data, and has suggested that this archive is potentially useful on both millennial and million-year timescales. Here we present seawater and proximal core-top foraminifer Nd isotope data for samples recovered during the 2008 "RETRO" cruise of the Marion Dufresne. The foraminifer samples comprise a depth-transect spanning 3000m of the water column in the Angola Basin and permit a direct comparison between high-resolution water column and core-top foraminiferal Nd isotope data. We use these data to assess the reliability of both planktonic and benthic foraminifera as recorders of water column neodymium isotope composition. Frank, M., 2002. Radiogenic isotopes: Tracers of past ocean circulation and erosional input, Rev. Geophys., 40 (1), 1001, doi:10.1029/2000RG000094. Klevenz, V., Vance, D., Schmidt, D.N., and Mezger, K., 2008. Neodymium isotopes in benthic foraminifera: Core-top systematics and a down-core record from the Neogene south Atlantic

  7. Telemetric Sensors for the Space Life Sciences

    Hines, John W.; Somps, Chris J.; Madou, Marc; Jeutter, Dean C.; Singh, Avtar; Connolly, John P. (Technical Monitor)

    1996-01-01

    Telemetric sensors for monitoring physiological changes in animal models in space are being developed by NASA's Sensors 2000! program. The sensors measure a variety of physiological measurands, including temperature, biopotentials, pressure, flow, acceleration, and chemical levels, and transmit these signals from the animals to a remote receiver via a wireless link. Thus physiologic information can be obtained continuously and automatically without animal handling, tethers, or percutaneous leads. We report here on NASA's development and testing of advanced wireless sensor systems for space life sciences research.

  8. The Telemetric and Holter ECG Warehouse (THEW) The first three years of development and Research

    Couderc, Jean-Philippe

    2012-01-01

    The Telemetric and Holter ECG Warehouse (THEW) hosts more than 3,700 digital 24-Holter ECG recordings from 13 independent studies. In addition to the ECGs, the repository includes patient information in separate clinical database with a content varying according to the study focus. In its third year of activities, the THEW database has been accessed by researchers from 37 universities and 16 corporations located in 16 countries worldwide. Twenty publications have been released primarily focus...

  9. The Jalisco Seismic Telemetric Network (RESJAL)

    Nunez-Cornu, F. J.; Nunez-Cornu, F. J.; Reyes-Davila, G.; Reyes-Davila, G.; Suarez-Plascencia, C.; Suarez-Plascencia, C.; Gonzalez-Ledezma, M.; Garcia-Puga, J.

    2001-12-01

    The region of Jalisco is one of the most active seismic regions in Mexico, the main tectonic units in this region are the Jalisco Block and the Rivera Plate. The greatest earthquake (M=8.2) occurred in Mexico in the Twenty-Century (1932) took place in the coast of Jalisco, this was followed by another one (Ms =7.8) fifteen days later. In 1995 an earthquake magnitude 8.0 took place in the coast of Jalisco, but its rupture area was only the southern half of the rupture area proposed for the 1932 earthquakes, these facts suggest the existence of an important seismic Gap in the north coast of Jalisco which includes the area of Bahía de Banderas. However, not only subduction earthquakes occurred in this region there are also large inland earthquakes, such as the December 27, 1568 and February 11, 1872 events. There are also three active volcanoes Sanganguey, Ceboruco and the most active volcano in Mexico, the Colima volcano. In spite of these facts and the risk associated to these processes, there were only one seismological permanent station in Chamela on the coast of Jalisco and an analog telemetric network (RESCO) located on the Colima Volcano and the south part of the Colima Rift Zone (CRZ). By these reasons, the Unidad Estatal de Protección Civil de Jalisco (Jalisco Civil Defense) began a project to install a Digital Telemetric Network in the region in several phases, this project is carrying out jointly with SisVOc UdeG.; due to the size of the area and the topography of the region it is very difficult to get direct telemetric links, by these reasons the network is designed in cells with nodes, where the nodes are the different Campus of the University of Guadalajara located in the region, all Campus are linked by a computer network. First phase started in August 2001, it includes the installation of six stations, each station with a Kinemetrics Everest 24 bit datalogger, GPS time, and a Lennartz LE3Dlite 1Hz sensor, using KNI NMS to control and data acquisition

  10. Metamaterial based telemetric strain sensing in different materials

    Melik, Rohat; Unal, Emre; Perkgoz, Nihan Kosku; Puttlitz, Christian; Demir, Hilmi Volkan

    2010-01-01

    We present telemetric sensing of surface strains on different industrial materials using split-ring-resonator based metamaterials. For wireless strain sensing, we utilize metamaterial array architectures for high sensitivity and low nonlinearity-errors in strain sensing. In this work, telemetric strain measurements in three test materials of cast polyamide, derlin and polyamide are performed by observing operating frequency shift under mechanical deformation and these data are compared with c...

  11. One Unequal Error Control Method for Telemetric Data Transmission

    Hirner, Tomáš; Farkaš, Peter; Krile, Srečko

    2011-05-01

    In wireless sensor networks (WSN) it is necessary to use very simple codes for transmission of information since the nodes in these networks have usually only limited energy available not only for transmission but also for processing. On the other hand, common codes do not usually take into account the fact that in case of telemetric information the weights of individual orders are not equal and errors in different orders cause different deviations from correct value. In this contribution, new very simple codes for transmission of telemetric information on WSN will be presented, which take into account the above-mentioned requirements. Resulting square deviation will be used as a quality evaluation criterion.

  12. Clinical experience with telemetric intracranial pressure monitoring in a Danish neurosurgical center

    Lilja, Alexander; Andresen, Morten; Hadi, Amer;

    2014-01-01

    reading, number of ICP recording sessions (in relation to symptoms of increased ICP) and their clinical consequence. RESULTS: We included 21 patients in the evaluation (11 female and 10 male). Median age was 28 (2-83) years and median duration of disease was 11 (0-30) years. Eleven patients had various...... procedure related risks of repeated transducer insertions. MATERIALS AND METHODS: We identified all patients in our clinic with an implanted Raumedic(®) telemetric ICP probe (NEUROVENT(®)-P-tel). For each patient we identified diagnosis, indication for implantation, surgical complications, duration of ICP...... recording session was 154 (8-433) days. In total, 86 recording sessions were performed; 29 resulted in surgical shunt revision, 30 in change of acetazolamide dose or programmable valve setting, 20 required no action and 5 resulted in a new recording session. No surgical complications occurred, except for...

  13. Surgical implantation and functional assessment of an invasive telemetric system to measure autonomic responses in domestic pigs.

    Krause, A; Zebunke, M; Bellmann, O; Mohr, E; Langbein, J; Puppe, B

    2016-01-01

    The first aim of this study was to establish a surgical procedure to implant a new telemetric device for the continuous recording of electrocardiogram (ECG) and blood pressure (BP) in freely moving pigs. A second aim was the functional assessment of cardiovascular parameters, including heart rate variability (HRV) and blood pressure variability (BPV), so that these data could be used as the basis for the objective evaluation of autonomic activity and balance in different behavioural contexts. Eleven domestic pigs (German Landrace) underwent surgery for the placement of a telemetric device. At day 15 after surgery, 512 consecutive inter-beat intervals and pressure waves were analysed using different detection methods (automatic and manually corrected) while the animals were resting or feeding, respectively. HRV and BPV were calculated. Incomplete datasets were found in four pigs due to missing ECG or BP signals. Technical and surgical issues concerning catheterisation and detachment of the negative ECG lead were continuously improved. In the remaining pigs, excellent signal quality (manually corrected data of 1%) was obtained during resting and acceptable signal quality (ECG recordings. Sympathetic arousal with accompanying vagal withdrawal during feeding was documented. The established surgical implantation and functional assessment of the telemetric system with the reliable registration of cardiovascular parameters in freely moving pigs could serve as a basis for future studies of autonomic regulation in context of stress and animal welfare. PMID:26626089

  14. Telemetric Technologies for the Assay of Gene Expression

    Paul, Anna-Lisa; Bamsey, Matthew; Berinstain, Alain; Neron, Philip; Graham, Thomas; Ferl, Robert

    Telemetric data collection has been widely used in spaceflight applications where human participation is limited (orbital mission payloads) or unfeasible (planetary landers, satellites, and probes). The transmission of digital data from electronic sensors of typical environmental parameters, growth patterns and physical properties of materials is routine telemetry, and even the collection and transmission of deep space images is a standard tool of astrophysics. But telemetric imaging for current biological payloads has thus far been limited to the collection of standard white-light photography that is largely confined to reporting the surface characteristics of the specimens involved. Advances in imaging technologies that facilitate the collection of a variety of light wavelengths will expand the science return on biological payloads to include evaluations of the molecular genetic response of organisms to the spaceflight or extraterrestrial environment, with minimal or no human intervention. Advanced imaging technology in combination with biologically engineered sensor organisms can create a system that can report via telemetry on the patterns of gene expression required to adapt to a novel environment. The utilization of genetically engineered plants as biosensors has made elegant strides in the recent years, providing keen insights into the health of plants in general and particularly in the nature and cellular location of stress responses. Moreover, molecular responses to gravitational vectors have been elegantly analyzed with fluorescent tools. Green Fluorescence Protein (GFP) and other fluorophores have made it possible for analyses of gene expression and biological responses to occur telemetrically, with the information potentially delivered to the investigator over large distances as simple, preprocessed fluorescence images. Having previously deployed transgenic plant biosensors to evaluate responses to orbital spaceflight, we wish to develop both the plants

  15. Develop of a telemetric system of gases for the study of seismic signs

    A gas telemetrical system is in development and adjusts. Some diagrams for the measurements of gases from groundwater in the Eifel (Germany) seismogenic area are presented. Two water trap for the protection of sensors and some sensors for Helium, Radon-222 and physical parameters like temperature and pressure are used. The information in this gas telemetrical system is converted from analogical to digital and viceverse used electronic units, pc computers and modems. eperm system. telemetry, carbon dioxide, multiparameter station, real time

  16. Chlorine nuclear quadrupole resonance spectrometer with accurate recording the resonant frequency

    A spectrometer with automatic frequency control (AFC) of the nuclear quadrupole resonance (NQR) detector has been developed for decreasing errors and automating measurement of NQR frequencies. A parametric superregenerator is used as a detector of signals in the NQR 35Cl spectrometer. A digital frequency meter and a band puncher are used for measuring and recording of the values of the synthesizer frequency; the recording of the first derivative of the NQR signal is done by a two-coordinate self-recorder. The AFC circuit consists of an audio generator, an amplidude detector, a selective low-frequency amplifier, a low-frequency phase detector, a direct-current amplifier and a voltage adder. The recording of the first derivative of the signal of NQR 35Cl in KClO3 at the modulation frequency of 250 Hz was given to illustrate the operation of the NQR spectrometer. The rems error of measurements of the NQR 35Cl in KClO3 frequency is +-1.5 Hz, which corresponds to possible sample temperature changes with an accuracy of +0.0003 K

  17. Telemetric control of peripheral lipophagy by hypothalamic autophagy.

    Martinez-Lopez, Nuria; Singh, Rajat

    2016-08-01

    Autophagy maintains cellular quality control by degrading organelles, and cytosolic proteins and their aggregates in lysosomes. Autophagy also degrades lipid droplets (LD) through a process termed lipophagy. During lipophagy, LD are sequestered within autophagosomes and degraded by lysosomal acid lipases to generate free fatty acids that are β-oxidized for energy. Lipophagy was discovered in hepatocytes, and since then has been shown to function in diverse cell types. Whether lipophagy degrades LD in the major fat storing cell-the adipocyte-remained unclear. We have found that blocking autophagy in brown adipose tissues (BAT) by deleting the autophagy gene Atg7 in BAT MYF5 (myogenic factor 5)-positive progenitors increases basal lipid content in BAT and decreases lipid utilization during cold exposure-indicating that lipophagy contributes to lipohomeostasis in the adipose tissue. Surprisingly, knocking out Atg7 in hypothalamic proopiomelanocortin (POMC) neurons also blocks lipophagy in BAT and liver suggesting that specific neurons within the central nervous system (CNS) exert telemetric control over lipophagy in BAT and liver. PMID:27341145

  18. Design and Optimization of a Telemetric system for appliance in earthquake prediction

    Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C.

    2009-04-01

    This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process

  19. Combining Spatial and Telemetric Features for Learning Animal Movement Models

    Kapicioglu, Berk; Wikelski, Martin; Broderick, Tamara

    2012-01-01

    We introduce a new graphical model for tracking radio-tagged animals and learning their movement patterns. The model provides a principled way to combine radio telemetry data with an arbitrary set of userdefined, spatial features. We describe an efficient stochastic gradient algorithm for fitting model parameters to data and demonstrate its effectiveness via asymptotic analysis and synthetic experiments. We also apply our model to real datasets, and show that it outperforms the most popular radio telemetry software package used in ecology. We conclude that integration of different data sources under a single statistical framework, coupled with appropriate parameter and state estimation procedures, produces both accurate location estimates and an interpretable statistical model of animal movement.

  20. Home labour induction with retrievable prostaglandin pessary and continuous telemetric trans-abdominal fetal ECG monitoring.

    Zubair Rauf

    Full Text Available OBJECTIVE: To evaluate the feasibility of continuous telemetric trans-abdominal fetal electrocardiogram (a-fECG in women undergoing labour induction at home. STUDY DESIGN: Low risk women with singleton term pregnancy undergoing labour induction with retrievable, slow-release dinoprostone pessaries (n = 70 were allowed home for up to 24 hours, while a-fECG and uterine activity were monitored in hospital via wireless technology. Semi-structured diaries were analysed using a combined descriptive and interpretive approach. RESULTS: 62/70 women (89% had successful home monitoring; 8 women (11% were recalled because of signal loss. Home monitoring lasted between 2-22 hours (median 10 hours. Good quality signal was achieved most of the time (86%, SD 10%. 3 women were recalled back to hospital for suspicious a-fECG. In 2 cases suspicious a-fECG persisted, requiring Caesarean section after recall to hospital. 48/51 women who returned the diary coped well (94%; 46/51 were satisfied with home monitoring (90%. CONCLUSIONS: Continuous telemetric trans-abdominal fetal ECG monitoring of ambulatory women undergoing labour induction is feasible and acceptable to women.

  1. Telemetric Catheter-Based Pressure Sensor for Hemodynamic Monitoring: Experimental Experience

    The purpose of this study was to evaluate the technical and animal experimental feasibility of a percutaneously implantable pulmonary arterial implant for permanent hemodynamic monitoring. Two systems for measuring pulmonary artery pressure (PAP) as well as pulmonary artery occlusion pressure (PAOP) were developed by modifying a commercially available pulmonary artery catheter (PAC). First, a cable-bound catheter-based system was designed by implementation of a capacitive absolute-pressure sensor in the catheter tip. This system was developed further into a completely implantable telemetric system. The devices were tested in an acute setting in a total of 10 sheep. The implant was placed with its tip in the descending pulmonary artery via the right jugular approach. Results were compared with conventional PAC positioned in the contralateral pulmonary artery using Pearson's correlation coefficients and Bland-Altman plots. Implantation of the monitoring systems was uneventful in 10 animals. Data from two fully functional cable-bound and telemetric pressure monitoring systems were available, with a total of 18,506 measurements. There was an excellent correlation between reference data and the data obtained with the implants (r = 0.9944). Bland-Altman plots indicated a very good agreement between the techniques. We report the development and successful initial test of an implantable catheter-based device for long-term measurement of PAP and PAOP. Both devices may be applicable for hemodynamic monitoring. Further long-term studies for assessing reliability and durability of the device are warranted.

  2. Design and Performance of a Low-Cost Telemetric Laparoscopic Tactile Grasper.

    Schostek, Sebastian; Zimmermann, Melanie; Schurr, Marc O; Prosst, Ruediger L

    2016-06-01

    Tactile feedback is completely lost in laparoscopic surgery, which would provide information about tissue compliance, texture, structural features, and foreign bodies. We developed a system with artificial tactile feedback for laparoscopic surgery that consists of a telemetric tactile laparoscopic grasper, a remote PC with customized software, and a commercial video-mixer. A standard, nonsensorized laparoscopic grasper was customized to allow the integration of a tactile sensor and its electronics. The tactile sensor and the electronics module were designed to be detachable from the instrument. These parts are lightweight and wireless, thus not impeding the use of the device as surgical instrument. The remaining system components used to generate visualization of the tactile data do not influence the workflow in the operating room. The overall system design of the described instrumentation allows for easy implementation in an operating room environment. The fabrication of the tactile sensor is relatively easy and the production costs are low. With this telemetric laparoscopic grasper instrument, systematic preclinical studies can be performed in which surgeons execute surgical tasks that are derived from clinical reality. The experience gained from these investigations could then be used to define the requirements for any further development of artificial tactile feedback systems. PMID:26546367

  3. Comparison between core temperatures measured telemetrically using the CorTemp® ingestible temperature sensor and rectal temperature in healthy Labrador retrievers

    Osinchuk, Stephanie; Taylor, Susan M.; Shmon, Cindy L.; Pharr, John; Campbell, John

    2014-01-01

    This study evaluated the CorTemp® ingestible telemetric core body temperature sensor in dogs, to establish the relationship between rectal temperature and telemetrically measured core body temperature at rest and during exercise, and to examine the effect of sensor location in the gastrointestinal (GI) tract on measured core temperature. CorTemp® sensors were administered orally to fasted Labrador retriever dogs and radiographs were taken to document sensor location. Core and rectal temperatu...

  4. Telemetric left ventricular monitoring using wireless telemetry in the rabbit model

    Zavala Diana L

    2011-09-01

    Full Text Available Abstract Background Heart failure is a critical condition that affects many people and often results from left ventricular dysfunction. Numerous studies investigating this condition have been performed using various model systems. To do so, investigators must be able to accurately measure myocardial performance in order to determine the degree of left ventricular function. In this model development study, we employ a wireless telemetry system purchased from Data Sciences International to continuously assess left ventricular function in the rabbit model. Findings We surgically implanted pressure-sensitive catheters fitted to wireless radio-transmitters into the left ventricle of Dutch-belted rabbits. Following recovery of the animals, we continuously recorded indices of cardiac contractility and ventricular relaxation at baseline for a given time period. The telemetry system allowed us to continuously record baseline left ventricular parameters for the entire recording period. During this time, the animals were unrestrained and fully conscious. The values we recorded are similar to those obtained using other reported methods. Conclusions The wireless telemetry system can continuously measure left ventricular pressure, cardiac contractility, and cardiac relaxation in the rabbit model. These results, which were obtained just as baseline levels, substantiate the need for further validation in this model system of left ventricular assessment.

  5. Telemetric assessment of referred vaginal hyperalgesia and the effect of Indomethacin in a rat model of endometriosis

    NataliaDmitrieva

    2012-08-01

    Full Text Available Symptoms of endometriosis, among others, include pelvic/abdominal and muscle pain. Non-steroidal anti-inflammatory agents are first-line treatment for this pain. Similar to women, rats with surgically-induced endometriosis (ENDO, but not its surgical control, exhibit vaginal hyperalgesia, which in rats is evidenced by a decreased threshold for the visceromotor response (VMR induced by vaginal distention. Here we assess the VMR in rats with implanted probes that telemetrically transmit EMG activity from the abdominal muscle. The feasibility and sensitivity of this technique for monitoring the VMR threshold across the estrous cycle and the influence of Indomethacin on ENDO-induced vaginal hyperalgesia were evaluated. VMR thresholds in response to vaginal distention with an infusion pump were measured in different estrous stages. Indomethacin (5 or 10 mg/kg i.p.or s.c. was injected in proestrous rats and 40-60 min later the VMR threshold was measured. The VMR threshold varied across the estrous cycle only in ENDO rats, being lowest in proestrus. Indomethacin increased this threshold in proestrous ENDO rats. These results show that telemetric assessment of the VMR is a sensitive tool, suitable for long-term studies in conscious rats. The results with this technique also suggest that ENDO-associated vaginal hyperalgesia involves COX activity, the feature that also underlies inflammatory pains.

  6. Developing Tele-Operated Laboratories for Manufacturing Engineering Education. Platform for E-Learning and Telemetric Experimentation (PeTEX

    A. Erman Tekkaya

    2010-09-01

    Full Text Available The aim of the PeTEX-project is to establish an e-Learning platform for the development, implementation, and delivery of educational training programs in the field of manufacturing engineering. The PeTEX team designs both: a technical platform for eLearning based on “Moodle” including distributed tele-operated experimentation facilities, and didactic and socio-technical requirements for a successful online learning community. User interfaces are deployed for remote access to instruments, data analysis and multiplexed data access via network protocols. Hence, the platform provides complex tools in order to perform various activities to support the educational process, from telemetric experimentation to virtual project groups for an entire community to the purpose of domain specific learning. This paper describes important steps of interdisciplinary participatory design and development of a remote lab-prototype in the field of manufacturing engineering.

  7. Equipo biomédico con telemetría diseñado para las áreas rurales

    David Arturo Gutiérrez Begovich; Raúl Ruiz Meza

    2006-01-01

    Se presenta un sistema que registra señales electrocardiográficas, las guarda en archivos y las transmite entre teléfonos celulares. La finalidad de la comunicación vía celular es poder brindar servicios de telemedicina en las áreas rurales sin depender de cables y de forma económica. Una cualidad de un sistema biomédico con telemetría es que los médicos especialistas no tienen que viajar para dar un diagnóstico, es decir, es un sistema que puede ser operado por técnicos y los resultados ser ...

  8. Accurate wavelength measurements and modeling of FeXV to FeXIX spectra recorded in high density plasmas between 13.5 to 17 A.

    May, M; Beiersdorfer, P; Dunn, J; Jordan, N; Osterheld, A; Faenov, A; Pikuz, T; Skobelev, I; Fora, F; Bollanti, S; Lazzaro, P D; Murra, D; Reale, A; Reale, L; Tomassetti, G; Ritucci, A; Francucci, M; Martellucci, S; Petrocelli, G

    2004-09-28

    Iron spectra have been recorded from plasmas created at three different laser plasma facilities, the Tor Vergata University laser in Rome (Italy), the Hercules laser at ENEA in Frascati (Italy), and the Compact Multipulse Terawatt (COMET) laser at LLNL in California (USA). The measurements provide a means of identifying dielectronic satellite lines from FeXVI and FeXV in the vicinity of the strong 2p {yields} 3d transitions of FeXVII. About 80 {Delta}n {ge} 1 lines of FeXV (Mg-like) to FeXIX (O-like) were recorded between 13.8 to 17.1 {angstrom} with a high spectral resolution ({lambda}/{Delta}{lambda} {approx} 4000), about thirty of these lines are from FeXVI and FeXV. The laser produced plasmas had electron temperatures between 100 to 500 eV and electron densities between 10{sup 20} to 10{sup 22} cm{sup -3}. The Hebrew University Lawrence Livermore Atomic Code (HULLAC) was used to calculate the atomic structure and atomic rates for FeXV to FeXIX. HULLAC was used to calculate synthetic line intensities at T{sub e} = 200 eV and n{sub e} = 10{sup 21}cm{sup -3} for three different conditions to illustrate the role of opacity: optically thin plasmas with no excitation-autoionization/dielectronic recombination (EA/DR) contributions to the line intensities, optically thin plasmas that included EA/DR contributions to the line intensities, and optically thick plasmas (optical depth {approx} 200 {micro}m) that included EA/DR contributions to the line intensities. The optically thick simulation best reproduced the recorded spectrum from the Hercules laser. However some discrepancies between the modeling and the recorded spectra remain.

  9. Lake sediment multi-taxon DNA from North Greenland records early post-glacial appearance of vascular plants and accurately tracks environmental changes

    Epp, L. S.; Gussarova, C.; Boessenkool, S.;

    2015-01-01

    temperatures. Lake sediments contain DNA paleorecords of the surrounding ecosystems and can be used to retrieve a variety of organismal groups from a single sample. In this study, we analyzed vascular plant, bryophyte, algal (in particular diatom) and copepod DNA retrieved from a sediment core spanning the...... phases, and distinct temporal changes in plant presence were recovered. The plant DNA was mostly in agreement with expected vegetation history, but very early occurrences of vascular plants, including the woody Empetrum nigrum, document terrestrial vegetation very shortly after glacial retreat. Our study...... core. Our DNA record was stratigraphically coherent, with no indication of leaching between layers, and our cross-taxon comparisons were in accordance with previously inferred local ecosystem changes. Authentic ancient plant DNA was retrieved from nearly all layers, both from the marine and the limnic...

  10. Lake sediment multi-taxon DNA from North Greenland records early post-glacial appearance of vascular plants and accurately tracks environmental changes

    Epp, L. S.; Gussarova, G.; Boessenkool, S.; Olsen, J.; Haile, J.; Schrøder-Nielsen, A.; Ludikova, A.; Hassel, K.; Stenøien, H. K.; Funder, S.; Willerslev, E.; Kjær, K.; Brochmann, C.

    2015-06-01

    High Arctic environments are particularly sensitive to climate changes, but retrieval of paleoecological data is challenging due to low productivity and biomass. At the same time, Arctic soils and sediments have proven exceptional for long-term DNA preservation due to their constantly low temperatures. Lake sediments contain DNA paleorecords of the surrounding ecosystems and can be used to retrieve a variety of organismal groups from a single sample. In this study, we analyzed vascular plant, bryophyte, algal (in particular diatom) and copepod DNA retrieved from a sediment core spanning the Holocene, taken from Bliss Lake on the northernmost coast of Greenland. A previous multi-proxy study including microscopic diatom analyses showed that this lake experienced changes between marine and lacustrine conditions. We inferred the same environmental changes from algal DNA preserved in the sediment core. Our DNA record was stratigraphically coherent, with no indication of leaching between layers, and our cross-taxon comparisons were in accordance with previously inferred local ecosystem changes. Authentic ancient plant DNA was retrieved from nearly all layers, both from the marine and the limnic phases, and distinct temporal changes in plant presence were recovered. The plant DNA was mostly in agreement with expected vegetation history, but very early occurrences of vascular plants, including the woody Empetrum nigrum, document terrestrial vegetation very shortly after glacial retreat. Our study shows that multi-taxon metabarcoding of sedimentary ancient DNA from lake cores is a valuable tool both for terrestrial and aquatic paleoecology, even in low-productivity ecosystems such as the High Arctic.

  11. Simultaneous telemetric monitoring of brain glucose and lactate and motion in freely moving rats.

    Rocchitta, Gaia; Secchi, Ottavio; Alvau, Maria Domenica; Farina, Donatella; Bazzu, Gianfranco; Calia, Giammario; Migheli, Rossana; Desole, Maria Speranza; O'Neill, Robert D; Serra, Pier A

    2013-11-01

    A new telemetry system for simultaneous detection of extracellular brain glucose and lactate and motion is presented. The device consists of dual-channel, single-supply miniature potentiostat-I/V converter, a microcontroller unit, a signal transmitter, and a miniaturized microvibration sensor. Although based on simple and inexpensive components, the biotelemetry device has been used for accurate transduction of the anodic oxidation currents generated on the surface of implanted glucose and lactate biosensors and animal microvibrations. The device was characterized and validated in vitro before in vivo experiments. The biosensors were implanted in the striatum of freely moving animals and the biotelemetric device was fixed to the animal's head. Physiological and pharmacological stimulations were given in order to induce striatal neural activation and to modify the motor behavior in awake, untethered animals. PMID:24102201

  12. Técnicas OCDMA con códigos ópticos aleatorios para redes de telecontrol / telemetría en entornos cerrados

    Poves Valdés, Enrique

    2010-01-01

    Los sistemas ópticos inalámbricos presentan numerosas ventajas que los hacen interesantes candidatos para las comunicaciones en entornos cerrados. En estos entornos la implantación de redes de telecontrol y telemetría impone requisitos particulares que permitan la comunicación de todos los nodos sin afectar la velocidad de transmisión y las condiciones de trabajo. Los sistemas basados en OCDMA suponen una buena alternativa para estos entornos por sus especiales características. En este trabaj...

  13. Control of the functional state of organism of sportsmen-youths in educational training process with the use of telemetric system.

    Lebedinskiy V.Yu.

    2012-02-01

    Full Text Available The purpose of this work is upgrading educational training process on the basis of the use of facilities of complex operative control. 18 skiers of racing drivers took part in research. The telemetric system is developed for the estimation of the functional state of organism (frequencies of heart-throbs, breathing frequency, temperature of body. The algorithm of training process is presented. The method of account of individual reactions of the system of oxygen providing and utilizations of oxygen is offered in the organism of sportsman. Directions individualization of training of sportsman are shown.

  14. Implementación de un sistema de radiocomunicaciones para la transmisión de la telemetría de una moto de carreras

    Pérez Muñoz, María Belén

    2015-01-01

    Motostudent es una competición entre universidades españolas y europeas, que plantea el desafío de diseñar y desarrollar un prototipo de moto de competición de 125 centímetros cúbicos y 2 tiempos. De manera que este proyecto se enmarca dentro de un trabajo multidisciplinar que comprende distintas tareas relacionadas con la elaboración de un sistema de telemetría de una moto de competición. Aunque, la memoria de este trabajo se centra, en particular, en la configuración del s...

  15. Expansión tecnológica en telemetría para operaciones de producción petrolera

    Isabel Cristina Salas Rivero

    2010-01-01

    La investigación realizada gira en torno a una consulta al personal experto dentro y fuera de PDVSA con el fin de determinar las mejores alternativas para un proyecto de migración de la plataforma portadora en transmisiones inalámbricas de alto ancho de banda. El objetivo general que orienta el escrito es desarrollar un plan de expansión tecnológica en telemetría apalancado con nuevas tecnologías y tendencias de negocios actuales para operacion...

  16. Simultaneous amperometric detection of ascorbic acid and antioxidant capacity in orange, blueberry and kiwi juice, by a telemetric system coupled with a fullerene- or nanotubes-modified ascorbate subtractive biosensor.

    Barberis, Antonio; Spissu, Ylenia; Fadda, Angela; Azara, Emanuela; Bazzu, Gianfranco; Marceddu, Salvatore; Angioni, Alberto; Sanna, Daniele; Schirra, Mario; Serra, Pier Andrea

    2015-05-15

    Four fullerenes- or nanotubes-modified graphite sensor-biosensor systems (SBs), coupled with a dual-channel telemetric device, based on an ascorbate oxidase (AOx) biosensor, were developed for on line simultaneous amperometric detection of ascorbic acid (AA) and antioxidant capacity in blueberry, kiwi and orange juice. Fullerene C60 (FC60), fullerene C70 (FC70), single-walled carbon nanotubes (SWCN) and multi-walled carbon nanotubes (MWCN) increased the sensitivity of graphite toward AA and phenols 1.2, 1.5, 5.1 and 5.1 times respectively. Fullerenes combined with AOx improved the selectivity toward AA more than nanotubes, being able to hold a higher number of AOx molecules on the biosensor surface. The SBs work at an applied potential of +500 mV, in a concentration range between the LOD and 20 μM, with a response time of two minutes. The LOD is 0.10, 0.13, 0.20 and 0.22 μM for SBs modified with FC60, FC70, SWCN and MWCN respectively. Biosensors register lower AA currents than the sensors due to the enzyme capability to oxidize AA before it reaches the transductor surface. Phenols currents registered by sensors and biosensors did not differ. Based on the difference between sensor and biosensor recorded currents a AA selectivity index was developed as an indicator of specificity toward AA and of the capacity to distinguish between AA and phenols contribution to the antioxidant capacity. This value is almost zero for fullerene-modified SBs, 0.13 and 0.22 for SWCN- and MWCN-modified SBs respectively. The results of juices analysis performed with SBs were in accordance with reference methods. PMID:25155059

  17. Desarrollo de un dispositivo de telemetría y geolocalización basado en la plataforma Arduino y Shield 3G+GPS

    López Jiménez, Pedro Celestino

    2015-01-01

    [SPA]El objetivo principal del trabajo es el diseño de un dispositivo electrónico basado en la plataforma Arduino capaz de realizar telemetría en tiempo real a través de internet utilizando un Shield con tecnología 3G+GPS. Para ello en primer lugar será necesario diseñar y programar la etapa de adquisición de datos procedentes de diferentes sensores, tanto analógicos como digitales, así como la geolocalización. A continuación se diseñará un procedimiento de empaquetamiento y en...

  18. Detection of postharvest changes of ascorbic acid in fresh-cut melon, kiwi, and pineapple, by using a low cost telemetric system.

    Barberis, Antonio; Fadda, Angela; Schirra, Mario; Bazzu, Gianfranco; Serra, Pier Andrea

    2012-12-01

    The present paper deals with a novel telemetric device combined with a carbon amperometric sensor system to determine postharvest changes of ascorbic acid (AA) in fresh-cut fruits, without displacing products out of the storage rooms. The investigation was performed on kiwi, pineapple and melon, subjected to minimal processing, packaging, cold storage, and simulated shelf life. Results demonstrated that AA content of fresh-cut fruits of all species declines differently during storage. Cold storage notably reduced the degradation rate of AA in comparison with samples stored at 20°C. The cold-chain interruption resulted in a sharp AA content reduction when the optimal storage condition was not rapidly replaced. Unpredicted results showed a high activity of oxidative enzymes, which prevented AA detection in melon samples. Our sensor system allowed us to demonstrate that both ascorbate peroxidase and ascorbate oxidase affected the oxidative stability and the nutritional quality of fresh cut melon fruits. PMID:22953893

  19. Polish Experience with Advanced Digital Heritage Recording Methodology, including 3D Laser Scanning, CAD, and GIS Application, as the Most Accurate and Flexible Response for Archaeology and Conservation Needs at Jan III Sobieski's Residence in Wilanów

    Baranowski, P.; Czajkowski, K.; Gładki, M.; Morysiński, T.; Szambelan, R.; Rzonca, A.

    Review of recent critical points for introduction of laser technology into the field of heritage documentation, management, conservation, and archaeology will be discussed. The relationship of benefit versus cost of 3D laser scanning technique for complex multitask heritage recording project at Wilanow is presented. Definition of basic criteria for the successful use of such heritage detailed record as laser scanning is given.

  20. Development of the unmanned aerial vehicle flight recorder

    Walendziuk, Wojciech; Kwasniewski, Daniel

    2014-11-01

    This work presents a telemetric flight recorder which can be used in unmanned aerial vehicles. The device can store GPS position and altitude, measured with the use of pressure sensor HP03M, a flying platform. The most important subassembly of the recorder is an M2M family device H24 modem developed by Telit company. The modem interface communicates with the use of UART interface and AT commands. The autonomic work is provided by a microcontroller which is master component of the recorder. The ATmega 664P-AU from AVR family microcontrollers developed by Atmel is used. The functionality of the measurement system was developed in such a way that a GSM module can send current position to the base station on demand. In the paper the general description of the device and achieved results of tests are presented.

  1. Accurate Finite Difference Algorithms

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  2. Optimización de un sistema de radiocomunicaciones para la transmisión de la telemetría de una moto en un entorno real de un circuito

    Hernández Mejías, David

    2015-01-01

    Motostudent es una competición promovida por la fundación Moto Engineering Fundation (MEF) entre universidades españolas y europeas. Consiste en diseñar y desarrollar un prototipo de moto de competición de 125 centímetros cúbicos y dos tiempos, y superar unas pruebas de evaluación que se llevaran a cabo en las instalaciones de la Ciudad del Motor de Aragón. El proyecto se enmarca dentro de un trabajo multidisciplinar que parte de la recopilación de los datos de la telemetría en una placa s...

  3. Computerized mega code recording.

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses. PMID:3354937

  4. Niche Genetic Algorithm with Accurate Optimization Performance

    LIU Jian-hua; YAN De-kun

    2005-01-01

    Based on crowding mechanism, a novel niche genetic algorithm was proposed which can record evolutionary direction dynamically during evolution. After evolution, the solutions's precision can be greatly improved by means of the local searching along the recorded direction. Simulation shows that this algorithm can not only keep population diversity but also find accurate solutions. Although using this method has to take more time compared with the standard GA, it is really worth applying to some cases that have to meet a demand for high solution precision.

  5. Interpreting land records

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  6. Economic Records Antiquity States

    Michal Hora

    2008-01-01

    This article proposes a way of historical progress of accounting records. Infancy dates back to the earliest days of human agriculture and civilization, the Sumerians in Mesopotamia an another civilization, when the need to maintain accurate records of the quantities and relative values of agricultural products first arose. Accounting records developed purely in response to the needs of the time brought about by changes in the environment and societal demands. The view to the history helps ex...

  7. Towards accurate emergency response behavior

    Nuclear reactor operator emergency response behavior has persisted as a training problem through lack of information. The industry needs an accurate definition of operator behavior in adverse stress conditions, and training methods which will produce the desired behavior. Newly assembled information from fifty years of research into human behavior in both high and low stress provides a more accurate definition of appropriate operator response, and supports training methods which will produce the needed control room behavior. The research indicates that operator response in emergencies is divided into two modes, conditioned behavior and knowledge based behavior. Methods which assure accurate conditioned behavior, and provide for the recovery of knowledge based behavior, are described in detail

  8. Accurate determination of antenna directivity

    Dich, Mikael

    1997-01-01

    The derivation of a formula for accurate estimation of the total radiated power from a transmitting antenna for which the radiated power density is known in a finite number of points on the far-field sphere is presented. The main application of the formula is determination of directivity from power...

  9. A miniature bidirectional telemetry system for in vivo gastric slow wave recordings

    Stomach contractions are initiated and coordinated by an underlying electrical activity (slow waves), and electrical dysrhythmias accompany motility diseases. Electrical recordings taken directly from the stomach provide the most valuable data, but face technical constraints. Serosal or mucosal electrodes have cables that traverse the abdominal wall, or a natural orifice, causing discomfort and possible infection, and restricting mobility. These problems motivated the development of a wireless system. The bidirectional telemetric system constitutes a front-end transponder, a back-end receiver and a graphical user interface. The front-end module conditions the analogue signals, then digitizes and loads the data into a radio for transmission. Data receipt at the back-end is acknowledged via a transceiver function. The system was validated in a bench-top study, then validated in vivo using serosal electrodes connected simultaneously to a commercial wired system. The front-end module was 35 × 35 × 27 mm3 and weighed 20 g. Bench-top tests demonstrated reliable communication within a distance range of 30 m, power consumption of 13.5 mW, and 124 h operation when utilizing a 560 mAh, 3 V battery. In vivo, slow wave frequencies were recorded identically with the wireless and wired reference systems (2.4 cycles min−1), automated activation time detection was modestly better for the wireless system (5% versus 14% FP rate), and signal amplitudes were modestly higher via the wireless system (462 versus 386 µV; p < 0.001). This telemetric system for slow wave acquisition is reliable, power efficient, readily portable and potentially implantable. The device will enable chronic monitoring and evaluation of slow wave patterns in animals and patients. (note)

  10. Accurate ab initio spin densities

    Boguslawski, Katharina; Legeza, Örs; Reiher, Markus

    2012-01-01

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys. 2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CA...

  11. Accurate Modeling of Advanced Reflectarrays

    Zhou, Min

    of the incident field, the choice of basis functions, and the technique to calculate the far-field. Based on accurate reference measurements of two offset reflectarrays carried out at the DTU-ESA Spherical NearField Antenna Test Facility, it was concluded that the three latter factors are particularly important...... to the conventional phase-only optimization technique (POT), the geometrical parameters of the array elements are directly optimized to fulfill the far-field requirements, thus maintaining a direct relation between optimization goals and optimization variables. As a result, better designs can be obtained compared...... using the GDOT to demonstrate its capabilities. To verify the accuracy of the GDOT, two offset contoured beam reflectarrays that radiate a high-gain beam on a European coverage have been designed and manufactured, and subsequently measured at the DTU-ESA Spherical Near-Field Antenna Test Facility...

  12. Accurate thickness measurement of graphene

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  13. Records Management

    U.S. Environmental Protection Agency — All Federal Agencies are required to prescribe an appropriate records maintenance program so that complete records are filed or otherwise preserved, records can be...

  14. A More Accurate Fourier Transform

    Courtney, Elya

    2015-01-01

    Fourier transform methods are used to analyze functions and data sets to provide frequencies, amplitudes, and phases of underlying oscillatory components. Fast Fourier transform (FFT) methods offer speed advantages over evaluation of explicit integrals (EI) that define Fourier transforms. This paper compares frequency, amplitude, and phase accuracy of the two methods for well resolved peaks over a wide array of data sets including cosine series with and without random noise and a variety of physical data sets, including atmospheric $\\mathrm{CO_2}$ concentrations, tides, temperatures, sound waveforms, and atomic spectra. The FFT uses MIT's FFTW3 library. The EI method uses the rectangle method to compute the areas under the curve via complex math. Results support the hypothesis that EI methods are more accurate than FFT methods. Errors range from 5 to 10 times higher when determining peak frequency by FFT, 1.4 to 60 times higher for peak amplitude, and 6 to 10 times higher for phase under a peak. The ability t...

  15. Localização de áreas de monitoramento telemétrico em ambientes aquáticos da Amazônia Location of telemetric monitoring sites in Amazon floodplain lakes

    Ivan Bergier Tavares de Lima

    2006-01-01

    Full Text Available O presente trabalho demonstra a aplicabilidade de imagens de sensoriamento remoto e de métodos de processamento de imagens digitais para definição de locais adequados à instalação de sistemas telemétricos de monitoramento de variáveis ambientais em sistemas aquáticos, localizados em regiões de difícil acesso. A técnica consiste essencialmente da aplicação de operações Booleanas entre mapas da pluma do Rio Amazonas e de zonas inundadas do Lago de Curuai em diferentes etapas do ciclo hidrológico. A localização exata para o sistema de monitoramento telemétrico será vital para o desenvolvimento de modelos de troca de gases traço entre a planície de inundação Amazônica e a atmosfera.The present work illustrates the application of remote sensing and image processing methods to define appropriate sites for installing buoy moored telemetric systems at the surface of Amazon floodplain lakes for long-term limnologic-micrometeorologic monitoring. The technique consists essentially of Boolean operations over Amazon plume maps and historic inundation of the Curuai Lake at distinct stages of the hydrologic cycle. The precise location for the long-term monitoring is vital to the development of models concerning air-water trace gas exchange in the Amazon floodplains.

  16. 38 CFR 4.46 - Accurate measurement.

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  17. Quality records for nuclear power plants

    The purpose of collecting and maintaining quality records is to provide the plant operator with records which accurately describe the configuration and condition of the plant. The volume of these quality records presents retention and retrievability problems seldom encountered in normal business operations. This presentation addresses the planning and development of a quality records management system. Topics for discussion include: system purpose, requirements of standards and federal regulations, planning parameters, classification and indexing, and storage and retrieval of quality records

  18. Accurate paleointensities - the multi-method approach

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  19. Audio Recording of Children with Dyslalia

    Stefan Gheorghe Pentiuc

    2008-01-01

    Full Text Available In this paper we present our researches regarding automat parsing of audio recordings. These recordings are obtained from children with dyslalia and are necessary for an accurate identification of speech problems. We develop a software application that helps parsing audio, real time, recordings.

  20. The foundations of magnetic recording

    Mallinson, John C

    1993-01-01

    This expanded and updated new edition provides a comprehensive overview of the science and technology of magnetic recording. In the six years since the publication of the first edition, the magnetic recording and storage industry has burgeoned with the introduction of a host of new ideas and technologies. His book contains a discussion of almost every technologically important aspect of recording.* Continas complete coverage of the current technology of magnetic recording and storage* Written in a non-mathematical but scientifically accurate style* Permits intelligent evaluat

  1. Digital recording system

    A large number of critical process parameters in nuclear power plants have hitherto been monitored using electromechanical chart recorders. The reducing costs of electronics systems have led to a trend towards modernizing power plant control rooms by computerizing all the panel instrumentation. As a first step, it has been decided to develop a digital recording system to record the values of 48 process parameters. The system as developed and described in this report is more than a replacement for recorders; it offers substantial advantages in terms of lower overall system cost, excellent time resolution, accurate data and absolute synchronization for correlated signals. The system provides high speed recording of 48 process parameters, maintains historical records and permits retrieval and display of archival information on a colour monitor, a plotter and a printer. It is implemented using a front end data acquisition unit connected on a serial link to a PC-XT computer with 20 MB Winchester. The system offers an extremely user friendly man machine interaction, based on a hierarchical paged menu driven scheme. Softwre development for this system has been carried out using the C language. (author). 9 figs

  2. Robert Recorde

    Williams, Jack

    2011-01-01

    The 16th-Century intellectual Robert Recorde is chiefly remembered for introducing the equals sign into algebra, yet the greater significance and broader scope of his work is often overlooked. This book presents an authoritative and in-depth analysis of the man, his achievements and his historical importance. This scholarly yet accessible work examines the latest evidence on all aspects of Recorde's life, throwing new light on a character deserving of greater recognition. Topics and features: presents a concise chronology of Recorde's life; examines his published works; describes Recorde's pro

  3. Accurate characterization of OPVs: Device masking and different solar simulators

    Gevorgyan, Suren; Carlé, Jon Eggert; Søndergaard, Roar R.;

    2013-01-01

    One of the prime objects of organic solar cell research has been to improve the power conversion efficiency. Unfortunately, the accurate determination of this property is not straight forward and has led to the recommendation that record devices be tested and certified at a few accredited...... laboratories following rigorous ASTM and IEC standards. This work tries to address some of the issues confronting the standard laboratory in this regard. Solar simulator lamps are investigated for their light field homogeneity and direct versus diffuse components, as well as the correct device area...

  4. Laboratory Building for Accurate Determination of Plutonium

    2008-01-01

    <正>The accurate determination of plutonium is one of the most important assay techniques of nuclear fuel, also the key of the chemical measurement transfer and the base of the nuclear material balance. An

  5. Phenological Records

    National Oceanic and Atmospheric Administration, Department of Commerce — Phenology is the scientific study of periodic biological phenomena, such as flowering, breeding, and migration, in relation to climatic conditions. The few records...

  6. Invariant Image Watermarking Using Accurate Zernike Moments

    Ismail A. Ismail

    2010-01-01

    Full Text Available problem statement: Digital image watermarking is the most popular method for image authentication, copyright protection and content description. Zernike moments are the most widely used moments in image processing and pattern recognition. The magnitudes of Zernike moments are rotation invariant so they can be used just as a watermark signal or be further modified to carry embedded data. The computed Zernike moments in Cartesian coordinate are not accurate due to geometrical and numerical error. Approach: In this study, we employed a robust image-watermarking algorithm using accurate Zernike moments. These moments are computed in polar coordinate, where both approximation and geometric errors are removed. Accurate Zernike moments are used in image watermarking and proved to be robust against different kind of geometric attacks. The performance of the proposed algorithm is evaluated using standard images. Results: Experimental results show that, accurate Zernike moments achieve higher degree of robustness than those approximated ones against rotation, scaling, flipping, shearing and affine transformation. Conclusion: By computing accurate Zernike moments, the embedded bits watermark can be extracted at low error rate.

  7. RECORDS REACHING RECORDING DATA TECHNOLOGIES

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-01-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence...

  8. 25 CFR 700.267 - Disclosure of records.

    2010-04-01

    ... Information Act, reasonable efforts shall be made to assure that the records are accurate, complete, timely... 25 Indians 2 2010-04-01 2010-04-01 false Disclosure of records. 700.267 Section 700.267 Indians... Privacy Act § 700.267 Disclosure of records. (a) Prohibition of disclosure. No record contained in...

  9. Accurate atomic data for industrial plasma applications

    Griesmann, U.; Bridges, J.M.; Roberts, J.R.; Wiese, W.L.; Fuhr, J.R. [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

    1997-12-31

    Reliable branching fraction, transition probability and transition wavelength data for radiative dipole transitions of atoms and ions in plasma are important in many industrial applications. Optical plasma diagnostics and modeling of the radiation transport in electrical discharge plasmas (e.g. in electrical lighting) depend on accurate basic atomic data. NIST has an ongoing experimental research program to provide accurate atomic data for radiative transitions. The new NIST UV-vis-IR high resolution Fourier transform spectrometer has become an excellent tool for accurate and efficient measurements of numerous transition wavelengths and branching fractions in a wide wavelength range. Recently, the authors have also begun to employ photon counting techniques for very accurate measurements of branching fractions of weaker spectral lines with the intent to improve the overall accuracy for experimental branching fractions to better than 5%. They have now completed their studies of transition probabilities of Ne I and Ne II. The results agree well with recent calculations and for the first time provide reliable transition probabilities for many weak intercombination lines.

  10. More accurate picture of human body organs

    Computerized tomography and nucler magnetic resonance tomography (NMRT) are revolutionary contributions to radiodiagnosis because they allow to obtain a more accurate image of human body organs. The principles are described of both methods. Attention is mainly devoted to NMRT which has clinically only been used for three years. It does not burden the organism with ionizing radiation. (Ha)

  11. Records Reaching Recording Data Technologies

    Gresik, G. W. L.; Siebe, S.; Drewello, R.

    2013-07-01

    The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.

  12. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  13. ATLAS Recordings

    Steven Goldfarb; Mitch McLachlan; Homer A. Neal

    Web Archives of ATLAS Plenary Sessions, Workshops, Meetings, and Tutorials from 2005 until this past month are available via the University of Michigan portal here. Most recent additions include the Trigger-Aware Analysis Tutorial by Monika Wielers on March 23 and the ROOT Workshop held at CERN on March 26-27.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal.Feedback WelcomeOur group is making arrangements now to record plenary sessions, tutorials, and other important ATLAS events for 2007. Your suggestions for potential recording, as well as your feedback on existing archives is always welcome. Please contact us at wlap@umich.edu. Thank you.Enjoy the Lectures!

  14. How Accurate is inv(A)*b?

    Druinsky, Alex

    2012-01-01

    Several widely-used textbooks lead the reader to believe that solving a linear system of equations Ax = b by multiplying the vector b by a computed inverse inv(A) is inaccurate. Virtually all other textbooks on numerical analysis and numerical linear algebra advise against using computed inverses without stating whether this is accurate or not. In fact, under reasonable assumptions on how the inverse is computed, x = inv(A)*b is as accurate as the solution computed by the best backward-stable solvers. This fact is not new, but obviously obscure. We review the literature on the accuracy of this computation and present a self-contained numerical analysis of it.

  15. Accurate guitar tuning by cochlear implant musicians.

    Thomas Lu

    Full Text Available Modern cochlear implant (CI users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  16. Accurate guitar tuning by cochlear implant musicians.

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  17. Accurate Finite Difference Methods for Option Pricing

    Persson, Jonas

    2006-01-01

    Stock options are priced numerically using space- and time-adaptive finite difference methods. European options on one and several underlying assets are considered. These are priced with adaptive numerical algorithms including a second order method and a more accurate method. For American options we use the adaptive technique to price options on one stock with and without stochastic volatility. In all these methods emphasis is put on the control of errors to fulfill predefined tolerance level...

  18. Accurate, reproducible measurement of blood pressure.

    Campbell, N. R.; Chockalingam, A; Fodor, J. G.; McKay, D. W.

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine con...

  19. Accurate variational forms for multiskyrmion configurations

    Jackson, A.D.; Weiss, C.; Wirzba, A.; Lande, A.

    1989-04-17

    Simple variational forms are suggested for the fields of a single skyrmion on a hypersphere, S/sub 3/(L), and of a face-centered cubic array of skyrmions in flat space, R/sub 3/. The resulting energies are accurate at the level of 0.2%. These approximate field configurations provide a useful alternative to brute-force solutions of the corresponding Euler equations.

  20. Efficient Accurate Context-Sensitive Anomaly Detection

    2007-01-01

    For program behavior-based anomaly detection, the only way to ensure accurate monitoring is to construct an efficient and precise program behavior model. A new program behavior-based anomaly detection model,called combined pushdown automaton (CPDA) model was proposed, which is based on static binary executable analysis. The CPDA model incorporates the optimized call stack walk and code instrumentation technique to gain complete context information. Thereby the proposed method can detect more attacks, while retaining good performance.

  1. Towards accurate modeling of moving contact lines

    Holmgren, Hanna

    2015-01-01

    The present thesis treats the numerical simulation of immiscible incompressible two-phase flows with moving contact lines. The conventional Navier–Stokes equations combined with a no-slip boundary condition leads to a non-integrable stress singularity at the contact line. The singularity in the model can be avoided by allowing the contact line to slip. Implementing slip conditions in an accurate way is not straight-forward and different regularization techniques exist where ad-hoc procedures ...

  2. Accurate phase-shift velocimetry in rock

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  3. Accurate characterisation of post moulding shrinkage of polymer parts

    Neves, L. C.; De Chiffre, L.; González-Madruga, D.;

    2015-01-01

    The work deals with experimental determination of the shrinkage of polymer parts after injection moulding. A fixture for length measurements on 8 parts at the same time was designed and manufactured in Invar, mounted with 8 electronic gauges, and provided with 3 temperature sensors. The fixture was...... used to record the length at a well-defined position on each part continuously, starting from approximately 10 minutes after moulding and covering a time period of 7 days. Two series of shrinkage curves were analysed and length values after stabilisation extracted and compared for all 16 parts. Values...... were compensated with respect to the effect from temperature variations during the measurements. Prediction of the length after stabilisation was carried out by fitting data at different stages of shrinkage. Uncertainty estimations were carried out and a procedure for the accurate characterisation of...

  4. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  5. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  6. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  7. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  8. How accurately can we calculate thermal systems?

    The objective was to determine how accurately simple reactor lattice integral parameters can be determined, considering user input, differences in the methods, source data and the data processing procedures and assumptions. Three simple square lattice test cases with different fuel to moderator ratios were defined. The effect of the thermal scattering models were shown to be important and much bigger than the spread in the results. Nevertheless, differences of up to 0.4% in the K-eff calculated by continuous energy Monte Carlo codes were observed even when the same source data were used. (author)

  9. Accurate diagnosis is essential for amebiasis

    2004-01-01

    @@ Amebiasis is one of the three most common causes of death from parasitic disease, and Entamoeba histolytica is the most widely distributed parasites in the world. Particularly, Entamoeba histolytica infection in the developing countries is a significant health problem in amebiasis-endemic areas with a significant impact on infant mortality[1]. In recent years a world wide increase in the number of patients with amebiasis has refocused attention on this important infection. On the other hand, improving the quality of parasitological methods and widespread use of accurate tecniques have improved our knowledge about the disease.

  10. Investigations on Accurate Analysis of Microstrip Reflectarrays

    Zhou, Min; Sørensen, S. B.; Kim, Oleksiy S.;

    2011-01-01

    An investigation on accurate analysis of microstrip reflectarrays is presented. Sources of error in reflectarray analysis are examined and solutions to these issues are proposed. The focus is on two sources of error, namely the determination of the equivalent currents to calculate the radiation...... pattern, and the inaccurate mutual coupling between array elements due to the lack of periodicity. To serve as reference, two offset reflectarray antennas have been designed, manufactured and measured at the DTUESA Spherical Near-Field Antenna Test Facility. Comparisons of simulated and measured data are...

  11. Record club

    Record club

    2010-01-01

      Bonjour a tous, Voici les 24 nouveaux DVD de Juillet disponibles depuis quelques jours, sans oublier les 5 CD Pop musique. Découvrez la saga du terroriste Carlos, la vie de Gainsbourg et les aventures de Lucky Luke; angoissez avec Paranormal Activity et évadez vous sur Pandora dans la peau d’Avatar. Toutes les nouveautés sont à découvrir directement au club. Pour en connaître la liste complète ainsi que le reste de la collection du Record Club, nous vous invitons sur notre site web: http://cern.ch/crc. Toutes les dernières nouveautés sont dans la rubrique « Discs of the Month ». Rappel : le club est ouvert les Lundis, Mercredis, Vendredis de 12h30 à 13h00 au restaurant n°2, bâtiment 504. A bientôt chers Record Clubbers.  

  12. Record Club

    Record Club

    2011-01-01

    http://cern.ch/Record.Club November  Selections Just in time for the holiday season, we have added a number of new CDs and DVDs into the Club. You will find the full lists at http://cern.ch/record.club; select the "Discs of the Month" button on the left side on the left panel of the web page and then Nov 2011. New films include the all 5 episodes of Fast and Furious, many of the most famous films starring Jean-Paul Belmondo and those of Louis de Funes and some more recent films such as The Lincoln Lawyer and, according to some critics, Woody Allen’s best film for years – Midnight in Paris. For the younger generation there is Cars 2 and Kung Fu Panda 2. New CDs include the latest releases by Adele, Coldplay and the Red Hot Chili Peppers. We have also added the new Duets II CD featuring Tony Bennett singing with some of today’s pop stars including Lady Gaga, Amy Winehouse and Willy Nelson. The Club is now open every Monday, Wednesday and Friday ...

  13. ATLAS Recordings

    Jeremy Herr; Homer A. Neal; Mitch McLachlan

    The University of Michigan Web Archives for the 2006 ATLAS Week Plenary Sessions, as well as the first of 2007, are now online. In addition, there are a wide variety of Software and Physics Tutorial sessions, recorded over the past couple years, to chose from. All ATLAS-specific archives are accessible here.Viewing requires a standard web browser with RealPlayer plug-in (included in most browsers automatically) and works on any major platform. Lectures can be viewed directly over the web or downloaded locally.In addition, you will find access to a variety of general tutorials and events via the portal. Shaping Collaboration 2006The Michigan group is happy to announce a complete set of recordings from the Shaping Collaboration conference held last December at the CICG in Geneva.The event hosted a mix of Collaborative Tool experts and LHC Users, and featured presentations by the CERN Deputy Director General, Prof. Jos Engelen, the President of Internet2, and chief developers from VRVS/EVO, WLAP, and other tools...

  14. Record Club

    Record Club

    2011-01-01

    http://cern.ch/Record.Club June Selections We have put a significant number of new CDs and DVDs into the Club You will find the full lists at http://cern.ch/record.club and select the «Discs of the Month» button on the left side on the left panel of the web page and then June 2011. New films include the latest Action, Suspense and Science Fiction film hits, general drama movies including the Oscar-winning The King’s Speech, comedies including both chapter of Bridget Jones’s Diary, seven films for children and a musical. Other highlights include the latest Harry Potter release and some movies from the past you may have missed including the first in the Terminator series. New CDs include the latest releases by Michel Sardou, Mylene Farmer, Jennifer Lopez, Zucchero and Britney Spears. There is also a hits collection from NRJ. Don’t forget that the Club is now open every Monday, Wednesday and Friday lunchtimes from 12h30 to 13h00 in Restaurant 2, Building 504. (C...

  15. Record Club

    Record Club

    2011-01-01

    http://cern.ch/Record.Club Nouveautés été 2011 Le club de location de CDs et de DVDs vient d’ajouter un grand nombre de disques pour l’été 2011. Parmi eux, Le Discours d’un Roi, oscar 2011 du meilleur film et Harry Potter les reliques de la mort (1re partie). Ce n’est pas moins de 48 DVDs et 10 CDs nouveaux qui vous sont proposés à la location. Il y en a pour tous les genres. Alors n’hésitez pas à consulter notre site http://cern.ch/record.club, voir Disc Catalogue, Discs of the month pour avoir la liste complète. Le club est ouvert tous les Lundi, Mercredi, Vendredi de 12h30 à 13h dans le bâtiment du restaurent N°2 (Cf. URL: http://www.cern.ch/map/building?bno=504) A très bientôt.  

  16. Accurate radiative transfer calculations for layered media.

    Selden, Adrian C

    2016-07-01

    Simple yet accurate results for radiative transfer in layered media with discontinuous refractive index are obtained by the method of K-integrals. These are certain weighted integrals applied to the angular intensity distribution at the refracting boundaries. The radiative intensity is expressed as the sum of the asymptotic angular intensity distribution valid in the depth of the scattering medium and a transient term valid near the boundary. Integrated boundary equations are obtained, yielding simple linear equations for the intensity coefficients, enabling the angular emission intensity and the diffuse reflectance (albedo) and transmittance of the scattering layer to be calculated without solving the radiative transfer equation directly. Examples are given of half-space, slab, interface, and double-layer calculations, and extensions to multilayer systems are indicated. The K-integral method is orders of magnitude more accurate than diffusion theory and can be applied to layered scattering media with a wide range of scattering albedos, with potential applications to biomedical and ocean optics. PMID:27409700

  17. Accurate basis set truncation for wavefunction embedding

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  18. Accurate pose estimation for forensic identification

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  19. Accurate determination of characteristic relative permeability curves

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  20. Accurate shear measurement with faint sources

    Zhang, Jun; Foucaud, Sebastien [Center for Astronomy and Astrophysics, Department of Physics and Astronomy, Shanghai Jiao Tong University, 955 Jianchuan road, Shanghai, 200240 (China); Luo, Wentao, E-mail: betajzhang@sjtu.edu.cn, E-mail: walt@shao.ac.cn, E-mail: foucaud@sjtu.edu.cn [Key Laboratory for Research in Galaxies and Cosmology, Shanghai Astronomical Observatory, Nandan Road 80, Shanghai, 200030 (China)

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  1. Record breakers

    Antonella Del Rosso

    2012-01-01

    In the sixties, CERN’s Fellows were but a handful of about 50 young experimentalists present on site to complete their training. Today, their number has increased to a record-breaking 500. They come from many different fields and are spread across CERN’s different activity areas.   “Diversifying the Fellowship programme has been the key theme in recent years,” comments James Purvis, Head of the Recruitment, Programmes and Monitoring group in the HR Department. “In particular, the 2005 five-yearly review introduced the notion of ‘senior’ and ‘junior’ Fellowships, broadening the target audience to include those with Bachelor-level qualifications.” Diversification made CERN’s Fellowship programme attractive to a wider audience but the number of Fellows on site could not have increased so much without the support of EU-funded projects, which were instrumental in the growth of the programme. ...

  2. RECORD CLUB

    Record Club

    2010-01-01

    DVD James Bond – Series Complete To all Record Club Members, to start the new year, we have taken advantage of a special offer to add copies of all the James Bond movies to date, from the very first - Dr. No - to the latest - Quantum of Solace. No matter which of the successive 007s you prefer (Sean Connery, George Lazenby, Roger Moore, Timothy Dalton, Pierce Brosnan or Daniel Craig), they are all there. Or perhaps you have a favourite Bond Girl, or even perhaps a favourite villain. Take your pick. You can find the full selection listed on the club web site http://cern.ch/crc; use the panel on the left of the page “Discs of the Month” and select Jan 2010. We remind you that we are open on Mondays, Wednesdays and Fridays from 12:30 to 13:00 in Restaurant 2 (Bldg 504).

  3. 21 CFR 312.57 - Recordkeeping and record retention.

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Recordkeeping and record retention. 312.57 Section... § 312.57 Recordkeeping and record retention. (a) A sponsor shall maintain adequate records showing the... batch or code mark of each such shipment. (b) A sponsor shall maintain complete and accurate...

  4. Accurate Telescope Mount Positioning with MEMS Accelerometers

    Mészáros, László; Pál, András; Csépány, Gergely

    2014-01-01

    This paper describes the advantages and challenges of applying microelectromechanical accelerometer systems (MEMS accelerometers) in order to attain precise, accurate and stateless positioning of telescope mounts. This provides a completely independent method from other forms of electronic, optical, mechanical or magnetic feedback or real-time astrometry. Our goal is to reach the sub-arcminute range which is well smaller than the field-of-view of conventional imaging telescope systems. Here we present how this sub-arcminute accuracy can be achieved with very cheap MEMS sensors and we also detail how our procedures can be extended in order to attain even finer measurements. In addition, our paper discusses how can a complete system design be implemented in order to be a part of a telescope control system.

  5. Accurate estimation of indoor travel times

    Prentow, Thor Siiger; Blunck, Henrik; Stisen, Allan;

    2014-01-01

    the InTraTime method for accurately estimating indoor travel times via mining of historical and real-time indoor position traces. The method learns during operation both travel routes, travel times and their respective likelihood---both for routes traveled as well as for sub-routes thereof. InTraTime...... allows to specify temporal and other query parameters, such as time-of-day, day-of-week or the identity of the traveling individual. As input the method is designed to take generic position traces and is thus interoperable with a variety of indoor positioning systems. The method's advantages include...... a minimal-effort setup and self-improving operations due to unsupervised learning---as it is able to adapt implicitly to factors influencing indoor travel times such as elevators, rotating doors or changes in building layout. We evaluate and compare the proposed InTraTime method to indoor adaptions...

  6. Accurate sky background modelling for ESO facilities

    Full text: Ground-based measurements like e.g. high resolution spectroscopy are heavily influenced by several physical processes. Amongst others, line absorption/ emission, air glow by OH molecules, and scattering of photons within the earth's atmosphere make observations in particular from facilities like the future European extremely large telescope a challenge. Additionally, emission from unresolved extrasolar objects, the zodiacal light, the moon and even thermal emission from the telescope and the instrument contribute significantly to the broad band background over a wide wavelength range. In our talk we review these influences and give an overview on how they can be accurately modeled for increasing the overall precision of spectroscopic and imaging measurements. (author)

  7. Toward Accurate and Quantitative Comparative Metagenomics.

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  8. Accurate valence band width of diamond

    An accurate width is determined for the valence band of diamond by imaging photoelectron momentum distributions for a variety of initial- and final-state energies. The experimental result of 23.0±0.2 eV2 agrees well with first-principles quasiparticle calculations (23.0 and 22.88 eV) and significantly exceeds the local-density-functional width, 21.5±0.2 eV2. This difference quantifies effects of creating an excited hole state (with associated many-body effects) in a band measurement vs studying ground-state properties treated by local-density-functional calculations. copyright 1997 The American Physical Society

  9. Accurate Weather Forecasting for Radio Astronomy

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  10. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  11. Record Club

    Record Club

    2012-01-01

      March  Selections By the time this appears, we will have added a number of new CDs and DVDs into the Club. You will find the full lists at http://cern.ch/record.club; select the "Discs of the Month" button on the left panel of the web page and then Mar 2012. New films include recent releases such as Johnny English 2, Bad Teacher, Cowboys vs Aliens, and Super 8. We are also starting to acquire some of the classic films we missed when we initiated the DVD section of the club, such as appeared in a recent Best 100 Films published by a leading UK magazine; this month we have added Spielberg’s Jaws and Scorsese’s Goodfellas. If you have your own ideas on what we are missing, let us know. For children we have no less than 8 Tin-Tin DVDs. And if you like fast moving pop music, try the Beyonce concert DVD. New CDs include the latest releases from Paul McCartney, Rihanna and Amy Winehouse. There is a best of Mylene Farmer, a compilation from the NRJ 201...

  12. Direct conscious telemetry recordings demonstrate increased renal sympathetic nerve activity in rats with chronic kidney disease

    Ibrahim M Salman

    2015-08-01

    Full Text Available Chronic kidney disease (CKD is associated with sympathetic hyperactivity and impaired blood pressure control reflex responses, yet direct evidence demonstrating these features of autonomic dysfunction in conscious animals is still lacking. Here we measured renal sympathetic nerve activity (RSNA and mean arterial pressure (MAP using telemetry-based recordings in a rat model of CKD, the Lewis Polycystic Kidney (LPK rat, and assessed responses to chemoreflex activation and acute stress. Male LPK and Lewis control animals (total n=16 were instrumented for telemetric recording of RSNA and MAP. At 12–13 weeks-of-age, resting RSNA and MAP, sympathetic and haemodynamic responses to both peripheral (hypoxia: 10% O2 and central chemoreflex (hypercapnia: 7% CO2 activation and acute stress (open-field exposure, were measured. As indicators of renal function, urinary protein (UPro and creatinine (Ucr levels were assessed. LPK rats had higher resting RSNA (1.2±0.1 vs. 0.6±0.1 µV, p<0.05 and MAP (151±8 vs. 97±2 mmHg, p<0.05 compared to Lewis. MAP was negatively correlated with Ucr (r=-0.80, p=0.002 and positively correlated with RSNA (r=0.66, p=0.014, with multiple linear regression modeling indicating the strongest correlation was with Ucr. RSNA and MAP responses to activation of the central chemoreflex and open-field stress were reduced in the LPK relative to the Lewis (all p<0.05. This is the first description of dual conscious telemetry recording of RSNA and MAP in a genetic rodent model of CKD. Elevated RSNA is likely a key contributor to the marked hypertension in this model, while attenuated RSNA and MAP responses to central chemoreflex activation and acute stress in the LPK indicate possible deficits in the neural processing of autonomic outflows evoked by these sympathoexcitatory pathways.

  13. Fast and Provably Accurate Bilateral Filtering.

    Chaudhury, Kunal N; Dabhade, Swapnil D

    2016-06-01

    The bilateral filter is a non-linear filter that uses a range filter along with a spatial filter to perform edge-preserving smoothing of images. A direct computation of the bilateral filter requires O(S) operations per pixel, where S is the size of the support of the spatial filter. In this paper, we present a fast and provably accurate algorithm for approximating the bilateral filter when the range kernel is Gaussian. In particular, for box and Gaussian spatial filters, the proposed algorithm can cut down the complexity to O(1) per pixel for any arbitrary S . The algorithm has a simple implementation involving N+1 spatial filterings, where N is the approximation order. We give a detailed analysis of the filtering accuracy that can be achieved by the proposed approximation in relation to the target bilateral filter. This allows us to estimate the order N required to obtain a given accuracy. We also present comprehensive numerical results to demonstrate that the proposed algorithm is competitive with the state-of-the-art methods in terms of speed and accuracy. PMID:27093722

  14. Accurate adiabatic correction in the hydrogen molecule

    Pachucki, Krzysztof, E-mail: krp@fuw.edu.pl [Faculty of Physics, University of Warsaw, Pasteura 5, 02-093 Warsaw (Poland); Komasa, Jacek, E-mail: komasa@man.poznan.pl [Faculty of Chemistry, Adam Mickiewicz University, Umultowska 89b, 61-614 Poznań (Poland)

    2014-12-14

    A new formalism for the accurate treatment of adiabatic effects in the hydrogen molecule is presented, in which the electronic wave function is expanded in the James-Coolidge basis functions. Systematic increase in the size of the basis set permits estimation of the accuracy. Numerical results for the adiabatic correction to the Born-Oppenheimer interaction energy reveal a relative precision of 10{sup −12} at an arbitrary internuclear distance. Such calculations have been performed for 88 internuclear distances in the range of 0 < R ⩽ 12 bohrs to construct the adiabatic correction potential and to solve the nuclear Schrödinger equation. Finally, the adiabatic correction to the dissociation energies of all rovibrational levels in H{sub 2}, HD, HT, D{sub 2}, DT, and T{sub 2} has been determined. For the ground state of H{sub 2} the estimated precision is 3 × 10{sup −7} cm{sup −1}, which is almost three orders of magnitude higher than that of the best previous result. The achieved accuracy removes the adiabatic contribution from the overall error budget of the present day theoretical predictions for the rovibrational levels.

  15. Accurate fission data for nuclear safety

    Solders, A; Jokinen, A; Kolhinen, V S; Lantz, M; Mattera, A; Penttila, H; Pomp, S; Rakopoulos, V; Rinta-Antila, S

    2013-01-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyvaskyla. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (10^12 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons...

  16. Towards a more accurate concept of fuels

    Full text: The introduction of LEU in Atucha and the approval of CARA show an advancement of the Argentine power stations fuels, which stimulate and show a direction to follow. In the first case, the use of enriched U fuel relax an important restriction related to neutronic economy; that means that it is possible to design less penalized fuels using more Zry. The second case allows a decrease in the lineal power of the rods, enabling a better performance of the fuel in normal and also in accident conditions. In this work we wish to emphasize this last point, trying to find a design in which the surface power of the rod is diminished. Hence, in accident conditions owing to lack of coolant, the cladding tube will not reach temperatures that will produce oxidation, with the corresponding H2 formation and with plasticity enough to form blisters, which will obstruct the reflooding and hydration that will produce fragility and rupture of the cladding tube, with the corresponding radioactive material dispersion. This work is oriented to find rods designs with quasi rectangular geometry to lower the surface power of the rods, in order to obtain a lower central temperature of the rod. Thus, critical temperatures will not be reached in case of lack of coolant. This design is becoming a reality after PPFAE's efforts in search of cladding tubes fabrication with different circumferential values, rectangular in particular. This geometry, with an appropriate pellet design, can minimize the pellet-cladding interaction and, through the accurate width election, non rectified pellets could be used. This means an important economy in pellets production, as well as an advance in the fabrication of fuels in gloves box and hot cells in the future. The sequence to determine critical geometrical parameters is described and some rod dispositions are explored

  17. Accurate orbit propagation with planetary close encounters

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  18. Towards Accurate Application Characterization for Exascale (APEX)

    Hammond, Simon David [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  19. Accurate hydrocarbon estimates attained with radioactive isotope

    To make accurate economic evaluations of new discoveries, an oil company needs to know how much gas and oil a reservoir contains. The porous rocks of these reservoirs are not completely filled with gas or oil, but contain a mixture of gas, oil and water. It is extremely important to know what volume percentage of this water--called connate water--is contained in the reservoir rock. The percentage of connate water can be calculated from electrical resistivity measurements made downhole. The accuracy of this method can be improved if a pure sample of connate water can be analyzed or if the chemistry of the water can be determined by conventional logging methods. Because of the similarity of the mud filtrate--the water in a water-based drilling fluid--and the connate water, this is not always possible. If the oil company cannot distinguish between connate water and mud filtrate, its oil-in-place calculations could be incorrect by ten percent or more. It is clear that unless an oil company can be sure that a sample of connate water is pure, or at the very least knows exactly how much mud filtrate it contains, its assessment of the reservoir's water content--and consequently its oil or gas content--will be distorted. The oil companies have opted for the Repeat Formation Tester (RFT) method. Label the drilling fluid with small doses of tritium--a radioactive isotope of hydrogen--and it will be easy to detect and quantify in the sample

  20. Fast, accurate standardless XRF analysis with IQ+

    Full text: Due to both chemical and physical effects, the most accurate XRF data are derived from calibrations set up using in-type standards, necessitating some prior knowledge of the samples being analysed. Whilst this is often the case for routine samples, particularly in production control, for completely unknown samples the identification and availability of in-type standards can be problematic. Under these circumstances standardless analysis can offer a viable solution. Successful analysis of completely unknown samples requires a complete chemical overview of the speciemen together with the flexibility of a fundamental parameters (FP) algorithm to handle wide-ranging compositions. Although FP algorithms are improving all the time, most still require set-up samples to define the spectrometer response to a particular element. Whilst such materials may be referred to as standards, the emphasis in this kind of analysis is that only a single calibration point is required per element and that the standard chosen does not have to be in-type. The high sensitivities of modern XRF spectrometers, together with recent developments in detector counting electronics that possess a large dynamic range and high-speed data processing capacity bring significant advances to fast, standardless analysis. Illustrated with a tantalite-columbite heavy-mineral concentrate grading use-case, this paper will present the philosophy behind the semi-quantitative IQ+ software and the required hardware. This combination can give a rapid scan-based overview and quantification of the sample in less than two minutes, together with the ability to define channels for specific elements of interest where higher accuracy and lower levels of quantification are required. The accuracy, precision and limitations of standardless analysis will be assessed using certified reference materials of widely differing chemical and physical composition. Copyright (2002) Australian X-ray Analytical Association Inc

  1. Urolithiasis: how accurate are plain radiographs?

    Chan, V.O.; Buckley, O.; Persaud, T.; Torreggiani, W.C. [Dept. of Radiology, The Adelaide and Meath Hospital, Tallaght, Dublin (Ireland)], E-mail: William.Torreggiani@amnch.ie

    2008-06-15

    To determine the value of the kidneys, ureters, and bladder radiograph (KUB) in the diagnosis of urolithiasis using unenhanced helical computerized tomography (UHCT) as the gold standard. A retrospective study was performed on 100 consecutive patients being investigated for suspected urolithiasis. All patients presented with acute renal colic and had a KUB and UHCT within a 3-hour period. UHCT and KUB pairs were assessed separately by 2 radiologists in consensus who were blinded to the clinical details of the patients and the results of the other tests and examinations. The presence, location, number, and size of stones were recorded. Each UHCT and KUB pair was then compared for concordance on a stone-by-stone basis. KUB was concordant with the gold standard UHCT in only 50% of patients (11 positive, 39 negative), giving a sensitivity of 18.6%, a specificity of 95.1%, a positive predictive value of 84.6%, and a negative predictive value of 44.8%. KUB has a very low sensitivity for the detection of urolithiasis, although specificity is acceptable. (author)

  2. Urolithiasis: how accurate are plain radiographs?

    To determine the value of the kidneys, ureters, and bladder radiograph (KUB) in the diagnosis of urolithiasis using unenhanced helical computerized tomography (UHCT) as the gold standard. A retrospective study was performed on 100 consecutive patients being investigated for suspected urolithiasis. All patients presented with acute renal colic and had a KUB and UHCT within a 3-hour period. UHCT and KUB pairs were assessed separately by 2 radiologists in consensus who were blinded to the clinical details of the patients and the results of the other tests and examinations. The presence, location, number, and size of stones were recorded. Each UHCT and KUB pair was then compared for concordance on a stone-by-stone basis. KUB was concordant with the gold standard UHCT in only 50% of patients (11 positive, 39 negative), giving a sensitivity of 18.6%, a specificity of 95.1%, a positive predictive value of 84.6%, and a negative predictive value of 44.8%. KUB has a very low sensitivity for the detection of urolithiasis, although specificity is acceptable. (author)

  3. 49 CFR 379.7 - Preservation of records.

    2010-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) FEDERAL MOTOR CARRIER SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL MOTOR CARRIER SAFETY REGULATIONS PRESERVATION OF RECORDS... alteration, modification, or erasure of the underlying data and will enable production of an accurate...

  4. Medical records and record-keeping standards.

    Carpenter, Iain; Ram, Mala Bridgelal; Croft, Giles P; Williams, John G

    2007-08-01

    The structure of medical records becomes ever more critical with the advent of electronic records. The Health Informatics Unit (HIU) of the Royal College of Physicians has two work streams in this area. The Records Standards programme is developing generic standards for all entries into medical notes and standards for the content of admission, handover and discharge records. The Information Laboratory (iLab) focuses on hospital episode statistics and their use for monitoring clinician performance. Clinician endorsement of the work is achieved through extensive consultations. Generic medical record-keeping standards are now available. PMID:17882846

  5. CMS Records Schedule

    U.S. Department of Health & Human Services — The CMS Records Schedule provides disposition authorizations approved by the National Archives and Records Administration (NARA) for CMS program-related records...

  6. Presidential Electronic Records Library

    National Archives and Records Administration — PERL (Presidential Electronic Records Library) used to ingest and provide internal access to the Presidential electronic Records of the Reagan, Bush, and Clinton...

  7. Keeping the Records Straight.

    Clift, Phil; Keynes, Milton

    1982-01-01

    Guidelines are given regarding keeping and using educational records for exceptional children in Great Britain. Procedures related to anecdotal records, observation inventories, and rating scales are delineated. (CL)

  8. Accurate documentation in cultural heritage by merging TLS and high-resolution photogrammetric data

    Grussenmeyer, Pierre; Alby, Emmanuel; Assali, Pierre; Poitevin, Valentin; Hullo, Jean-François; Smigiel, Eddie

    2011-07-01

    Several recording techniques are used together in Cultural Heritage Documentation projects. The main purpose of the documentation and conservation works is usually to generate geometric and photorealistic 3D models for both accurate reconstruction and visualization purposes. The recording approach discussed in this paper is based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons, and criteria as geometry, texture, accuracy, resolution, recording and processing time are often compared. TLS techniques (time of flight or phase shift systems) are often used for the recording of large and complex objects or sites. Point cloud generation from images by dense stereo or multi-image matching can be used as an alternative or a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one as the acquisition system is limited to a digital camera and a few accessories only. Indeed, the stereo matching process offers a cheap, flexible and accurate solution to get 3D point clouds and textured models. The calibration of the camera allows the processing of distortion free images, accurate orientation of the images, and matching at the subpixel level. The main advantage of this photogrammetric methodology is to get at the same time a point cloud (the resolution depends on the size of the pixel on the object), and therefore an accurate meshed object with its texture. After the matching and processing steps, we can use the resulting data in much the same way as a TLS point cloud, but with really better raster information for textures. The paper will address the automation of recording and processing steps, the assessment of the results, and the deliverables (e.g. PDF-3D files). Visualization aspects of the final 3D models are presented. Two case studies with merged photogrammetric and TLS data are finally presented: - The Gallo-roman Theatre of Mandeure, France); - The

  9. Accurate Jones Matrix of the Practical Faraday Rotator

    王林斗; 祝昇翔; 李玉峰; 邢文烈; 魏景芝

    2003-01-01

    The Jones matrix of practical Faraday rotators is often used in the engineering calculation of non-reciprocal optical field. Nevertheless, only the approximate Jones matrix of practical Faraday rotators has been presented by now. Based on the theory of polarized light, this paper presents the accurate Jones matrix of practical Faraday rotators. In addition, an experiment has been carried out to verify the validity of the accurate Jones matrix. This matrix accurately describes the optical characteristics of practical Faraday rotators, including rotation, loss and depolarization of the polarized light. The accurate Jones matrix can be used to obtain the accurate results for the practical Faraday rotator to transform the polarized light, which paves the way for the accurate analysis and calculation of practical Faraday rotators in relevant engineering applications.

  10. Surgical medical record

    Bulow, S.

    2008-01-01

    A medical record is presented on the basis of selected linguistic pearls collected over the years from surgical case records Udgivelsesdato: 2008/12/15......A medical record is presented on the basis of selected linguistic pearls collected over the years from surgical case records Udgivelsesdato: 2008/12/15...

  11. Your Medical Records

    ... Can I Help a Friend Who Cuts? Your Medical Records KidsHealth > For Teens > Your Medical Records Print ... Records? en español Tus historias clínicas What Are Medical Records? Each time you climb up on a ...

  12. Electronic Health Records

    ... How Can I Help a Friend Who Cuts? Electronic Health Records KidsHealth > For Teens > Electronic Health Records Print A A A Text Size ... t happen overnight, they are coming. Understanding EHRs Electronic health records (EHR) — also called electronic medical records ( ...

  13. 19 CFR 201.29 - Commission disclosure of individual records, accounting of record disclosures, and requests for...

    2010-04-01

    ... disclosure required by 5 U.S.C. 552, the Privacy Act Officer shall keep an accurate accounting of: (1) The... individual requesting an accounting of disclosure of his or her records should make the request in writing to... and in the letter that it is a Privacy Act request for an accounting of disclosure of records....

  14. Biomimetic Approach for Accurate, Real-Time Aerodynamic Coefficients Project

    National Aeronautics and Space Administration — Aerodynamic and structural reliability and efficiency depends critically on the ability to accurately assess the aerodynamic loads and moments for each lifting...

  15. On the Record: The Philosophy of Recording

    Martin Newman

    2011-03-01

    Full Text Available At the Theoretical Archaeology Group (TAG conference in Durham in December 2009 I organised a session on the philosophy of recording, and the three articles presented here originated in that discussion. The aim of the session was to consider some fundamental questions about the recording that archaeologists undertake but which are often overlooked, and think about these in a theoretical way. These questions included: ◦Why do we choose to record the sites, monuments and artefacts that we do? Why do we select the units of information we choose to record about them? How have the things we record and the attributes recorded changed over time? ◦How can the adoption of a reflexive approach enable us to assess the recording choices we make and inform those that will be made in the future? ◦What do these recording choices tell us about archaeology and wider society over time? ◦Can something as intangible as a database record or a digital photograph be considered as an artefact and studied as material culture? ◦How are new technologies changing recording and adding to the material available for study, which will form the historic documents of the future? The first two of these questions are decisions that archaeologists have been making since the earliest origins of the discipline, often without a passing thought. It is not very often that we pause to analyse them at a theoretical level. As recording has developed we have been very good at categorizing, making inventories and constructing ontologies to describe the past around us, but less good about asking why we do so in the way we do.

  16. Mining time-dependent patient outcomes from hospital patient records.

    Rao, Bharat R.; Sandilya, Sathyakama; Niculescu, Radu; Germond, Colin; Goel, A.

    2002-01-01

    We describe REMIND, a data mining framework that accurately infers missing clinical information by reasoning over the entire patient record. Hospitals collect computerized patient records (CPR's) in structured (database tables) and unstructured (free text) formats. Structured clinical data in the CPR's is often poorly recorded, and information may be missing about key outcomes and processes. For instance, for a population of 344 colon cancer patients, important clinical outcomes, such as dise...

  17. Comparison of speech recognition with different speech coding strategies (SPEAK, CIS, and ACE) and their relationship to telemetric measures of compound action potentials in the nucleus CI 24M cochlear implant system.

    Kiefer, J; Hohl, S; Stürzebecher, E; Pfennigdorff, T; Gstöettner, W

    2001-01-01

    Speech understanding and subjective preference for three different speech coding strategies (spectral peak coding [SPEAK], continuous interleaved sampling [CIS], and advanced combination encoders [ACE]) were investigated in 11 post-lingually deaf adult subjects, using the Nucleus CI 24M cochlear implant system. Subjects were randomly assigned to two groups in a balanced crossover study design. The first group was initially fitted with SPEAK and the second group with CIS. The remaining strategies were tested sequentially over 8 to 10 weeks with systematic variations of number of channels and rate of stimulation. Following a further interval of 3 months, during which subjects were allowed to listen with their preferred strategy, they were tested again with all three strategies. Compound action potentials (CAPs) were recorded using neural response telemetry. Input/output functions in relation to increasing stimulus levels and inter-stimulus intervals between masker and probe were established to assess the physiological status of the cochlear nerve. Objective results and subjective rating showed significant differences in favour of the ACE strategy. Ten of the 11 subjects preferred the ACE strategy at the end of the study. The estimate of the refractory period based on the inter-stimulus interval correlated significantly with the overall performance with all three strategies, but CAP measures could not be related to individual preference of strategy or differences in performance between strategies. Based on these results, the ACE strategy can be recommended as an initial choice specifically for the Nucleus CI 24M cochlear implant system. Nevertheless, access to the other strategies may help to increase performance in individual patients. PMID:11296939

  18. Can structured data fields accurately measure quality of care? The example of falls

    David A. Ganz, MD, PhD

    2012-12-01

    Full Text Available By automating collection of data elements, electronic health records may simplify the process of measuring the quality of medical care. Using data from a quality improvement initiative in primary care medical groups, we sought to determine whether the quality of care for falls and fear of falling in outpatients aged 75 and older could be accurately measured solely from codable (non-free-text data in a structured visit note. A traditional medical record review by trained abstractors served as the criterion standard. Among 215 patient records reviewed, we found a structured visit note in 54% of charts within 3 mo of the date patients had been identified as having falls or fear of falling. The reliability of an algorithm based on codable data was at least good (kappa of at least 0.61 compared with full medical record review for three care processes recommended for patients with two falls or one fall with injury in the past year: orthostatic vital signs, vision test/eye examination, and home safety evaluation. However, the automated algorithm routinely underestimated quality of care. Performance standards based on automated measurement of quality of care from electronic health records need to account for documentation occurring in nonstructured form.

  19. Admission medical records made at night time have the same quality as day and evening time records

    Amirian, Ilda; Mortensen, Jacob F; Rosenberg, Jacob;

    2014-01-01

    INTRODUCTION: A thorough and accurate admission medical record is an important tool in ensuring patient safety during the hospital stay. Surgeons' performance might be affected during night shifts due to sleep deprivation. The aim of the study was to assess the quality of admission medical records....... CONCLUSION: Night time deterioration was not seen in the quality of the medical records. FUNDING: The study was supported financially by the Tryg Foundation Denmark and The Danish Medical Association. TRIAL REGISTRATION: not relevant....

  20. Accurate formulas for the penalty caused by interferometric crosstalk

    Rasmussen, Christian Jørgen; Liu, Fenghai; Jeppesen, Palle

    2000-01-01

    New simple formulas for the penalty caused by interferometric crosstalk in PIN receiver systems and optically preamplified receiver systems are presented. They are more accurate than existing formulas.......New simple formulas for the penalty caused by interferometric crosstalk in PIN receiver systems and optically preamplified receiver systems are presented. They are more accurate than existing formulas....

  1. A new, accurate predictive model for incident hypertension

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures.......Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  2. 78 FR 34604 - Submitting Complete and Accurate Information

    2013-06-10

    ... COMMISSION 10 CFR Part 50 Submitting Complete and Accurate Information AGENCY: Nuclear Regulatory Commission... accurate information as would a licensee or an applicant for a license.'' DATES: Submit comments by August... may submit comments by any of the following methods (unless this document describes a different...

  3. How utilities can achieve more accurate decommissioning cost estimates

    The number of commercial nuclear power plants that are undergoing decommissioning coupled with the economic pressure of deregulation has increased the focus on adequate funding for decommissioning. The introduction of spent-fuel storage and disposal of low-level radioactive waste into the cost analysis places even greater concern as to the accuracy of the fund calculation basis. The size and adequacy of the decommissioning fund have also played a major part in the negotiations for transfer of plant ownership. For all of these reasons, it is important that the operating plant owner reduce the margin of error in the preparation of decommissioning cost estimates. To data, all of these estimates have been prepared via the building block method. That is, numerous individual calculations defining the planning, engineering, removal, and disposal of plant systems and structures are performed. These activity costs are supplemented by the period-dependent costs reflecting the administration, control, licensing, and permitting of the program. This method will continue to be used in the foreseeable future until adequate performance data are available. The accuracy of the activity cost calculation is directly related to the accuracy of the inventory of plant system component, piping and equipment, and plant structural composition. Typically, it is left up to the cost-estimating contractor to develop this plant inventory. The data are generated by searching and analyzing property asset records, plant databases, piping and instrumentation drawings, piping system isometric drawings, and component assembly drawings. However, experience has shown that these sources may not be up to date, discrepancies may exist, there may be missing data, and the level of detail may not be sufficient. Again, typically, the time constraints associated with the development of the cost estimate preclude perfect resolution of the inventory questions. Another problem area in achieving accurate cost

  4. Iraq Radiosonde Launch Records

    National Oceanic and Atmospheric Administration, Department of Commerce — Iraqi upper air records loaned to NCDC from the Air Force 14th Weather Squadron. Scanned notebooks containing upper air radiosonde launch records and data. Launches...

  5. Vessel Activity Record

    National Oceanic and Atmospheric Administration, Department of Commerce — The Vessel Activity Record is a bi-weekly spreadsheet that shows the status of fishing vessels. It records whether fishing vessels are fishing without an observer...

  6. Daily Weather Records

    National Oceanic and Atmospheric Administration, Department of Commerce — These daily weather records were compiled from a subset of stations in the Global Historical Climatological Network (GHCN)-Daily dataset. A weather record is...

  7. Climate Record Books

    National Oceanic and Atmospheric Administration, Department of Commerce — Climate Record Books contain daily, monthly, seasonal, and annual averages, extremes, or occurrences. Most data are sequential by period of record 1871-1910,...

  8. Accurate focal spot diagnostics based on a single shot coherent modulation imaging

    A single-shot method based on coherent modulation imaging is presented for the diagnostics of the focal spot of laser facilities. The laser beam to be measured first illuminates a highly random phase plate with a known structure and subsequently the intensity of the resulting diffraction pattern is recorded by a charge-coupled device positioned behind the phase plate. Intensity distribution at the focus of the laser beam is accurately reconstructed with the coherent modulation imaging method. The feasibility of this method is demonstrated with an experiment involving a He–Ne laser. (letter)

  9. Generator maintenance electrical testing. The importance of trending and accurate interpretation. A case study

    In today's rapidly changing Power Generation Industry it is more critical than ever to acquire and maintain accurate records of previous and current electrical test data. Evaluation and trending of this data is essential to insuring the reliable operation of the machine in the ever changing world of extended maintenance outages and maintenance budget reductions. This paper presents a case study of a unique problem that had initiated in as early as 1990 and was not properly diagnosed and corrected until 2004, at which time it had propagated to a condition of eminent failure. (author)

  10. Accurate calculation of diffraction-limited encircled and ensquared energy.

    Andersen, Torben B

    2015-09-01

    Mathematical properties of the encircled and ensquared energy functions for the diffraction-limited point-spread function (PSF) are presented. These include power series and a set of linear differential equations that facilitate the accurate calculation of these functions. Asymptotic expressions are derived that provide very accurate estimates for the relative amount of energy in the diffraction PSF that fall outside a square or rectangular large detector. Tables with accurate values of the encircled and ensquared energy functions are also presented. PMID:26368873

  11. Accurately bearing measurement in non-cooperative passive location system

    The system of non-cooperative passive location based on array is proposed. In the system, target is detected by beamforming and Doppler matched filtering; and bearing is measured by a long-base-ling interferometer which is composed of long distance sub-arrays. For the interferometer with long-base-line, the bearing is measured accurately but ambiguously. To realize unambiguous accurately bearing measurement, beam width and multiple constraint adoptive beamforming technique is used to resolve azimuth ambiguous. Theory and simulation result shows this method is effective to realize accurately bearing measurement in no-cooperate passive location system. (authors)

  12. Automating aviation training records.

    Reinholt, Kurt B.

    2000-01-01

    Over the years with advances in computer technology, the navy has gradually transitioned into a paperless operation. Personnel training records have provided a standardized, documentable individual qualification record for Navy aviation maintenance personnel, however these records continue to be kept in folders, stored in file cabinets. In addition, paper records create a maintenance burden, in that the continued handling and possibility of errors made during data entry and normal wear and te...

  13. Using multiple survey vendors to collect health outcomes information: How accurate are the data?

    Haffer Samuel C

    2003-04-01

    Full Text Available Abstract Background To measure and assess health outcomes and quality of life at the national level, large-scale surveys using multiple vendors to gather health information is becoming the norm. This study evaluates the ability of multiple survey vendors to gather and report data collected as part of the 1998 Medicare Health Outcomes Survey (HOS. Method Four hundred randomly sampled completed mailed surveys were chosen from each of six certified vendors (N = 2397 participating in the 1998 HOS. The accuracy of the data gathered from the vendors was measured by creating a "gold standard" record for each survey and comparing it to the final record submitted by the vendor. Results Overall rates of agreement were calculated, and ranged from 97.0% to 99.8% across the vendors. Conclusion Researchers may be confident that using multiple vendors to gather health outcomes information will yield accurate data.

  14. Managing electronic records

    McLeod, Julie

    2005-01-01

    For records management courses, this book covers the theory and practice of managing electronic records as business and information assets. It focuses on the strategies, systems and procedures necessary to ensure that electronic records are appropriately created, captured, organized and retained over time to meet business and legal requirements.

  15. Dental records: An overview

    B K Charangowda

    2010-01-01

    Full Text Available Dental records consist of documents related to the history of present illness, clinical examination, diagnosis, treatment done, and the prognosis. A thorough knowledge of dental records is essential for the practicing dentist, as it not only has a forensic application, but also a legal implication with respect to insurance and consumerism. This article reviews the importance of dental records in forensics.

  16. Dental records: An overview

    Charangowda, B K

    2010-01-01

    Dental records consist of documents related to the history of present illness, clinical examination, diagnosis, treatment done, and the prognosis. A thorough knowledge of dental records is essential for the practicing dentist, as it not only has a forensic application, but also a legal implication with respect to insurance and consumerism. This article reviews the importance of dental records in forensics.

  17. Accurate backgrounds to Higgs production at the LHC

    Kauer, N

    2007-01-01

    Corrections of 10-30% for backgrounds to the H --> WW --> l^+l^-\\sla{p}_T search in vector boson and gluon fusion at the LHC are reviewed to make the case for precise and accurate theoretical background predictions.

  18. ACCURATE ESTIMATES OF CHARACTERISTIC EXPONENTS FOR SECOND ORDER DIFFERENTIAL EQUATION

    2009-01-01

    In this paper, a second order linear differential equation is considered, and an accurate estimate method of characteristic exponent for it is presented. Finally, we give some examples to verify the feasibility of our result.

  19. Accurate wall thickness measurement using autointerference of circumferential Lamb wave

    In this paper, a method of accurately measuring the pipe wall thickness by using noncontact air-coupled ultrasonic transducer (NAUT) was presented. In this method, accurate measurement of angular wave number (AWN) is a key technique because the AWN is changes minutely with the wall thickness. An autointerference of the circumferential (C-) Lamb wave was used for accurate measurements of the AWN. Principle of the method was first explained. Modified method for measuring the wall thickness near a butt weld line was also proposed and its accuracy was evaluated within 6 μm error. It was also shown in the paper that wall thickness measurement was accurately carried out beyond the difference among the sensors by calibrating the frequency response of the sensors. (author)

  20. Highly Accurate Sensor for High-Purity Oxygen Determination Project

    National Aeronautics and Space Administration — In this STTR effort, Los Gatos Research (LGR) and the University of Wisconsin (UW) propose to develop a highly-accurate sensor for high-purity oxygen determination....

  1. Highly Resolved Paleoclimatic Aerosol Records

    Kettner, Ernesto

    to paleotemperatures. Impurities in the matrix are comprised of particulate and soluble aerosols, each carrying information on its source’s activitiy and|or proximity. Opposed to gases and water isotopes, the seasonality of many aerosols is not smoothed out in the firn column so that large concentration gradients...... with frequently changing signs are preserved. Therefore, these aerosol records can be used for dating by annual layer counting. However, with increasing depth the annual layer thicknesses decreases due to pressure and ice flow and accurate dating is possible only as long as the rapid variations can be resolved...... experimentally. Over the last decades Continuous Flow Analysis (CFA) has become a well-established technique for aerosol quantification. In CFA, a piece of core is melted continuously and the melt water is analysed for an array of chemical impurities. When designing a CFA system, a trilemma between high sample...

  2. Record Statistics and Dynamics

    Sibani, Paolo; Jensen, Henrik J.

    2009-01-01

    The term record statistics covers the statistical properties of records within an ordered series of numerical data obtained from observations or measurements. A record within such series is simply a value larger (or smaller) than all preceding values. The mathematical properties of records strongly...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....

  3. 42 CFR 422.118 - Confidentiality and accuracy of enrollee records.

    2010-10-01

    ... and disclosure of medical records, or other health and enrollment information. The MA organization... court orders or subpoenas. (c) Maintain the records and information in an accurate and timely manner. (d) Ensure timely access by enrollees to the records and information that pertain to them....

  4. Admission medical records made at night time have the same quality as day and evening time

    Amirian, Ilda; Mortensen, Jacob F; Rosenberg, Jacob;

    2014-01-01

    INTRODUCTION: A thorough and accurate admission medical record is an important tool in ensuring patient safety during the hospital stay. Surgeons' performance might be affected during night shifts due to sleep deprivation. The aim of the study was to assess the quality of admission medical records...... deterioration was not seen in the quality of the medical records....

  5. History and progress on accurate measurements of the Planck constant.

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  6. History and progress on accurate measurements of the Planck constant

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10-34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the improved

  7. History and progress on accurate measurements of the Planck constant

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10−34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  8. Accurately measuring dynamic coefficient of friction in ultraform finishing

    Briggs, Dennis; Echaves, Samantha; Pidgeon, Brendan; Travis, Nathan; Ellis, Jonathan D.

    2013-09-01

    UltraForm Finishing (UFF) is a deterministic sub-aperture computer numerically controlled grinding and polishing platform designed by OptiPro Systems. UFF is used to grind and polish a variety of optics from simple spherical to fully freeform, and numerous materials from glasses to optical ceramics. The UFF system consists of an abrasive belt around a compliant wheel that rotates and contacts the part to remove material. This work aims to accurately measure the dynamic coefficient of friction (μ), how it changes as a function of belt wear, and how this ultimately affects material removal rates. The coefficient of friction has been examined in terms of contact mechanics and Preston's equation to determine accurate material removal rates. By accurately predicting changes in μ, polishing iterations can be more accurately predicted, reducing the total number of iterations required to meet specifications. We have established an experimental apparatus that can accurately measure μ by measuring triaxial forces during translating loading conditions or while manufacturing the removal spots used to calculate material removal rates. Using this system, we will demonstrate μ measurements for UFF belts during different states of their lifecycle and assess the material removal function from spot diagrams as a function of wear. Ultimately, we will use this system for qualifying belt-wheel-material combinations to develop a spot-morphing model to better predict instantaneous material removal functions.

  9. New Mediterranean Biodiversity Records

    Katsanevakis, S.; Ü. ACAR; Ammar, I.; Balci, B.A.; Bekas, P.; Belmonte, M.; Chintiroglou, C.C.; P. CONSOLI; Dimiza, M; K. FRYGANIOTIS; V. GEROVASILEIOU; Gnisci, V.; N. GÜLŞAHIN; Hoffman, R.; Y. ISSARIS

    2014-01-01

    The Collective Article ‘New Mediterranean Biodiversity Records’ of the Mediterranean Marine Science journal offers the means to publish biodiversity records in the Mediterranean Sea. The current article is divided in two parts, for records of alien and native species respectively. The new records of alien species include: the red alga Asparagopsis taxiformis (Crete and Lakonicos Gulf) (Greece); the red alga Grateloupia turuturu (along the Israeli Mediterranean shore); the mantis shrimp Clorid...

  10. Modern recording techniques

    Huber, David Miles

    2013-01-01

    As the most popular and authoritative guide to recording Modern Recording Techniques provides everything you need to master the tools and day to day practice of music recording and production. From room acoustics and running a session to mic placement and designing a studio Modern Recording Techniques will give you a really good grounding in the theory and industry practice. Expanded to include the latest digital audio technology the 7th edition now includes sections on podcasting, new surround sound formats and HD and audio.If you are just starting out or looking for a step up

  11. Simple and accurate analytical calculation of shortest path lengths

    Melnik, Sergey

    2016-01-01

    We present an analytical approach to calculating the distribution of shortest paths lengths (also called intervertex distances, or geodesic paths) between nodes in unweighted undirected networks. We obtain very accurate results for synthetic random networks with specified degree distribution (the so-called configuration model networks). Our method allows us to accurately predict the distribution of shortest path lengths on real-world networks using their degree distribution, or joint degree-degree distribution. Compared to some other methods, our approach is simpler and yields more accurate results. In order to obtain the analytical results, we use the analogy between an infection reaching a node in $n$ discrete time steps (i.e., as in the susceptible-infected epidemic model) and that node being at a distance $n$ from the source of the infection.

  12. Accurate and Simple Calibration of DLP Projector Systems

    Wilm, Jakob; Olesen, Oline Vinter; Larsen, Rasmus

    2014-01-01

    Much work has been devoted to the calibration of optical cameras, and accurate and simple methods are now available which require only a small number of calibration targets. The problem of obtaining these parameters for light projectors has not been studied as extensively and most current methods...... require a camera and involve feature extraction from a known projected pattern. In this work we present a novel calibration technique for DLP Projector systems based on phase shifting profilometry projection onto a printed calibration target. In contrast to most current methods, the one presented here...... does not rely on an initial camera calibration, and so does not carry over the error into projector calibration. A radial interpolation scheme is used to convert features coordinates into projector space, thereby allowing for a very accurate procedure. This allows for highly accurate determination of...

  13. Accurate level set method for simulations of liquid atomization☆

    Changxiao Shao; Kun Luo; Jianshan Yang; Song Chen; Jianren Fan

    2015-01-01

    Computational fluid dynamics is an efficient numerical approach for spray atomization study, but it is chal enging to accurately capture the gas–liquid interface. In this work, an accurate conservative level set method is intro-duced to accurately track the gas–liquid interfaces in liquid atomization. To validate the capability of this method, binary drop collision and drop impacting on liquid film are investigated. The results are in good agreement with experiment observations. In addition, primary atomization (swirling sheet atomization) is studied using this method. To the swirling sheet atomization, it is found that Rayleigh–Taylor instability in the azimuthal direction causes the primary breakup of liquid sheet and complex vortex structures are clustered around the rim of the liq-uid sheet. The effects of central gas velocity and liquid–gas density ratio on atomization are also investigated. This work lays a solid foundation for further studying the mechanism of spray atomization.

  14. Accurate nuclear radii and binding energies from a chiral interaction

    Ekstrom, A; Wendt, K A; Hagen, G; Papenbrock, T; Carlsson, B D; Forssen, C; Hjorth-Jensen, M; Navratil, P; Nazarewicz, W

    2015-01-01

    The accurate reproduction of nuclear radii and binding energies is a long-standing challenge in nuclear theory. To address this problem two-nucleon and three-nucleon forces from chiral effective field theory are optimized simultaneously to low-energy nucleon-nucleon scattering data, as well as binding energies and radii of few-nucleon systems and selected isotopes of carbon and oxygen. Coupled-cluster calculations based on this interaction, named NNLOsat, yield accurate binding energies and radii of nuclei up to 40Ca, and are consistent with the empirical saturation point of symmetric nuclear matter. In addition, the low-lying collective 3- states in 16O and 40Ca are described accurately, while spectra for selected p- and sd-shell nuclei are in reasonable agreement with experiment.

  15. Equivalent method for accurate solution to linear interval equations

    王冲; 邱志平

    2013-01-01

    Based on linear interval equations, an accurate interval finite element method for solving structural static problems with uncertain parameters in terms of optimization is discussed. On the premise of ensuring the consistency of solution sets, the original interval equations are equivalently transformed into some deterministic inequations. On this basis, calculating the structural displacement response with interval parameters is predigested to a number of deterministic linear optimization problems. The results are proved to be accurate to the interval governing equations. Finally, a numerical example is given to demonstrate the feasibility and efficiency of the proposed method.

  16. Accurate upwind-monotone (nonoscillatory) methods for conservation laws

    Huynh, Hung T.

    1992-01-01

    The well known MUSCL scheme of Van Leer is constructed using a piecewise linear approximation. The MUSCL scheme is second order accurate at the smooth part of the solution except at extrema where the accuracy degenerates to first order due to the monotonicity constraint. To construct accurate schemes which are free from oscillations, the author introduces the concept of upwind monotonicity. Several classes of schemes, which are upwind monotone and of uniform second or third order accuracy are then presented. Results for advection with constant speed are shown. It is also shown that the new scheme compares favorably with state of the art methods.

  17. An Innovative Imputation and Classification Approach for Accurate Disease Prediction

    UshaRani, Yelipe; Sammulal, P.

    2016-01-01

    Imputation of missing attribute values in medical datasets for extracting hidden knowledge from medical datasets is an interesting research topic of interest which is very challenging. One cannot eliminate missing values in medical records. The reason may be because some tests may not been conducted as they are cost effective, values missed when conducting clinical trials, values may not have been recorded to name some of the reasons. Data mining researchers have been proposing various approa...

  18. Record Keeping Guidelines

    American Psychologist, 2007

    2007-01-01

    These guidelines are designed to educate psychologists and provide a framework for making decisions regarding professional record keeping. State and federal laws, as well as the American Psychological Association's "Ethical Principles of Psychologists and Code of Conduct," generally require maintenance of appropriate records of psychological…

  19. Disturbance recording system

    A computerized system for disturbance monitoring, recording and display has been developed for use in nuclear power plants and is versatile enough to be used where ever a large number of parameters need to be recorded, e.g. conventional power plants, chemical industry etc. The Disturbance Recording System (DRS) has been designed to continuously monitor a process plant and record crucial parameters. The DRS provides a centralized facility to monitor and continuously record 64 process parameters scanned every 1 sec for 5 days. The system also provides facility for storage of 64 parameters scanned every 200 msec during 2 minutes prior to and 3 minutes after a disturbance. In addition the system can initiate, on demand, the recording of 8 parameters at a fast rate of every 5 msec for a period of 5 sec. and thus act as a visicorder. All this data is recorded in non-volatile memory and can be displayed, printed/plotted and used for subsequent analysis. Since data can be stored densely on floppy disks, the volume of space required for archival storage is also low. As a disturbance recorder, the DRS allows the operator to view the state of the plant prior to occurrence of the disturbance and helps in identifying the root cause. (author). 10 refs., 7 figs

  20. The Evolving Scholarly Record

    Lavoie, Brian; Childress, Eric; Erway, Ricky; Faniel, Ixchel; Malpas, Constance; Schaffner, Jennifer; van der Werf, Titia

    2014-01-01

    The ways and means of scholarly inquiry are experiencing fundamental change, with consequences for scholarly communication and ultimately, the scholarly record. The boundaries of the scholarly record are both expanding and blurring, driven by changes in research practices, as well as changing perceptions of the long-term value of certain forms of…

  1. A Simple and Accurate Model to Predict Responses to Multi-electrode Stimulation in the Retina.

    Maturana, Matias I; Apollo, Nicholas V; Hadjinicolaou, Alex E; Garrett, David J; Cloherty, Shaun L; Kameneva, Tatiana; Grayden, David B; Ibbotson, Michael R; Meffin, Hamish

    2016-04-01

    Implantable electrode arrays are widely used in therapeutic stimulation of the nervous system (e.g. cochlear, retinal, and cortical implants). Currently, most neural prostheses use serial stimulation (i.e. one electrode at a time) despite this severely limiting the repertoire of stimuli that can be applied. Methods to reliably predict the outcome of multi-electrode stimulation have not been available. Here, we demonstrate that a linear-nonlinear model accurately predicts neural responses to arbitrary patterns of stimulation using in vitro recordings from single retinal ganglion cells (RGCs) stimulated with a subretinal multi-electrode array. In the model, the stimulus is projected onto a low-dimensional subspace and then undergoes a nonlinear transformation to produce an estimate of spiking probability. The low-dimensional subspace is estimated using principal components analysis, which gives the neuron's electrical receptive field (ERF), i.e. the electrodes to which the neuron is most sensitive. Our model suggests that stimulation proportional to the ERF yields a higher efficacy given a fixed amount of power when compared to equal amplitude stimulation on up to three electrodes. We find that the model captures the responses of all the cells recorded in the study, suggesting that it will generalize to most cell types in the retina. The model is computationally efficient to evaluate and, therefore, appropriate for future real-time applications including stimulation strategies that make use of recorded neural activity to improve the stimulation strategy. PMID:27035143

  2. A Simple and Accurate Model to Predict Responses to Multi-electrode Stimulation in the Retina.

    Matias I Maturana

    2016-04-01

    Full Text Available Implantable electrode arrays are widely used in therapeutic stimulation of the nervous system (e.g. cochlear, retinal, and cortical implants. Currently, most neural prostheses use serial stimulation (i.e. one electrode at a time despite this severely limiting the repertoire of stimuli that can be applied. Methods to reliably predict the outcome of multi-electrode stimulation have not been available. Here, we demonstrate that a linear-nonlinear model accurately predicts neural responses to arbitrary patterns of stimulation using in vitro recordings from single retinal ganglion cells (RGCs stimulated with a subretinal multi-electrode array. In the model, the stimulus is projected onto a low-dimensional subspace and then undergoes a nonlinear transformation to produce an estimate of spiking probability. The low-dimensional subspace is estimated using principal components analysis, which gives the neuron's electrical receptive field (ERF, i.e. the electrodes to which the neuron is most sensitive. Our model suggests that stimulation proportional to the ERF yields a higher efficacy given a fixed amount of power when compared to equal amplitude stimulation on up to three electrodes. We find that the model captures the responses of all the cells recorded in the study, suggesting that it will generalize to most cell types in the retina. The model is computationally efficient to evaluate and, therefore, appropriate for future real-time applications including stimulation strategies that make use of recorded neural activity to improve the stimulation strategy.

  3. Palm computer demonstrates a fast and accurate means of burn data collection.

    Lal, S O; Smith, F W; Davis, J P; Castro, H Y; Smith, D W; Chinkes, D L; Barrow, R E

    2000-01-01

    Manual biomedical data collection and entry of the data into a personal computer is time-consuming and can be prone to errors. The purpose of this study was to compare data entry into a hand-held computer versus hand written data followed by entry of the data into a personal computer. A Palm (3Com Palm IIIx, Santa, Clara, Calif) computer with a custom menu-driven program was used for the entry and retrieval of burn-related variables. These variables were also used to create an identical sheet that was filled in by hand. Identical data were retrieved twice from 110 charts 48 hours apart and then used to create an Excel (Microsoft, Redmond, Wash) spreadsheet. One time data were recorded by the Palm entry method, and the other time the data were handwritten. The method of retrieval was alternated between the Palm system and handwritten system every 10 charts. The total time required to log data and to generate an Excel spreadsheet was recorded and used as a study endpoint. The total time for the Palm method of data collection and downloading to a personal computer was 23% faster than hand recording with the personal computer entry method (P errors were generated with the Palm method.) The Palm is a faster and more accurate means of data collection than a handwritten technique. PMID:11194811

  4. A Simple and Accurate Model to Predict Responses to Multi-electrode Stimulation in the Retina

    Maturana, Matias I.; Apollo, Nicholas V.; Hadjinicolaou, Alex E.; Garrett, David J.; Cloherty, Shaun L.; Kameneva, Tatiana; Grayden, David B.; Ibbotson, Michael R.; Meffin, Hamish

    2016-01-01

    Implantable electrode arrays are widely used in therapeutic stimulation of the nervous system (e.g. cochlear, retinal, and cortical implants). Currently, most neural prostheses use serial stimulation (i.e. one electrode at a time) despite this severely limiting the repertoire of stimuli that can be applied. Methods to reliably predict the outcome of multi-electrode stimulation have not been available. Here, we demonstrate that a linear-nonlinear model accurately predicts neural responses to arbitrary patterns of stimulation using in vitro recordings from single retinal ganglion cells (RGCs) stimulated with a subretinal multi-electrode array. In the model, the stimulus is projected onto a low-dimensional subspace and then undergoes a nonlinear transformation to produce an estimate of spiking probability. The low-dimensional subspace is estimated using principal components analysis, which gives the neuron’s electrical receptive field (ERF), i.e. the electrodes to which the neuron is most sensitive. Our model suggests that stimulation proportional to the ERF yields a higher efficacy given a fixed amount of power when compared to equal amplitude stimulation on up to three electrodes. We find that the model captures the responses of all the cells recorded in the study, suggesting that it will generalize to most cell types in the retina. The model is computationally efficient to evaluate and, therefore, appropriate for future real-time applications including stimulation strategies that make use of recorded neural activity to improve the stimulation strategy. PMID:27035143

  5. Cultural Heritage Recording Utilising Low-Cost Closerange Photogrammetry

    Melanie Kirchhöfer; Jim Chandler; Rene Wackrow

    2011-01-01

    Cultural heritage is under a constant threat of damage or even destruction and comprehensive and accurate recording is necessary to attenuate the risk of losing heritage or serve as basis for reconstruction. Cost effective and easy to use methods are required to record cultural heritage, particularly during a world recession, and close-range photogrammetry has proven potential in this area. Off-the-shelf digital cameras can be used to rapidly acquire data at low cost, allowing non-experts to ...

  6. Is Expressive Language Disorder an Accurate Diagnostic Category?

    Leonard, Laurence B.

    2009-01-01

    Purpose: To propose that the diagnostic category of "expressive language disorder" as distinct from a disorder of both expressive and receptive language might not be accurate. Method: Evidence that casts doubt on a pure form of this disorder is reviewed from several sources, including the literature on genetic findings, theories of language…

  7. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, S. A., E-mail: Sergey.Khrapak@dlr.de [Forschungsgruppe Komplexe Plasmen, Deutsches Zentrum für Luft- und Raumfahrt, Oberpfaffenhofen (Germany)

    2014-04-15

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within ±2% in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  8. Is a Writing Sample Necessary for "Accurate Placement"?

    Sullivan, Patrick; Nielsen, David

    2009-01-01

    The scholarship about assessment for placement is extensive and notoriously ambiguous. Foremost among the questions that continue to be unresolved in this scholarship is this one: Is a writing sample necessary for "accurate placement"? Using a robust data sample of student assessment essays and ACCUPLACER test scores, we put this question to the…

  9. Accurately Detecting Students' Lies regarding Relational Aggression by Correctional Instructions

    Dickhauser, Oliver; Reinhard, Marc-Andre; Marksteiner, Tamara

    2012-01-01

    This study investigates the effect of correctional instructions when detecting lies about relational aggression. Based on models from the field of social psychology, we predict that correctional instruction will lead to a less pronounced lie bias and to more accurate lie detection. Seventy-five teachers received videotapes of students' true denial…

  10. Fast and Accurate Residential Fire Detection Using Wireless Sensor Networks

    Bahrepour, Majid; Meratnia, Nirvana; Havinga, Paul J.M.

    2010-01-01

    Prompt and accurate residential fire detection is important for on-time fire extinguishing and consequently reducing damages and life losses. To detect fire sensors are needed to measure the environmental parameters and algorithms are required to decide about occurrence of fire. Recently, wireless s

  11. Efficient and accurate sound propagation using adaptive rectangular decomposition.

    Raghuvanshi, Nikunj; Narain, Rahul; Lin, Ming C

    2009-01-01

    Accurate sound rendering can add significant realism to complement visual display in interactive applications, as well as facilitate acoustic predictions for many engineering applications, like accurate acoustic analysis for architectural design. Numerical simulation can provide this realism most naturally by modeling the underlying physics of wave propagation. However, wave simulation has traditionally posed a tough computational challenge. In this paper, we present a technique which relies on an adaptive rectangular decomposition of 3D scenes to enable efficient and accurate simulation of sound propagation in complex virtual environments. It exploits the known analytical solution of the Wave Equation in rectangular domains, and utilizes an efficient implementation of the Discrete Cosine Transform on Graphics Processors (GPU) to achieve at least a 100-fold performance gain compared to a standard Finite-Difference Time-Domain (FDTD) implementation with comparable accuracy, while also being 10-fold more memory efficient. Consequently, we are able to perform accurate numerical acoustic simulation on large, complex scenes in the kilohertz range. To the best of our knowledge, it was not previously possible to perform such simulations on a desktop computer. Our work thus enables acoustic analysis on large scenes and auditory display for complex virtual environments on commodity hardware. PMID:19590105

  12. Accurate Period Approximation for Any Simple Pendulum Amplitude

    XUE De-Sheng; ZHOU Zhao; GAO Mei-Zhen

    2012-01-01

    Accurate approximate analytical formulae of the pendulum period composed of a few elementary functions for any amplitude are constructed.Based on an approximation of the elliptic integral,two new logarithmic formulae for large amplitude close to 180° are obtained.Considering the trigonometric function modulation results from the dependence of relative error on the amplitude,we realize accurate approximation period expressions for any amplitude between 0 and 180°.A relative error less than 0.02% is achieved for any amplitude.This kind of modulation is also effective for other large-amplitude logarithmic approximation expressions.%Accurate approximate analytical formulae of the pendulum period composed of a few elementary functions for any amplitude are constructed. Based on an approximation of the elliptic integral, two new logarithmic formulae for large amplitude close to 180° are obtained. Considering the trigonometric function modulation results from the dependence of relative error on the amplitude, we realize accurate approximation period expressions for any amplitude between 0 and 180°. A relative error less than 0.02% is achieved for any amplitude. This kind of modulation is also effective for other large-amplitude logarithmic approximation expressions.

  13. Second-order accurate nonoscillatory schemes for scalar conservation laws

    Huynh, Hung T.

    1989-01-01

    Explicit finite difference schemes for the computation of weak solutions of nonlinear scalar conservation laws is presented and analyzed. These schemes are uniformly second-order accurate and nonoscillatory in the sense that the number of extrema of the discrete solution is not increasing in time.

  14. Accurate segmentation of dense nanoparticles by partially discrete electron tomography

    Roelandts, T., E-mail: tom.roelandts@ua.ac.be [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Batenburg, K.J. [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, 1098 XG Amsterdam (Netherlands); Biermans, E. [EMAT, University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Kuebel, C. [Institute of Nanotechnology, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Sijbers, J. [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium)

    2012-03-15

    Accurate segmentation of nanoparticles within various matrix materials is a difficult problem in electron tomography. Due to artifacts related to image series acquisition and reconstruction, global thresholding of reconstructions computed by established algorithms, such as weighted backprojection or SIRT, may result in unreliable and subjective segmentations. In this paper, we introduce the Partially Discrete Algebraic Reconstruction Technique (PDART) for computing accurate segmentations of dense nanoparticles of constant composition. The particles are segmented directly by the reconstruction algorithm, while the surrounding regions are reconstructed using continuously varying gray levels. As no properties are assumed for the other compositions of the sample, the technique can be applied to any sample where dense nanoparticles must be segmented, regardless of the surrounding compositions. For both experimental and simulated data, it is shown that PDART yields significantly more accurate segmentations than those obtained by optimal global thresholding of the SIRT reconstruction. -- Highlights: Black-Right-Pointing-Pointer We present a novel reconstruction method for partially discrete electron tomography. Black-Right-Pointing-Pointer It accurately segments dense nanoparticles directly during reconstruction. Black-Right-Pointing-Pointer The gray level to use for the nanoparticles is determined objectively. Black-Right-Pointing-Pointer The method expands the set of samples for which discrete tomography can be applied.

  15. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, Sergey

    2014-01-01

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within 2% in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  16. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, S. A.

    2014-01-01

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within $\\pm 2\\%$ in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  17. On the importance of having accurate data for astrophysical modelling

    Lique, Francois

    2016-06-01

    The Herschel telescope and the ALMA and NOEMA interferometers have opened new windows of observation for wavelengths ranging from far infrared to sub-millimeter with spatial and spectral resolutions previously unmatched. To make the most of these observations, an accurate knowledge of the physical and chemical processes occurring in the interstellar and circumstellar media is essential.In this presentation, I will discuss what are the current needs of astrophysics in terms of molecular data and I will show that accurate molecular data are crucial for the proper determination of the physical conditions in molecular clouds.First, I will focus on collisional excitation studies that are needed for molecular lines modelling beyond the Local Thermodynamic Equilibrium (LTE) approach. In particular, I will show how new collisional data for the HCN and HNC isomers, two tracers of star forming conditions, have allowed solving the problem of their respective abundance in cold molecular clouds. I will also present the last collisional data that have been computed in order to analyse new highly resolved observations provided by the ALMA interferometer.Then, I will present the calculation of accurate rate constants for the F+H2 → HF+H and Cl+H2 ↔ HCl+H reactions, which have allowed a more accurate determination of the physical conditions in diffuse molecular clouds. I will also present the recent work on the ortho-para-H2 conversion due to hydrogen exchange that allow more accurate determination of the ortho-to-para-H2 ratio in the universe and that imply a significant revision of the cooling mechanism in astrophysical media.

  18. Electronic Health Records

    Kierkegaard, Patrick

    2011-01-01

    that a centralised European health record system will become a reality even before 2020. However, the concept of a centralised supranational central server raises concern about storing electronic medical records in a central location. The privacy threat posed by a supranational network is a key concern. Cross......-border and Interoperable electronic health record systems make confidential data more easily and rapidly accessible to a wider audience and increase the risk that personal data concerning health could be accidentally exposed or easily distributed to unauthorised parties by enabling greater access to a compilation...

  19. Electronic Health Record

    Kierkegaard, Patrick

    2011-01-01

    that a centralised European health record system will become a reality even before 2020. However, the concept of a centralised supranational central server raises concern about storing electronic medical records in a central location. The privacy threat posed by a supranational network is a key concern. Cross border...... and Interoperable electronic health record systems make confidential data more easily and rapidly accessible to a wider audience and increases the risk that personal data concerning health could be accidentally exposed or easily distributed to unauthorised parties by enabling greater access to a compilation...

  20. Method for Accurately Calibrating a Spectrometer Using Broadband Light

    Simmons, Stephen; Youngquist, Robert

    2011-01-01

    A novel method has been developed for performing very fine calibration of a spectrometer. This process is particularly useful for modern miniature charge-coupled device (CCD) spectrometers where a typical factory wavelength calibration has been performed and a finer, more accurate calibration is desired. Typically, the factory calibration is done with a spectral line source that generates light at known wavelengths, allowing specific pixels in the CCD array to be assigned wavelength values. This method is good to about 1 nm across the spectrometer s wavelength range. This new method appears to be accurate to about 0.1 nm, a factor of ten improvement. White light is passed through an unbalanced Michelson interferometer, producing an optical signal with significant spectral variation. A simple theory can be developed to describe this spectral pattern, so by comparing the actual spectrometer output against this predicted pattern, errors in the wavelength assignment made by the spectrometer can be determined.

  1. Multimodal Spatial Calibration for Accurately Registering EEG Sensor Positions

    Jianhua Zhang

    2014-01-01

    Full Text Available This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views’ calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain.

  2. Accurate multireference study of Si3 electronic manifold

    Goncalves, Cayo Emilio Monteiro; Braga, Joao Pedro

    2016-01-01

    Since it has been shown that the silicon trimer has a highly multi-reference character, accurate multi-reference configuration interaction calculations are performed to elucidate its electronic manifold. Emphasis is given to the long range part of the potential, aiming to understand the atom-diatom collisions dynamical aspects, to describe conical intersections and important saddle points along the reactive path. Potential energy surface main features analysis are performed for benchmarking, and highly accurate values for structures, vibrational constants and energy gaps are reported, as well as the unpublished spin-orbit coupling magnitude. The results predict that inter-system crossings will play an important role in dynamical simulations, specially in triplet state quenching, making the problem of constructing a precise potential energy surface more complicated and multi-layer dependent. The ground state is predicted to be the singlet one, but since the singlet-triplet gap is rather small (2.448 kJ/mol) bo...

  3. Simple and High-Accurate Schemes for Hyperbolic Conservation Laws

    Renzhong Feng

    2014-01-01

    Full Text Available The paper constructs a class of simple high-accurate schemes (SHA schemes with third order approximation accuracy in both space and time to solve linear hyperbolic equations, using linear data reconstruction and Lax-Wendroff scheme. The schemes can be made even fourth order accurate with special choice of parameter. In order to avoid spurious oscillations in the vicinity of strong gradients, we make the SHA schemes total variation diminishing ones (TVD schemes for short by setting flux limiter in their numerical fluxes and then extend these schemes to solve nonlinear Burgers’ equation and Euler equations. The numerical examples show that these schemes give high order of accuracy and high resolution results. The advantages of these schemes are their simplicity and high order of accuracy.

  4. Fixed-Wing Micro Aerial Vehicle for Accurate Corridor Mapping

    Rehak, M.; Skaloud, J.

    2015-08-01

    In this study we present a Micro Aerial Vehicle (MAV) equipped with precise position and attitude sensors that together with a pre-calibrated camera enables accurate corridor mapping. The design of the platform is based on widely available model components to which we integrate an open-source autopilot, customized mass-market camera and navigation sensors. We adapt the concepts of system calibration from larger mapping platforms to MAV and evaluate them practically for their achievable accuracy. We present case studies for accurate mapping without ground control points: first for a block configuration, later for a narrow corridor. We evaluate the mapping accuracy with respect to checkpoints and digital terrain model. We show that while it is possible to achieve pixel (3-5 cm) mapping accuracy in both cases, precise aerial position control is sufficient for block configuration, the precise position and attitude control is required for corridor mapping.

  5. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  6. Accurate Load Modeling Based on Analytic Hierarchy Process

    Zhenshu Wang

    2016-01-01

    Full Text Available Establishing an accurate load model is a critical problem in power system modeling. That has significant meaning in power system digital simulation and dynamic security analysis. The synthesis load model (SLM considers the impact of power distribution network and compensation capacitor, while randomness of power load is more precisely described by traction power system load model (TPSLM. On the basis of these two load models, a load modeling method that combines synthesis load with traction power load is proposed in this paper. This method uses analytic hierarchy process (AHP to interact with two load models. Weight coefficients of two models can be calculated after formulating criteria and judgment matrixes and then establishing a synthesis model by weight coefficients. The effectiveness of the proposed method was examined through simulation. The results show that accurate load modeling based on AHP can effectively improve the accuracy of load model and prove the validity of this method.

  7. Accurate adjoint design sensitivities for nano metal optics.

    Hansen, Paul; Hesselink, Lambertus

    2015-09-01

    We present a method for obtaining accurate numerical design sensitivities for metal-optical nanostructures. Adjoint design sensitivity analysis, long used in fluid mechanics and mechanical engineering for both optimization and structural analysis, is beginning to be used for nano-optics design, but it fails for sharp-cornered metal structures because the numerical error in electromagnetic simulations of metal structures is highest at sharp corners. These locations feature strong field enhancement and contribute strongly to design sensitivities. By using high-accuracy FEM calculations and rounding sharp features to a finite radius of curvature we obtain highly-accurate design sensitivities for 3D metal devices. To provide a bridge to the existing literature on adjoint methods in other fields, we derive the sensitivity equations for Maxwell's equations in the PDE framework widely used in fluid mechanics. PMID:26368483

  8. Efficient and Accurate Robustness Estimation for Large Complex Networks

    Wandelt, Sebastian

    2016-01-01

    Robustness estimation is critical for the design and maintenance of resilient networks, one of the global challenges of the 21st century. Existing studies exploit network metrics to generate attack strategies, which simulate intentional attacks in a network, and compute a metric-induced robustness estimation. While some metrics are easy to compute, e.g. degree centrality, other, more accurate, metrics require considerable computation efforts, e.g. betweennes centrality. We propose a new algorithm for estimating the robustness of a network in sub-quadratic time, i.e., significantly faster than betweenness centrality. Experiments on real-world networks and random networks show that our algorithm estimates the robustness of networks close to or even better than betweenness centrality, while being orders of magnitudes faster. Our work contributes towards scalable, yet accurate methods for robustness estimation of large complex networks.

  9. The FLUKA code: An accurate simulation tool for particle therapy

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  10. Records Management Database

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  11. Herbicide application records

    US Fish and Wildlife Service, Department of the Interior — This document contains records of pesticide applications on Neal Smith National Wildlife Refuge (Walnut Creek National Wildlife Refuge) between 1995 and 2006.

  12. Environmental Review Records

    Department of Housing and Urban Development — HUD’s Environmental Review Records page houses environmental reviews made publicly available through the HUD Environmental Review Online System (HEROS). This...

  13. A novel automated image analysis method for accurate adipocyte quantification

    Osman, Osman S.; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J.; O’Dowd, Jacqueline F; Cawthorne, Michael A.; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-section...

  14. Combinatorial Approaches to Accurate Identification of Orthologous Genes

    Shi, Guanqun

    2011-01-01

    The accurate identification of orthologous genes across different species is a critical and challenging problem in comparative genomics and has a wide spectrum of biological applications including gene function inference, evolutionary studies and systems biology. During the past several years, many methods have been proposed for ortholog assignment based on sequence similarity, phylogenetic approaches, synteny information, and genome rearrangement. Although these methods share many commonly a...

  15. Strategy Guideline. Accurate Heating and Cooling Load Calculations

    Burdick, Arlan [IBACOS, Inc., Pittsburgh, PA (United States)

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  16. Strategy Guideline: Accurate Heating and Cooling Load Calculations

    Burdick, A.

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  17. Evaluation of accurate eye corner detection methods for gaze estimation

    Bengoechea, Jose Javier; Cerrolaza, Juan J.; Villanueva, Arantxa; Cabeza, Rafael

    2014-01-01

    Accurate detection of iris center and eye corners appears to be a promising approach for low cost gaze estimation. In this paper we propose novel eye inner corner detection methods. Appearance and feature based segmentation approaches are suggested. All these methods are exhaustively tested on a realistic dataset containing images of subjects gazing at different points on a screen. We have demonstrated that a method based on a neural network presents the best performance even in light changin...

  18. Building with Drones: Accurate 3D Facade Reconstruction using MAVs

    Daftry, Shreyansh; Hoppe, Christof; Bischof, Horst

    2015-01-01

    Automatic reconstruction of 3D models from images using multi-view Structure-from-Motion methods has been one of the most fruitful outcomes of computer vision. These advances combined with the growing popularity of Micro Aerial Vehicles as an autonomous imaging platform, have made 3D vision tools ubiquitous for large number of Architecture, Engineering and Construction applications among audiences, mostly unskilled in computer vision. However, to obtain high-resolution and accurate reconstruc...

  19. Mouse models of human AML accurately predict chemotherapy response

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S.; Zhao, Zhen; Rappaport, Amy R.; Luo, Weijun; McCurrach, Mila E.; Yang, Miao-Miao; Dolan, M. Eileen; Kogan, Scott C.; Downing, James R.; Lowe, Scott W.

    2009-01-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to co...

  20. Accurate Multisteps Traffic Flow Prediction Based on SVM

    Zhang Mingheng; Zhen Yaobao; Hui Ganglong; Chen Gang

    2013-01-01

    Accurate traffic flow prediction is prerequisite and important for realizing intelligent traffic control and guidance, and it is also the objective requirement for intelligent traffic management. Due to the strong nonlinear, stochastic, time-varying characteristics of urban transport system, artificial intelligence methods such as support vector machine (SVM) are now receiving more and more attentions in this research field. Compared with the traditional single-step prediction method, the mul...

  1. Accurate calibration of stereo cameras for machine vision

    Li, Liangfu; Feng, Zuren; Feng, Yuanjing

    2004-01-01

    Camera calibration is an important task for machine vision, whose goal is to obtain the internal and external parameters of each camera. With these parameters, the 3D positions of a scene point, which is identified and matched in two stereo images, can be determined by the triangulation theory. This paper presents a new accurate estimation of CCD camera parameters for machine vision. We present a fast technique to estimate the camera center with special arrangement of calibration target and t...

  2. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    Mark Shortis

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation a...

  3. Fast and Accurate Bilateral Filtering using Gauss-Polynomial Decomposition

    Chaudhury, Kunal N.

    2015-01-01

    The bilateral filter is a versatile non-linear filter that has found diverse applications in image processing, computer vision, computer graphics, and computational photography. A widely-used form of the filter is the Gaussian bilateral filter in which both the spatial and range kernels are Gaussian. A direct implementation of this filter requires $O(\\sigma^2)$ operations per pixel, where $\\sigma$ is the standard deviation of the spatial Gaussian. In this paper, we propose an accurate approxi...

  4. Accurate Insertion Loss Measurements of the Juno Patch Array Antennas

    Chamberlain, Neil; Chen, Jacqueline; Hodges, Richard; Demas, John

    2010-01-01

    This paper describes two independent methods for estimating the insertion loss of patch array antennas that were developed for the Juno Microwave Radiometer instrument. One method is based principally on pattern measurements while the other method is based solely on network analyzer measurements. The methods are accurate to within 0.1 dB for the measured antennas and show good agreement (to within 0.1dB) of separate radiometric measurements.

  5. Dejavu: An Accurate Energy-Efficient Outdoor Localization System

    Aly, Heba; Youssef, Moustafa

    2013-01-01

    We present Dejavu, a system that uses standard cell-phone sensors to provide accurate and energy-efficient outdoor localization suitable for car navigation. Our analysis shows that different road landmarks have a unique signature on cell-phone sensors; For example, going inside tunnels, moving over bumps, going up a bridge, and even potholes all affect the inertial sensors on the phone in a unique pattern. Dejavu employs a dead-reckoning localization approach and leverages these road landmark...

  6. Accurate Parameter Estimation for Unbalanced Three-Phase System

    Yuan Chen; Hing Cheung So

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newt...

  7. Accurate, inexpensive testing of laser pointer power for safe operation

    An accurate, inexpensive test-bed for the measurement of optical power emitted from handheld lasers is described. The setup consists of a power meter, optical bandpass filters, an adjustable iris and self-centering lens mounts. We demonstrate this test-bed by evaluating the output power of 23 laser pointers with respect to the limits imposed by the US Code of Federal Regulations. We find a compliance rate of only 26%. A discussion of potential laser pointer hazards is included. (paper)

  8. DOMAC: an accurate, hybrid protein domain prediction server

    Cheng, Jianlin

    2007-01-01

    Protein domain prediction is important for protein structure prediction, structure determination, function annotation, mutagenesis analysis and protein engineering. Here we describe an accurate protein domain prediction server (DOMAC) combining both template-based and ab initio methods. The preliminary version of the server was ranked among the top domain prediction servers in the seventh edition of Critical Assessment of Techniques for Protein Structure Prediction (CASP7), 2006. DOMAC server...

  9. A multiple more accurate Hardy-Littlewood-Polya inequality

    Qiliang Huang

    2012-11-01

    Full Text Available By introducing multi-parameters and conjugate exponents and using Euler-Maclaurin’s summation formula, we estimate the weight coefficient and prove a multiple more accurate Hardy-Littlewood-Polya (H-L-P inequality, which is an extension of some earlier published results. We also prove that the constant factor in the new inequality is the best possible, and obtain its equivalent forms.

  10. Shock Emergence in Supernovae: Limiting Cases and Accurate Approximations

    Ro, Stephen

    2013-01-01

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient the solutions match the outcome of shock acceleration in exponential atmospheres.

  11. Shock Emergence in Supernovae: Limiting Cases and Accurate Approximations

    Ro, Stephen; Matzner, Christopher D.

    2013-08-01

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere, there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient, the solutions match the outcome of shock acceleration in exponential atmospheres.

  12. SHOCK EMERGENCE IN SUPERNOVAE: LIMITING CASES AND ACCURATE APPROXIMATIONS

    Ro, Stephen; Matzner, Christopher D. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George St., Toronto, ON M5S 3H4 (Canada)

    2013-08-10

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere, there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient, the solutions match the outcome of shock acceleration in exponential atmospheres.

  13. An accurate and robust gyroscope-gased pedometer.

    Lim, Yoong P; Brown, Ian T; Khoo, Joshua C T

    2008-01-01

    Pedometers are known to have steps estimation issues. This is mainly attributed to their innate acceleration based measuring sensory. A micro-machined gyroscope (better immunity to acceleration) based pedometer is proposed. Through syntactic data recognition of apriori knowledge of human shank's dynamics and temporally précised detection of heel strikes permitted by Wavelet decomposition, an accurate and robust pedometer is acquired. PMID:19163737

  14. Accurate calculation of thermal noise in multilayer coating

    Gurkovsky, Alexey; Vyatchanin, Sergey

    2010-01-01

    We derive accurate formulas for thermal fluctuations in multilayer interferometric coating taking into account light propagation inside the coating. In particular, we calculate the reflected wave phase as a function of small displacements of the boundaries between the layers using transmission line model for interferometric coating and derive formula for spectral density of reflected phase in accordance with Fluctuation-Dissipation Theorem. We apply the developed approach for calculation of t...

  15. Novel multi-beam radiometers for accurate ocean surveillance

    Cappellin, C.; Pontoppidan, K.; Nielsen, P. H.;

    2014-01-01

    Novel antenna architectures for real aperture multi-beam radiometers providing high resolution and high sensitivity for accurate sea surface temperature (SST) and ocean vector wind (OVW) measurements are investigated. On the basis of the radiometer requirements set for future SST/OVW missions, co......, conical scanners and push-broom antennas are compared. The comparison will cover reflector optics and focal plane array configuration....

  16. Strategy for accurate liver intervention by an optical tracking system

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Guan, Peifeng; Xiao, Weihu; Wu, Xiaoming

    2015-01-01

    Image-guided navigation for radiofrequency ablation of liver tumors requires the accurate guidance of needle insertion into a tumor target. The main challenge of image-guided navigation for radiofrequency ablation of liver tumors is the occurrence of liver deformations caused by respiratory motion. This study reports a strategy of real-time automatic registration to track custom fiducial markers glued onto the surface of a patient’s abdomen to find the respiratory phase, in which the static p...

  17. Efficient and Accurate Path Cost Estimation Using Trajectory Data

    Dai, Jian; Yang, Bin; Guo, Chenjuan; Jensen, Christian S.

    2015-01-01

    Using the growing volumes of vehicle trajectory data, it becomes increasingly possible to capture time-varying and uncertain travel costs in a road network, including travel time and fuel consumption. The current paradigm represents a road network as a graph, assigns weights to the graph's edges by fragmenting trajectories into small pieces that fit the underlying edges, and then applies a routing algorithm to the resulting graph. We propose a new paradigm that targets more accurate and more ...

  18. Accurate molecular classification of cancer using simple rules

    Gotoh Osamu; Wang Xiaosheng

    2009-01-01

    Abstract Background One intractable problem with using microarray data analysis for cancer classification is how to reduce the extremely high-dimensionality gene feature data to remove the effects of noise. Feature selection is often used to address this problem by selecting informative genes from among thousands or tens of thousands of genes. However, most of the existing methods of microarray-based cancer classification utilize too many genes to achieve accurate classification, which often ...

  19. Accurate Identification of Fear Facial Expressions Predicts Prosocial Behavior

    Marsh, Abigail A.; Kozak, Megan N.; Ambady, Nalini

    2007-01-01

    The fear facial expression is a distress cue that is associated with the provision of help and prosocial behavior. Prior psychiatric studies have found deficits in the recognition of this expression by individuals with antisocial tendencies. However, no prior study has shown accuracy for recognition of fear to predict actual prosocial or antisocial behavior in an experimental setting. In 3 studies, the authors tested the prediction that individuals who recognize fear more accurately will beha...

  20. Continuous glucose monitors prove highly accurate in critically ill children

    Bridges, Brian C.; Preissig, Catherine M; Maher, Kevin O.; Rigby, Mark R

    2010-01-01

    Introduction Hyperglycemia is associated with increased morbidity and mortality in critically ill patients and strict glycemic control has become standard care for adults. Recent studies have questioned the optimal targets for such management and reported increased rates of iatrogenic hypoglycemia in both critically ill children and adults. The ability to provide accurate, real-time continuous glucose monitoring would improve the efficacy and safety of this practice in critically ill patients...

  1. Accurate quantum state estimation via "Keeping the experimentalist honest"

    Blume-Kohout, R; Blume-Kohout, Robin; Hayden, Patrick

    2006-01-01

    In this article, we derive a unique procedure for quantum state estimation from a simple, self-evident principle: an experimentalist's estimate of the quantum state generated by an apparatus should be constrained by honesty. A skeptical observer should subject the estimate to a test that guarantees that a self-interested experimentalist will report the true state as accurately as possible. We also find a non-asymptotic, operational interpretation of the quantum relative entropy function.

  2. A highly accurate method to solve Fisher’s equation

    Mehdi Bastani; Davod Khojasteh Salkuyeh

    2012-03-01

    In this study, we present a new and very accurate numerical method to approximate the Fisher’s-type equations. Firstly, the spatial derivative in the proposed equation is approximated by a sixth-order compact finite difference (CFD6) scheme. Secondly, we solve the obtained system of differential equations using a third-order total variation diminishing Runge–Kutta (TVD-RK3) scheme. Numerical examples are given to illustrate the efficiency of the proposed method.

  3. Accurate Method for Determining Adhesion of Cantilever Beams

    Michalske, T.A.; de Boer, M.P.

    1999-01-08

    Using surface micromachined samples, we demonstrate the accurate measurement of cantilever beam adhesion by using test structures which are adhered over long attachment lengths. We show that this configuration has a deep energy well, such that a fracture equilibrium is easily reached. When compared to the commonly used method of determining the shortest attached beam, the present method is much less sensitive to variations in surface topography or to details of capillary drying.

  4. A robust and accurate formulation of molecular and colloidal electrostatics

    Sun, Qiang; Klaseboer, Evert; Chan, Derek Y. C.

    2016-08-01

    This paper presents a re-formulation of the boundary integral method for the Debye-Hückel model of molecular and colloidal electrostatics that removes the mathematical singularities that have to date been accepted as an intrinsic part of the conventional boundary integral equation method. The essence of the present boundary regularized integral equation formulation consists of subtracting a known solution from the conventional boundary integral method in such a way as to cancel out the singularities associated with the Green's function. This approach better reflects the non-singular physical behavior of the systems on boundaries with the benefits of the following: (i) the surface integrals can be evaluated accurately using quadrature without any need to devise special numerical integration procedures, (ii) being able to use quadratic or spline function surface elements to represent the surface more accurately and the variation of the functions within each element is represented to a consistent level of precision by appropriate interpolation functions, (iii) being able to calculate electric fields, even at boundaries, accurately and directly from the potential without having to solve hypersingular integral equations and this imparts high precision in calculating the Maxwell stress tensor and consequently, intermolecular or colloidal forces, (iv) a reliable way to handle geometric configurations in which different parts of the boundary can be very close together without being affected by numerical instabilities, therefore potentials, fields, and forces between surfaces can be found accurately at surface separations down to near contact, and (v) having the simplicity of a formulation that does not require complex algorithms to handle singularities will result in significant savings in coding effort and in the reduction of opportunities for coding errors. These advantages are illustrated using examples drawn from molecular and colloidal electrostatics.

  5. Robust Small Sample Accurate Inference in Moment Condition Models

    Serigne N. Lo; Elvezio Ronchetti

    2006-01-01

    Procedures based on the Generalized Method of Moments (GMM) (Hansen, 1982) are basic tools in modern econometrics. In most cases, the theory available for making inference with these procedures is based on first order asymptotic theory. It is well-known that the (first order) asymptotic distribution does not provide accurate p-values and confidence intervals in moderate to small samples. Moreover, in the presence of small deviations from the assumed model, p-values and confidence intervals ba...

  6. Recording information about immunizations

    Gadsby, Roger

    1980-01-01

    The recording of information on triple plus polio and rubella immunizations is reviewed and immunization rates determined for patients in a single-handed practice. Rates of triple plus polio immunizations are satisfactory but rates for rubella immunization are very poor. Immunization information is not exchanged between different sections of the Health Service in Stoke-on-Trent and so the general practitioner has no reliable immunization record for his patients.

  7. Is bioelectrical impedance accurate for use in large epidemiological studies?

    Merchant Anwar T

    2008-09-01

    Full Text Available Abstract Percentage of body fat is strongly associated with the risk of several chronic diseases but its accurate measurement is difficult. Bioelectrical impedance analysis (BIA is a relatively simple, quick and non-invasive technique, to measure body composition. It measures body fat accurately in controlled clinical conditions but its performance in the field is inconsistent. In large epidemiologic studies simpler surrogate techniques such as body mass index (BMI, waist circumference, and waist-hip ratio are frequently used instead of BIA to measure body fatness. We reviewed the rationale, theory, and technique of recently developed systems such as foot (or hand-to-foot BIA measurement, and the elements that could influence its results in large epidemiologic studies. BIA results are influenced by factors such as the environment, ethnicity, phase of menstrual cycle, and underlying medical conditions. We concluded that BIA measurements validated for specific ethnic groups, populations and conditions can accurately measure body fat in those populations, but not others and suggest that for large epdiemiological studies with diverse populations BIA may not be the appropriate choice for body composition measurement unless specific calibration equations are developed for different groups participating in the study.

  8. Accurate thermoelastic tensor and acoustic velocities of NaCl

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry

  9. Can blind persons accurately assess body size from the voice?

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264

  10. An accurate determination of the flux within a slab

    During the past decade, several articles have been written concerning accurate solutions to the monoenergetic neutron transport equation in infinite and semi-infinite geometries. The numerical formulations found in these articles were based primarily on the extensive theoretical investigations performed by the open-quotes transport greatsclose quotes such as Chandrasekhar, Busbridge, Sobolev, and Ivanov, to name a few. The development of numerical solutions in infinite and semi-infinite geometries represents an example of how mathematical transport theory can be utilized to provide highly accurate and efficient numerical transport solutions. These solutions, or analytical benchmarks, are useful as open-quotes industry standards,close quotes which provide guidance to code developers and promote learning in the classroom. The high accuracy of these benchmarks is directly attributable to the rapid advancement of the state of computing and computational methods. Transport calculations that were beyond the capability of the open-quotes supercomputersclose quotes of just a few years ago are now possible at one's desk. In this paper, we again build upon the past to tackle the slab problem, which is of the next level of difficulty in comparison to infinite media problems. The formulation is based on the monoenergetic Green's function, which is the most fundamental transport solution. This method of solution requires a fast and accurate evaluation of the Green's function, which, with today's computational power, is now readily available

  11. Accurate pose estimation using single marker single camera calibration system

    Pati, Sarthak; Erat, Okan; Wang, Lejing; Weidert, Simon; Euler, Ekkehard; Navab, Nassir; Fallavollita, Pascal

    2013-03-01

    Visual marker based tracking is one of the most widely used tracking techniques in Augmented Reality (AR) applications. Generally, multiple square markers are needed to perform robust and accurate tracking. Various marker based methods for calibrating relative marker poses have already been proposed. However, the calibration accuracy of these methods relies on the order of the image sequence and pre-evaluation of pose-estimation errors, making the method offline. Several studies have shown that the accuracy of pose estimation for an individual square marker depends on camera distance and viewing angle. We propose a method to accurately model the error in the estimated pose and translation of a camera using a single marker via an online method based on the Scaled Unscented Transform (SUT). Thus, the pose estimation for each marker can be estimated with highly accurate calibration results independent of the order of image sequences compared to cases when this knowledge is not used. This removes the need for having multiple markers and an offline estimation system to calculate camera pose in an AR application.

  12. Accurate thermoelastic tensor and acoustic velocities of NaCl

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  13. Accurate thermoelastic tensor and acoustic velocities of NaCl

    Marcondes, Michel L., E-mail: michel@if.usp.br [Physics Institute, University of Sao Paulo, Sao Paulo, 05508-090 (Brazil); Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Shukla, Gaurav, E-mail: shukla@physics.umn.edu [School of Physics and Astronomy, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States); Silveira, Pedro da [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Wentzcovitch, Renata M., E-mail: wentz002@umn.edu [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States)

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  14. Interacting with image hierarchies for fast and accurate object segmentation

    Beard, David V.; Eberly, David H.; Hemminger, Bradley M.; Pizer, Stephen M.; Faith, R. E.; Kurak, Charles; Livingston, Mark

    1994-05-01

    Object definition is an increasingly important area of medical image research. Accurate and fairly rapid object definition is essential for measuring the size and, perhaps more importantly, the change in size of anatomical objects such as kidneys and tumors. Rapid and fairly accurate object definition is essential for 3D real-time visualization including both surgery planning and Radiation oncology treatment planning. One approach to object definition involves the use of 3D image hierarchies, such as Eberly's Ridge Flow. However, the image hierarchy segmentation approach requires user interaction in selecting regions and subtrees. Further, visualizing and comprehending the anatomy and the selected portions of the hierarchy can be problematic. In this paper we will describe the Magic Crayon tool which allows a user to define rapidly and accurately various anatomical objects by interacting with image hierarchies such as those generated with Eberly's Ridge Flow algorithm as well as other 3D image hierarchies. Preliminary results suggest that fairly complex anatomical objects can be segmented in under a minute with sufficient accuracy for 3D surgery planning, 3D radiation oncology treatment planning, and similar applications. Potential modifications to the approach for improved accuracy are summarized.

  15. Can clinicians accurately assess esophageal dilation without fluoroscopy?

    Bailey, A D; Goldner, F

    1990-01-01

    This study questioned whether clinicians could determine the success of esophageal dilation accurately without the aid of fluoroscopy. Twenty patients were enrolled with the diagnosis of distal esophageal stenosis, including benign peptic stricture (17), Schatski's ring (2), and squamous cell carcinoma of the esophagus (1). Dilation attempts using only Maloney dilators were monitored fluoroscopically by the principle investigator, the physician and patient being unaware of the findings. Physicians then predicted whether or not their dilations were successful, and they examined various features to determine their usefulness in predicting successful dilation. They were able to predict successful dilation accurately in 97% of the cases studied; however, their predictions of unsuccessful dilation were correct only 60% of the time. Features helpful in predicting passage included easy passage of the dilator (98%) and the patient feeling the dilator in the stomach (95%). Excessive resistance suggesting unsuccessful passage was an unreliable feature and was often due to the dilator curling in the stomach. When Maloney dilators are used to dilate simple distal strictures, if the physician predicts successful passage, he is reliably accurate without the use of fluoroscopy; however, if unsuccessful passage is suspected, fluoroscopy must be used for confirmation. PMID:2210278

  16. Future Proof for Physics: Preserving the Record of SLAC

    Paper provides a brief introduction to SLAC, discusses the origins of the SLAC Archives and History Office, its present-day operations, and the present and future challenges it faces in attempting to preserve an accurate historical record of SLAC's activities

  17. 25 CFR 226.32 - Well records and reports.

    2010-04-01

    ... and character of oil, gas, or water in each formation, and the kind, weight, size, landed depth and... measure production of oil, gas, and water from individual wells at reasonably frequent intervals to the... keep accurate and complete records of the drilling, redrilling, deepening, repairing,...

  18. Accurate Full-Field Thermochromic Liquid Crystal Thermography for the Study of Instantaneous Turbulent Heat Transfer

    Sabatino, D. R.; Praisner, T. J.; Smith, C. R.

    1998-11-01

    The color change of thermochromic liquid crystals with temperature can be effectively utilized as full-field surface temperature sensors to investigate the fundamental structure of wall turbulence. In order to accurately quantify turbulent heat transfer behavior, a new technique has been developed for the calibration of wide-band micro-encapsulated thermochromic liquid crystals. Lighting/viewing arrangements are described and evaluated for ease of implementation and accuracy of the displayed color. This new technique employs images recorded in-situ with the test surface systematically exposed to a series of uniform temperature conditions spanning the bandwidth of the liquid crystals. This sequence of images is used to generate point-wise color/temperature calibration curves for the entire surface. Experimental results will be presented illustrating the application of the technique for assessment of spatial/temporal surface heat transfer behavior due to selected turbulent flows in a water channel

  19. Validation of a wrist monitor for accurate estimation of RR intervals during sleep.

    Renevey, Ph; Sola, J; Theurillat, P; Bertschi, M; Krauss, J; Andries, D; Sartori, C

    2013-01-01

    While the incidence of sleep disorders is continuously increasing in western societies, there is a clear demand for technologies to asses sleep-related parameters in ambulatory scenarios. The present study introduces a novel concept of accurate sensor to measure RR intervals via the analysis of photo-plethysmographic signals recorded at the wrist. In a cohort of 26 subjects undergoing full night polysomnography, the wrist device provided RR interval estimates in agreement with RR intervals as measured from standard electrocardiographic time series. The study showed an overall agreement between both approaches of 0.05 ± 18 ms. The novel wrist sensor opens the door towards a new generation of comfortable and easy-to-use sleep monitors. PMID:24110980

  20. Induced Dual-Nanospray: A Novel Internal Calibration Method for Convenient and Accurate Mass Measurement

    Li, Yafeng; Zhang, Ning; Zhou, Yueming; Wang, Jianing; Zhang, Yiming; Wang, Jiyun; Xiong, Caiqiao; Chen, Suming; Nie, Zongxiu

    2013-09-01

    Accurate mass information is of great importance in the determination of unknown compounds. An effective and easy-to-control internal mass calibration method will dramatically benefit accurate mass measurement. Here we reported a simple induced dual-nanospray internal calibration device which has the following three advantages: (1) the two sprayers are in the same alternating current field; thus both reference ions and sample ions can be simultaneously generated and recorded. (2) It is very simple and can be easily assembled. Just two metal tubes, two nanosprayers, and an alternating current power supply are included. (3) With the low-flow-rate character and the versatility of nanoESI, this calibration method is capable of calibrating various samples, even untreated complex samples such as urine and other biological samples with small sample volumes. The calibration errors are around 1 ppm in positive ion mode and 3 ppm in negative ion mode with good repeatability. This new internal calibration method opens up new possibilities in the determination of unknown compounds, and it has great potential for the broad applications in biological and chemical analysis.

  1. The accurate location of the injection- induced microearthquakes in German Continental Deep Drilling Program

    涂毅敏; 陈运泰

    2002-01-01

    From August 21, 2000 to October 20, 2000,a fluid injection-induced seismicity experiment has been carried out in the KTB (German Continental Deep Drilling Program). The KTB seismic network recorded more than 2 700 events. Among them 237 events were of high signal-to-noise ratio, and were processed and accurately located. When the events were located, non KTB events were weeded out by Wadati(s method. The standard deviation, mean and median were obtained by Jackknife's technique, and finally the events were accurately located by Geiger(s method so that the mean error is about 0.1 km. No earthquakes with focal depth greater than 9.3 km, which is nearly at the bottom of the hole, were detected. One of the explanation is that at such depths the stress levels may not close to the rock(s frictional strength so that failure could not be induced by the relatively small perturbation in pore pressure. Or at these depths there may be no permeable, well-oriented faults. This depth may be in close proximity to the bottom of the hole to the brittle-ductile transition, even in this relatively stable interior of the interaplate. This phenomenon is explained by the experimental results and geothermal data from the superdeep borehole.

  2. Classificação de sistemas meteorológicos e comparação da precipitação estimada pelo radar e medida pela rede telemétrica na bacia hidrográfica do alto Tietê Classification of meteorological systems and comparison of radar estimated precipitation to the measured by telemetric network in the high Tietê watershed

    Fabrício Daniel Dos Santos Silva

    2009-09-01

    . Five types of morphologic systems were identified: Isolated Convection (CI, Maritime Breeze (BM, Squall Lines (LI, Dispersed Bands (BD, and Cold Fronts (FF. Convection events dominate in spring and summer and Stratiforms in the Autumn and Winter. CI and BM have occurred more frequently between October and March, while the cold fronts from April to September. Dispersed Bands occurred throughout the year, and the lines of instability did not occur only on June and July. A comparison between the telemetric measured precipitation and the Radar estimated one has been done, and a positive bias, of the Radar accumulations for 10, 30 and 60 minutes, was shown on the majority of cases. To integrate the precipitation estimations from the Radar to the telemetric network measurements, by means of an objective statistical analysis, the structures of spatial correlation, for rain accumulation during 15, 30, 60 and 120 minutes for the five types of characterized systems, was obtained from the Radar precipitation fields. The average spatial correlation curves of all the precipitation events of each system were fitted to a sixth order polynomial function. The results indicate significant differences in the spatial structures of the correlation among the precipitation systems.

  3. Records Center Program Billing System

    National Archives and Records Administration — RCPBS supports the Records center programs (RCP) in producing invoices for the storage (NARS-5) and servicing of National Archives and Records Administration’s...

  4. Accurate LAI retrieval method based on PROBA/CHRIS data

    W. Fan

    2009-11-01

    Full Text Available Leaf area index (LAI is one of the key structural variables in terrestrial vegetation ecosystems. Remote sensing offers a chance to derive LAI in regional scales accurately. Variations of background, atmospheric conditions and the anisotropy of canopy reflectance are three factors that can strongly restrain the accuracy of retrieved LAI. Based on the hybrid canopy reflectance model, a new hyperspectral directional second derivative method (DSD is proposed in this paper. This method can estimate LAI accurately through analyzing the canopy anisotropy. The effect of the background can also be effectively removed. So the inversion precision and the dynamic range can be improved remarkably, which has been proved by numerical simulations. As the derivative method is very sensitive to the random noise, we put forward an innovative filtering approach, by which the data can be de-noised in spectral and spatial dimensions synchronously. It shows that the filtering method can remove the random noise effectively; therefore, the method can be performed to the remotely sensed hyperspectral image. The study region is situated in Zhangye, Gansu Province, China; the hyperspectral and multi-angular image of the study region has been acquired from Compact High-Resolution Imaging Spectrometer/Project for On-Board Autonomy (CHRIS/PROBA, on 4 and 14 June 2008. After the pre-processing procedures, the DSD method was applied, and the retrieve LAI was validated by the ground truth of 11 sites. It shows that by applying innovative filtering method, the new LAI inversion method is accurate and effective.

  5. Fast and accurate estimation for astrophysical problems in large databases

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  6. Fast and Accurate Construction of Confidence Intervals for Heritability.

    Schweiger, Regev; Kaufman, Shachar; Laaksonen, Reijo; Kleber, Marcus E; März, Winfried; Eskin, Eleazar; Rosset, Saharon; Halperin, Eran

    2016-06-01

    Estimation of heritability is fundamental in genetic studies. Recently, heritability estimation using linear mixed models (LMMs) has gained popularity because these estimates can be obtained from unrelated individuals collected in genome-wide association studies. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. Existing methods for the construction of confidence intervals and estimators of SEs for REML rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals. Here, we show that the estimation of confidence intervals by state-of-the-art methods is inaccurate, especially when the true heritability is relatively low or relatively high. We further show that these inaccuracies occur in datasets including thousands of individuals. Such biases are present, for example, in estimates of heritability of gene expression in the Genotype-Tissue Expression project and of lipid profiles in the Ludwigshafen Risk and Cardiovascular Health study. We also show that often the probability that the genetic component is estimated as 0 is high even when the true heritability is bounded away from 0, emphasizing the need for accurate confidence intervals. We propose a computationally efficient method, ALBI (accurate LMM-based heritability bootstrap confidence intervals), for estimating the distribution of the heritability estimator and for constructing accurate confidence intervals. Our method can be used as an add-on to existing methods for estimating heritability and variance components, such as GCTA, FaST-LMM, GEMMA, or EMMAX. PMID:27259052

  7. An accurate RLGC circuit model for dual tapered TSV structure

    A fast RLGC circuit model with analytical expression is proposed for the dual tapered through-silicon via (TSV) structure in three-dimensional integrated circuits under different slope angles at the wide frequency region. By describing the electrical characteristics of the dual tapered TSV structure, the RLGC parameters are extracted based on the numerical integration method. The RLGC model includes metal resistance, metal inductance, substrate resistance, outer inductance with skin effect and eddy effect taken into account. The proposed analytical model is verified to be nearly as accurate as the Q3D extractor but more efficient. (semiconductor integrated circuits)

  8. Accurate strand-specific quantification of viral RNA.

    Nicole E Plaskon

    Full Text Available The presence of full-length complements of viral genomic RNA is a hallmark of RNA virus replication within an infected cell. As such, methods for detecting and measuring specific strands of viral RNA in infected cells and tissues are important in the study of RNA viruses. Strand-specific quantitative real-time PCR (ssqPCR assays are increasingly being used for this purpose, but the accuracy of these assays depends on the assumption that the amount of cDNA measured during the quantitative PCR (qPCR step accurately reflects amounts of a specific viral RNA strand present in the RT reaction. To specifically test this assumption, we developed multiple ssqPCR assays for the positive-strand RNA virus o'nyong-nyong (ONNV that were based upon the most prevalent ssqPCR assay design types in the literature. We then compared various parameters of the ONNV-specific assays. We found that an assay employing standard unmodified virus-specific primers failed to discern the difference between cDNAs generated from virus specific primers and those generated through false priming. Further, we were unable to accurately measure levels of ONNV (- strand RNA with this assay when higher levels of cDNA generated from the (+ strand were present. Taken together, these results suggest that assays of this type do not accurately quantify levels of the anti-genomic strand present during RNA virus infectious cycles. However, an assay permitting the use of a tag-specific primer was able to distinguish cDNAs transcribed from ONNV (- strand RNA from other cDNAs present, thus allowing accurate quantification of the anti-genomic strand. We also report the sensitivities of two different detection strategies and chemistries, SYBR(R Green and DNA hydrolysis probes, used with our tagged ONNV-specific ssqPCR assays. Finally, we describe development, design and validation of ssqPCR assays for chikungunya virus (CHIKV, the recent cause of large outbreaks of disease in the Indian Ocean

  9. Accurately Determining the Risks of Rising Sea Level

    Marbaix, Philippe; Nicholls, Robert J.

    2007-10-01

    With the highest density of people and the greatest concentration of economic activity located in the coastal regions, sea level rise is an important concern as the climate continues to warm. Subsequent flooding may potentially disrupt industries, populations, and livelihoods, particularly in the long term if the climate is not quickly stabilized [McGranahan et al., 2007; Tol et al., 2006]. To help policy makers understand these risks, a more accurate description of hazards posed by rising sea levels is needed at the global scale, even though the impacts in specific regions are better known.

  10. Calibration Techniques for Accurate Measurements by Underwater Camera Systems.

    Shortis, Mark

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172

  11. Accurate analysis of EBSD data for phase identification

    Palizdar, Y; Cochrane, R C; Brydson, R; Leary, R; Scott, A J, E-mail: preyp@leeds.ac.u [Institute for Materials Research, University of Leeds, Leeds LS2 9JT UK (United Kingdom)

    2010-07-01

    This paper aims to investigate the reliability of software default settings in the analysis of EBSD results. To study the effect of software settings on the EBSD results, the presence of different phases in high Al steel has been investigated by EBSD. The results show the importance of appropriate automated analysis parameters for valid and reliable phase discrimination. Specifically, the importance of the minimum number of indexed bands and the maximum solution error have been investigated with values of 7-9 and 1.0-1.5{sup 0} respectively, found to be needed for accurate analysis.

  12. Accurate Excited State Geometries within Reduced Subspace TDDFT/TDA.

    Robinson, David

    2014-12-01

    A method for the calculation of TDDFT/TDA excited state geometries within a reduced subspace of Kohn-Sham orbitals has been implemented and tested. Accurate geometries are found for all of the fluorophore-like molecules tested, with at most all valence occupied orbitals and half of the virtual orbitals included but for some molecules even fewer orbitals. Efficiency gains of between 15 and 30% are found for essentially the same level of accuracy as a standard TDDFT/TDA excited state geometry optimization calculation. PMID:26583218

  13. Accurate method of modeling cluster scaling relations in modified gravity

    He, Jian-hua; Li, Baojiu

    2016-06-01

    We propose a new method to model cluster scaling relations in modified gravity. Using a suite of nonradiative hydrodynamical simulations, we show that the scaling relations of accumulated gas quantities, such as the Sunyaev-Zel'dovich effect (Compton-y parameter) and the x-ray Compton-y parameter, can be accurately predicted using the known results in the Λ CDM model with a precision of ˜3 % . This method provides a reliable way to analyze the gas physics in modified gravity using the less demanding and much more efficient pure cold dark matter simulations. Our results therefore have important theoretical and practical implications in constraining gravity using cluster surveys.

  14. Accurate Programming: Thinking about programs in terms of properties

    Walid Taha

    2011-09-01

    Full Text Available Accurate programming is a practical approach to producing high quality programs. It combines ideas from test-automation, test-driven development, agile programming, and other state of the art software development methods. In addition to building on approaches that have proven effective in practice, it emphasizes concepts that help programmers sharpen their understanding of both the problems they are solving and the solutions they come up with. This is achieved by encouraging programmers to think about programs in terms of properties.

  15. Accurate studies on dissociation energies of diatomic molecules

    SUN; WeiGuo; FAN; QunChao

    2007-01-01

    The molecular dissociation energies of some electronic states of hydride and N2 molecules were studied using a parameter-free analytical formula suggested in this study and the algebraic method (AM) proposed recently. The results show that the accurate AM dissociation energies DeAM agree excellently with experimental dissociation energies Deexpt, and that the dissociation energy of an electronic state such as the 23△g state of 7Li2 whose experimental value is not available can be predicted using the new formula.

  16. Pink-Beam, Highly-Accurate Compact Water Cooled Slits

    Advanced Design Consulting, Inc. (ADC) has designed accurate compact slits for applications where high precision is required. The system consists of vertical and horizontal slit mechanisms, a vacuum vessel which houses them, water cooling lines with vacuum guards connected to the individual blades, stepper motors with linear encoders, limit (home position) switches and electrical connections including internal wiring for a drain current measurement system. The total slit size is adjustable from 0 to 15 mm both vertically and horizontally. Each of the four blades are individually controlled and motorized. In this paper, a summary of the design and Finite Element Analysis of the system are presented

  17. Accurate laboratory boresight alignment of transmitter/receiver optical axes

    Martinek, Stephen J.

    1986-01-01

    An apparatus and procedure for the boresight alignment of the transmitter and receiver optical axes of a laser radar system are described. This accurate technique is applicable to both shared and dual aperture systems. A laser autostigmatic cube interferometer (LACI) is utilized to align a paraboloid in autocollimation. The LACI pinhole located at the paraboloid center of curvature becomes the far field receiver track and transmit reference point when illuminated by the transmit beam via a fiber optic pick-off/delay line. Boresight alignment accuracy better than 20 microrad is achievable.

  18. Fast and accurate methods of independent component analysis: A survey

    Tichavský, Petr; Koldovský, Zbyněk

    2011-01-01

    Roč. 47, č. 3 (2011), s. 426-438. ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : Blind source separation * artifact removal * electroencephalogram * audio signal processing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/tichavsky-fast and accurate methods of independent component analysis a survey.pdf

  19. Simple, Accurate, and Robust Nonparametric Blind Super-Resolution

    Shao, Wen-Ze; Elad, Michael

    2015-01-01

    This paper proposes a simple, accurate, and robust approach to single image nonparametric blind Super-Resolution (SR). This task is formulated as a functional to be minimized with respect to both an intermediate super-resolved image and a nonparametric blur-kernel. The proposed approach includes a convolution consistency constraint which uses a non-blind learning-based SR result to better guide the estimation process. Another key component is the unnatural bi-l0-l2-norm regularization imposed...

  20. Accurate Image Super-Resolution Using Very Deep Convolutional Networks

    Kim, Jiwon; Lee, Jung Kwon; Lee, Kyoung Mu

    2015-01-01

    We present a highly accurate single-image super-resolution (SR) method. Our method uses a very deep convolutional network inspired by VGG-net used for ImageNet classification \\cite{simonyan2015very}. We find increasing our network depth shows a significant improvement in accuracy. Our final model uses 20 weight layers. By cascading small filters many times in a deep network structure, contextual information over large image regions is exploited in an efficient way. With very deep networks, ho...

  1. Optimized pulse sequences for the accurate measurement of aortic compliance

    Aortic compliance is potentially an important cardiovascular diagnostic parameter by virtue of a proposed correlation with cardiovascular fitness. Measurement requires cross-sectional images of the ascending and descending aorta in systole and diastole for measurement of aortic lumen areas. Diastolic images have poor vessel- wall delineation due to signal from slow-flowing blood. A comparison has been carried out using presaturation (SAT) RF pulses, transparent RF pulses, and flow-compensated gradients in standard pulse sequences to improve vessel-wall delineation in diastole. Properly timed SAT pulses provide the most consistent vessel-wall delineation and the most accurate measurement of aortic compliance

  2. Electronic surgical record management.

    Rockman, Justin

    2010-01-01

    This paper explores the challenges surgical practices face in coordinating surgeries and how the electronic surgical record management (ESRM) approach to surgical coordination can solve these problems and improve efficiency. Surgical practices continue to experience costly inefficiencies when managing surgical coordination. Application software like practice management and electronic health record systems have enabled practices to "go digital" for their administrative, financial, and clinical data. However, surgical coordination is still a manual and labor-intensive process. Surgical practices need to create a central and secure record of their surgeries. When surgical data are inputted once only and stored in a central repository, the data are transformed into active information that can be outputted to any form, letter, calendar, or report. ESRM is a new approach to surgical coordination. It enables surgical practices to automate and streamline their processes, reduce costs, and ensure that patients receive the best possible care. PMID:20480775

  3. Records via probability theory

    Ahsanullah, Mohammad

    2015-01-01

    A lot of statisticians, actuarial mathematicians, reliability engineers, meteorologists, hydrologists, economists. Business and sport analysts deal with records which play important roles in various fields of statistics and its application. This book enables a reader to check his/her level of understanding of the theory of record values. We give basic formulae which are more important in the theory and present a lot of examples which illustrate the theoretical statements. For a beginner in record statistics, as well as for graduate students the study of our book needs the basic knowledge of the subject. A more advanced reader can use our book to polish his/her knowledge. An upgraded list of bibliography which will help a reader to enrich his/her theoretical knowledge and widen the experience of dealing with ordered observations, is also given in the book.

  4. Isomerism of Cyanomethanimine: Accurate Structural, Energetic, and Spectroscopic Characterization.

    Puzzarini, Cristina

    2015-11-25

    The structures, relative stabilities, and rotational and vibrational parameters of the Z-C-, E-C-, and N-cyanomethanimine isomers have been evaluated using state-of-the-art quantum-chemical approaches. Equilibrium geometries have been calculated by means of a composite scheme based on coupled-cluster calculations that accounts for the extrapolation to the complete basis set limit and core-correlation effects. The latter approach is proved to provide molecular structures with an accuracy of 0.001-0.002 Å and 0.05-0.1° for bond lengths and angles, respectively. Systematically extrapolated ab initio energies, accounting for electron correlation through coupled-cluster theory, including up to single, double, triple, and quadruple excitations, and corrected for core-electron correlation and anharmonic zero-point vibrational energy, have been used to accurately determine relative energies and the Z-E isomerization barrier with an accuracy of about 1 kJ/mol. Vibrational and rotational spectroscopic parameters have been investigated by means of hybrid schemes that allow us to obtain rotational constants accurate to about a few megahertz and vibrational frequencies with a mean absolute error of ∼1%. Where available, for all properties considered, a very good agreement with experimental data has been observed. PMID:26529434

  5. Accurate phylogenetic classification of DNA fragments based onsequence composition

    McHardy, Alice C.; Garcia Martin, Hector; Tsirigos, Aristotelis; Hugenholtz, Philip; Rigoutsos, Isidore

    2006-05-01

    Metagenome studies have retrieved vast amounts of sequenceout of a variety of environments, leading to novel discoveries and greatinsights into the uncultured microbial world. Except for very simplecommunities, diversity makes sequence assembly and analysis a verychallenging problem. To understand the structure a 5 nd function ofmicrobial communities, a taxonomic characterization of the obtainedsequence fragments is highly desirable, yet currently limited mostly tothose sequences that contain phylogenetic marker genes. We show that forclades at the rank of domain down to genus, sequence composition allowsthe very accurate phylogenetic 10 characterization of genomic sequence.We developed a composition-based classifier, PhyloPythia, for de novophylogenetic sequence characterization and have trained it on adata setof 340 genomes. By extensive evaluation experiments we show that themethodis accurate across all taxonomic ranks considered, even forsequences that originate fromnovel organisms and are as short as 1kb.Application to two metagenome datasets 15 obtained from samples ofphosphorus-removing sludge showed that the method allows the accurateclassification at genus level of most sequence fragments from thedominant populations, while at the same time correctly characterizingeven larger parts of the samples at higher taxonomic levels.

  6. Canadian consumer issues in accurate and fair electricity metering

    The Public Interest Advocacy Centre (PIAC), located in Ottawa, participates in regulatory proceedings concerning electricity and natural gas to support public and consumer interest. PIAC provides legal representation, research and policy support and public advocacy. A study aimed toward the determination of the issues at stake for residential electricity consumers in the provision of fair and accurate electricity metering, was commissioned by Measurement Canada in consultation with Industry Canada's Consumer Affairs. The metering of electricity must be carried out in a fair and efficient manner for all residential consumers. The Electricity, Gas and Inspection Act was developed to ensure compliance with standards for measuring instrumentation. The accurate metering of electricity through the distribution systems for electricity in Canada represents the main focus of this study and report. The role played by Measurement Canada and the increased efficiencies of service delivery by Measurement Canada or the changing of electricity market conditions are of special interest. The role of Measurement Canada was explained, as were the concerns of residential consumers. A comparison was then made between the interests of residential consumers and those of commercial and industrial electricity consumers in electricity metering. Selected American and Commonwealth jurisdictions were reviewed in light of their electricity metering practices. A section on compliance and conflict resolution was included, in addition to a section on the use of voluntary codes for compliance and conflict resolution

  7. How accurately can 21cm tomography constrain cosmology?

    Mao, Yi; Tegmark, Max; McQuinn, Matthew; Zaldarriaga, Matias; Zahn, Oliver

    2008-07-01

    There is growing interest in using 3-dimensional neutral hydrogen mapping with the redshifted 21 cm line as a cosmological probe. However, its utility depends on many assumptions. To aid experimental planning and design, we quantify how the precision with which cosmological parameters can be measured depends on a broad range of assumptions, focusing on the 21 cm signal from 6noise, to uncertainties in the reionization history, and to the level of contamination from astrophysical foregrounds. We derive simple analytic estimates for how various assumptions affect an experiment’s sensitivity, and we find that the modeling of reionization is the most important, followed by the array layout. We present an accurate yet robust method for measuring cosmological parameters that exploits the fact that the ionization power spectra are rather smooth functions that can be accurately fit by 7 phenomenological parameters. We find that for future experiments, marginalizing over these nuisance parameters may provide constraints almost as tight on the cosmology as if 21 cm tomography measured the matter power spectrum directly. A future square kilometer array optimized for 21 cm tomography could improve the sensitivity to spatial curvature and neutrino masses by up to 2 orders of magnitude, to ΔΩk≈0.0002 and Δmν≈0.007eV, and give a 4σ detection of the spectral index running predicted by the simplest inflation models.

  8. Accurate measurement of streamwise vortices using dual-plane PIV

    Waldman, Rye M.; Breuer, Kenneth S. [Brown University, School of Engineering, Providence, RI (United States)

    2012-11-15

    Low Reynolds number aerodynamic experiments with flapping animals (such as bats and small birds) are of particular interest due to their application to micro air vehicles which operate in a similar parameter space. Previous PIV wake measurements described the structures left by bats and birds and provided insight into the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions based on said measurements. The highly three-dimensional and unsteady nature of the flows associated with flapping flight are major challenges for accurate measurements. The challenge of animal flight measurements is finding small flow features in a large field of view at high speed with limited laser energy and camera resolution. Cross-stream measurement is further complicated by the predominately out-of-plane flow that requires thick laser sheets and short inter-frame times, which increase noise and measurement uncertainty. Choosing appropriate experimental parameters requires compromise between the spatial and temporal resolution and the dynamic range of the measurement. To explore these challenges, we do a case study on the wake of a fixed wing. The fixed model simplifies the experiment and allows direct measurements of the aerodynamic forces via load cell. We present a detailed analysis of the wake measurements, discuss the criteria for making accurate measurements, and present a solution for making quantitative aerodynamic load measurements behind free-flyers. (orig.)

  9. Accurate 3D quantification of the bronchial parameters in MDCT

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  10. Machine learning of parameters for accurate semiempirical quantum chemical calculations

    We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempirical OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules

  11. Cerebral fat embolism: Use of MR spectroscopy for accurate diagnosis

    Laxmi Kokatnur

    2015-01-01

    Full Text Available Cerebral fat embolism (CFE is an uncommon but serious complication following orthopedic procedures. It usually presents with altered mental status, and can be a part of fat embolism syndrome (FES if associated with cutaneous and respiratory manifestations. Because of the presence of other common factors affecting the mental status, particularly in the postoperative period, the diagnosis of CFE can be challenging. Magnetic resonance imaging (MRI of brain typically shows multiple lesions distributed predominantly in the subcortical region, which appear as hyperintense lesions on T2 and diffusion weighted images. Although the location offers a clue, the MRI findings are not specific for CFE. Watershed infarcts, hypoxic encephalopathy, disseminated infections, demyelinating disorders, diffuse axonal injury can also show similar changes on MRI of brain. The presence of fat in these hyperintense lesions, identified by MR spectroscopy as raised lipid peaks will help in accurate diagnosis of CFE. Normal brain tissue or conditions producing similar MRI changes will not show any lipid peak on MR spectroscopy. We present a case of CFE initially misdiagnosed as brain stem stroke based on clinical presentation and cranial computed tomography (CT scan, and later, MR spectroscopy elucidated the accurate diagnosis.

  12. The economic value of accurate wind power forecasting to utilities

    Watson, S.J. [Rutherford Appleton Lab., Oxfordshire (United Kingdom); Giebel, G.; Joensen, A. [Risoe National Lab., Dept. of Wind Energy and Atmospheric Physics, Roskilde (Denmark)

    1999-03-01

    With increasing penetrations of wind power, the need for accurate forecasting is becoming ever more important. Wind power is by its very nature intermittent. For utility schedulers this presents its own problems particularly when the penetration of wind power capacity in a grid reaches a significant level (>20%). However, using accurate forecasts of wind power at wind farm sites, schedulers are able to plan the operation of conventional power capacity to accommodate the fluctuating demands of consumers and wind farm output. The results of a study to assess the value of forecasting at several potential wind farm sites in the UK and in the US state of Iowa using the Reading University/Rutherford Appleton Laboratory National Grid Model (NGM) are presented. The results are assessed for different types of wind power forecasting, namely: persistence, optimised numerical weather prediction or perfect forecasting. In particular, it will shown how the NGM has been used to assess the value of numerical weather prediction forecasts from the Danish Meteorological Institute model, HIRLAM, and the US Nested Grid Model, which have been `site tailored` by the use of the linearized flow model WA{sup s}P and by various Model output Statistics (MOS) and autoregressive techniques. (au)

  13. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm-1). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10-9 began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is α-1 = 137.03599884 (91) with a relative uncertainty of 6.7*10-9. The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  14. KFM: a homemade yet accurate and dependable fallout meter

    The KFM is a homemade fallout meter that can be made using only materials, tools, and skills found in millions of American homes. It is an accurate and dependable electroscope-capacitor. The KFM, in conjunction with its attached table and a watch, is designed for use as a rate meter. Its attached table relates observed differences in the separations of its two leaves (before and after exposures at the listed time intervals) to the dose rates during exposures of these time intervals. In this manner dose rates from 30 mR/hr up to 43 R/hr can be determined with an accuracy of +-25%. A KFM can be charged with any one of the three expedient electrostatic charging devices described. Due to the use of anhydrite (made by heating gypsum from wallboard) inside a KFM and the expedient ''dry-bucket'' in which it can be charged when the air is very humid, this instrument always can be charged and used to obtain accurate measurements of gamma radiation no matter how high the relative humidity. The step-by-step illustrated instructions for making and using a KFM are presented. These instructions have been improved after each successive field test. The majority of the untrained test families, adequately motivated by cash bonuses offered for success and guided only by these written instructions, have succeeded in making and using a KFM

  15. Accurate location estimation of moving object In Wireless Sensor network

    Vinay Bhaskar Semwal

    2011-12-01

    Full Text Available One of the central issues in wirless sensor networks is track the location, of moving object which have overhead of saving data, an accurate estimation of the target location of object with energy constraint .We do not have any mechanism which control and maintain data .The wireless communication bandwidth is also very limited. Some field which is using this technique are flood and typhoon detection, forest fire detection, temperature and humidity and ones we have these information use these information back to a central air conditioning and ventilation.In this research paper, we propose protocol based on the prediction and adaptive based algorithm which is using less sensor node reduced by an accurate estimation of the target location. We had shown that our tracking method performs well in terms of energy saving regardless of mobility pattern of the mobile target. We extends the life time of network with less sensor node. Once a new object is detected, a mobile agent will be initiated to track the roaming path of the object.

  16. Accurate interlaminar stress recovery from finite element analysis

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  17. Accurate rest frequencies of methanol maser and dark cloud lines

    Müller, H S P; Maeder, H

    2004-01-01

    We report accurate laboratory measurements of selected methanol transition frequencies between 0.834 and 230 GHz in order to facilitate astronomical velocity analyses. New data have been obtained between 10 and 27 GHz and between 60 and 119 GHz. Emphasis has been put on known or potential interstellar maser lines as well as on transitions suitable for the investigation of cold dark clouds. Because of the narrow line widths (<0.5 kms-1) of maser lines and lines detected in dark molecular clouds, accurate frequencies are needed for comparison of the velocities of different methanol lines with each other as well as with lines from other species. In particular, frequencies for a comprehensive set of transitions are given which, because of their low level energies (< 20 cm-1 or 30 K) are potentially detectable in cold clouds. Global Hamiltonian fits generally do not yet yield the required accuracy. Additionally, we report transition frequencies for other lines that may be used to test and to improve existing...

  18. Accurate Multisteps Traffic Flow Prediction Based on SVM

    Zhang Mingheng

    2013-01-01

    Full Text Available Accurate traffic flow prediction is prerequisite and important for realizing intelligent traffic control and guidance, and it is also the objective requirement for intelligent traffic management. Due to the strong nonlinear, stochastic, time-varying characteristics of urban transport system, artificial intelligence methods such as support vector machine (SVM are now receiving more and more attentions in this research field. Compared with the traditional single-step prediction method, the multisteps prediction has the ability that can predict the traffic state trends over a certain period in the future. From the perspective of dynamic decision, it is far important than the current traffic condition obtained. Thus, in this paper, an accurate multi-steps traffic flow prediction model based on SVM was proposed. In which, the input vectors were comprised of actual traffic volume and four different types of input vectors were compared to verify their prediction performance with each other. Finally, the model was verified with actual data in the empirical analysis phase and the test results showed that the proposed SVM model had a good ability for traffic flow prediction and the SVM-HPT model outperformed the other three models for prediction.

  19. Automated Weather Observation Recorder

    Alaranta, Simo

    2016-01-01

    Many fields, including aeronautics and transportation, require accurate real-time weather data for predicting hazardous conditions. These fields utilize present weather information, since precipitation and reduced visibility affect their operational safety. Due to variation in the severity of the conditions arising from different precipitation types, it is vital to reliably identify the type of precipitation. Automatic systems have increasingly been used to classify precipitations, pa...

  20. 12 CFR 404.19 - Request for accounting of record disclosures.

    2010-01-01

    ... maintain an accurate accounting of the date, nature, and purpose of each external disclosure of a record... the accounting relates to a disclosure made: (1) To an employee within the agency; (2) Under the FOIA... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Request for accounting of record...

  1. 5 CFR 2413.7 - Transcripts, recordings or minutes of closed meeting; public availability; retention.

    2010-01-01

    ... meeting, or portion thereof, there shall also be maintained a complete transcript or electronic recording... lieu of a transcript or electronic recording, maintain a set of minutes fully and accurately...' vote on each rollcall vote. (b) The agency shall make promptly available to the public copies...

  2. A Miniature Recording Cardiotachometer

    Zsombor-Murray, Paul J; Vroomen, Louis J.; Hendriksen, Nils Thedin

    1981-01-01

    The design of a miniature, recording cardiotachometer is described. It is simple and can store digital data. Bench and field tests, using a hand-held display, are presented. Construction and principles of operation are discussed. Applications, with performing athlete subjects, are outlined.

  3. A Miniature Recording Cardiotachometer

    Zsombor-Murray, Paul J; Vroomen, Louis J.; Hendriksen, Nils Thedin

    1981-01-01

    The design of a miniature, recording cardiotachometer is described. It is simple and can store digital data. Bench and field tests, using a hand-held display, are presented. Construction and principles of operation are discussed. Applications, with performing athlete subjects, are outlined....

  4. Chess endgame records

    Haworth, Guy

    2013-01-01

    This dataset is an evolving collection of chess endgame record scenarios illustrating the extremes of the game including the deepest positions in various metrics. Optimal lines in consonant strategies are given and annotated. The two attached files are (a) a pgn file of the chess positions and lines, and (b) an annotated version of the pgn file.

  5. Theory of Magnetic Recording

    Bertram, H. Neal

    1994-04-01

    This book is designed to give the student a fundamental, in-depth understanding of all the essential features of the magnetic recording process for both high density disk and tape recording. The book provides a thorough grounding in four basic areas of magnetic recording: structure and fields of heads and media, the replay process, the recording process, and medium noise analysis. Besides the fundamental issues, key systems questions of nonlinearities, overwrite, side track phenomena, error rate estimates as well as comparisons of MR and inductive heads will be discussed. The student will be able to use the information presented to design and analyze key experiments for head and medium evaluation as well as for overall system performance decisions. A parallel treatment of time and frequency response will enable the student to evaluate signal processing schemes. The book is intended either for senior-year undergraduates or first-year graduates. It assumes that the reader has had basic introductory electrical engineering or physics courses such as electricity and magnetism and applied mathematics.

  6. Governors Cite Education Records

    McNeil, Michele

    2007-01-01

    The three current presidential hopefuls with experience as state governors have records on education that offer voters an unusually detailed preview of what the nation's schools might expect if any of the three should win the White House next year. Those candidates--New Mexico Governor Bill Richardson, on the Democratic side, and former Governors…

  7. The KFM, A Homemade Yet Accurate and Dependable Fallout Meter

    Kearny, C.H.

    2001-11-20

    The KFM is a homemade fallout meter that can be made using only materials, tools, and skills found in millions of American homes. It is an accurate and dependable electroscope-capacitor. The KFM, in conjunction with its attached table and a watch, is designed for use as a rate meter. Its attached table relates observed differences in the separations of its two leaves (before and after exposures at the listed time intervals) to the dose rates during exposures of these time intervals. In this manner dose rates from 30 mR/hr up to 43 R/hr can be determined with an accuracy of {+-}25%. A KFM can be charged with any one of the three expedient electrostatic charging devices described. Due to the use of anhydrite (made by heating gypsum from wallboard) inside a KFM and the expedient ''dry-bucket'' in which it can be charged when the air is very humid, this instrument always can be charged and used to obtain accurate measurements of gamma radiation no matter how high the relative humidity. The heart of this report is the step-by-step illustrated instructions for making and using a KFM. These instructions have been improved after each successive field test. The majority of the untrained test families, adequately motivated by cash bonuses offered for success and guided only by these written instructions, have succeeded in making and using a KFM. NOTE: ''The KFM, A Homemade Yet Accurate and Dependable Fallout Meter'', was published by Oak Ridge National Laboratory report in1979. Some of the materials originally suggested for suspending the leaves of the Kearny Fallout Meter (KFM) are no longer available. Because of changes in the manufacturing process, other materials (e.g., sewing thread, unwaxed dental floss) may not have the insulating capability to work properly. Oak Ridge National Laboratory has not tested any of the suggestions provided in the preface of the report, but they have been used by other groups. When using these

  8. The importance of accurate meteorological input fields and accurate planetary boundary layer parameterizations, tested against ETEX-1

    Atmospheric transport of air pollutants is, in principle, a well understood process. If information about the state of the atmosphere is given in all details (infinitely accurate information about wind speed, etc.) and infinitely fast computers are available then the advection equation could in principle be solved exactly. This is, however, not the case: discretization of the equations and input data introduces some uncertainties and errors in the results. Therefore many different issues have to be carefully studied in order to diminish these uncertainties and to develop an accurate transport model. Some of these are e.g. the numerical treatment of the transport equation, accuracy of the mean meteorological input fields and parameterizations of sub-grid scale phenomena (as e.g. parameterizations of the 2 nd and higher order turbulence terms in order to reach closure in the perturbation equation). A tracer model for studying transport and dispersion of air pollution caused by a single but strong source is under development. The model simulations from the first ETEX release illustrate the differences caused by using various analyzed fields directly in the tracer model or using a meteorological driver. Also different parameterizations of the mixing height and the vertical exchange are compared. (author)

  9. Accurate determination of phase arrival times using autoregressive likelihood estimation

    G. Kvaerna

    1994-01-01

    We have investigated the potential automatic use of an onset picker based on autoregressive likelihood estimation. Both a single component version and a three component version of this method have been tested on data from events located in the Khibiny Massif of the Kola peninsula, recorded at the Apatity array, the Apatity three component station and the ARCESS array. Using this method, we have been able to estimate onset times to an accuracy (standard deviation) of about 0.05 s for P-phases ...

  10. A new accurate pill recognition system using imprint information

    Chen, Zhiyuan; Kamata, Sei-ichiro

    2013-12-01

    Great achievements in modern medicine benefit human beings. Also, it has brought about an explosive growth of pharmaceuticals that current in the market. In daily life, pharmaceuticals sometimes confuse people when they are found unlabeled. In this paper, we propose an automatic pill recognition technique to solve this problem. It functions mainly based on the imprint feature of the pills, which is extracted by proposed MSWT (modified stroke width transform) and described by WSC (weighted shape context). Experiments show that our proposed pill recognition method can reach an accurate rate up to 92.03% within top 5 ranks when trying to classify more than 10 thousand query pill images into around 2000 categories.

  11. Spectropolarimetrically accurate magnetohydrostatic sunspot model for forward modelling in helioseismology

    Przybylski, D; Cally, P S

    2015-01-01

    We present a technique to construct a spectropolarimetrically accurate magneto-hydrostatic model of a large-scale solar magnetic field concentration, mimicking a sunspot. Using the constructed model we perform a simulation of acoustic wave propagation, conversion and absorption in the solar interior and photosphere with the sunspot embedded into it. With the $6173\\mathrm{\\AA}$ magnetically sensitive photospheric absorption line of neutral iron, we calculate observable quantities such as continuum intensities, Doppler velocities, as well as full Stokes vector for the simulation at various positions at the solar disk, and analyse the influence of non-locality of radiative transport in the solar photosphere on helioseismic measurements. Bisector shapes were used to perform multi-height observations. The differences in acoustic power at different heights within the line formation region at different positions at the solar disk were simulated and characterised. An increase in acoustic power in the simulated observ...

  12. Analytical method to accurately predict LMFBR core flow distribution

    An accurate and detailed representation of the flow distribution in LMFBR cores is very important as the starting point and basis of the thermal and structural core design. Previous experience indicated that the steady state and transient core design is as good as the core orificing; thus, a new orificing philosophy satisfying a priori all design constraints was developd. However, optimized orificing is a necessary, but not sufficient condition for achieving the optimum core flow distribution, which is affected by the hydraulic characteristics of the remainder of the primary system. Consequently, an analytical model of the overall primary system was developed, resulting in the CATFISH computer code, which, even though specifically written for LMFBRs, can be used for any reactor employing ducted assemblies

  13. Accurate performance analysis of opportunistic decode-and-forward relaying

    Tourki, Kamel

    2011-07-01

    In this paper, we investigate an opportunistic relaying scheme where the selected relay assists the source-destination (direct) communication. In our study, we consider a regenerative opportunistic relaying scheme in which the direct path may be considered unusable, and the destination may use a selection combining technique. We first derive the exact statistics of each hop, in terms of probability density function (PDF). Then, the PDFs are used to determine accurate closed form expressions for end-to-end outage probability for a transmission rate R. Furthermore, we evaluate the asymptotical performance analysis and the diversity order is deduced. Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over different network architectures. © 2011 IEEE.

  14. Accurate numerical solution of compressible, linear stability equations

    Malik, M. R.; Chuang, S.; Hussaini, M. Y.

    1982-01-01

    The present investigation is concerned with a fourth order accurate finite difference method and its application to the study of the temporal and spatial stability of the three-dimensional compressible boundary layer flow on a swept wing. This method belongs to the class of compact two-point difference schemes discussed by White (1974) and Keller (1974). The method was apparently first used for solving the two-dimensional boundary layer equations. Attention is given to the governing equations, the solution technique, and the search for eigenvalues. A general purpose subroutine is employed for solving a block tridiagonal system of equations. The computer time can be reduced significantly by exploiting the special structure of two matrices.

  15. Accurate volume measurement system for plutonium nitrate solution

    An accurate volume measurement system for a large amount of plutonium nitrate solution stored in a reprocessing or a conversion plant has been developed at the Plutonium Conversion Development Facility (PCDF) in the Power Reactor and Nuclear Fuel Development Corp. (PNC) Tokai Works. A pair of differential digital quartz pressure transducers is utilized in the volume measurement system. To obtain high accuracy, it is important that the non-linearity of the transducer is minimized within the measurement range, the zero point is stabilized, and the damping property of the pneumatic line is designed to minimize pressure oscillation. The accuracy of the pressure measurement can always be within 2Pa with re-calibration once a year. In the PCDF, the overall uncertainty of the volume measurement has been evaluated to be within 0.2 %. This system has been successfully applied to the Japanese government's and IAEA's routine inspection since 1984. (author)

  16. Accurate bond dissociation energies (D 0) for FHF- isotopologues

    Stein, Christopher; Oswald, Rainer; Sebald, Peter; Botschwina, Peter; Stoll, Hermann; Peterson, Kirk A.

    2013-09-01

    Accurate bond dissociation energies (D 0) are determined for three isotopologues of the bifluoride ion (FHF-). While the zero-point vibrational contributions are taken from our previous work (P. Sebald, A. Bargholz, R. Oswald, C. Stein, P. Botschwina, J. Phys. Chem. A, DOI: 10.1021/jp3123677), the equilibrium dissociation energy (D e ) of the reaction ? was obtained by a composite method including frozen-core (fc) CCSD(T) calculations with basis sets up to cardinal number n = 7 followed by extrapolation to the complete basis set limit. Smaller terms beyond fc-CCSD(T) cancel each other almost completely. The D 0 values of FHF-, FDF-, and FTF- are predicted to be 15,176, 15,191, and 15,198 cm-1, respectively, with an uncertainty of ca. 15 cm-1.

  17. Efficient and Accurate Indoor Localization Using Landmark Graphs

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  18. An Integrative Approach to Accurate Vehicle Logo Detection

    Hao Pan

    2013-01-01

    required for many applications in intelligent transportation systems and automatic surveillance. The task is challenging considering the small target of logos and the wide range of variability in shape, color, and illumination. A fast and reliable vehicle logo detection approach is proposed following visual attention mechanism from the human vision. Two prelogo detection steps, that is, vehicle region detection and a small RoI segmentation, rapidly focalize a small logo target. An enhanced Adaboost algorithm, together with two types of features of Haar and HOG, is proposed to detect vehicles. An RoI that covers logos is segmented based on our prior knowledge about the logos’ position relative to license plates, which can be accurately localized from frontal vehicle images. A two-stage cascade classier proceeds with the segmented RoI, using a hybrid of Gentle Adaboost and Support Vector Machine (SVM, resulting in precise logo positioning. Extensive experiments were conducted to verify the efficiency of the proposed scheme.

  19. Accurate finite difference methods for time-harmonic wave propagation

    Harari, Isaac; Turkel, Eli

    1994-01-01

    Finite difference methods for solving problems of time-harmonic acoustics are developed and analyzed. Multidimensional inhomogeneous problems with variable, possibly discontinuous, coefficients are considered, accounting for the effects of employing nonuniform grids. A weighted-average representation is less sensitive to transition in wave resolution (due to variable wave numbers or nonuniform grids) than the standard pointwise representation. Further enhancement in method performance is obtained by basing the stencils on generalizations of Pade approximation, or generalized definitions of the derivative, reducing spurious dispersion, anisotropy and reflection, and by improving the representation of source terms. The resulting schemes have fourth-order accurate local truncation error on uniform grids and third order in the nonuniform case. Guidelines for discretization pertaining to grid orientation and resolution are presented.

  20. Accurate Modeling of Buck Converters with Magnetic-Core Inductors

    Astorino, Antonio; Antonini, Giulio; Swaminathan, Madhavan

    2015-01-01

    In this paper, a modeling approach for buck converters with magnetic-core inductors is presented. Due to the high nonlinearity of magnetic materials, the frequency domain analysis of such circuits is not suitable for an accurate description of their behaviour. Hence, in this work, a timedomain model...... of buck converters with magnetic-core inductors in a SimulinkR environment is proposed. As an example, the presented approach is used to simulate an eight-phase buck converter. The simulation results show that an unexpected system behaviour in terms of current ripple amplitude needs the inductor core...

  1. Accurate Parallel Algorithm for Adini Nonconforming Finite Element

    罗平; 周爱辉

    2003-01-01

    Multi-parameter asymptotic expansions are interesting since they justify the use of multi-parameter extrapolation which can be implemented in parallel and are well studied in many papers for the conforming finite element methods. For the nonconforming finite element methods, however, the work of the multi-parameter asymptotic expansions and extrapolation have seldom been found in the literature. This paper considers the solution of the biharmonic equation using Adini nonconforming finite elements and reports new results for the multi-parameter asymptotic expansions and extrapolation. The Adini nonconforming finite element solution of the biharmonic equation is shown to have a multi-parameter asymptotic error expansion and extrapolation. This expansion and a multi-parameter extrapolation technique were used to develop an accurate approximation parallel algorithm for the biharmonic equation. Finally, numerical results have verified the extrapolation theory.

  2. Accurate object tracking system by integrating texture and depth cues

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  3. Accurate Derivative Evaluation for any Grad-Shafranov Solver

    Ricketson, L F; Rachh, M; Freidberg, J P

    2015-01-01

    We present a numerical scheme that can be combined with any fixed boundary finite element based Poisson or Grad-Shafranov solver to compute the first and second partial derivatives of the solution to these equations with the same order of convergence as the solution itself. At the heart of our scheme is an efficient and accurate computation of the Dirichlet to Neumann map through the evaluation of a singular volume integral and the solution to a Fredholm integral equation of the second kind. Our numerical method is particularly useful for magnetic confinement fusion simulations, since it allows the evaluation of quantities such as the magnetic field, the parallel current density and the magnetic curvature with much higher accuracy than has been previously feasible on the affordable coarse grids that are usually implemented.

  4. Accurate derivative evaluation for any Grad-Shafranov solver

    Ricketson, L. F.; Cerfon, A. J.; Rachh, M.; Freidberg, J. P.

    2016-01-01

    We present a numerical scheme that can be combined with any fixed boundary finite element based Poisson or Grad-Shafranov solver to compute the first and second partial derivatives of the solution to these equations with the same order of convergence as the solution itself. At the heart of our scheme is an efficient and accurate computation of the Dirichlet to Neumann map through the evaluation of a singular volume integral and the solution to a Fredholm integral equation of the second kind. Our numerical method is particularly useful for magnetic confinement fusion simulations, since it allows the evaluation of quantities such as the magnetic field, the parallel current density and the magnetic curvature with much higher accuracy than has been previously feasible on the affordable coarse grids that are usually implemented.

  5. Accurate monitoring of large aligned objects with videometric techniques

    Klumb, F; Grussenmeyer, P

    1999-01-01

    This paper describes a new videometric technique designed to monitor the deformations and misalignments of large vital components in the centre of a future particle detector. It relies on a geometrical principle called "reciprocal collimation" of two CCD cameras: the combination of the video devices in pair gives rise to a network of well located reference lines that surround the object to be surveyed. Each observed point, which in practice is a bright point-like light- source, is accurately located with respect to this network of neighbouring axes. Adjustment calculations provide the three- dimensional position of the object fitted with various light-sources. An experimental test-bench, equipped with four cameras, has corroborated the precision predicted by previous simulations of the system. (11 refs).

  6. Methods for Accurate Free Flight Measurement of Drag Coefficients

    Courtney, Elya; Courtney, Michael

    2015-01-01

    This paper describes experimental methods for free flight measurement of drag coefficients to an accuracy of approximately 1%. There are two main methods of determining free flight drag coefficients, or equivalent ballistic coefficients: 1) measuring near and far velocities over a known distance and 2) measuring a near velocity and time of flight over a known distance. Atmospheric conditions must also be known and nearly constant over the flight path. A number of tradeoffs are important when designing experiments to accurately determine drag coefficients. The flight distance must be large enough so that the projectile's loss of velocity is significant compared with its initial velocity and much larger than the uncertainty in the near and/or far velocity measurements. On the other hand, since drag coefficients and ballistic coefficients both depend on velocity, the change in velocity over the flight path should be small enough that the average drag coefficient over the path (which is what is really determined)...

  7. Natural orbital expansions of highly accurate three-body wavefunctions

    Natural orbital expansions are considered for highly accurate three-body wavefunctions written in the relative coordinates r32, r31 and r21. Our present method is applied to the ground S(L = 0) -state wavefunctions of the Ps- and inftyH- ions. Our best variational energies computed herein for these systems are E(Ps-) = -0.262 005 070 232 980 107 7666 au and E(inftyH- =-0.5277510165443771965865 au, respectively. The variational wavefunctions determined for these systems contain between 2000 and 4200 exponential basis functions. In general, the natural orbital expansions of these functions are compact and rapidly convergent functions, which are represented as linear combinations of some relatively simple functions. The natural orbitals can be very useful in various applications, including photodetachment and scattering problems

  8. Fast and accurate automated cell boundary determination for fluorescence microscopy

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  9. Interactive Isogeometric Volume Visualization with Pixel-Accurate Geometry.

    Fuchs, Franz G; Hjelmervik, Jon M

    2016-02-01

    A recent development, called isogeometric analysis, provides a unified approach for design, analysis and optimization of functional products in industry. Traditional volume rendering methods for inspecting the results from the numerical simulations cannot be applied directly to isogeometric models. We present a novel approach for interactive visualization of isogeometric analysis results, ensuring correct, i.e., pixel-accurate geometry of the volume including its bounding surfaces. The entire OpenGL pipeline is used in a multi-stage algorithm leveraging techniques from surface rendering, order-independent transparency, as well as theory and numerical methods for ordinary differential equations. We showcase the efficiency of our approach on different models relevant to industry, ranging from quality inspection of the parametrization of the geometry, to stress analysis in linear elasticity, to visualization of computational fluid dynamics results. PMID:26731454

  10. Accurate macroscale modelling of spatial dynamics in multiple dimensions

    Roberts, A ~J; Bunder, J ~E

    2011-01-01

    Developments in dynamical systems theory provides new support for the macroscale modelling of pdes and other microscale systems such as Lattice Boltzmann, Monte Carlo or Molecular Dynamics simulators. By systematically resolving subgrid microscale dynamics the dynamical systems approach constructs accurate closures of macroscale discretisations of the microscale system. Here we specifically explore reaction-diffusion problems in two spatial dimensions as a prototype of generic systems in multiple dimensions. Our approach unifies into one the modelling of systems by a type of finite elements, and the `equation free' macroscale modelling of microscale simulators efficiently executing only on small patches of the spatial domain. Centre manifold theory ensures that a closed model exist on the macroscale grid, is emergent, and is systematically approximated. Dividing space either into overlapping finite elements or into spatially separated small patches, the specially crafted inter-element\\slash patch coupling als...

  11. How accurately can digital images depict conventional radiographs

    The purpose of this paper is to investigate how accurately the video image of a digitized chest radiograph can depict normal anatomic configurations of thoracic organs seen on a conventional radiograph. These configurations are important to diagnosis of diseases of the chest. Chest radiographs of 50 individuals diagnosed as normal were analyzed. Three chest physicians and one radiologist reviewed 50 pairs of digitized images (digitized in 0.125-mm pixel size, 10-bit gray scale, displayed on 1,024 x 1.536, 8-bit gray scale) constructed and conventional films. The visibility of eight structures (spinal process, trachea, right and left main bronchus, anterior tip of right fourth rib, vessels behind diaphragm and cardiac shadow, and descending aorta behind heart) was graded into five levels of confidence

  12. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  13. Can a surgeon drill accurately at a specified angle?

    Brioschi, Valentina; Cook, Jodie; Arthurs, Gareth I

    2016-01-01

    Objectives To investigate whether a surgeon can drill accurately a specified angle and whether surgeon experience, task repetition, drill bit size and perceived difficulty influence drilling angle accuracy. Methods The sample population consisted of final-year students (n=25), non-specialist veterinarians (n=22) and board-certified orthopaedic surgeons (n=8). Each participant drilled a hole twice in a horizontal oak plank at 30°, 45°, 60°, 80°, 85° and 90° angles with either a 2.5  or a 3.5 mm drill bit. Participants then rated the perceived difficulty to drill each angle. The true angle of each hole was measured using a digital goniometer. Results Greater drilling accuracy was achieved at angles closer to 90°. An error of ≤±4° was achieved by 84.5 per cent of participants drilling a 90° angle compared with approximately 20 per cent of participants drilling a 30–45° angle. There was no effect of surgeon experience, task repetition or drill bit size on the mean error for intended versus achieved angle. Increased perception of difficulty was associated with the more acute angles and decreased accuracy, but not experience level. Clinical significance This study shows that surgeon ability to drill accurately (within ±4° error) is limited, particularly at angles ≤60°. In situations where drill angle is critical, use of computer-assisted navigation or custom-made drill guides may be preferable. PMID:27547423

  14. Bayesian calibration of power plant models for accurate performance prediction

    Highlights: • Bayesian calibration is applied to power plant performance prediction. • Measurements from a plant in operation are used for model calibration. • A gas turbine performance model and steam cycle model are calibrated. • An integrated plant model is derived. • Part load efficiency is accurately predicted as a function of ambient conditions. - Abstract: Gas turbine combined cycles are expected to play an increasingly important role in the balancing of supply and demand in future energy markets. Thermodynamic modeling of these energy systems is frequently applied to assist in decision making processes related to the management of plant operation and maintenance. In most cases, model inputs, parameters and outputs are treated as deterministic quantities and plant operators make decisions with limited or no regard of uncertainties. As the steady integration of wind and solar energy into the energy market induces extra uncertainties, part load operation and reliability are becoming increasingly important. In the current study, methods are proposed to not only quantify various types of uncertainties in measurements and plant model parameters using measured data, but to also assess their effect on various aspects of performance prediction. The authors aim to account for model parameter and measurement uncertainty, and for systematic discrepancy of models with respect to reality. For this purpose, the Bayesian calibration framework of Kennedy and O’Hagan is used, which is especially suitable for high-dimensional industrial problems. The article derives a calibrated model of the plant efficiency as a function of ambient conditions and operational parameters, which is also accurate in part load. The article shows that complete statistical modeling of power plants not only enhances process models, but can also increases confidence in operational decisions

  15. Population variability complicates the accurate detection of climate change responses.

    McCain, Christy; Szewczyk, Tim; Bracy Knight, Kevin

    2016-06-01

    The rush to assess species' responses to anthropogenic climate change (CC) has underestimated the importance of interannual population variability (PV). Researchers assume sampling rigor alone will lead to an accurate detection of response regardless of the underlying population fluctuations of the species under consideration. Using population simulations across a realistic, empirically based gradient in PV, we show that moderate to high PV can lead to opposite and biased conclusions about CC responses. Between pre- and post-CC sampling bouts of modeled populations as in resurvey studies, there is: (i) A 50% probability of erroneously detecting the opposite trend in population abundance change and nearly zero probability of detecting no change. (ii) Across multiple years of sampling, it is nearly impossible to accurately detect any directional shift in population sizes with even moderate PV. (iii) There is up to 50% probability of detecting a population extirpation when the species is present, but in very low natural abundances. (iv) Under scenarios of moderate to high PV across a species' range or at the range edges, there is a bias toward erroneous detection of range shifts or contractions. Essentially, the frequency and magnitude of population peaks and troughs greatly impact the accuracy of our CC response measurements. Species with moderate to high PV (many small vertebrates, invertebrates, and annual plants) may be inaccurate 'canaries in the coal mine' for CC without pertinent demographic analyses and additional repeat sampling. Variation in PV may explain some idiosyncrasies in CC responses detected so far and urgently needs more careful consideration in design and analysis of CC responses. PMID:26725404

  16. An accurate and portable solid state neutron rem meter

    Accurately resolving the ambient neutron dose equivalent spanning the thermal to 15 MeV energy range with a single configuration and lightweight instrument is desirable. This paper presents the design of a portable, high intrinsic efficiency, and accurate neutron rem meter whose energy-dependent response is electronically adjusted to a chosen neutron dose equivalent standard. The instrument may be classified as a moderating type neutron spectrometer, based on an adaptation to the classical Bonner sphere and position sensitive long counter, which, simultaneously counts thermalized neutrons by high thermal efficiency solid state neutron detectors. The use of multiple detectors and moderator arranged along an axis of symmetry (e.g., long axis of a cylinder) with known neutron-slowing properties allows for the construction of a linear combination of responses that approximate the ambient neutron dose equivalent. Variations on the detector configuration are investigated via Monte Carlo N-Particle simulations to minimize the total instrument mass while maintaining acceptable response accuracy—a dose error less than 15% for bare 252Cf, bare AmBe, an epi-thermal and mixed monoenergetic sources is found at less than 4.5 kg moderator mass in all studied cases. A comparison of the energy dependent dose equivalent response and resultant energy dependent dose equivalent error of the present dosimeter to commercially-available portable rem meters and the prior art are presented. Finally, the present design is assessed by comparison of the simulated output resulting from applications of several known neutron sources and dose rates

  17. Accurate thermodynamic characterization of a synthetic coal mine methane mixture

    Highlights: • Accurate density data of a 10 components synthetic coal mine methane mixture are presented. • Experimental data are compared with the densities calculated from the GERG-2008 equation of state. • Relative deviations in density were within a 0.2% band at temperatures above 275 K. • Densities at 250 K as well as at 275 K and pressures above 10 MPa showed higher deviations. -- Abstract: In the last few years, coal mine methane (CMM) has gained significance as a potential non-conventional gas fuel. The progressive depletion of common fossil fuels reserves and, on the other hand, the positive estimates of CMM resources as a by-product of mining promote this fuel gas as a promising alternative fuel. The increasing importance of its exploitation makes it necessary to check the capability of the present-day models and equations of state for natural gas to predict the thermophysical properties of gases with a considerably different composition, like CMM. In this work, accurate density measurements of a synthetic CMM mixture are reported in the temperature range from (250 to 400) K and pressures up to 15 MPa, as part of the research project EMRP ENG01 of the European Metrology Research Program for the characterization of non-conventional energy gases. Experimental data were compared with the densities calculated with the GERG-2008 equation of state. Relative deviations between experimental and estimated densities were within a 0.2% band at temperatures above 275 K, while data at 250 K as well as at 275 K and pressures above 10 MPa showed higher deviations

  18. Accurate molecular classification of cancer using simple rules

    Gotoh Osamu

    2009-10-01

    Full Text Available Abstract Background One intractable problem with using microarray data analysis for cancer classification is how to reduce the extremely high-dimensionality gene feature data to remove the effects of noise. Feature selection is often used to address this problem by selecting informative genes from among thousands or tens of thousands of genes. However, most of the existing methods of microarray-based cancer classification utilize too many genes to achieve accurate classification, which often hampers the interpretability of the models. For a better understanding of the classification results, it is desirable to develop simpler rule-based models with as few marker genes as possible. Methods We screened a small number of informative single genes and gene pairs on the basis of their depended degrees proposed in rough sets. Applying the decision rules induced by the selected genes or gene pairs, we constructed cancer classifiers. We tested the efficacy of the classifiers by leave-one-out cross-validation (LOOCV of training sets and classification of independent test sets. Results We applied our methods to five cancerous gene expression datasets: leukemia (acute lymphoblastic leukemia [ALL] vs. acute myeloid leukemia [AML], lung cancer, prostate cancer, breast cancer, and leukemia (ALL vs. mixed-lineage leukemia [MLL] vs. AML. Accurate classification outcomes were obtained by utilizing just one or two genes. Some genes that correlated closely with the pathogenesis of relevant cancers were identified. In terms of both classification performance and algorithm simplicity, our approach outperformed or at least matched existing methods. Conclusion In cancerous gene expression datasets, a small number of genes, even one or two if selected correctly, is capable of achieving an ideal cancer classification effect. This finding also means that very simple rules may perform well for cancerous class prediction.

  19. NRC comprehensive records disposition schedule

    Effective January 1, 1982, NRC will institute records retention and disposal practices in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 22 which apply to housekeeping or facilitative records. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  20. Boiling water heat transfer burnout in uniformly heated round tubes: A compilation of world data with accurate correlations

    Thompson, B.; Macbeth, R.V. [Reactor Development Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1964-07-15

    All available World burn-out data for vertical, uniformly heated round tubes, with liquid water inlet, have been compiled and are presented in systematic order. A total of 4,389 experimental results is recorded, covering a very extensive range of parameters. Detailed examination of these data over the years has indicated a number of inconsistencies and these are discussed. The majority of the data fall into four main pressure groups: 560, 1000, 1550 and 2000 p.s.i.a., and accurate correlations of these data are presented together with error distribution histograms and graphical aids to rapid calculation of burn-out flux. (author)

  1. Accurate location of nuclear explosions at Azgir, Kazakhstan, from satellite images and seismic data: Implications for monitoring decoupled explosions

    Sykes, L.R.; Deng, J. (Lamont-Doherty Earth Observatory, Palisades, NY (United States) Columbia Univ., New York, NY (United States)); Lyubomirskiy, P. (Lamont-Doherty Earth Observatory, Palisades, NY (United States))

    1993-09-15

    This paper reports on the accurate location of ten large tamped nuclear explosions near Azgir, Kazakhstan, conducted by the former Soviet Union in salt domes. The events are located from shot points on a SPOT satellite image, and from reconstructed seismic events recorded on seismographs scattered around the world, including recently released data from the Soviet Union. A concern behind the location of these events, is the possibility that the caverns created by these shots might be used for seismically decoupled testing of nuclear explosions in the future.

  2. Optimization by record dynamics

    Barettin, Daniele; Sibani, Paolo

    2014-01-01

    Large dynamical changes in thermalizing glassy systems are triggered by trajectories crossing record sized barriers, a behavior revealing the presence of a hierarchical structure in configuration space. The observation is here turned into a novel local search optimization algorithm dubbed record......), is applied to the same problem as a benchmark. RDO and PT turn out to produce solutions of similar quality for similar numerical effort, but RDO is simpler to program and additionally yields geometrical information on the system’s configuration space which is of interest in many applications. In particular......, the effectiveness of RDO strongly indicates the presence of the above mentioned hierarchically organized configuration space, with metastable regions indexed by the cost (or energy) of the transition states connecting them....

  3. Case record analysis

    Whitaker, Simon

    2009-01-01

    It is argued that the determinates of low frequency (less than once an hour) challenging behavior are likely to be more complex than those of high frequency behavior involving setting events that may not be present when the behavior occurs. The analysis of case records is then examined as a method of identifying possible setting events to low frequency behaviours. It is suggested that time series analysis, correlational analysis and time lag sequential analysis may all be useful methods in th...

  4. Personal Health Records

    Kensing, F.

    2012-01-01

    in the distributed heterogeneous network of chronic patients and the healthcare professionals that take care of them. An interactive personal health record (PHR) has been designed as part of the project. As such it is part of a trend to find ways to include patients in their own care process. This has been motivated...... by expected health benefits for the patients as well as promises to lead to reduced costs for a burdened healthcare system....

  5. Radiation exposure records management

    Management of individual radiation exposure records begins at employment with the accumulation of data pertinent to the individual and any previous occupational radiation exposure. Appropriate radiation monitorinng badges or devices are issued and accountability established. A computer master file is initiated to include the individual's name, payroll number, social security number, birth date, assigned department, and location. From this base, a radiation exposure history is accumulated to include external ionizing radiation exposure to skin and whole body, contributing neutron exposure, contributing tritium exposure, and extremity exposure. It is used also to schedule bioassay sampling and in-vivo counts and to provide other pertinent information. The file is used as a basis for providing periodic reports to management and monthly exposure summaries to departmental line supervision to assist in planning work so that individual annual exposures are kept as low as practical. Radiation exposure records management also includes documentation of radiation surveys performed by the health physicist to establish working rates and the individual estimating and recording his estimated exposure on a day-to-day basis. Exposure information is also available to contribute to Energy Research and Development Administration statistics and to the National Transuranium Registry

  6. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    Monitoring sulphur chemistry is thought to be of great importance for exoplanets. Doing this requires detailed knowledge of the spectroscopic properties of sulphur containing molecules such as hydrogen sulphide (H2S) [1], sulphur dioxide (SO2), and sulphur trioxide (SO3). Each of these molecules can be found in terrestrial environments, produced in volcano emissions on Earth, and analysis of their spectroscopic data can prove useful to the characterisation of exoplanets, as well as the study of planets in our own solar system, with both having a possible presence on Venus. A complete, high temperature list of line positions and intensities for H32 2 S is presented. The DVR3D program suite is used to calculate the bound ro-vibration energy levels, wavefunctions, and dipole transition intensities using Radau coordinates. The calculations are based on a newly determined, spectroscopically refined potential energy surface (PES) and a new, high accuracy, ab initio dipole moment surface (DMS). Tests show that the PES enables us to calculate the line positions accurately and the DMS gives satisfactory results for line intensities. Comparisons with experiment as well as with previous theoretical spectra will be presented. The results of this study will form an important addition to the databases which are considered as sources of information for space applications; especially, in analysing the spectra of extrasolar planets, and remote sensing studies for Venus and Earth, as well as laboratory investigations and pollution studies. An ab initio line list for SO3 was previously computed using the variational nuclear motion program TROVE [2], and was suitable for modelling room temperature SO3 spectra. The calculations considered transitions in the region of 0-4000 cm-1 with rotational states up to J = 85, and includes 174,674,257 transitions. A list of 10,878 experimental transitions had relative intensities placed on an absolute scale, and were provided in a form suitable

  7. A spectroscopic transfer standard for accurate atmospheric CO measurements

    Nwaboh, Javis A.; Li, Gang; Serdyukov, Anton; Werhahn, Olav; Ebert, Volker

    2016-04-01

    Atmospheric carbon monoxide (CO) is a precursor of essential climate variables and has an indirect effect for enhancing global warming. Accurate and reliable measurements of atmospheric CO concentration are becoming indispensable. WMO-GAW reports states a compatibility goal of ±2 ppb for atmospheric CO concentration measurements. Therefore, the EMRP-HIGHGAS (European metrology research program - high-impact greenhouse gases) project aims at developing spectroscopic transfer standards for CO concentration measurements to meet this goal. A spectroscopic transfer standard would provide results that are directly traceable to the SI, can be very useful for calibration of devices operating in the field, and could complement classical gas standards in the field where calibration gas mixtures in bottles often are not accurate, available or stable enough [1][2]. Here, we present our new direct tunable diode laser absorption spectroscopy (dTDLAS) sensor capable of performing absolute ("calibration free") CO concentration measurements, and being operated as a spectroscopic transfer standard. To achieve the compatibility goal stated by WMO for CO concentration measurements and ensure the traceability of the final concentration results, traceable spectral line data especially line intensities with appropriate uncertainties are needed. Therefore, we utilize our new high-resolution Fourier-transform infrared (FTIR) spectroscopy CO line data for the 2-0 band, with significantly reduced uncertainties, for the dTDLAS data evaluation. Further, we demonstrate the capability of our sensor for atmospheric CO measurements, discuss uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) principles and show that CO concentrations derived using the sensor, based on the TILSAM (traceable infrared laser spectroscopic amount fraction measurement) method, are in excellent agreement with gravimetric values. Acknowledgement Parts of this work have been

  8. Accurate in-line CD metrology for nanometer semiconductor manufacturing

    Perng, Baw-Ching; Shieh, Jyu-Horng; Jang, S.-M.; Liang, M.-S.; Huang, Renee; Chen, Li-Chien; Hwang, Ruey-Lian; Hsu, Joe; Fong, David

    2006-03-01

    The need for absolute accuracy is increasing as semiconductor-manufacturing technologies advance to sub-65nm nodes, since device sizes are reducing to sub-50nm but offsets ranging from 5nm to 20nm are often encountered. While TEM is well-recognized as the most accurate CD metrology, direct comparison between the TEM data and in-line CD data might be misleading sometimes due to different statistical sampling and interferences from sidewall roughness. In this work we explore the capability of CD-AFM as an accurate in-line CD reference metrology. Being a member of scanning profiling metrology, CD-AFM has the advantages of avoiding e-beam damage and minimum sample damage induced CD changes, in addition to the capability of more statistical sampling than typical cross section metrologies. While AFM has already gained its reputation on the accuracy of depth measurement, not much data was reported on the accuracy of CD-AFM for CD measurement. Our main focus here is to prove the accuracy of CD-AFM and show its measuring capability for semiconductor related materials and patterns. In addition to the typical precision check, we spent an intensive effort on examining the bias performance of this CD metrology, which is defined as the difference between CD-AFM data and the best-known CD value of the prepared samples. We first examine line edge roughness (LER) behavior for line patterns of various materials, including polysilicon, photoresist, and a porous low k material. Based on the LER characteristics of each patterning, a method is proposed to reduce its influence on CD measurement. Application of our method to a VLSI nanoCD standard is then performed, and agreement of less than 1nm bias is achieved between the CD-AFM data and the standard's value. With very careful sample preparations and TEM tool calibration, we also obtained excellent correlation between CD-AFM and TEM for poly-CDs ranging from 70nm to 400nm. CD measurements of poly ADI and low k trenches are also

  9. Passive samplers accurately predict PAH levels in resident crayfish.

    Paulik, L Blair; Smith, Brian W; Bergmann, Alan J; Sower, Greg J; Forsberg, Norman D; Teeguarden, Justin G; Anderson, Kim A

    2016-02-15

    Contamination of resident aquatic organisms is a major concern for environmental risk assessors. However, collecting organisms to estimate risk is often prohibitively time and resource-intensive. Passive sampling accurately estimates resident organism contamination, and it saves time and resources. This study used low density polyethylene (LDPE) passive water samplers to predict polycyclic aromatic hydrocarbon (PAH) levels in signal crayfish, Pacifastacus leniusculus. Resident crayfish were collected at 5 sites within and outside of the Portland Harbor Superfund Megasite (PHSM) in the Willamette River in Portland, Oregon. LDPE deployment was spatially and temporally paired with crayfish collection. Crayfish visceral and tail tissue, as well as water-deployed LDPE, were extracted and analyzed for 62 PAHs using GC-MS/MS. Freely-dissolved concentrations (Cfree) of PAHs in water were calculated from concentrations in LDPE. Carcinogenic risks were estimated for all crayfish tissues, using benzo[a]pyrene equivalent concentrations (BaPeq). ∑PAH were 5-20 times higher in viscera than in tails, and ∑BaPeq were 6-70 times higher in viscera than in tails. Eating only tail tissue of crayfish would therefore significantly reduce carcinogenic risk compared to also eating viscera. Additionally, PAH levels in crayfish were compared to levels in crayfish collected 10years earlier. PAH levels in crayfish were higher upriver of the PHSM and unchanged within the PHSM after the 10-year period. Finally, a linear regression model predicted levels of 34 PAHs in crayfish viscera with an associated R-squared value of 0.52 (and a correlation coefficient of 0.72), using only the Cfree PAHs in water. On average, the model predicted PAH concentrations in crayfish tissue within a factor of 2.4±1.8 of measured concentrations. This affirms that passive water sampling accurately estimates PAH contamination in crayfish. Furthermore, the strong predictive ability of this simple model suggests

  10. Simple Mathematical Models Do Not Accurately Predict Early SIV Dynamics

    Cecilia Noecker

    2015-03-01

    Full Text Available Upon infection of a new host, human immunodeficiency virus (HIV replicates in the mucosal tissues and is generally undetectable in circulation for 1–2 weeks post-infection. Several interventions against HIV including vaccines and antiretroviral prophylaxis target virus replication at this earliest stage of infection. Mathematical models have been used to understand how HIV spreads from mucosal tissues systemically and what impact vaccination and/or antiretroviral prophylaxis has on viral eradication. Because predictions of such models have been rarely compared to experimental data, it remains unclear which processes included in these models are critical for predicting early HIV dynamics. Here we modified the “standard” mathematical model of HIV infection to include two populations of infected cells: cells that are actively producing the virus and cells that are transitioning into virus production mode. We evaluated the effects of several poorly known parameters on infection outcomes in this model and compared model predictions to experimental data on infection of non-human primates with variable doses of simian immunodifficiency virus (SIV. First, we found that the mode of virus production by infected cells (budding vs. bursting has a minimal impact on the early virus dynamics for a wide range of model parameters, as long as the parameters are constrained to provide the observed rate of SIV load increase in the blood of infected animals. Interestingly and in contrast with previous results, we found that the bursting mode of virus production generally results in a higher probability of viral extinction than the budding mode of virus production. Second, this mathematical model was not able to accurately describe the change in experimentally determined probability of host infection with increasing viral doses. Third and finally, the model was also unable to accurately explain the decline in the time to virus detection with increasing viral

  11. Rapid and accurate pyrosequencing of angiosperm plastid genomes

    Farmerie William G

    2006-08-01

    Full Text Available Abstract Background Plastid genome sequence information is vital to several disciplines in plant biology, including phylogenetics and molecular biology. The past five years have witnessed a dramatic increase in the number of completely sequenced plastid genomes, fuelled largely by advances in conventional Sanger sequencing technology. Here we report a further significant reduction in time and cost for plastid genome sequencing through the successful use of a newly available pyrosequencing platform, the Genome Sequencer 20 (GS 20 System (454 Life Sciences Corporation, to rapidly and accurately sequence the whole plastid genomes of the basal eudicot angiosperms Nandina domestica (Berberidaceae and Platanus occidentalis (Platanaceae. Results More than 99.75% of each plastid genome was simultaneously obtained during two GS 20 sequence runs, to an average depth of coverage of 24.6× in Nandina and 17.3× in Platanus. The Nandina and Platanus plastid genomes shared essentially identical gene complements and possessed the typical angiosperm plastid structure and gene arrangement. To assess the accuracy of the GS 20 sequence, over 45 kilobases of sequence were generated for each genome using conventional sequencing. Overall error rates of 0.043% and 0.031% were observed in GS 20 sequence for Nandina and Platanus, respectively. More than 97% of all observed errors were associated with homopolymer runs, with ~60% of all errors associated with homopolymer runs of 5 or more nucleotides and ~50% of all errors associated with regions of extensive homopolymer runs. No substitution errors were present in either genome. Error rates were generally higher in the single-copy and noncoding regions of both plastid genomes relative to the inverted repeat and coding regions. Conclusion Highly accurate and essentially complete sequence information was obtained for the Nandina and Platanus plastid genomes using the GS 20 System. More importantly, the high accuracy

  12. Automatic classification and accurate size measurement of blank mask defects

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    A blank mask and its preparation stages, such as cleaning or resist coating, play an important role in the eventual yield obtained by using it. Blank mask defects' impact analysis directly depends on the amount of available information such as the number of defects observed, their accurate locations and sizes. Mask usability qualification at the start of the preparation process, is crudely based on number of defects. Similarly, defect information such as size is sought to estimate eventual defect printability on the wafer. Tracking of defect characteristics, specifically size and shape, across multiple stages, can further be indicative of process related information such as cleaning or coating process efficiencies. At the first level, inspection machines address the requirement of defect characterization by detecting and reporting relevant defect information. The analysis of this information though is still largely a manual process. With advancing technology nodes and reducing half-pitch sizes, a large number of defects are observed; and the detailed knowledge associated, make manual defect review process an arduous task, in addition to adding sensitivity to human errors. Cases where defect information reported by inspection machine is not sufficient, mask shops rely on other tools. Use of CDSEM tools is one such option. However, these additional steps translate into increased costs. Calibre NxDAT based MDPAutoClassify tool provides an automated software alternative to the manual defect review process. Working on defect images generated by inspection machines, the tool extracts and reports additional information such as defect location, useful for defect avoidance[4][5]; defect size, useful in estimating defect printability; and, defect nature e.g. particle, scratch, resist void, etc., useful for process monitoring. The tool makes use of smart and elaborate post-processing algorithms to achieve this. Their elaborateness is a consequence of the variety and

  13. Standardized EEG interpretation accurately predicts prognosis after cardiac arrest

    Westhall, Erik; Rossetti, Andrea O; van Rootselaar, Anne-Fleur; Wesenberg Kjaer, Troels; Horn, Janneke; Ullén, Susann; Friberg, Hans; Nielsen, Niklas; Rosén, Ingmar; Åneman, Anders; Erlinge, David; Gasche, Yvan; Hassager, Christian; Hovdenes, Jan; Kjaergaard, Jesper; Kuiper, Michael; Pellis, Tommaso; Stammet, Pascal; Wanscher, Michael; Wetterslev, Jørn; Wise, Matt P; Cronberg, Tobias

    2016-01-01

    OBJECTIVE: To identify reliable predictors of outcome in comatose patients after cardiac arrest using a single routine EEG and standardized interpretation according to the terminology proposed by the American Clinical Neurophysiology Society. METHODS: In this cohort study, 4 EEG specialists......, blinded to outcome, evaluated prospectively recorded EEGs in the Target Temperature Management trial (TTM trial) that randomized patients to 33°C vs 36°C. Routine EEG was performed in patients still comatose after rewarming. EEGs were classified into highly malignant (suppression, suppression with...... periodic discharges, burst-suppression), malignant (periodic or rhythmic patterns, pathological or nonreactive background), and benign EEG (absence of malignant features). Poor outcome was defined as best Cerebral Performance Category score 3-5 until 180 days. RESULTS: Eight TTM sites randomized 202...

  14. ACE-I Angioedema: Accurate Clinical Diagnosis May Prevent Epinephrine-Induced Harm

    Curtis, R. Mason; Felder, Sarah; Borici-Mazi, Rozita; Ball, Ian

    2016-01-01

    Introduction Upper airway angioedema is a life-threatening emergency department (ED) presentation with increasing incidence. Angiotensin-converting enzyme inhibitor induced angioedema (AAE) is a non-mast cell mediated etiology of angioedema. Accurate diagnosis by clinical examination can optimize patient management and reduce morbidity from inappropriate treatment with epinephrine. The aim of this study is to describe the incidence of angioedema subtypes and the management of AAE. We evaluate the appropriateness of treatments and highlight preventable iatrogenic morbidity. Methods We conducted a retrospective chart review of consecutive angioedema patients presenting to two tertiary care EDs between July 2007 and March 2012. Results Of 1,702 medical records screened, 527 were included. The cause of angioedema was identified in 48.8% (n=257) of cases. The most common identifiable etiology was AAE (33.1%, n=85), with a 60.0% male predominance. The most common AAE management strategies included diphenhydramine (63.5%, n=54), corticosteroids (50.6%, n=43) and ranitidine (31.8%, n=27). Epinephrine was administered in 21.2% (n=18) of AAE patients, five of whom received repeated doses. Four AAE patients required admission (4.7%) and one required endotracheal intubation. Epinephrine induced morbidity in two patients, causing myocardial ischemia or dysrhythmia shortly after administration. Conclusion AAE is the most common identifiable etiology of angioedema and can be accurately diagnosed by physical examination. It is easily confused with anaphylaxis and mismanaged with antihistamines, corticosteroids and epinephrine. There is little physiologic rationale for epinephrine use in AAE and much risk. Improved clinical differentiation of mast cell and non-mast cell mediated angioedema can optimize patient management. PMID:27330660

  15. Image Capture with Synchronized Multiple-Cameras for Extraction of Accurate Geometries

    Koehl, M.; Delacourt, T.; Boutry, C.

    2016-06-01

    This paper presents a project of recording and modelling tunnels, traffic circles and roads from multiple sensors. The aim is the representation and the accurate 3D modelling of a selection of road infrastructures as dense point clouds in order to extract profiles and metrics from it. Indeed, these models will be used for the sizing of infrastructures in order to simulate exceptional convoy truck routes. The objective is to extract directly from the point clouds the heights, widths and lengths of bridges and tunnels, the diameter of gyrating and to highlight potential obstacles for a convoy. Light, mobile and fast acquisition approaches based on images and videos from a set of synchronized sensors have been tested in order to obtain useable point clouds. The presented solution is based on a combination of multiple low-cost cameras designed on an on-boarded device allowing dynamic captures. The experimental device containing GoPro Hero4 cameras has been set up and used for tests in static or mobile acquisitions. That way, various configurations have been tested by using multiple synchronized cameras. These configurations are discussed in order to highlight the best operational configuration according to the shape of the acquired objects. As the precise calibration of each sensor and its optics are major factors in the process of creation of accurate dense point clouds, and in order to reach the best quality available from such cameras, the estimation of the internal parameters of fisheye lenses of the cameras has been processed. Reference measures were also realized by using a 3D TLS (Faro Focus 3D) to allow the accuracy assessment.

  16. Misperception and accurate perception of close friend substance use in early adolescence: Developmental and intervention implications.

    Scalco, Matthew D; Meisel, Samuel N; Colder, Craig R

    2016-05-01

    Misperceptions of peer substance use (SU) are believed to be a robust correlate of adolescent SU; however, perceived peer SU is biased in the direction of an adolescent's own SU raising questions about the validity of perceived peer SU (social norms; Henry, Kobus, & Schoeny, 2011). In addition, social norm theories emphasize inaccurate perceptions of peer SU while other theories emphasize actual peer behavior and selection of friends as motivators of adolescent SU. Furthermore, no theories consider the role of accurate perceptions, suggesting the need to more carefully consider the coevolution of perceived peer norms, actual peer behavior, and adolescent SU. To do this, we modeled the latent structure of accurate and inaccurate perceptions of peer SU while including an adolescent's own SU using latent class analysis and tested the natural evolution of the classes using latent transition analysis. The design included 3 annual assessments of peer and perceptions of peer SU and 6 assessments of adolescent SU (N = 765; age = 10-13 at Wave 1; female = 53%). Latent class analysis findings largely replicated Henry et al. (2011), suggesting that misperceptions of peer SU were biased by an adolescent's own SU. We also found 3 distinct pathways to a high risk class that predicted high levels of later adolescent SU, 2 in which adolescent and perceived peer SU preceded peer SU (age = 10-12 and 12-14) and another in which peer SU preceded adolescent SU and perceptions of peer SU (age = 12-14). Implications for peer influence theories are discussed. (PsycINFO Database Record PMID:27214169

  17. A new automatic blood pressure kit auscultates for accurate reading with a smartphone

    Wu, Hongjun; Wang, Bingjian; Zhu, Xinpu; Chu, Guang; Zhang, Zhi

    2016-01-01

    Abstract The widely used oscillometric automated blood pressure (BP) monitor was continuously questioned on its accuracy. A novel BP kit named Accutension which adopted Korotkoff auscultation method was then devised. Accutension worked with a miniature microphone, a pressure sensor, and a smartphone. The BP values were automatically displayed on the smartphone screen through the installed App. Data recorded in the phone could be played back and reconfirmed after measurement. They could also be uploaded and saved to the iCloud. The accuracy and consistency of this novel electronic auscultatory sphygmomanometer was preliminarily verified here. Thirty-two subjects were included and 82 qualified readings were obtained. The mean differences ± SD for systolic and diastolic BP readings between Accutension and mercury sphygmomanometer were 0.87 ± 2.86 and −0.94 ± 2.93 mm Hg. Agreements between Accutension and mercury sphygmomanometer were highly significant for systolic (ICC = 0.993, 95% confidence interval (CI): 0.989–0.995) and diastolic (ICC = 0.987, 95% CI: 0.979–0.991). In conclusion, Accutension worked accurately based on our pilot study data. The difference was acceptable. ICC and Bland–Altman plot charts showed good agreements with manual measurements. Systolic readings of Accutension were slightly higher than those of manual measurement, while diastolic readings were slightly lower. One possible reason was that Accutension captured the first and the last korotkoff sound more sensitively than human ear during manual measurement and avoided sound missing, so that it might be more accurate than traditional mercury sphygmomanometer. By documenting and analyzing of variant tendency of BP values, Accutension helps management of hypertension and therefore contributes to the mobile heath service. PMID:27512876

  18. ACE-I Angioedema: Accurate Clinical Diagnosis May Prevent Epinephrine-Induced Harm

    R. Mason Curtis

    2016-06-01

    Full Text Available Introduction: Upper airway angioedema is a life-threatening emergency department (ED presentation with increasing incidence. Angiotensin-converting enzyme inhibitor induced angioedema (AAE is a non-mast cell mediated etiology of angioedema. Accurate diagnosis by clinical examination can optimize patient management and reduce morbidity from inappropriate treatment with epinephrine. The aim of this study is to describe the incidence of angioedema subtypes and the management of AAE. We evaluate the appropriateness of treatments and highlight preventable iatrogenic morbidity. Methods: We conducted a retrospective chart review of consecutive angioedema patients presenting to two tertiary care EDs between July 2007 and March 2012. Results: Of 1,702 medical records screened, 527 were included. The cause of angioedema was identified in 48.8% (n=257 of cases. The most common identifiable etiology was AAE (33.1%, n=85, with a 60.0% male predominance. The most common AAE management strategies included diphenhydramine (63.5%, n=54, corticosteroids (50.6%, n=43 and ranitidine (31.8%, n=27. Epinephrine was administered in 21.2% (n=18 of AAE patients, five of whom received repeated doses. Four AAE patients required admission (4.7% and one required endotracheal intubation. Epinephrine induced morbidity in two patients, causing myocardial ischemia or dysrhythmia shortly after administration. Conclusion: AAE is the most common identifiable etiology of angioedema and can be accurately diagnosed by physical examination. It is easily confused with anaphylaxis and mismanaged with antihistamines, corticosteroids and epinephrine. There is little physiologic rationale for epinephrine use in AAE and much risk. Improved clinical differentiation of mast cell and non-mast cell mediated angioedema can optimize patient management.

  19. Standardized EEG interpretation accurately predicts prognosis after cardiac arrest

    Rossetti, Andrea O.; van Rootselaar, Anne-Fleur; Wesenberg Kjaer, Troels; Horn, Janneke; Ullén, Susann; Friberg, Hans; Nielsen, Niklas; Rosén, Ingmar; Åneman, Anders; Erlinge, David; Gasche, Yvan; Hassager, Christian; Hovdenes, Jan; Kjaergaard, Jesper; Kuiper, Michael; Pellis, Tommaso; Stammet, Pascal; Wanscher, Michael; Wetterslev, Jørn; Wise, Matt P.; Cronberg, Tobias

    2016-01-01

    Objective: To identify reliable predictors of outcome in comatose patients after cardiac arrest using a single routine EEG and standardized interpretation according to the terminology proposed by the American Clinical Neurophysiology Society. Methods: In this cohort study, 4 EEG specialists, blinded to outcome, evaluated prospectively recorded EEGs in the Target Temperature Management trial (TTM trial) that randomized patients to 33°C vs 36°C. Routine EEG was performed in patients still comatose after rewarming. EEGs were classified into highly malignant (suppression, suppression with periodic discharges, burst-suppression), malignant (periodic or rhythmic patterns, pathological or nonreactive background), and benign EEG (absence of malignant features). Poor outcome was defined as best Cerebral Performance Category score 3–5 until 180 days. Results: Eight TTM sites randomized 202 patients. EEGs were recorded in 103 patients at a median 77 hours after cardiac arrest; 37% had a highly malignant EEG and all had a poor outcome (specificity 100%, sensitivity 50%). Any malignant EEG feature had a low specificity to predict poor prognosis (48%) but if 2 malignant EEG features were present specificity increased to 96% (p < 0.001). Specificity and sensitivity were not significantly affected by targeted temperature or sedation. A benign EEG was found in 1% of the patients with a poor outcome. Conclusions: Highly malignant EEG after rewarming reliably predicted poor outcome in half of patients without false predictions. An isolated finding of a single malignant feature did not predict poor outcome whereas a benign EEG was highly predictive of a good outcome. PMID:26865516

  20. Automating occupational protection records systems

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs

  1. Accurate Complex Systems Design: Integrating Serious Games with Petri Nets

    Kirsten Sinclair

    2016-03-01

    Full Text Available Difficulty understanding the large number of interactions involved in complex systems makes their successful engineering a problem. Petri Nets are one graphical modelling technique used to describe and check proposed designs of complex systems thoroughly. While automatic analysis capabilities of Petri Nets are useful, their visual form is less so, particularly for communicating the design they represent. In engineering projects, this can lead to a gap in communications between people with different areas of expertise, negatively impacting achieving accurate designs.In contrast, although capable of representing a variety of real and imaginary objects effectively, behaviour of serious games can only be analysed manually through interactive simulation. This paper examines combining the complementary strengths of Petri Nets and serious games. The novel contribution of this work is a serious game prototype of a complex system design that has been checked thoroughly. Underpinned by Petri Net analysis, the serious game can be used as a high-level interface to communicate and refine the design.Improvement of a complex system design is demonstrated by applying the integration to a proof-of-concept case study.   

  2. An accurate δf method for neoclassical transport calculation

    A δf method, solving drift kinetic equation, for neoclassical transport calculation is presented in detail. It is demonstrated that valid results essentially rely on the correct evaluation of marker density g in weight calculation. A general and accurate weighting scheme is developed without using some assumed g in weight equation for advancing particle weights, unlike the previous schemes. This scheme employs an additional weight function to directly solve g from its kinetic equation using the idea of δf method. Therefore the severe constraint that the real marker distribution must be consistent with the initially assumed g during a simulation is relaxed. An improved like-particle collision scheme is presented. By performing compensation for momentum, energy and particle losses arising from numerical errors, the conservations of all the three quantities are greatly improved during collisions. Ion neoclassical transport due to self-collisions is examined under finite banana case as well as zero banana limit. A solution with zero particle and zero energy flux (in case of no temperature gradient) over whole poloidal section is obtained. With the improvement in both like-particle collision scheme and weighting scheme, the δf simulation shows a significantly upgraded performance for neoclassical transport study. (author)

  3. Progress in Fast, Accurate Multi-scale Climate Simulations

    Collins, William D [Lawrence Berkeley National Laboratory (LBNL); Johansen, Hans [Lawrence Berkeley National Laboratory (LBNL); Evans, Katherine J [ORNL; Woodward, Carol S. [Lawrence Livermore National Laboratory (LLNL); Caldwell, Peter [Lawrence Livermore National Laboratory (LLNL)

    2015-01-01

    We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.

  4. Faster and More Accurate Sequence Alignment with SNAP

    Zaharia, Matei; Curtis, Kristal; Fox, Armando; Patterson, David; Shenker, Scott; Stoica, Ion; Karp, Richard M; Sittler, Taylor

    2011-01-01

    We present the Scalable Nucleotide Alignment Program (SNAP), a new short and long read aligner that is both more accurate (i.e., aligns more reads with fewer errors) and 10-100x faster than state-of-the-art tools such as BWA. Unlike recent aligners based on the Burrows-Wheeler transform, SNAP uses a simple hash index of short seed sequences from the genome, similar to BLAST's. However, SNAP greatly reduces the number and cost of local alignment checks performed through several measures: it uses longer seeds to reduce the false positive locations considered, leverages larger memory capacities to speed index lookup, and excludes most candidate locations without fully computing their edit distance to the read. The result is an algorithm that scales well for reads from one hundred to thousands of bases long and provides a rich error model that can match classes of mutations (e.g., longer indels) that today's fast aligners ignore. We calculate that SNAP can align a dataset with 30x coverage of a human genome in le...

  5. Accurate Detection of Rifampicin-Resistant Mycobacterium Tuberculosis Strains

    Keum-Soo Song

    2016-03-01

    Full Text Available In 2013 alone, the death rate among the 9.0 million people infected with Mycobacterium tuberculosis (TB worldwide was around 14%, which is unacceptably high. An empiric treatment of patients infected with TB or drug-resistant Mycobacterium tuberculosis (MDR-TB strain can also result in the spread of MDR-TB. The diagnostic tools which are rapid, reliable, and have simple experimental protocols can significantly help in decreasing the prevalence rate of MDR-TB strain. We report the evaluation of the 9G technology based 9G DNAChips that allow accurate detection and discrimination of TB and MDR-TB-RIF. One hundred and thirteen known cultured samples were used to evaluate the ability of 9G DNAChip in the detection and discrimination of TB and MDR-TB-RIF strains. Hybridization of immobilized probes with the PCR products of TB and MDR-TB-RIF strains allow their detection and discrimination. The accuracy of 9G DNAChip was determined by comparing its results with sequencing analysis and drug susceptibility testing. Sequencing analysis showed 100% agreement with the results of 9G DNAChip. The 9G DNAChip showed very high sensitivity (95.4% and specificity (100%.

  6. The place of highly accurate methods by RNAA in metrology

    With the introduction of physical metrological concepts to chemical analysis which require that the result should be accompanied by uncertainty statement written down in terms of Sl units, several researchers started to consider lD-MS as the only method fulfilling this requirement. However, recent publications revealed that in certain cases also some expert laboratories using lD-MS and analyzing the same material, produced results for which their uncertainty statements did not overlap, what theoretically should not have taken place. This shows that no monopoly is good in science and it would be desirable to widen the set of methods acknowledged as primary in inorganic trace analysis. Moreover, lD-MS cannot be used for monoisotopic elements. The need for searching for other methods having similar metrological quality as the lD-MS seems obvious. In this paper, our long-time experience on devising highly accurate ('definitive') methods by RNAA for the determination of selected trace elements in biological materials is reviewed. The general idea of definitive methods based on combination of neutron activation with the highly selective and quantitative isolation of the indicator radionuclide by column chromatography followed by gamma spectrometric measurement is reminded and illustrated by examples of the performance of such methods when determining Cd, Co, Mo, etc. lt is demonstrated that such methods are able to provide very reliable results with very low levels of uncertainty traceable to Sl units

  7. Accurate quantification of cells recovered by bronchoalveolar lavage.

    Saltini, C; Hance, A J; Ferrans, V J; Basset, F; Bitterman, P B; Crystal, R G

    1984-10-01

    Quantification of the differential cell count and total number of cells recovered from the lower respiratory tract by bronchoalveolar lavage is a valuable technique for evaluating the alveolitis of patients with inflammatory disorders of the lower respiratory tract. The most commonly used technique for the evaluation of cells recovered by lavage has been to concentrate cells by centrifugation and then to determine total cell number using a hemocytometer and differential cell count from a Wright-Glemsa-stained cytocentrifuge preparation. However, we have noted that the percentage of small cells present in the original cell suspension recovered by lavage is greater than the percentage of lymphocytes identified on cytocentrifuge preparations. Therefore, we developed procedures for determining differential cell counts on lavage cells collected on Millipore filters and stained with hematoxylin-eosin (filter preparations) and compared the results of differential cell counts performed on filter preparations with those obtained using cytocentrifuge preparations. When cells recovered by lavage were collected on filter preparations, accurate differential cell counts were obtained, as confirmed by performing differential cell counts on cell mixtures of known composition, and by comparing differential cell counts obtained using filter preparations stained with hematoxylin-eosin with those obtained using filter preparations stained with a peroxidase cytochemical stain. The morphology of cells displayed on filter preparations was excellent, and interobserver variability in quantitating cell types recovered by lavage was less than 3%.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:6385789

  8. Accurate measurement of liquid transport through nanoscale conduits.

    Alibakhshi, Mohammad Amin; Xie, Quan; Li, Yinxiao; Duan, Chuanhua

    2016-01-01

    Nanoscale liquid transport governs the behaviour of a wide range of nanofluidic systems, yet remains poorly characterized and understood due to the enormous hydraulic resistance associated with the nanoconfinement and the resulting minuscule flow rates in such systems. To overcome this problem, here we present a new measurement technique based on capillary flow and a novel hybrid nanochannel design and use it to measure water transport through single 2-D hydrophilic silica nanochannels with heights down to 7 nm. Our results show that silica nanochannels exhibit increased mass flow resistance compared to the classical hydrodynamics prediction. This difference increases with decreasing channel height and reaches 45% in the case of 7 nm nanochannels. This resistance increase is attributed to the formation of a 7-angstrom-thick stagnant hydration layer on the hydrophilic surfaces. By avoiding use of any pressure and flow sensors or any theoretical estimations the hybrid nanochannel scheme enables facile and precise flow measurement through single nanochannels, nanotubes, or nanoporous media and opens the prospect for accurate characterization of both hydrophilic and hydrophobic nanofluidic systems. PMID:27112404

  9. Accurate reconstruction of hyperspectral images from compressive sensing measurements

    Greer, John B.; Flake, J. C.

    2013-05-01

    The emerging field of Compressive Sensing (CS) provides a new way to capture data by shifting the heaviest burden of data collection from the sensor to the computer on the user-end. This new means of sensing requires fewer measurements for a given amount of information than traditional sensors. We investigate the efficacy of CS for capturing HyperSpectral Imagery (HSI) remotely. We also introduce a new family of algorithms for constructing HSI from CS measurements with Split Bregman Iteration [Goldstein and Osher,2009]. These algorithms combine spatial Total Variation (TV) with smoothing in the spectral dimension. We examine models for three different CS sensors: the Coded Aperture Snapshot Spectral Imager-Single Disperser (CASSI-SD) [Wagadarikar et al.,2008] and Dual Disperser (CASSI-DD) [Gehm et al.,2007] cameras, and a hypothetical random sensing model closer to CS theory, but not necessarily implementable with existing technology. We simulate the capture of remotely sensed images by applying the sensor forward models to well-known HSI scenes - an AVIRIS image of Cuprite, Nevada and the HYMAP Urban image. To measure accuracy of the CS models, we compare the scenes constructed with our new algorithm to the original AVIRIS and HYMAP cubes. The results demonstrate the possibility of accurately sensing HSI remotely with significantly fewer measurements than standard hyperspectral cameras.

  10. Study of accurate volume measurement system for plutonium nitrate solution

    Hosoma, T. [Power Reactor and Nuclear Fuel Development Corp., Tokai, Ibaraki (Japan). Tokai Works

    1998-12-01

    It is important for effective safeguarding of nuclear materials to establish a technique for accurate volume measurement of plutonium nitrate solution in accountancy tank. The volume of the solution can be estimated by two differential pressures between three dip-tubes, in which the air is purged by an compressor. One of the differential pressure corresponds to the density of the solution, and another corresponds to the surface level of the solution in the tank. The measurement of the differential pressure contains many uncertain errors, such as precision of pressure transducer, fluctuation of back-pressure, generation of bubbles at the front of the dip-tubes, non-uniformity of temperature and density of the solution, pressure drop in the dip-tube, and so on. The various excess pressures at the volume measurement are discussed and corrected by a reasonable method. High precision-differential pressure measurement system is developed with a quartz oscillation type transducer which converts a differential pressure to a digital signal. The developed system is used for inspection by the government and IAEA. (M. Suetake)

  11. An Accurate Projector Calibration Method Based on Polynomial Distortion Representation

    Miao Liu

    2015-10-01

    Full Text Available In structure light measurement systems or 3D printing systems, the errors caused by optical distortion of a digital projector always affect the precision performance and cannot be ignored. Existing methods to calibrate the projection distortion rely on calibration plate and photogrammetry, so the calibration performance is largely affected by the quality of the plate and the imaging system. This paper proposes a new projector calibration approach that makes use of photodiodes to directly detect the light emitted from a digital projector. By analyzing the output sequence of the photoelectric module, the pixel coordinates can be accurately obtained by the curve fitting method. A polynomial distortion representation is employed to reduce the residuals of the traditional distortion representation model. Experimental results and performance evaluation show that the proposed calibration method is able to avoid most of the disadvantages in traditional methods and achieves a higher accuracy. This proposed method is also practically applicable to evaluate the geometric optical performance of other optical projection system.

  12. A New Path Generation Algorithm Based on Accurate NURBS Curves

    Sawssen Jalel

    2016-04-01

    Full Text Available The process of finding an optimum, smooth and feasible global path for mobile robot navigation usually involves determining the shortest polyline path, which will be subsequently smoothed to satisfy the requirements. Within this context, this paper deals with a novel roadmap algorithm for generating an optimal path in terms of Non-Uniform Rational B-Splines (NURBS curves. The generated path is well constrained within the curvature limit by exploiting the influence of the weight parameter of NURBS and/or the control points’ locations. The novelty of this paper lies in the fact that NURBS curves are not used only as a means of smoothing, but they are also involved in meeting the system’s constraints via a suitable parameterization of the weights and locations of control points. The accurate parameterization of weights allows for a greater benefit to be derived from the influence and geometrical effect of this factor, which has not been well investigated in previous works. The effectiveness of the proposed algorithm is demonstrated through extensive MATLAB computer simulations.

  13. Accurate ab initio vibrational energies of methyl chloride

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH335Cl and CH337Cl. The respective PESs, CBS-35 HL, and CBS-37 HL, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY 3Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35 HL and CBS-37 HL PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm−1, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH3Cl without empirical refinement of the respective PESs

  14. Accurate ab initio vibrational energies of methyl chloride

    Owens, Alec, E-mail: owens@mpi-muelheim.mpg.de [Max-Planck-Institut für Kohlenforschung, Kaiser-Wilhelm-Platz 1, 45470 Mülheim an der Ruhr (Germany); Department of Physics and Astronomy, University College London, Gower Street, WC1E 6BT London (United Kingdom); Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan [Department of Physics and Astronomy, University College London, Gower Street, WC1E 6BT London (United Kingdom); Thiel, Walter [Max-Planck-Institut für Kohlenforschung, Kaiser-Wilhelm-Platz 1, 45470 Mülheim an der Ruhr (Germany)

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH{sub 3}{sup 35}Cl and CH{sub 3}{sup 37}Cl. The respective PESs, CBS-35{sup  HL}, and CBS-37{sup  HL}, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY {sub 3}Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35{sup  HL} and CBS-37{sup  HL} PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm{sup −1}, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH{sub 3}Cl without empirical refinement of the respective PESs.

  15. An accurate {delta}f method for neoclassical transport calculation

    Wang, W.X.; Nakajima, N.; Murakami, S.; Okamoto, M. [National Inst. for Fusion Science, Toki, Gifu (Japan)

    1999-03-01

    A {delta}f method, solving drift kinetic equation, for neoclassical transport calculation is presented in detail. It is demonstrated that valid results essentially rely on the correct evaluation of marker density g in weight calculation. A general and accurate weighting scheme is developed without using some assumed g in weight equation for advancing particle weights, unlike the previous schemes. This scheme employs an additional weight function to directly solve g from its kinetic equation using the idea of {delta}f method. Therefore the severe constraint that the real marker distribution must be consistent with the initially assumed g during a simulation is relaxed. An improved like-particle collision scheme is presented. By performing compensation for momentum, energy and particle losses arising from numerical errors, the conservations of all the three quantities are greatly improved during collisions. Ion neoclassical transport due to self-collisions is examined under finite banana case as well as zero banana limit. A solution with zero particle and zero energy flux (in case of no temperature gradient) over whole poloidal section is obtained. With the improvement in both like-particle collision scheme and weighting scheme, the {delta}f simulation shows a significantly upgraded performance for neoclassical transport study. (author)

  16. Iterative feature refinement for accurate undersampled MR image reconstruction

    Wang, Shanshan; Liu, Jianbo; Liu, Qiegen; Ying, Leslie; Liu, Xin; Zheng, Hairong; Liang, Dong

    2016-05-01

    Accelerating MR scan is of great significance for clinical, research and advanced applications, and one main effort to achieve this is the utilization of compressed sensing (CS) theory. Nevertheless, the existing CSMRI approaches still have limitations such as fine structure loss or high computational complexity. This paper proposes a novel iterative feature refinement (IFR) module for accurate MR image reconstruction from undersampled K-space data. Integrating IFR with CSMRI which is equipped with fixed transforms, we develop an IFR-CS method to restore meaningful structures and details that are originally discarded without introducing too much additional complexity. Specifically, the proposed IFR-CS is realized with three iterative steps, namely sparsity-promoting denoising, feature refinement and Tikhonov regularization. Experimental results on both simulated and in vivo MR datasets have shown that the proposed module has a strong capability to capture image details, and that IFR-CS is comparable and even superior to other state-of-the-art reconstruction approaches.

  17. A Distributed Weighted Voting Approach for Accurate Eye Center Estimation

    Gagandeep Singh

    2013-05-01

    Full Text Available This paper proposes a novel approach for accurate estimation of eye center in face images. A distributed voting based approach in which every pixel votes is adopted for potential eye center candidates. The votes are distributed over a subset of pixels which lie in a direction which is opposite to gradient direction and the weightage of votes is distributed according to a novel mechanism.  First, image is normalized to eliminate illumination variations and its edge map is generated using Canny edge detector. Distributed voting is applied on the edge image to generate different eye center candidates. Morphological closing and local maxima search are used to reduce the number of candidates. A classifier based on spatial and intensity information is used to choose the correct candidates for the locations of eye center. The proposed approach was tested on BioID face database and resulted in better Iris detection rate than the state-of-the-art. The proposed approach is robust against illumination variation, small pose variations, presence of eye glasses and partial occlusion of eyes.Defence Science Journal, 2013, 63(3, pp.292-297, DOI:http://dx.doi.org/10.14429/dsj.63.2763

  18. Reusable, robust, and accurate laser-generated photonic nanosensor.

    Yetisen, Ali K; Montelongo, Yunuen; da Cruz Vasconcellos, Fernando; Martinez-Hurtado, J L; Neupane, Sankalpa; Butt, Haider; Qasim, Malik M; Blyth, Jeffrey; Burling, Keith; Carmody, J Bryan; Evans, Mark; Wilkinson, Timothy D; Kubota, Lauro T; Monteiro, Michael J; Lowe, Christopher R

    2014-06-11

    Developing noninvasive and accurate diagnostics that are easily manufactured, robust, and reusable will provide monitoring of high-risk individuals in any clinical or point-of-care environment. We have developed a clinically relevant optical glucose nanosensor that can be reused at least 400 times without a compromise in accuracy. The use of a single 6 ns laser (λ = 532 nm, 200 mJ) pulse rapidly produced off-axis Bragg diffraction gratings consisting of ordered silver nanoparticles embedded within a phenylboronic acid-functionalized hydrogel. This sensor exhibited reversible large wavelength shifts and diffracted the spectrum of narrow-band light over the wavelength range λpeak ≈ 510-1100 nm. The experimental sensitivity of the sensor permits diagnosis of glucosuria in the urine samples of diabetic patients with an improved performance compared to commercial high-throughput urinalysis devices. The sensor response was achieved within 5 min, reset to baseline in ∼10 s. It is anticipated that this sensing platform will have implications for the development of reusable, equipment-free colorimetric point-of-care diagnostic devices for diabetes screening. PMID:24844116

  19. An Accurate ANFIS-based MPPT for Solar PV System

    Ahmed Bin-Halabi

    2014-06-01

    Full Text Available It has been found from the literature review that the ANFIS-based maximum power point tracking (MPPT techniques are very fast and accurate in tracking the MPP at any weather conditions, and they have smaller power losses if trained well. Unfortunately, this is true in simulation, but in practice they do not work very well because they do not take aging of solar cells as well as the effect of dust and shading into account. In other words, the solar irradiance measured by solar irradiance sensor is not always the same irradiance that influences the PV module. The main objective of this work is to design and practically implement an MPPT system for solar PV with high speed, high efficiency, and relatively easy implementation in order to improve the efficiency of solar energy conversion. This MPPT system is based on ANFIS technique. The contribution of this research is eliminating the need of irradiance sensor while having the same adequate performance obtained by the ANFIS with irradiance sensor, both, in simulation as well as in experimental implementation. The proposed technique has been validated by comparing the practical results of the implemented setup to simulations. Experimental results have showed good agreement with simulation results.

  20. AUTOMATED, HIGHLY ACCURATE VERIFICATION OF RELAP5-3D

    George L Mesina; David Aumiller; Francis Buschman

    2014-07-01

    Computer programs that analyze light water reactor safety solve complex systems of governing, closure and special process equations to model the underlying physics. In addition, these programs incorporate many other features and are quite large. RELAP5-3D[1] has over 300,000 lines of coding for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. Verification ensures that a program is built right by checking that it meets its design specifications. Recently, there has been an increased importance on the development of automated verification processes that compare coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions[2]. For the first time, the ability exists to ensure that the data transfer operations associated with timestep advancement/repeating and writing/reading a solution to a file have no unintended consequences. To ensure that the code performs as intended over its extensive list of applications, an automated and highly accurate verification method has been modified and applied to RELAP5-3D. Furthermore, mathematical analysis of the adequacy of the checks used in the comparisons is provided.

  1. Accurate and efficient waveforms for compact binaries on eccentric orbits

    Huerta, E A; McWilliams, Sean T; O'Shaughnessy, Richard; Yunes, Nicolas

    2014-01-01

    Compact binaries that emit gravitational waves in the sensitivity band of ground-based detectors can have non-negligible eccentricities just prior to merger, depending on the formation scenario. We develop a purely analytic, frequency-domain model for gravitational waves emitted by compact binaries on orbits with small eccentricity, which reduces to the quasi-circular post-Newtonian approximant TaylorF2 at zero eccentricity and to the post-circular approximation of Yunes et al. (2009) at small eccentricity. Our model uses a spectral approximation to the (post-Newtonian) Kepler problem to model the orbital phase as a function of frequency, accounting for eccentricity effects up to ${\\cal{O}}(e^8)$ at each post-Newtonian order. Our approach accurately reproduces an alternative time-domain eccentric waveform model for eccentricities $e\\in [0, 0.4]$ and binaries with total mass less than 12 solar masses. As an application, we evaluate the signal amplitude that eccentric binaries produce in different networks of e...

  2. Accurate measurement of streamwise vortices in low speed aerodynamic flows

    Waldman, Rye M.; Kudo, Jun; Breuer, Kenneth S.

    2010-11-01

    Low Reynolds number experiments with flapping animals (such as bats and small birds) are of current interest in understanding biological flight mechanics, and due to their application to Micro Air Vehicles (MAVs) which operate in a similar parameter space. Previous PIV wake measurements have described the structures left by bats and birds, and provided insight to the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions due to significant experimental challenges associated with the highly three-dimensional and unsteady nature of the flows, and the low wake velocities associated with lifting bodies that only weigh a few grams. This requires the high-speed resolution of small flow features in a large field of view using limited laser energy and finite camera resolution. Cross-stream measurements are further complicated by the high out-of-plane flow which requires thick laser sheets and short interframe times. To quantify and address these challenges we present data from a model study on the wake behind a fixed wing at conditions comparable to those found in biological flight. We present a detailed analysis of the PIV wake measurements, discuss the criteria necessary for accurate measurements, and present a new dual-plane PIV configuration to resolve these issues.

  3. Accurate analysis of multicomponent fuel spray evaporation in turbulent flow

    Rauch, Bastian; Calabria, Raffaela; Chiariello, Fabio; Le Clercq, Patrick; Massoli, Patrizio; Rachner, Michael

    2012-04-01

    The aim of this paper is to perform an accurate analysis of the evaporation of single component and binary mixture fuels sprays in a hot weakly turbulent pipe flow by means of experimental measurement and numerical simulation. This gives a deeper insight into the relationship between fuel composition and spray evaporation. The turbulence intensity in the test section is equal to 10%, and the integral length scale is three orders of magnitude larger than the droplet size while the turbulence microscale (Kolmogorov scales) is of same order as the droplet diameter. The spray produced by means of a calibrated droplet generator was injected in a gas flow electrically preheated. N-nonane, isopropanol, and their mixtures were used in the tests. The generalized scattering imaging technique was applied to simultaneously determine size, velocity, and spatial location of the droplets carried by the turbulent flow in the quartz tube. The spray evaporation was computed using a Lagrangian particle solver coupled to a gas-phase solver. Computations of spray mean diameter and droplet size distributions at different locations along the pipe compare very favorably with the measurement results. This combined research tool enabled further investigation concerning the influencing parameters upon the evaporation process such as the turbulence, droplet internal mixing, and liquid-phase thermophysical properties.

  4. Accurate measurement of RF exposure from emerging wireless communication systems

    Isotropic broadband probes or spectrum analyzers (SAs) may be used for the measurement of rapidly varying electromagnetic fields generated by emerging wireless communication systems. In this paper this problematic is investigated by comparing the responses measured by two different isotropic broadband probes typically used to perform electric field (E-field) evaluations. The broadband probes are submitted to signals with variable duty cycles (DC) and crest factors (CF) either with or without Orthogonal Frequency Division Multiplexing (OFDM) modulation but with the same root-mean-square (RMS) power. The two probes do not provide accurate enough results for deterministic signals such as Worldwide Interoperability for Microwave Access (WIMAX) or Long Term Evolution (LTE) as well as for non-deterministic signals such as Wireless Fidelity (WiFi). The legacy measurement protocols should be adapted to cope for the emerging wireless communication technologies based on the OFDM modulation scheme. This is not easily achieved except when the statistics of the RF emission are well known. In this case the measurement errors are shown to be systematic and a correction factor or calibration can be applied to obtain a good approximation of the total RMS power.

  5. Accurate reading with sequential presentation of single letters

    Nicholas Seow Chiang Price

    2012-10-01

    Full Text Available Rapid, accurate reading is possible when isolated, single words from a sentence are sequentially presented at a fixed spatial location. We investigated if reading of words and sentences is possible when single letters are rapidly presented at the fovea under user-controlled or automatically-controlled rates. When tested with complete sentences, trained participants achieved reading rates of over 60 words/minute and accuracies of over 90% with the single letter reading (SLR method and naive participants achieved average reading rates over 30 wpm with >90% accuracy. Accuracy declined as individual letters were presented for shorter periods of time, even when the overall reading rate was maintained by increasing the duration of spaces between words. Words in the lexicon that occur more frequently were identified with higher accuracy and more quickly, demonstrating that trained participants have lexical access. In combination, our data strongly suggest that comprehension is possible and that SLR is a practicable form of reading under conditions in which normal scanning of text is not possible, or for scenarios with limited spatial and temporal resolution such as patients with low vision or prostheses.

  6. Accurate stereochemistry for two related 22,26-epiminocholestene derivatives

    Regioselective opening of ring E of solasodine under various conditions afforded (25R)-22,26-epimino@@cholesta-5,22(N)-di@@ene-3β,16β-diyl diacetate (previously known as 3,16-diacetyl pseudosolasodine B), C31H47NO4, or (22S,25R)-16β-hydr@@oxy-22,26-epimino@@cholesta-5-en-3β-yl acetate (a derivative of the naturally occurring alkaloid oblonginine), C29H47NO3. In both cases, the reactions are carried out with retention of chirality at the C16, C20 and C25 stereogenic centers, which are found to be S, S and R, respectively. Although pseudosolasodine was synthesized 50 years ago, these accurate assignments clarify some controversial points about the actual stereochemistry for these alkaloids. This is of particular importance in the case of oblonginine, since this compound is currently under consideration for the treatment of aphasia arising from apoplexy; the present study defines a diastereoisomerically pure compound for pharmacological studies

  7. How Accurate Is Pierce's Theory of Traveling Wave Tube?

    Simon, D. H.; Chernin, D.; Wong, P.; Zhang, P.; Lau, Y. Y.; Dong, C. F.; Hoff, B.; Gilgenbach, R. M.

    2015-11-01

    This paper provides a rigorous test of the accuracy of Pierce's classical theory of traveling wave tubes (TWTs). The EXACT dispersion relation for a dielectric TWT is derived, from which the spatial amplification rate, ki, is calculated. This ki is compared with that obtained from Pierce's widely used 3-wave theory and his more general 4-wave theory (which includes the reverse propagating circuit mode). We have used various procedures to extract Pierce's gain parameter C and space charge parameter Q from the exact dispersion relation. We find that, in general, the 3-wave theory is a poor representation to the exact dispersion relation if C >0.05. However, the 4-wave theory gives excellent agreement even for C as high as 0.12 and over more than 20 percent bandwidth, if the quantity (k2 × C3) is evaluated accurately as a function of frequency, and if Q is expanded to first order in the wavenumber k, where Q is the difference between the exact dispersion relation and its 4-wave representation in which Q is set to zero. Similar tests will be performed on the disk-on-rod slow wave TWT, for which the hot tube dispersion relation including all space harmonics has been obtained. Supported by AFOSR FA9550-14-1-0309, FA9550-15-1-0097, AFRL FA9451-14-1-0374, and L-3 Communications.

  8. Accurate measurement of RF exposure from emerging wireless communication systems

    Letertre, Thierry; Monebhurrun, Vikass; Toffano, Zeno

    2013-04-01

    Isotropic broadband probes or spectrum analyzers (SAs) may be used for the measurement of rapidly varying electromagnetic fields generated by emerging wireless communication systems. In this paper this problematic is investigated by comparing the responses measured by two different isotropic broadband probes typically used to perform electric field (E-field) evaluations. The broadband probes are submitted to signals with variable duty cycles (DC) and crest factors (CF) either with or without Orthogonal Frequency Division Multiplexing (OFDM) modulation but with the same root-mean-square (RMS) power. The two probes do not provide accurate enough results for deterministic signals such as Worldwide Interoperability for Microwave Access (WIMAX) or Long Term Evolution (LTE) as well as for non-deterministic signals such as Wireless Fidelity (WiFi). The legacy measurement protocols should be adapted to cope for the emerging wireless communication technologies based on the OFDM modulation scheme. This is not easily achieved except when the statistics of the RF emission are well known. In this case the measurement errors are shown to be systematic and a correction factor or calibration can be applied to obtain a good approximation of the total RMS power.

  9. Data fusion for accurate microscopic rough surface metrology.

    Chen, Yuhang

    2016-06-01

    Data fusion for rough surface measurement and evaluation was analyzed on simulated datasets, one with higher density (HD) but lower accuracy and the other with lower density (LD) but higher accuracy. Experimental verifications were then performed on laser scanning microscopy (LSM) and atomic force microscopy (AFM) characterizations of surface areal roughness artifacts. The results demonstrated that the fusion based on Gaussian process models is effective and robust under different measurement biases and noise strengths. All the amplitude, height distribution, and spatial characteristics of the original sample structure can be precisely recovered, with better metrological performance than any individual measurements. As for the influencing factors, the HD noise has a relatively weaker effect as compared with the LD noise. Furthermore, to enable an accurate fusion, the ratio of LD sampling interval to surface autocorrelation length should be smaller than a critical threshold. In general, data fusion is capable of enhancing the nanometrology of rough surfaces by combining efficient LSM measurement and down-sampled fast AFM scan. The accuracy, resolution, spatial coverage and efficiency can all be significantly improved. It is thus expected to have potential applications in development of hybrid microscopy and in surface metrology. PMID:27058888

  10. Downhole temperature tool accurately measures well bore profile

    This paper reports that an inexpensive temperature tool provides accurate temperatures measurements during drilling operations for better design of cement jobs, workovers, well stimulation, and well bore hydraulics. Valid temperature data during specific wellbore operations can improve initial job design, fluid testing, and slurry placement, ultimately enhancing well bore performance. This improvement applies to cement slurries, breaker activation for slurries, breaker activation for stimulation and profile control, and fluid rheological properties for all downhole operations. The temperature tool has been run standalone mounted inside drill pipe, on slick wire line and braided cable, and as a free-falltool. It has also been run piggyback on both directional surveys (slick line and free-fall) and standard logging runs. This temperature measuring system has been used extensively in field well bores to depths of 20,000 ft. The temperature tool is completely reusable in the field, ever similar to the standard directional survey tools used on may drilling rigs. The system includes a small, rugged, programmable temperature sensor, a standard body housing, various adapters for specific applications, and a personal computer (PC) interface

  11. Accurate measurement of liquid transport through nanoscale conduits

    Alibakhshi, Mohammad Amin; Xie, Quan; Li, Yinxiao; Duan, Chuanhua

    2016-04-01

    Nanoscale liquid transport governs the behaviour of a wide range of nanofluidic systems, yet remains poorly characterized and understood due to the enormous hydraulic resistance associated with the nanoconfinement and the resulting minuscule flow rates in such systems. To overcome this problem, here we present a new measurement technique based on capillary flow and a novel hybrid nanochannel design and use it to measure water transport through single 2-D hydrophilic silica nanochannels with heights down to 7 nm. Our results show that silica nanochannels exhibit increased mass flow resistance compared to the classical hydrodynamics prediction. This difference increases with decreasing channel height and reaches 45% in the case of 7 nm nanochannels. This resistance increase is attributed to the formation of a 7-angstrom-thick stagnant hydration layer on the hydrophilic surfaces. By avoiding use of any pressure and flow sensors or any theoretical estimations the hybrid nanochannel scheme enables facile and precise flow measurement through single nanochannels, nanotubes, or nanoporous media and opens the prospect for accurate characterization of both hydrophilic and hydrophobic nanofluidic systems.

  12. Fast and Accurate Brain Image Retrieval Using Gabor Wavelet Algorithm

    J.Esther

    2014-01-01

    Full Text Available CBIR in medical image databases are used to assist physician in diagnosis the diseases and also used to aid diagnosis by identifying similar past cases. In order to retrieve a fast, accurate and an effective similarity of images from the large data set. The pre-processing step is extraction of brain. It removes the unwanted non-brain areas like scalp, skull, neck, eyes, ear etc from the MRI Head scan images. After removing the unwanted areas of non-brain region, it is very effective to retrieve the similar images. In this paper it is proposed a brain extraction technique using fuzzy morphological operators. For the experimental results 1200 MRI images are taken from scan centre and some brain images are collected from web and these have been implemented with popular brain extraction algorithm of Graph- Cut Algorithm (GCUT and Expectation Maximization algorithm (EMA. The experiment result shows that the proposed algorithm fuzzy morphological operator algorithm (FMOA is prompting the best promising results. Using this FMOA result retrieved the brain image from the large collection of databases using Gabor-Wavelet Transform.

  13. Fast, accurate, robust and Open Source Brain Extraction Tool (OSBET)

    Namias, R.; Donnelly Kehoe, P.; D'Amato, J. P.; Nagel, J.

    2015-12-01

    The removal of non-brain regions in neuroimaging is a critical task to perform a favorable preprocessing. The skull-stripping depends on different factors including the noise level in the image, the anatomy of the subject being scanned and the acquisition sequence. For these and other reasons, an ideal brain extraction method should be fast, accurate, user friendly, open-source and knowledge based (to allow for the interaction with the algorithm in case the expected outcome is not being obtained), producing stable results and making it possible to automate the process for large datasets. There are already a large number of validated tools to perform this task but none of them meets the desired characteristics. In this paper we introduced an open source brain extraction tool (OSBET), composed of four steps using simple well-known operations such as: optimal thresholding, binary morphology, labeling and geometrical analysis that aims to assemble all the desired features. We present an experiment comparing OSBET with other six state-of-the-art techniques against a publicly available dataset consisting of 40 T1-weighted 3D scans and their corresponding manually segmented images. OSBET gave both: a short duration with an excellent accuracy, getting the best Dice Coefficient metric. Further validation should be performed, for instance, in unhealthy population, to generalize its usage for clinical purposes.

  14. Accurate methodology for channel bow impact on CPR

    An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an enhanced CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This is considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. The enhanced CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (orig.)

  15. AN ACCURATE FLUX DENSITY SCALE FROM 1 TO 50 GHz

    Perley, R. A.; Butler, B. J., E-mail: RPerley@nrao.edu, E-mail: BButler@nrao.edu [National Radio Astronomy Observatory, P.O. Box O, Socorro, NM 87801 (United States)

    2013-02-15

    We develop an absolute flux density scale for centimeter-wavelength astronomy by combining accurate flux density ratios determined by the Very Large Array between the planet Mars and a set of potential calibrators with the Rudy thermophysical emission model of Mars, adjusted to the absolute scale established by the Wilkinson Microwave Anisotropy Probe. The radio sources 3C123, 3C196, 3C286, and 3C295 are found to be varying at a level of less than {approx}5% per century at all frequencies between 1 and 50 GHz, and hence are suitable as flux density standards. We present polynomial expressions for their spectral flux densities, valid from 1 to 50 GHz, with absolute accuracy estimated at 1%-3% depending on frequency. Of the four sources, 3C286 is the most compact and has the flattest spectral index, making it the most suitable object on which to establish the spectral flux density scale. The sources 3C48, 3C138, 3C147, NGC 7027, NGC 6542, and MWC 349 show significant variability on various timescales. Polynomial coefficients for the spectral flux density are developed for 3C48, 3C138, and 3C147 for each of the 17 observation dates, spanning 1983-2012. The planets Venus, Uranus, and Neptune are included in our observations, and we derive their brightness temperatures over the same frequency range.

  16. Accurate calculations of bound rovibrational states for argon trimer

    Brandon, Drew; Poirier, Bill [Department of Chemistry and Biochemistry, and Department of Physics, Texas Tech University, Box 41061, Lubbock, Texas 79409-1061 (United States)

    2014-07-21

    This work presents a comprehensive quantum dynamics calculation of the bound rovibrational eigenstates of argon trimer (Ar{sub 3}), using the ScalIT suite of parallel codes. The Ar{sub 3} rovibrational energy levels are computed to a very high level of accuracy (10{sup −3} cm{sup −1} or better), and up to the highest rotational and vibrational excitations for which bound states exist. For many of these rovibrational states, wavefunctions are also computed. Rare gas clusters such as Ar{sub 3} are interesting because the interatomic interactions manifest through long-range van der Waals forces, rather than through covalent chemical bonding. As a consequence, they exhibit strong Coriolis coupling between the rotational and vibrational degrees of freedom, as well as highly delocalized states, all of which renders accurate quantum dynamical calculation difficult. Moreover, with its (comparatively) deep potential well and heavy masses, Ar{sub 3} is an especially challenging rare gas trimer case. There are a great many rovibrational eigenstates to compute, and a very high density of states. Consequently, very few previous rovibrational state calculations for Ar{sub 3} may be found in the current literature—and only for the lowest-lying rotational excitations.

  17. Accurate calculations of bound rovibrational states for argon trimer

    This work presents a comprehensive quantum dynamics calculation of the bound rovibrational eigenstates of argon trimer (Ar3), using the ScalIT suite of parallel codes. The Ar3 rovibrational energy levels are computed to a very high level of accuracy (10−3 cm−1 or better), and up to the highest rotational and vibrational excitations for which bound states exist. For many of these rovibrational states, wavefunctions are also computed. Rare gas clusters such as Ar3 are interesting because the interatomic interactions manifest through long-range van der Waals forces, rather than through covalent chemical bonding. As a consequence, they exhibit strong Coriolis coupling between the rotational and vibrational degrees of freedom, as well as highly delocalized states, all of which renders accurate quantum dynamical calculation difficult. Moreover, with its (comparatively) deep potential well and heavy masses, Ar3 is an especially challenging rare gas trimer case. There are a great many rovibrational eigenstates to compute, and a very high density of states. Consequently, very few previous rovibrational state calculations for Ar3 may be found in the current literature—and only for the lowest-lying rotational excitations

  18. Accurate measurement of oxygen consumption in children undergoing cardiac catheterization.

    Li, Jia

    2013-01-01

    Oxygen consumption (VO(2) ) is an important part of hemodynamics using the direct Fick principle in children undergoing cardiac catheterization. Accurate measurement of VO(2) is vital. Obviously, any error in the measurement of VO(2) will translate directly into an equivalent percentage under- or overestimation of blood flows and vascular resistances. It remains common practice to estimate VO(2) values from published predictive equations. Among these, the LaFarge equation is the most commonly used equation and gives the closest estimation with the least bias and limits of agreement. However, considerable errors are introduced by the LaFarge equation, particularly in children younger than 3 years of age. Respiratory mass spectrometry remains the "state-of-the-art" method, allowing highly sensitive, rapid and simultaneous measurement of multiple gas fractions. The AMIS 2000 quadrupole respiratory mass spectrometer system has been adapted to measure VO(2) in children under mechanical ventilation with pediatric ventilators during cardiac catheterization. The small sampling rate, fast response time and long tubes make the equipment a unique and powerful tool for bedside continuous measurement of VO(2) in cardiac catheterization for both clinical and research purposes. PMID:22488802

  19. The Global Geodetic Infrastructure for Accurate Monitoring of Earth Systems

    Weston, Neil; Blackwell, Juliana; Wang, Yan; Willis, Zdenka

    2014-05-01

    The National Geodetic Survey (NGS) and the Integrated Ocean Observing System (IOOS), two Program Offices within the National Ocean Service, NOAA, routinely collect, analyze and disseminate observations and products from several of the 17 critical systems identified by the U.S. Group on Earth Observations. Gravity, sea level monitoring, coastal zone and ecosystem management, geo-hazards and deformation monitoring and ocean surface vector winds are the primary Earth systems that have active research and operational programs in NGS and IOOS. These Earth systems collect terrestrial data but most rely heavily on satellite-based sensors for analyzing impacts and monitoring global change. One fundamental component necessary for monitoring via satellites is having a stable, global geodetic infrastructure where an accurate reference frame is essential for consistent data collection and geo-referencing. This contribution will focus primarily on system monitoring, coastal zone management and global reference frames and how the scientific contributions from NGS and IOOS continue to advance our understanding of the Earth and the Global Geodetic Observing System.

  20. Symphony: A Framework for Accurate and Holistic WSN Simulation

    Laurynas Riliskis

    2015-02-01

    Full Text Available Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  1. Symphony: a framework for accurate and holistic WSN simulation.

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  2. An accurately fast algorithm of calculating reflection/transmission coefficients

    CASTAGNA; J; P

    2008-01-01

    For the boundary between transversely isotropic media with a vertical axis of symmetry (VTI media), the interface between a liquid and a VTI medium, and the free-surface of an elastic half-space of a VTI medium, an accurately fast algorithm was presented for calculating reflection/transmission (R/T) coefficients. Specially, the case of post-critical angle incidence was considered. Although we only performed the numerical calculation for the models of the VTI media, the calculated results can be extended to the models of transversely isotropic media with a horizontal axis of rotation symmetry (HTI media). Compared to previous work, this algorithm can be used not only for the calculation of R/T coefficients of the boundary between ellipsoidally anisotropic media, but also for that between generally anisotropic media, and the speed and accuracy of this algorithm are faster and higher. According to the anisotropic parameters of some rocks given by the published literature, we performed the calculation of R/T coefficients by using this algorithm and analyzed the effect of the rock anisotropy on R/T coefficients. We used Snell’s law and the energy balance principle to perform verification for the calculated results.

  3. Accurate transition rates for intercombination lines of singly ionized nitrogen

    The transition energies and rates for the 2s22p23P1,2-2s2p35S2o and 2s22p3s-2s22p3p intercombination transitions have been calculated using term-dependent nonorthogonal orbitals in the multiconfiguration Hartree-Fock approach. Several sets of spectroscopic and correlation nonorthogonal functions have been chosen to describe adequately term dependence of wave functions and various correlation corrections. Special attention has been focused on the accurate representation of strong interactions between the 2s2p31,3P1o and 2s22p3s 1,3P1olevels. The relativistic corrections are included through the one-body mass correction, Darwin, and spin-orbit operators and two-body spin-other-orbit and spin-spin operators in the Breit-Pauli Hamiltonian. The importance of core-valence correlation effects has been examined. The accuracy of present transition rates is evaluated by the agreement between the length and velocity formulations combined with the agreement between the calculated and measured transition energies. The present results for transition probabilities, branching fraction, and lifetimes have been compared with previous calculations and experiments.

  4. An Accurate Flux Density Scale from 1 to 50 GHz

    Perley, Rick A

    2012-01-01

    We develop an absolute flux density scale for cm-wavelength astronomy by combining accurate flux density ratios determined by the VLA between the planet Mars and a set of potential calibrators with the Rudy thermophysical emission model of Mars, adjusted to the absolute scale established by WMAP. The radio sources 3C123, 3C196, 3C286 and 3C295 are found to be varying at a level of less than ~5% per century at all frequencies between 1 and 50 GHz, and hence are suitable as flux density standards. We present polynomial expressions for their spectral flux densities, valid from 1 to 50 GHz, with absolute accuracy estimated at 1-3% depending on frequency. Of the four sources, 3C286 is the most compact and has the flattest spectral index, making it the most suitable object on which to establish the spectral flux density scale. The sources 3C48, 3C138, 3C147, NGC7027, NGC6542, and MWC349 show significant variability on various timescales. Polynomial coefficients for the spectral flux density are developed for 3C48, ...

  5. Accurate Detection of Rifampicin-Resistant Mycobacterium Tuberculosis Strains.

    Song, Keum-Soo; Nimse, Satish Balasaheb; Kim, Hee Jin; Yang, Jeongseong; Kim, Taisun

    2016-01-01

    In 2013 alone, the death rate among the 9.0 million people infected with Mycobacterium tuberculosis (TB) worldwide was around 14%, which is unacceptably high. An empiric treatment of patients infected with TB or drug-resistant Mycobacterium tuberculosis (MDR-TB) strain can also result in the spread of MDR-TB. The diagnostic tools which are rapid, reliable, and have simple experimental protocols can significantly help in decreasing the prevalence rate of MDR-TB strain. We report the evaluation of the 9G technology based 9G DNAChips that allow accurate detection and discrimination of TB and MDR-TB-RIF. One hundred and thirteen known cultured samples were used to evaluate the ability of 9G DNAChip in the detection and discrimination of TB and MDR-TB-RIF strains. Hybridization of immobilized probes with the PCR products of TB and MDR-TB-RIF strains allow their detection and discrimination. The accuracy of 9G DNAChip was determined by comparing its results with sequencing analysis and drug susceptibility testing. Sequencing analysis showed 100% agreement with the results of 9G DNAChip. The 9G DNAChip showed very high sensitivity (95.4%) and specificity (100%). PMID:26999135

  6. Intracellular recording of action potentials by nanopillar electroporation

    Xie, Chong; Lin, Ziliang; Hanson, Lindsey; Cui, Yi; Cui, Bianxiao

    2012-03-01

    Action potentials have a central role in the nervous system and in many cellular processes, notably those involving ion channels. The accurate measurement of action potentials requires efficient coupling between the cell membrane and the measuring electrodes. Intracellular recording methods such as patch clamping involve measuring the voltage or current across the cell membrane by accessing the cell interior with an electrode, allowing both the amplitude and shape of the action potentials to be recorded faithfully with high signal-to-noise ratios. However, the invasive nature of intracellular methods usually limits the recording time to a few hours, and their complexity makes it difficult to simultaneously record more than a few cells. Extracellular recording methods, such as multielectrode arrays and multitransistor arrays, are non-invasive and allow long-term and multiplexed measurements. However, extracellular recording sacrifices the one-to-one correspondence between the cells and electrodes, and also suffers from significantly reduced signal strength and quality. Extracellular techniques are not, therefore, able to record action potentials with the accuracy needed to explore the properties of ion channels. As a result, the pharmacological screening of ion-channel drugs is usually performed by low-throughput intracellular recording methods. The use of nanowire transistors, nanotube-coupled transistors and micro gold-spine and related electrodes can significantly improve the signal strength of recorded action potentials. Here, we show that vertical nanopillar electrodes can record both the extracellular and intracellular action potentials of cultured cardiomyocytes over a long period of time with excellent signal strength and quality. Moreover, it is possible to repeatedly switch between extracellular and intracellular recording by nanoscale electroporation and resealing processes. Furthermore, vertical nanopillar electrodes can detect subtle changes in action

  7. Short-wavelength magnetic recording new methods and analyses

    Ruigrok, JJM

    2013-01-01

    Short-wavelength magnetic recording presents a series of practical solutions to a wide range of problems in the field of magnetic recording. It features many new and original results, all derived from fundamental principles as a result of up-to-date research.A special section is devoted to the playback process, including the calculations of head efficiency and head impedance, derived from new theorems.Features include:A simple and fast method for measuring efficiency; a simple method for the accurate separation of the read and write behaviour of magnetic heads; a new concept - the bandpass hea

  8. New pre-coded food record form validation

    Víctor Manuel Rodríguez; Ana Elbusto-Cabello; Mireia Alberdi-Albeniz; Amaia De la Presa-Donado; Francisco Gómez-Pérez de Mendiola; Maria Puy Portillo-Baquedano; Itziar Churruca-Ortega

    2014-01-01

    Introduction: For some research fields, simple and accurate food intake quantification tools are needed. The aim of the present work was to design a new self-administered and pre-coded food intake record form and assess its reliability and validity when quantifying the food intake of adult population, in terms of food or food-groups portions.Material and Methods: First of all, a new food-record form was designed, which included food usually consumed and which sought to be easy-to-use, short, ...

  9. DVL Angular Velocity Recorder

    Liebe, Wolfgang

    1944-01-01

    In many studies, especially of nonstationary flight motion, it is necessary to determine the angular velocities at which the airplane rotates about its various axes. The three-component recorder is designed to serve this purpose. If the angular velocity for one flight attitude is known, other important quantities can be derived from its time rate of change, such as the angular acceleration by differentiations, or - by integration - the angles of position of the airplane - that is, the angles formed by the airplane axes with the axis direction presented at the instant of the beginning of the motion that is to be investigated.

  10. Magnetoencephalography recording and analysis

    Jayabal Velmurugan

    2014-01-01

    Full Text Available Magnetoencephalography (MEG non-invasively measures the magnetic field generated due to the excitatory postsynaptic electrical activity of the apical dendritic pyramidal cells. Such a tiny magnetic field is measured with the help of the biomagnetometer sensors coupled with the Super Conducting Quantum Interference Device (SQUID inside the magnetically shielded room (MSR. The subjects are usually screened for the presence of ferromagnetic materials, and then the head position indicator coils, electroencephalography (EEG electrodes (if measured simultaneously, and fiducials are digitized using a 3D digitizer, which aids in movement correction and also in transferring the MEG data from the head coordinates to the device and voxel coordinates, thereby enabling more accurate co-registration and localization. MEG data pre-processing involves filtering the data for environmental and subject interferences, artefact identification, and rejection. Magnetic resonance Imaging (MRI is processed for correction and identifying fiducials. After choosing and computing for the appropriate head models (spherical or realistic; boundary/finite element model, the interictal/ictal epileptiform discharges are selected and modeled by an appropriate source modeling technique (clinically and commonly used - single equivalent current dipole - ECD model. The equivalent current dipole (ECD source localization of the modeled interictal epileptiform discharge (IED is considered physiologically valid or acceptable based on waveform morphology, isofield pattern, and dipole parameters (localization, dipole moment, confidence volume, goodness of fit. Thus, MEG source localization can aid clinicians in sublobar localization, lateralization, and grid placement, by evoking the irritative/seizure onset zone. It also accurately localizes the eloquent cortex-like visual, language areas. MEG also aids in diagnosing and delineating multiple novel findings in other neuropsychiatric

  11. Radio Astronomers Set New Standard for Accurate Cosmic Distance Measurement

    1999-06-01

    A team of radio astronomers has used the National Science Foundation's Very Long Baseline Array (VLBA) to make the most accurate measurement ever made of the distance to a faraway galaxy. Their direct measurement calls into question the precision of distance determinations made by other techniques, including those announced last week by a team using the Hubble Space Telescope. The radio astronomers measured a distance of 23.5 million light-years to a galaxy called NGC 4258 in Ursa Major. "Ours is a direct measurement, using geometry, and is independent of all other methods of determining cosmic distances," said Jim Herrnstein, of the National Radio Astronomy Observatory (NRAO) in Socorro, NM. The team says their measurement is accurate to within less than a million light-years, or four percent. The galaxy is also known as Messier 106 and is visible with amateur telescopes. Herrnstein, along with James Moran and Lincoln Greenhill of the Harvard- Smithsonian Center for Astrophysics; Phillip Diamond, of the Merlin radio telescope facility at Jodrell Bank and the University of Manchester in England; Makato Inoue and Naomasa Nakai of Japan's Nobeyama Radio Observatory; Mikato Miyoshi of Japan's National Astronomical Observatory; Christian Henkel of Germany's Max Planck Institute for Radio Astronomy; and Adam Riess of the University of California at Berkeley, announced their findings at the American Astronomical Society's meeting in Chicago. "This is an incredible achievement to measure the distance to another galaxy with this precision," said Miller Goss, NRAO's Director of VLA/VLBA Operations. "This is the first time such a great distance has been measured this accurately. It took painstaking work on the part of the observing team, and it took a radio telescope the size of the Earth -- the VLBA -- to make it possible," Goss said. "Astronomers have sought to determine the Hubble Constant, the rate of expansion of the universe, for decades. This will in turn lead to an

  12. SNPdetector: A Software Tool for Sensitive and Accurate SNP Detection.

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozgyosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  13. TOWARDS MORE ACCURATE CLUSTERING METHOD BY USING DYNAMIC TIME WARPING

    Khadoudja Ghanem

    2013-03-01

    Full Text Available An intrinsic problem of classifiers based on machine learning (ML methods is that their learning time grows as the size and complexity of the training dataset increases. For this reason, it is important to have efficient computational methods and algorithms that can be applied on large datasets, such that it is still possible to complete the machine learning tasks in reasonable time. In this context, we present in this paper a more accurate simple process to speed up ML methods. An unsupervised clustering algorithm is combined with Expectation, Maximization (EM algorithm to develop an efficient Hidden Markov Model (HMM training. The idea of the proposed process consists of two steps. In the first step, training instances with similar inputs are clustered and a weight factor which represents the frequency of these instances is assigned to each representative cluster. Dynamic Time Warping technique is used as a dissimilarity function to cluster similar examples. In the second step, all formulas in the classical HMM training algorithm (EM associated with the number of training instances are modified to include the weight factor in appropriate terms. This process significantly accelerates HMM training while maintaining the same initial, transition and emission probabilities matrixes as those obtained with the classical HMM training algorithm. Accordingly, the classification accuracy is preserved. Depending on the size of the training set, speedups of up to 2200 times is possible when the size is about 100.000 instances. The proposed approach is not limited to training HMMs, but it can be employed for a large variety of MLs methods.

  14. Towards More Accurate Clutering Method by Using Dynamic Time Warping

    Khadoudja Ghanem

    2013-04-01

    Full Text Available An intrinsic problem of classifiers based on machine learning (ML methods is that their learning timegrows as the size and complexity of the training dataset increases. For this reason, it is important to have efficient computational methods and algorithms that can be applied on large datasets, such that it is still possible to complete the machine learning tasks in reasonable time. In this context, we present in this paper a more accurate simple process to speed up ML methods. An unsupervised clustering algorithm is combined with Expectation, Maximization (EM algorithm to develop an efficient Hidden Markov Model (HMM training. The idea of the proposed process consists of two steps. In the first step, training instances with similar inputs are clustered and a weight factor which represents the frequency of these instances is assigned to each representative cluster. Dynamic Time Warping technique is used as a dissimilarity function to cluster similar examples. In the second step, all formulas in the classical HMM training algorithm (EM associated with the number of training instances are modified to include the weight factor in appropriate terms. This process significantly accelerates HMM training while maintaining the same initial, transition and emission probabilities matrixes as those obtained with the classical HMM training algorithm. Accordingly, the classification accuracy is preserved. Depending on the size of the training set, speedups of up to 2200 times is possible when the size is about 100.000 instances. The proposed approach is not limited to training HMMs, but it can be employed for a large variety of MLs methods

  15. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  16. CT-Analyst: fast and accurate CBR emergency assessment

    Boris, Jay; Fulton, Jack E., Jr.; Obenschain, Keith; Patnaik, Gopal; Young, Theodore, Jr.

    2004-08-01

    An urban-oriented emergency assessment system for airborne Chemical, Biological, and Radiological (CBR) threats, called CT-Analyst and based on new principles, gives greater accuracy and much greater speed than possible with current alternatives. This paper explains how this has been done. The increased accuracy derives from detailed, three-dimensional CFD computations including, solar heating, buoyancy, complete building geometry specification, trees, wind fluctuations, and particle and droplet distributions (as appropriate). This paper shows how a very finite number of such computations for a given area can be extended to all wind directions and speeds, and all likely sources and source locations using a new data structure called Dispersion Nomographs. Finally, we demonstrate a portable, entirely graphical software tool called CT-Analyst that embodies this entirely new, high-resolution technology and runs effectively on small personal computers. Real-time users don't have to wait for results because accurate answers are available with near zero-latency (that is 10 - 20 scenarios per second). Entire sequences of cases (e.g. a continuously changing source location or wind direction) can be computed and displayed as continuous-action movies. Since the underlying database has been precomputed, the door is wide open for important new real-time, zero-latency functions such as sensor data fusion, backtracking to an unknown source location, and even evacuation route planning. Extensions of the technology to sensor location optimization, buildings, tunnels, and integration with other advanced technologies, e.g. micrometeorology or detailed wind field measurements, will be discussed briefly here.

  17. Copeptin does not accurately predict disease severity in imported malaria

    van Wolfswinkel Marlies E

    2012-01-01

    Full Text Available Abstract Background Copeptin has recently been identified to be a stable surrogate marker for the unstable hormone arginine vasopressin (AVP. Copeptin has been shown to correlate with disease severity in leptospirosis and bacterial sepsis. Hyponatraemia is common in severe imported malaria and dysregulation of AVP release has been hypothesized as an underlying pathophysiological mechanism. The aim of the present study was to evaluate the performance of copeptin as a predictor of disease severity in imported malaria. Methods Copeptin was measured in stored serum samples of 204 patients with imported malaria that were admitted to our Institute for Tropical Diseases in Rotterdam in the period 1999-2010. The occurrence of WHO defined severe malaria was the primary end-point. The diagnostic performance of copeptin was compared to that of previously evaluated biomarkers C-reactive protein, procalcitonin, lactate and sodium. Results Of the 204 patients (141 Plasmodium falciparum, 63 non-falciparum infection, 25 had severe malaria. The Area Under the ROC curve of copeptin for severe disease (0.66 [95% confidence interval 0.59-0.72] was comparable to that of lactate, sodium and procalcitonin. C-reactive protein (0.84 [95% CI 0.79-0.89] had a significantly better performance as a biomarker for severe malaria than the other biomarkers. Conclusions C-reactive protein but not copeptin was found to be an accurate predictor for disease severity in imported malaria. The applicability of copeptin as a marker for severe malaria in clinical practice is limited to exclusion of severe malaria.

  18. Accurate calculation of (31)P NMR chemical shifts in polyoxometalates.

    Pascual-Borràs, Magda; López, Xavier; Poblet, Josep M

    2015-04-14

    We search for the best density functional theory strategy for the determination of (31)P nuclear magnetic resonance (NMR) chemical shifts, δ((31)P), in polyoxometalates. Among the variables governing the quality of the quantum modelling, we tackle herein the influence of the functional and the basis set. The spin-orbit and solvent effects were routinely included. To do so we analysed the family of structures α-[P2W18-xMxO62](n-) with M = Mo(VI), V(V) or Nb(V); [P2W17O62(M'R)](n-) with M' = Sn(IV), Ge(IV) and Ru(II) and [PW12-xMxO40](n-) with M = Pd(IV), Nb(V) and Ti(IV). The main results suggest that, to date, the best procedure for the accurate calculation of δ((31)P) in polyoxometalates is the combination of TZP/PBE//TZ2P/OPBE (for NMR//optimization step). The hybrid functionals (PBE0, B3LYP) tested herein were applied to the NMR step, besides being more CPU-consuming, do not outperform pure GGA functionals. Although previous studies on (183)W NMR suggested that the use of very large basis sets like QZ4P were needed for geometry optimization, the present results indicate that TZ2P suffices if the functional is optimal. Moreover, scaling corrections were applied to the results providing low mean absolute errors below 1 ppm for δ((31)P), which is a step forward in order to confirm or predict chemical shifts in polyoxometalates. Finally, via a simplified molecular model, we establish how the small variations in δ((31)P) arise from energy changes in the occupied and virtual orbitals of the PO4 group. PMID:25738630

  19. Accurate mobile malware detection and classification in the cloud.

    Wang, Xiaolei; Yang, Yuexiang; Zeng, Yingzhi

    2015-01-01

    As the dominator of the Smartphone operating system market, consequently android has attracted the attention of s malware authors and researcher alike. The number of types of android malware is increasing rapidly regardless of the considerable number of proposed malware analysis systems. In this paper, by taking advantages of low false-positive rate of misuse detection and the ability of anomaly detection to detect zero-day malware, we propose a novel hybrid detection system based on a new open-source framework CuckooDroid, which enables the use of Cuckoo Sandbox's features to analyze Android malware through dynamic and static analysis. Our proposed system mainly consists of two parts: anomaly detection engine performing abnormal apps detection through dynamic analysis; signature detection engine performing known malware detection and classification with the combination of static and dynamic analysis. We evaluate our system using 5560 malware samples and 6000 benign samples. Experiments show that our anomaly detection engine with dynamic analysis is capable of detecting zero-day malware with a low false negative rate (1.16 %) and acceptable false positive rate (1.30 %); it is worth noting that our signature detection engine with hybrid analysis can accurately classify malware samples with an average positive rate 98.94 %. Considering the intensive computing resources required by the static and dynamic analysis, our proposed detection system should be deployed off-device, such as in the Cloud. The app store markets and the ordinary users can access our detection system for malware detection through cloud service. PMID:26543718

  20. Weighing Rain Gauge Recording Charts

    National Oceanic and Atmospheric Administration, Department of Commerce — Weighing rain gauge charts record the amount of precipitation that falls at a given location. The vast majority of the Weighing Rain Gauge Recording Charts...

  1. Quality assurance records system

    This Safety Guide was prepared as part of the Agency's programme, referred to as the NUSS programme, for establishing Codes of Practice and Safety Guides relating to nuclear power plants. It supplements the IAEA Code of Practice on Quality Assurance for Safety in Nuclear Power Plants (IAEA Safety Series No.50-C-QA), which requires that for each nuclear power plant a system for the generation, identification, collection, indexing, filing, storing, maintenance and disposition of quality assurance records shall be established and executed in accordance with written procedures and instructions. The purpose of this Safety Guide is to provide assistance in the establishment and operation of such a system. An orderly established and maintained records system is considered to be part of the means of providing a basis for an appropriate level of confidence that the activities which affect the quality of a nuclear power plant have been performed in accordance with the specific requirements and that the required quality has been achieved and is maintained

  2. Record prices [crude oil

    Crude oil prices climbed to new record levels on fears of a future loss of supplies from Iran as Washington stepped up its efforts to persuade Tehran to abandon its programme to produce nuclear fuel. IPE's December Brent contract set a new record for the exchange by trading at $75.80/bbl on 21st April. On the same day October WTI reached an all-time high of $77.30/bbl on Nymex. US product prices gained as refiners struggled to produce sufficient middle distillate. Alarmed by the rising retail price of gasoline, the US Senate debated a reduction in the already low US tax rate on motor spirit. The House of Representatives passed a measure to prohibit overcharging for petrol, diesel and heating oil, but Democrats rejected a Republican proposal to speed-up the process for approving new refineries. President George W Bush announced a temporary easing of new gasoline and diesel specifications (see 'Focus', March 2006) to allow more fuel to be produced. He also agreed to delay the repayment of some 2.1 mn bbl of crude oil lent to companies after last year's hurricanes from the Strategic Petroleum Reserve. California announced an inquiry into alleged overcharging for fuel by oil companies operating in the state. (author)

  3. Accurate determination of phase arrival times using autoregressive likelihood estimation

    G. Kvaerna

    1994-06-01

    Full Text Available We have investigated the potential automatic use of an onset picker based on autoregressive likelihood estimation. Both a single component version and a three component version of this method have been tested on data from events located in the Khibiny Massif of the Kola peninsula, recorded at the Apatity array, the Apatity three component station and the ARCESS array. Using this method, we have been able to estimate onset times to an accuracy (standard deviation of about 0.05 s for P-phases and 0.15 0.20 s for S phases. These accuracies are as good as for analyst picks, and are considerably better than the accuracies of the current onset procedure used for processing of regional array data at NORSAR. In another application, we have developed a generic procedure to reestimate the onsets of all types of first arriving P phases. By again applying the autoregressive likelihood technique, we have obtained automatic onset times of a quality such that 70% of the automatic picks are within 0.1 s of the best manual pick. For the onset time procedure currently used at NORSAR, the corresponding number is 28%. Clearly, automatic reestimation of first arriving P onsets using the autoregressive likelihood technique has the potential of significantly reducing the retiming efforts of the analyst.

  4. NRC comprehensive records disposition schedule

    Title 44 United States Code, ''Public Printing and Documents,'' regulations cited in the General Services Administration's (GSA) ''Federal Information Resources Management Regulations'' (FIRMR), Part 201-9, ''Creation, Maintenance, and Use of Records,'' and regulation issued by the National Archives and Records Administration (NARA) in 36 CFR Chapter XII, Subchapter B, ''Records Management,'' require each agency to prepare and issue a comprehensive records disposition schedule that contains the NARA approved records disposition schedules for records unique to the agency and contains the NARA's General Records Schedules for records common to several or all agencies. The approved records disposition schedules specify the appropriate duration of retention and the final disposition for records created or maintained by the NRC. NUREG-0910, Rev. 2, contains ''NRC's Comprehensive Records Disposition Schedule,'' and the original authorized approved citation numbers issued by NARA. Rev. 2 totally reorganizes the records schedules from a functional arrangement to an arrangement by the host office. A subject index and a conversion table have also been developed for the NRC schedules to allow staff to identify the new schedule numbers easily and to improve their ability to locate applicable schedules

  5. An accurate and efficient system model of iterative image reconstruction in high-resolution pinhole SPECT for small animal research

    Accurate modeling of the photon acquisition process in pinhole SPECT is essential for optimizing resolution. In this work, the authors develop an accurate system model in which pinhole finite aperture and depth-dependent geometric sensitivity are explicitly included. To achieve high-resolution pinhole SPECT, the voxel size is usually set in the range of sub-millimeter so that the total number of image voxels increase accordingly. It is inevitably that a system matrix that models a variety of favorable physical factors will become extremely sophisticated. An efficient implementation for such an accurate system model is proposed in this research. We first use the geometric symmetries to reduce redundant entries in the matrix. Due to the sparseness of the matrix, only non-zero terms are stored. A novel center-to-radius recording rule is also developed to effectively describe the relation between a voxel and its related detectors at every projection angle. The proposed system matrix is also suitable for multi-threaded computing. Finally, the accuracy and effectiveness of the proposed system model is evaluated in a workstation equipped with two Quad-Core Intel X eon processors.

  6. Enabling high grayscale resolution displays and accurate response time measurements on conventional computers.

    Li, Xiangrui; Lu, Zhong-Lin

    2012-01-01

    Display systems based on conventional computer graphics cards are capable of generating images with 8-bit gray level resolution. However, most experiments in vision research require displays with more than 12 bits of luminance resolution. Several solutions are available. Bit++ (1) and DataPixx (2) use the Digital Visual Interface (DVI) output from graphics cards and high resolution (14 or 16-bit) digital-to-analog converters to drive analog display devices. The VideoSwitcher (3) described here combines analog video signals from the red and blue channels of graphics cards with different weights using a passive resister network (4) and an active circuit to deliver identical video signals to the three channels of color monitors. The method provides an inexpensive way to enable high-resolution monochromatic displays using conventional graphics cards and analog monitors. It can also provide trigger signals that can be used to mark stimulus onsets, making it easy to synchronize visual displays with physiological recordings or response time measurements. Although computer keyboards and mice are frequently used in measuring response times (RT), the accuracy of these measurements is quite low. The RTbox is a specialized hardware and software solution for accurate RT measurements. Connected to the host computer through a USB connection, the driver of the RTbox is compatible with all conventional operating systems. It uses a microprocessor and high-resolution clock to record the identities and timing of button events, which are buffered until the host computer retrieves them. The recorded button events are not affected by potential timing uncertainties or biases associated with data transmission and processing in the host computer. The asynchronous storage greatly simplifies the design of user programs. Several methods are available to synchronize the clocks of the RTbox and the host computer. The RTbox can also receive external triggers and be used to measure RT with respect

  7. Hospital discharge diagnostic and procedure codes for upper gastro-intestinal cancer: how accurate are they?

    Stavrou Efty

    2012-09-01

    Full Text Available Abstract Background Population-level health administrative datasets such as hospital discharge data are used increasingly to evaluate health services and outcomes of care. However information about the accuracy of Australian discharge data in identifying cancer, associated procedures and comorbidity is limited. The Admitted Patients Data Collection (APDC is a census of inpatient hospital discharges in the state of New South Wales (NSW. Our aim was to assess the accuracy of the APDC in identifying upper gastro-intestinal (upper GI cancer cases, procedures for associated curative resection and comorbidities at the time of admission compared to data abstracted from medical records (the ‘gold standard’. Methods We reviewed the medical records of 240 patients with an incident upper GI cancer diagnosis derived from a clinical database in one NSW area health service from July 2006 to June 2007. Extracted case record data was matched to APDC discharge data to determine sensitivity, positive predictive value (PPV and agreement between the two data sources (κ-coefficient. Results The accuracy of the APDC diagnostic codes in identifying site-specific incident cancer ranged from 80-95% sensitivity. This was comparable to the accuracy of APDC procedure codes in identifying curative resection for upper GI cancer. PPV ranged from 42-80% for cancer diagnosis and 56-93% for curative surgery. Agreement between the data sources was >0.72 for most cancer diagnoses and curative resections. However, APDC discharge data was less accurate in reporting common comorbidities - for each condition, sensitivity ranged from 9-70%, whilst agreement ranged from κ = 0.64 for diabetes down to κ  Conclusions Identifying incident cases of upper GI cancer and curative resection from hospital administrative data is satisfactory but under-ascertained. Linkage of multiple population-health datasets is advisable to maximise case ascertainment and minimise false

  8. Variable impedance cardiography waveforms: how to evaluate the preejection period more accurately

    Ermishkin, V. V.; Kolesnikov, V. A.; Lukoshkova, E. V.; Mokh, V. P.; Sonina, R. S.; Dupik, N. V.; Boitsov, S. A.

    2012-12-01

    Impedance method has been successfully applied for left ventricular function assessment during functional tests. The preejection period (PEP), the interval between Q peak in ECG and a specific mark on impedance cardiogram (ICG) which corresponds to aortic valve opening, is an important indicator of the contractility state and its neurogenic control. Accurate identification of ejection onset by ICG is often problematic, especially in the cardiologic patients, due to peculiar waveforms. An essential obstacle is variability of the shape of the ICG waveform during the exercise and subsequent recovery. A promissing solution can be introduction of an additional pulse sensor placed in the nearby region. We tested this idea in 28 healthy subjects and 6 cardiologic patients using a dual-channel impedance cardiograph for simultaneous recording from the aortic and neck regions, and an earlobe photoplethysmograph. Our findings suggest that incidence of abnormal complicated ICG waveforms increases with age. The combination of standard ICG with ear photoplethysmography and/or additional impedance channel significantly improves the efficacy and accuracy of PEP estimation.

  9. Accurate acoustic and elastic beam migration without slant stack for complex topography

    Huang, Jianping; Yuan, Maolin; Liao, Wenyuan; Li, Zhenchun; Yue, Yubo

    2015-06-01

    Recent trends in seismic exploration have led to the collection of more surveys, often with multi-component recording, in onshore settings where both topography and subsurface targets are complex, leading to challenges for processing methods. Gaussian beam migration (GBM) is an alternative to single-arrival Kirchhoff migration, although there are some issues resulting in unsatisfactory GBM images. For example, static correction will give rise to the distortion of wavefields when near-surface elevation and velocity vary rapidly. Moreover, Green’s function compensated for phase changes from the beam center to receivers is inaccurate when receivers are not placed within some neighborhood of the beam center, that is, GBM is slightly inflexible for irregular acquisition system and complex topography. As a result, the differences of both the near-surface velocity and the surface slope from the beam center to the receivers and the poor spatial sampling of the land data lead to inaccuracy and aliasing of the slant stack, respectively. In order to improve the flexibility and accuracy of GBM, we propose accurate acoustic, PP and polarity-corrected PS beam migration without slant stack for complex topography. The applications of this method to one-component synthetic data from a 2D Canadian Foothills model and a Zhongyuan oilfield fault model, one-component field data and an unseparated multi-component synthetic data demonstrate that the method is effective for structural and relatively amplitude-preserved imaging, but significantly more time-consuming.

  10. Accurate diagnoses, evidence based drugs, and new devices (3 Ds in heart failure

    Bambang B. Siswanto

    2012-02-01

    Full Text Available Heart failure becomes main problem in cardiology because of increasing of heart failure patients, rehospitalization rate, morbidity, and mortality rate. The main causes of increasing heart failure problems are: (1 Successful treatment of acute myocardial infarction can be life saving, but its sequelae can cause heart failure. (2 Increasing life expectancy rate grows along with incidences of ageing related heart failure. (3 High prevalence of infection in Indonesia can cause rheumatic heart disease post Streptococcal beta hemolyticus infection, viral myocarditis, infective endocartitis, and tuberculoid pericarditis. (4 Many risk factors for coronary heart disease are often found in heart failure patients, for examples smoking, diabetes, hypercholesterolemia, hypertension, and obesity. Indonesia joined international multicentered registry in 2006. Acute Decompensated HEart failure REgistry is a web based international registry to record patient with acute decompensated heart failure treated in emergency room. It was found that heart failure patients in 5 big hospitals in Java and Bali island that joined this registry are younger, sicker and late to seek treatment. The median hospital length of stay was 7 days and in hospital mortality rate was 6.7%. The aim of this article is to give summary about essential things in diagnosing and treating heart failure patients. 3D (accurate diagnoses, evidence based drugs, and new devices are the most important but what to do and what not to do in dealing with heart failure is also useful for your daily practice. (Med J Indones 2012;21:52-8Keywords: Devices, diagnostic, drugs, heart failure

  11. Accurate 3d Textured Models of Vessels for the Improvement of the Educational Tools of a Museum

    Soile, S.; Adam, K.; Ioannidis, C.; Georgopoulos, A.

    2013-02-01

    Besides the demonstration of the findings, modern museums organize educational programs which aim to experience and knowledge sharing combined with entertainment rather than to pure learning. Toward that effort, 2D and 3D digital representations are gradually replacing the traditional recording of the findings through photos or drawings. The present paper refers to a project that aims to create 3D textured models of two lekythoi that are exhibited in the National Archaeological Museum of Athens in Greece; on the surfaces of these lekythoi scenes of the adventures of Odysseus are depicted. The project is expected to support the production of an educational movie and some other relevant interactive educational programs for the museum. The creation of accurate developments of the paintings and of accurate 3D models is the basis for the visualization of the adventures of the mythical hero. The data collection was made by using a structured light scanner consisting of two machine vision cameras that are used for the determination of geometry of the object, a high resolution camera for the recording of the texture, and a DLP projector. The creation of the final accurate 3D textured model is a complicated and tiring procedure which includes the collection of geometric data, the creation of the surface, the noise filtering, the merging of individual surfaces, the creation of a c-mesh, the creation of the UV map, the provision of the texture and, finally, the general processing of the 3D textured object. For a better result a combination of commercial and in-house software made for the automation of various steps of the procedure was used. The results derived from the above procedure were especially satisfactory in terms of accuracy and quality of the model. However, the procedure was proved to be time consuming while the use of various software packages presumes the services of a specialist.

  12. Accurate modelling of flow induced stresses in rigid colloidal aggregates

    Vanni, Marco

    2015-07-01

    A method has been developed to estimate the motion and the internal stresses induced by a fluid flow on a rigid aggregate. The approach couples Stokesian dynamics and structural mechanics in order to take into account accurately the effect of the complex geometry of the aggregates on hydrodynamic forces and the internal redistribution of stresses. The intrinsic error of the method, due to the low-order truncation of the multipole expansion of the Stokes solution, has been assessed by comparison with the analytical solution for the case of a doublet in a shear flow. In addition, it has been shown that the error becomes smaller as the number of primary particles in the aggregate increases and hence it is expected to be negligible for realistic reproductions of large aggregates. The evaluation of internal forces is performed by an adaptation of the matrix methods of structural mechanics to the geometric features of the aggregates and to the particular stress-strain relationship that occurs at intermonomer contacts. A preliminary investigation on the stress distribution in rigid aggregates and their mode of breakup has been performed by studying the response to an elongational flow of both realistic reproductions of colloidal aggregates (made of several hundreds monomers) and highly simplified structures. A very different behaviour has been evidenced between low-density aggregates with isostatic or weakly hyperstatic structures and compact aggregates with highly hyperstatic configuration. In low-density clusters breakup is caused directly by the failure of the most stressed intermonomer contact, which is typically located in the inner region of the aggregate and hence originates the birth of fragments of similar size. On the contrary, breakup of compact and highly cross-linked clusters is seldom caused by the failure of a single bond. When this happens, it proceeds through the removal of a tiny fragment from the external part of the structure. More commonly, however

  13. An accurate and simple quantum model for liquid water.

    Paesani, Francesco; Zhang, Wei; Case, David A; Cheatham, Thomas E; Voth, Gregory A

    2006-11-14

    The path-integral molecular dynamics and centroid molecular dynamics methods have been applied to investigate the behavior of liquid water at ambient conditions starting from a recently developed simple point charge/flexible (SPC/Fw) model. Several quantum structural, thermodynamic, and dynamical properties have been computed and compared to the corresponding classical values, as well as to the available experimental data. The path-integral molecular dynamics simulations show that the inclusion of quantum effects results in a less structured liquid with a reduced amount of hydrogen bonding in comparison to its classical analog. The nuclear quantization also leads to a smaller dielectric constant and a larger diffusion coefficient relative to the corresponding classical values. Collective and single molecule time correlation functions show a faster decay than their classical counterparts. Good agreement with the experimental measurements in the low-frequency region is obtained for the quantum infrared spectrum, which also shows a higher intensity and a redshift relative to its classical analog. A modification of the original parametrization of the SPC/Fw model is suggested and tested in order to construct an accurate quantum model, called q-SPC/Fw, for liquid water. The quantum results for several thermodynamic and dynamical properties computed with the new model are shown to be in a significantly better agreement with the experimental data. Finally, a force-matching approach was applied to the q-SPC/Fw model to derive an effective quantum force field for liquid water in which the effects due to the nuclear quantization are explicitly distinguished from those due to the underlying molecular interactions. Thermodynamic and dynamical properties computed using standard classical simulations with this effective quantum potential are found in excellent agreement with those obtained from significantly more computationally demanding full centroid molecular dynamics

  14. Fast and accurate line scanner based on white light interferometry

    Lambelet, Patrick; Moosburger, Rudolf

    2013-04-01

    White-light interferometry is a highly accurate technology for 3D measurements. The principle is widely utilized in surface metrology instruments but rarely adopted for in-line inspection systems. The main challenges for rolling out inspection systems based on white-light interferometry to the production floor are its sensitivity to environmental vibrations and relatively long measurement times: a large quantity of data needs to be acquired and processed in order to obtain a single topographic measurement. Heliotis developed a smart-pixel CMOS camera (lock-in camera) which is specially suited for white-light interferometry. The demodulation of the interference signal is treated at the level of the pixel which typically reduces the acquisition data by one orders of magnitude. Along with the high bandwidth of the dedicated lock-in camera, vertical scan-speeds of more than 40mm/s are reachable. The high scan speed allows for the realization of inspection systems that are rugged against external vibrations as present on the production floor. For many industrial applications such as the inspection of wafer-bumps, surface of mechanical parts and solar-panel, large areas need to be measured. In this case either the instrument or the sample are displaced laterally and several measurements are stitched together. The cycle time of such a system is mostly limited by the stepping time for multiple lateral displacements. A line-scanner based on white light interferometry would eliminate most of the stepping time while maintaining robustness and accuracy. A. Olszak proposed a simple geometry to realize such a lateral scanning interferometer. We demonstrate that such inclined interferometers can benefit significantly from the fast in-pixel demodulation capabilities of the lock-in camera. One drawback of an inclined observation perspective is that its application is limited to objects with scattering surfaces. We therefore propose an alternate geometry where the incident light is

  15. Fast and accurate predictions of covalent bonds in chemical space.

    Chang, K Y Samuel; Fias, Stijn; Ramakrishnan, Raghunathan; von Lilienfeld, O Anatole

    2016-05-01

    We assess the predictive accuracy of perturbation theory based estimates of changes in covalent bonding due to linear alchemical interpolations among molecules. We have investigated σ bonding to hydrogen, as well as σ and π bonding between main-group elements, occurring in small sets of iso-valence-electronic molecules with elements drawn from second to fourth rows in the p-block of the periodic table. Numerical evidence suggests that first order Taylor expansions of covalent bonding potentials can achieve high accuracy if (i) the alchemical interpolation is vertical (fixed geometry), (ii) it involves elements from the third and fourth rows of the periodic table, and (iii) an optimal reference geometry is used. This leads to near linear changes in the bonding potential, resulting in analytical predictions with chemical accuracy (∼1 kcal/mol). Second order estimates deteriorate the prediction. If initial and final molecules differ not only in composition but also in geometry, all estimates become substantially worse, with second order being slightly more accurate than first order. The independent particle approximation based second order perturbation theory performs poorly when compared to the coupled perturbed or finite difference approach. Taylor series expansions up to fourth order of the potential energy curve of highly symmetric systems indicate a finite radius of convergence, as illustrated for the alchemical stretching of H2 (+). Results are presented for (i) covalent bonds to hydrogen in 12 molecules with 8 valence electrons (CH4, NH3, H2O, HF, SiH4, PH3, H2S, HCl, GeH4, AsH3, H2Se, HBr); (ii) main-group single bonds in 9 molecules with 14 valence electrons (CH3F, CH3Cl, CH3Br, SiH3F, SiH3Cl, SiH3Br, GeH3F, GeH3Cl, GeH3Br); (iii) main-group double bonds in 9 molecules with 12 valence electrons (CH2O, CH2S, CH2Se, SiH2O, SiH2S, SiH2Se, GeH2O, GeH2S, GeH2Se); (iv) main-group triple bonds in 9 molecules with 10 valence electrons (HCN, HCP, HCAs, HSiN, HSi

  16. Fast and accurate predictions of covalent bonds in chemical space

    Chang, K. Y. Samuel; Fias, Stijn; Ramakrishnan, Raghunathan; von Lilienfeld, O. Anatole

    2016-05-01

    We assess the predictive accuracy of perturbation theory based estimates of changes in covalent bonding due to linear alchemical interpolations among molecules. We have investigated σ bonding to hydrogen, as well as σ and π bonding between main-group elements, occurring in small sets of iso-valence-electronic molecules with elements drawn from second to fourth rows in the p-block of the periodic table. Numerical evidence suggests that first order Taylor expansions of covalent bonding potentials can achieve high accuracy if (i) the alchemical interpolation is vertical (fixed geometry), (ii) it involves elements from the third and fourth rows of the periodic table, and (iii) an optimal reference geometry is used. This leads to near linear changes in the bonding potential, resulting in analytical predictions with chemical accuracy (˜1 kcal/mol). Second order estimates deteriorate the prediction. If initial and final molecules differ not only in composition but also in geometry, all estimates become substantially worse, with second order being slightly more accurate than first order. The independent particle approximation based second order perturbation theory performs poorly when compared to the coupled perturbed or finite difference approach. Taylor series expansions up to fourth order of the potential energy curve of highly symmetric systems indicate a finite radius of convergence, as illustrated for the alchemical stretching of H 2+ . Results are presented for (i) covalent bonds to hydrogen in 12 molecules with 8 valence electrons (CH4, NH3, H2O, HF, SiH4, PH3, H2S, HCl, GeH4, AsH3, H2Se, HBr); (ii) main-group single bonds in 9 molecules with 14 valence electrons (CH3F, CH3Cl, CH3Br, SiH3F, SiH3Cl, SiH3Br, GeH3F, GeH3Cl, GeH3Br); (iii) main-group double bonds in 9 molecules with 12 valence electrons (CH2O, CH2S, CH2Se, SiH2O, SiH2S, SiH2Se, GeH2O, GeH2S, GeH2Se); (iv) main-group triple bonds in 9 molecules with 10 valence electrons (HCN, HCP, HCAs, HSiN, HSi

  17. New simple method for fast and accurate measurement of volumes

    A new simple method is presented, which allows us to measure in just a few minutes but with reasonable accuracy (less than 1%) the volume confined inside a generic enclosure, regardless of the complexity of its shape. The technique proposed also allows us to measure the volume of any portion of a complex manifold, including, for instance, pipes and pipe fittings, valves, gauge heads, and so on, without disassembling the manifold at all. To this purpose an airtight variable volume is used, whose volume adjustment can be precisely measured; it has an overall capacity larger than that of the unknown volume. Such a variable volume is initially filled with a suitable test gas (for instance, air) at a known pressure, as carefully measured by means of a high precision capacitive gauge. By opening a valve, the test gas is allowed to expand into the previously evacuated unknown volume. A feedback control loop reacts to the resulting finite pressure drop, thus contracting the variable volume until the pressure exactly retrieves its initial value. The overall reduction of the variable volume achieved at the end of this process gives a direct measurement of the unknown volume, and definitively gets rid of the problem of dead spaces. The method proposed actually does not require the test gas to be rigorously held at a constant temperature, thus resulting in a huge simplification as compared to complex arrangements commonly used in metrology (gas expansion method), which can grant extremely accurate measurement but requires rather expensive equipments and results in time consuming methods, being therefore impractical in most applications. A simple theoretical analysis of the thermodynamic cycle and the results of experimental tests are described, which demonstrate that, in spite of its simplicity, the method provides a measurement accuracy within 0.5%. The system requires just a few minutes to complete a single measurement, and is ready immediately at the end of the process. The

  18. Improved management of radiotherapy departments through accurate cost data

    Escalating health care expenses urge Governments towards cost containment. More accurate data on the precise costs of health care interventions are needed. We performed an aggregate cost calculation of radiation therapy departments and treatments and discussed the different cost components. The costs of a radiotherapy department were estimated, based on accreditation norms for radiotherapy departments set forth in the Belgian legislation. The major cost components of radiotherapy are the cost of buildings and facilities, equipment, medical and non-medical staff, materials and overhead. They respectively represent around 3, 30, 50, 4 and 13% of the total costs, irrespective of the department size. The average cost per patient lowers with increasing department size and optimal utilization of resources. Radiotherapy treatment costs vary in a stepwise fashion: minor variations of patient load do not affect the cost picture significantly due to a small impact of variable costs. With larger increases in patient load however, additional equipment and/or staff will become necessary, resulting in additional semi-fixed costs and an important increase in costs. A sensitivity analysis of these two major cost inputs shows that a decrease in total costs of 12-13% can be obtained by assuming a 20% less than full time availability of personnel; that due to evolving seniority levels, the annual increase in wage costs is estimated to be more than 1%; that by changing the clinical life-time of buildings and equipment with unchanged interest rate, a 5% reduction of total costs and cost per patient can be calculated. More sophisticated equipment will not have a very large impact on the cost (±4000 BEF/patient), provided that the additional equipment is adapted to the size of the department. That the recommendations we used, based on the Belgian legislation, are not outrageous is shown by replacing them by the USA Blue book recommendations. Depending on the department size, costs in

  19. Rosiglitazone RECORD study

    Home, P D; Jones, N P; Pocock, S J;

    2007-01-01

    AIMS: To compare glucose control over 18 months between rosiglitazone oral combination therapy and combination metformin and sulphonylurea in people with Type 2 diabetes. METHODS: RECORD, a multicentre, parallel-group study of cardiovascular outcomes, enrolled people with an HbA(1c) of 7.1-9.0% on...... maximum doses of metformin or sulphonylurea. If on metformin they were randomized to add-on rosiglitazone or sulphonylurea (open label) and if on sulphonylurea to rosiglitazone or metformin. HbA(1c) was managed to < or = 7.0% by dose titration. A prospectively defined analysis of glycaemic control on the...... when rosiglitazone or metformin was added to sulphonylurea [0.06 (-0.09, 0.20)%]. At 6 months, the effect on HbA(1c) was greater with add-on sulphonylurea, but was similar whether sulphonylurea was added to rosiglitazone or metformin. Differences in fasting plasma glucose were not statistically...

  20. Streamflows at record highs

    Streamflow was reported well above average in more than half the country during May, with flows at or near record levels for the month in 22 states, according to the U.S. Geological Survey (USGS), Department of the Interior.USGS hydrologists said that above average flow was reported at 98 of the 173 USGS key index gauging stations used in their monthly check on surface- and ground-water conditions. High flows were most prevalent in the Mississippi River basin states and in the east, with the exception of Maine, South Carolina, and Georgia. Below-average streamflow occurred in the Pacific northwest and in small scattered areas in Colorado, Kansas, Texas, and Minnesota.

  1. NRC comprehensive records disposition schedule

    Effective January 1, 1982, NRC will institute records retention and disposal practives in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 24 which apply to housekeeping or facilitative records. NRCS-I applies to records common to all or most NRC offices; NRCS-II applies to program records as found in the various offices of the Commission, Atomic Safety and Licensing Board Panel, and the Atomic Safety and Licensing Appeal Panel; NRCS-III applies to records accumulated by the Advisory Committee on Reactor Safeguards; and NRCS-IV applies to records accumulated in the various NRC offices under the Executive Director for Operations. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  2. On the use of uavs in mining and archaeology - geo-accurate 3d reconstructions using various platforms and terrestrial views

    Tscharf, A.; Rumpler, M.; Fraundorfer, Friedrich; Mayer, G; Bischof, H.

    2015-01-01

    During the last decades photogrammetric computer vision systems have been well established in scientific and commercial applications. Especially the increasing affordability of unmanned aerial vehicles (UAVs) in conjunction with automated multi-view processing pipelines have resulted in an easy way of acquiring spatial data and creating realistic and accurate 3D models. With the use of multicopter UAVs, it is possible to record highly overlapping images from almost terrestrial camera posit...

  3. ON THE USE OF UAVS IN MINING AND ARCHAEOLOGY - GEO-ACCURATE 3D RECONSTRUCTIONS USING VARIOUS PLATFORMS AND TERRESTRIAL VIEWS

    Tscharf, A.; Rumpler, M.; F. Fraundorfer; Mayer, G; Bischof, H.

    2015-01-01

    During the last decades photogrammetric computer vision systems have been well established in scientific and commercial applications. Especially the increasing affordability of unmanned aerial vehicles (UAVs) in conjunction with automated multi-view processing pipelines have resulted in an easy way of acquiring spatial data and creating realistic and accurate 3D models. With the use of multicopter UAVs, it is possible to record highly overlapping images from almost terrestrial camera...

  4. The development of medical record services in Hong Kong public hospitals.

    Fung, V

    1994-12-01

    Medical record service in Hong Kong public hospitals have been developing at different levels. Since 1992, various improvements in medical record services have been carried out in public hospitals, e.g. professional management, record storage, organized medical records, computerization, completion of discharge summaries, and the introduction of a more precise coding system. The aim of the reform is to provide timely, accurate, organized and meaningful clinical information for end-users. Evolving from this reform, work has been started on developing Patient Related Groups and Specialty Clinical Information Systems. PMID:10142476

  5. Quantifying Methane Fluxes Simply and Accurately: The Tracer Dilution Method

    Rella, Christopher; Crosson, Eric; Green, Roger; Hater, Gary; Dayton, Dave; Lafleur, Rick; Merrill, Ray; Tan, Sze; Thoma, Eben

    2010-05-01

    Methane is an important atmospheric constituent with a wide variety of sources, both natural and anthropogenic, including wetlands and other water bodies, permafrost, farms, landfills, and areas with significant petrochemical exploration, drilling, transport, or processing, or refining occurs. Despite its importance to the carbon cycle, its significant impact as a greenhouse gas, and its ubiquity in modern life as a source of energy, its sources and sinks in marine and terrestrial ecosystems are only poorly understood. This is largely because high quality, quantitative measurements of methane fluxes in these different environments have not been available, due both to the lack of robust field-deployable instrumentation as well as to the fact that most significant sources of methane extend over large areas (from 10's to 1,000,000's of square meters) and are heterogeneous emitters - i.e., the methane is not emitted evenly over the area in question. Quantifying the total methane emissions from such sources becomes a tremendous challenge, compounded by the fact that atmospheric transport from emission point to detection point can be highly variable. In this presentation we describe a robust, accurate, and easy-to-deploy technique called the tracer dilution method, in which a known gas (such as acetylene, nitrous oxide, or sulfur hexafluoride) is released in the same vicinity of the methane emissions. Measurements of methane and the tracer gas are then made downwind of the release point, in the so-called far-field, where the area of methane emissions cannot be distinguished from a point source (i.e., the two gas plumes are well-mixed). In this regime, the methane emissions are given by the ratio of the two measured concentrations, multiplied by the known tracer emission rate. The challenges associated with atmospheric variability and heterogeneous methane emissions are handled automatically by the transport and dispersion of the tracer. We present detailed methane flux

  6. Record occurrence and record values in daily and monthly temperatures

    Wergen, Gregor; Krug, Joachim

    2012-01-01

    We analyze the occurrence and the values of record-breaking temperatures in daily and monthly temperature observations. Our aim is to better understand and quantify the statistics of temperature records in the context of global warming. Similar to earlier work we employ a simple mathematical model of independent and identically distributed random variables with a linearly growing expectation value. This model proved to be useful in predicting the increase (decrease) in upper (lower) temperature records in a warming climate. Using both station and re-analysis data from Europe and the United States we further investigate the statistics of temperature records and the validity of this model. The most important new contribution in this article is an analysis of the statistics of record values for our simple model and European reanalysis data. We estimate how much the mean values and the distributions of record temperatures are affected by the large scale warming trend. In this context we consider both the values o...

  7. REVIEW ON METHODS OF RECORDING VERTICAL RELATION

    Naveen Raj

    2013-03-01

    Full Text Available INTRODUCTION: The accuracy of recording vertical dimension at occ lusion in edentulous patients is always a prime consideration for any dent ist. Though there are many advances in techniques and materials employed in the field of pro sthodontics for recording vertical dimension at occlusion; still, there is no accurate method of assessing vertical dimension of occlusion in edentulous patients available to denti st. In assessing this component for fabrication of complete denture, clinical judgment by dentist pl ays a major role 1 . Vertical dimension is defined as: - “The distance between two selected an atomic and marked points (usually one on the tip of the nose and the other upon the chin one on a fixed and one on the movable member” – GPT 8. Vertical jaw relation are those established by the amount of separation of maxillae and mandible under specified conditions, classified as v ertical dimension of rest and vertical dimension of occlusion. Physiologic rest position of the mandible is not determined by teeth it is established by muscles and gravity. Position of hea d is important; it must be held in an upright position by the patient and not supported by a headrest . Vertical dimension of occlusion is established by the natural teeth when they are prese nt and in occlusion. In denture wearer, it is established by the vertical height of the two dentu res when the teeth are in contact

  8. Cloud-based Electronic Health Records for Real-time, Region-specific Influenza Surveillance.

    Santillana, M; Nguyen, A T; Louie, T; Zink, A; Gray, J; Sung, I; Brownstein, J S

    2016-01-01

    Accurate real-time monitoring systems of influenza outbreaks help public health officials make informed decisions that may help save lives. We show that information extracted from cloud-based electronic health records databases, in combination with machine learning techniques and historical epidemiological information, have the potential to accurately and reliably provide near real-time regional estimates of flu outbreaks in the United States. PMID:27165494

  9. Cloud-based Electronic Health Records for Real-time, Region-specific Influenza Surveillance

    Santillana, M.; Nguyen, A. T.; Louie, T.; Zink, A.; Gray, J.; Sung, I.; Brownstein, J. S.

    2016-01-01

    Accurate real-time monitoring systems of influenza outbreaks help public health officials make informed decisions that may help save lives. We show that information extracted from cloud-based electronic health records databases, in combination with machine learning techniques and historical epidemiological information, have the potential to accurately and reliably provide near real-time regional estimates of flu outbreaks in the United States. PMID:27165494

  10. NRPB TLD and dose record keeping service - further progress

    Various aspects of the National Radiological Protection Board's service are described. An increasing number of UK employers are transferring from film monitors, and record keeping is now provided for both large and small groups of workers. Data entry directly from punched cards prepared by the larger employers has reduced initial costs and therefore carries a reduced registration fee for these users. Computerized dose record keeping allows automatic retrieval of cumulative dose information from any NRPB record of previous employment, thus safeguarding itinerant workers. Warning Dose Reports are issued automatically when cumulative dose totals reach or exceed 60% of a limit, or when a dose rate greater than 0.1 rem per 4 weeks is recorded. Flexibility in wearing period results in dosemeter economy and reduces laboratory work load. High recorded doses can be checked by UV stimulation of both disks to confirm the accuracy of the previous measurement. Employers are provided with a comprehensive and accurate monitoring package, fulfilling HSE requirements and exempting employers from their former responsibility to keep their own comprehensive records. (UK)

  11. Records and record-keeping for the hospital compounding pharmacist.

    McElhiney, Linda F

    2007-01-01

    The United States Pharmacopeial Convention, Inc., is recognized by federal law and by most state boards of pharmacy as the official group for setting the standards for pharmaceuticals and pharmacy practice, including compounding. The standards of United States Pharmacopeia Chapter 795 require that a pharmacy maintain records on a compounded preparation, including the formulation record, and a Material Safety Data Sheets file. The American Society of Health-Systems Pharmacists' guidelines require that hospital pharmacy departments maintain at least four sets of records in the compounding area: (1) compounding formulas and procedures, (2) compounding logs of all compounded preparations, including batch records and sample batch labels, (3) equipment maintenance records, and (4) a record of ingredients purchased, including cerificates of analysis and Material Saftey Data Sheets. Hospital compounding records may be inspected by any of several outside organizations, including state boards of pharmacy, third-party payers, the Joint Commission on Accreditaion of Healthcare Organizations, the Drug Enforcement Agency, and attorneys. With the existing standards and guidelines in place and the importance of documentation unquestionable, a record of pharmacy activites should be maintained in a compounding pharmacy so that preparations can be replicated consistently, the history of each ingredient traced, equipment maintenance and calibration verified, and compounding procedures evaluated easily. PMID:23974620

  12. Time-Accurate Computational Fluid Dynamics Simulation of a Pair of Moving Solid Rocket Boosters

    Strutzenberg, Louise L.; Williams, Brandon R.

    2011-01-01

    Since the Columbia accident, the threat to the Shuttle launch vehicle from debris during the liftoff timeframe has been assessed by the Liftoff Debris Team at NASA/MSFC. In addition to engineering methods of analysis, CFD-generated flow fields during the liftoff timeframe have been used in conjunction with 3-DOF debris transport methods to predict the motion of liftoff debris. Early models made use of a quasi-steady flow field approximation with the vehicle positioned at a fixed location relative to the ground; however, a moving overset mesh capability has recently been developed for the Loci/CHEM CFD software which enables higher-fidelity simulation of the Shuttle transient plume startup and liftoff environment. The present work details the simulation of the launch pad and mobile launch platform (MLP) with truncated solid rocket boosters (SRBs) moving in a prescribed liftoff trajectory derived from Shuttle flight measurements. Using Loci/CHEM, time-accurate RANS and hybrid RANS/LES simulations were performed for the timeframe T0+0 to T0+3.5 seconds, which consists of SRB startup to a vehicle altitude of approximately 90 feet above the MLP. Analysis of the transient flowfield focuses on the evolution of the SRB plumes in the MLP plume holes and the flame trench, impingement on the flame deflector, and especially impingment on the MLP deck resulting in upward flow which is a transport mechanism for debris. The results show excellent qualitative agreement with the visual record from past Shuttle flights, and comparisons to pressure measurements in the flame trench and on the MLP provide confidence in these simulation capabilities.

  13. An accurate and portable eye movement detector for studying sleep in small animals.

    Sánchez-López, Álvaro; Escudero, Miguel

    2015-08-01

    Although eye movements are a highly valuable variable in attempts to precisely identify different periods of the sleep-wake cycle, their indirect measurement by electrooculography is not good enough. The present article describes an accurate and portable scleral search coil that allows the detection of tonic and phasic characteristics of eye movements in free-moving animals. Six adult Wistar rats were prepared for chronic recording of electroencephalography, electromyography and eye movements using the scleral search coil technique. We developed a miniature magnetic field generator made with two coils, consisting of 35 turns and 15 mm diameter of insulated 0.2 mm cooper wire, mounted in a frame of carbon fibre. This portable scleral search coil was fixed on the head of the animal, with each magnetic coil parallel to the eye coil and at 5 mm from each eye. Eye movements detected by the portable scleral search coil were compared with those measured by a commercial scleral search coil requiring immobilizing the head of the animal. No qualitative differences were found between the two scleral search coil systems in their capabilities to detect eye movements. This innovative portable scleral search coil system is an essential tool to detect slow changes in eye position and miniature rapid eye movements during sleep. The portable scleral search coil is much more suitable for detecting eye movements than any previously available system because of its precision and simplicity, and because it does not require immobilization of the animal's head. PMID:25590417

  14. Usage Reporting on Recorded Lectures

    Gorissen, Pierre; Bruggen, Jan van; Jochems, Wim

    2012-01-01

    This study analyses the interactions of students with the recorded lectures. We report on an analysis of students' use of recorded lectures at two Universities in the Netherlands. The data logged by the lecture capture system (LCS) is used and combined with collected survey data. We describe the pro

  15. 75 FR 2821 - Personnel Records

    2010-01-19

    ...; ] OFFICE OF PERSONNEL MANAGEMENT 5 CFR Part 293 RIN 3206-AM05 Personnel Records AGENCY: U.S. Office of Personnel Management. ACTION: Proposed rule with request for comments. SUMMARY: The U.S. Office of Personnel... Bennett, Records Manager, Office of Chief Information Officer, Office of Personnel Management, 1900...

  16. 76 FR 55880 - Recording Assignments

    2011-09-09

    ..., depending on the date they were recorded. The public may also search patent and trademark assignment... United States Patent and Trademark Office Recording Assignments ACTION: Proposed collection; comment request. SUMMARY: The United States Patent and Trademark Office (USPTO), as part of its continuing...

  17. Liaison neurologists facilitate accurate neurological diagnosis and management, resulting in substantial savings in the cost of inpatient care.

    Costelloe, L

    2012-02-01

    BACKGROUND: Despite understaffing of neurology services in Ireland, the demand for liaison neurologist input into the care of hospital inpatients is increasing. This aspect of the workload of the neurologist is often under recognised. AIMS\\/METHODS: We prospectively recorded data on referral and service delivery patterns to a liaison neurology service, the neurological conditions encountered, and the impact of neurology input on patient care. RESULTS: Over a 13-month period, 669 consults were audited. Of these, 79% of patients were seen within 48 h and 86% of patients were assessed by a consultant neurologist before discharge. Management was changed in 69% cases, and discharge from hospital expedited in 50%. If adequate resources for neurological assessment had been available, 28% could have been seen as outpatients, with projected savings of 857 bed days. CONCLUSIONS: Investment in neurology services would facilitate early accurate diagnosis, efficient patient and bed management, with substantial savings.

  18. Simplifying ART cohort monitoring: Can pharmacy stocks provide accurate estimates of patients retained on antiretroviral therapy in Malawi?

    Tweya Hannock

    2012-07-01

    Full Text Available Abstract Background Routine monitoring of patients on antiretroviral therapy (ART is crucial for measuring program success and accurate drug forecasting. However, compiling data from patient registers to measure retention in ART is labour-intensive. To address this challenge, we conducted a pilot study in Malawi to assess whether patient ART retention could be determined using pharmacy records as compared to estimates of retention based on standardized paper- or electronic based cohort reports. Methods Twelve ART facilities were included in the study: six used paper-based registers and six used electronic data systems. One ART facility implemented an electronic data system in quarter three and was included as a paper-based system facility in quarter two only. Routine patient retention cohort reports, paper or electronic, were collected from facilities for both quarter two [April–June] and quarter three [July–September], 2010. Pharmacy stock data were also collected from the 12 ART facilities over the same period. Numbers of ART continuation bottles recorded on pharmacy stock cards at the beginning and end of each quarter were documented. These pharmacy data were used to calculate the total bottles dispensed to patients in each quarter with intent to estimate the number of patients retained on ART. Information for time required to determine ART retention was gathered through interviews with clinicians tasked with compiling the data. Results Among ART clinics with paper-based systems, three of six facilities in quarter two and four of five facilities in quarter three had similar numbers of patients retained on ART comparing cohort reports to pharmacy stock records. In ART clinics with electronic systems, five of six facilities in quarter two and five of seven facilities in quarter three had similar numbers of patients retained on ART when comparing retention numbers from electronically generated cohort reports to pharmacy stock records. Among

  19. Ordinary kriging as a tool to estimate historical daily streamflow records

    Farmer, W. H.

    2016-01-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on...

  20. A Method for the Automated, Reliable Retrieval of Publication-Citation Records

    Derek Ruths; Faiyaz Al Zamal

    2010-01-01

    BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate e...

  1. Revision of the records of shark and ray species from the Maltese Islands (Chordata : Chondrichthyes)

    Schembri, Titian; Schembri, Patrick J.; Fergusson, Ian K.

    2003-01-01

    Records of sharks and rays from Maltese waters published in the scientific literature were critically evaluated by examining and accurately identifying specimens caught by fishers, seen by the authors and those kept in museum collections. Photographs of caught specimens but which were not preserved were also considered. Out of 37 species of sharks and 26 species of rays recorded from Malta, 24 sharks and 14 rays along with another two sharks, whose presence is a distinct probabili...

  2. COMPUTER VISION PHOTOGRAMMETRY FOR UNDERWATER ARCHAEOLOGICAL SITE RECORDING IN A LOW-VISIBILITY ENVIRONMENT

    Van Damme, T.

    2015-01-01

    Computer Vision Photogrammetry allows archaeologists to accurately record underwater sites in three dimensions using simple twodimensional picture or video sequences, automatically processed in dedicated software. In this article, I share my experience in working with one such software package, namely PhotoScan, to record a Dutch shipwreck site. In order to demonstrate the method’s reliability and flexibility, the site in question is reconstructed from simple GoPro footage, captured ...

  3. The science of sound recording

    Kadis, Jay

    2012-01-01

    The Science of Sound Recording will provide you with more than just an introduction to sound and recording, it will allow you to dive right into some of the technical areas that often appear overwhelming to anyone without an electrical engineering or physics background.  The Science of Sound Recording helps you build a basic foundation of scientific principles, explaining how recording really works. Packed with valuable must know information, illustrations and examples of 'worked through' equations this book introduces the theory behind sound recording practices in a logical and prac

  4. A comparative evaluation of dimensional stability of three types of interocclusal recording materials-an in-vitro multi-centre study

    Tejo Sampath; Kumar Anil G; Kattimani Vivekanand S; Desai Priti D; Nalla Sandeep; Chaitanya K Krishna

    2012-01-01

    Abstract Background The introduction of different interocclusal recording materials has put clinicians in dilemma that which material should be used in routine clinical practice for precise recording and transferring of accurate existing occlusal records for articulation of patient’s diagnostic or working casts in the fabrication of good satisfactory prosthesis. In the era of developing world of dentistry the different materials are introduced for interocclusal record with different brand nam...

  5. Highly accurate prediction of emotions surrounding the attacks of September 11, 2001 over 1-, 2-, and 7-year prediction intervals.

    Doré, Bruce P; Meksin, Robert; Mather, Mara; Hirst, William; Ochsner, Kevin N

    2016-06-01

    In the aftermath of a national tragedy, important decisions are predicated on judgments of the emotional significance of the tragedy in the present and future. Research in affective forecasting has largely focused on ways in which people fail to make accurate predictions about the nature and duration of feelings experienced in the aftermath of an event. Here we ask a related but understudied question: can people forecast how they will feel in the future about a tragic event that has already occurred? We found that people were strikingly accurate when predicting how they would feel about the September 11 attacks over 1-, 2-, and 7-year prediction intervals. Although people slightly under- or overestimated their future feelings at times, they nonetheless showed high accuracy in forecasting (a) the overall intensity of their future negative emotion, and (b) the relative degree of different types of negative emotion (i.e., sadness, fear, or anger). Using a path model, we found that the relationship between forecasted and actual future emotion was partially mediated by current emotion and remembered emotion. These results extend theories of affective forecasting by showing that emotional responses to an event of ongoing national significance can be predicted with high accuracy, and by identifying current and remembered feelings as independent sources of this accuracy. (PsycINFO Database Record PMID:27100309

  6. Acute social stress and cardiac electrical activity in rats

    Sgoifo, A; Stilli, D; de Boer, SF; Koolhaas, JM; Musso, E; Koolhaas, Jaap M.

    1998-01-01

    This paper summarizes the results of experiments aimed at describing electrocardiographic responses to different acute social stressors in healthy male rats. Electrocardiograms were telemetrically recorded during maternal aggression, social defeat, and psychosocial stimulation, as obtained using the

  7. Readout of delayline detectors with transient recorders

    The task of the presented work consists in the development of software to operate the transient recorders under the conditions of a real experiment. Another task was to find a proper method of analyzing the recorded signals. Several methods were tested with simulated and real detector signals. In the first chapter the motivation to this work and some theoretical background are presented. In the second chapter methods to analyze single pulses and trains of overlapping pulses are presented. The quality of the presented algorithms is determined by signals created artificially in software. It is shown that the best temporal resolution - for case of non-overlapping pulses - is obtained with a pulse fit. With this method a resolution of approximately 50 ps is achieved. In order to disentangling overlapping pulses a minimum distance in time between the pulses of at least 5 ns to 7 ns peak to peak is necessary. The third chapter shows an application of the new recording system where results of the new recording method are compared to those obtained with the traditional method. The next chapter presents another experiment in which an even larger amount of fragments had to be detected. In order to cope with this requirement further criteria to assign the measured signal to its originating particle is presented. In this chapter results from photo ionization of helium and neon are shown. The next chapter consist of another application of the methods developed in this work on a real experiment: the reaction examined leads in many cases to two particles that need to be detected within a time of only 30 ns. It turns out that the dead time for real signals is comparable to that of simulated signals. The algorithms cannot accurately determine the timing of two signals that are separated in time by less than 10 ns. On the basis of the pulse height distribution it can be shown that the used detector had smaller efficiency at its center. In the last chapter the quality of the

  8. A multiple regression analysis for accurate background subtraction in 99Tcm-DTPA renography

    A technique for accurate background subtraction in 99Tcm-DTPA renography is described. The technique is based on a multiple regression analysis of the renal curves and separate heart and soft tissue curves which together represent background activity. It is compared, in over 100 renograms, with a previously described linear regression technique. Results show that the method provides accurate background subtraction, even in very poorly functioning kidneys, thus enabling relative renal filtration and excretion to be accurately estimated. (author)

  9. On accurate computations of bound state properties in three- and four-electron atomic systems

    Frolov, Alexei M

    2016-01-01

    Results of accurate computations of bound states in three- and four-electron atomic systems are discussed. Bound state properties of the four-electron lithium ion Li$^{-}$ in its ground $2^{2}S-$state are determined from the results of accurate, variational computations. We also consider a closely related problem of accurate numerical evaluation of the half-life of the beryllium-7 isotope. This problem is of paramount importance for modern radiochemistry.

  10. Persistence of random walk records

    We study records generated by Brownian particles in one dimension. Specifically, we investigate an ordinary random walk and define the record as the maximal position of the walk. We compare the record of an individual random walk with the mean record, obtained as an average over infinitely many realizations. We term the walk ‘superior’ if the record is always above average, and conversely, the walk is said to be ‘inferior’ if the record is always below average. We find that the fraction of superior walks, S, decays algebraically with time, S ∼ t−β, in the limit t → ∞, and that the persistence exponent is nontrivial, β = 0.382 258…. The fraction of inferior walks, I, also decays as a power law, I ∼ t−α, but the persistence exponent is smaller, α = 0.241 608…. Both exponents are roots of transcendental equations involving the parabolic cylinder function. To obtain these theoretical results, we analyze the joint density of superior walks with a given record and position, while for inferior walks it suffices to study the density as a function of position. (paper)

  11. 1993 Department of Energy Records Management Conference

    NONE

    1993-12-31

    This document consists of viewgraphs from the presentations at the conference. Topics included are: DOE records management overview, NIRMA and ARMA resources, NARA records management training, potential quality assurance records, filing systems, organizing and indexing technical records, DOE-HQ initiatives, IRM reviews, status of epidemiologic inventory, disposition of records and personal papers, inactive records storage, establishing administrative records, managing records at Hanford, electronic mail -- legal and records issues, NARA-GAO reports status, consultive selling, automated indexing, decentralized approach to scheduling at a DOE office, developing specific records management programs, storage and retrieval at Savannah River Plant, an optical disk case study, and special interest group reports.

  12. Computation of records of streamflow at control structures

    Collins, Dannie L.

    1977-01-01

    Traditional methods of computing streamflow records on large, low-gradient streams require a continuous record of water-surface slope over a natural channel reach. This slope must be of sufficient magnitude to be accuratly measured with available stage measuring devices. On highly regulated streams, this slope approaches zero during periods of low flow and accurate measurement is difficult. Methods are described to calibrate multipurpose regulating control structures to more accurately compute streamflow records on highly-regulated streams. Hydraulic theory, assuming steady, uniform flow during a computational interval, is described for five different types of flow control. The controls are: Tainter gates, hydraulic turbines, fixed spillways, navigation locks, and crest gates. Detailed calibration procedures are described for the five different controls as well as for several flow regimes for some of the controls. The instrumentation package and computer programs necessary to collect and process the field data are discussed. Two typical calibration procedures and measurement data are presented to illustrate the accuracy of the methods. (Woodard-USGS)

  13. Trends in magnetic recording media

    Köster, E.

    1993-03-01

    The fifth M.R.M. Conference in Perugia presents an opportunity to analyse the market situation of magnetic recording media, the trend in future recording systems and the potential of present contenders for flexible media to meet future requirements. The main products in quantity and value are either in their zenith of product cycle or near to it. Highly innovative systems are needed in order to stimulate new applications or markets. The most challenging application is digital recording of HDTV for which particle in polymer binder technology is not yet fully ruled out by thin film ME technology. Improvements are expected from magnetic particles and binder, dispersion as well as coating technology.

  14. Electronic health records for dummies

    Williams, Trenor

    2010-01-01

    The straight scoop on choosing and implementing an electronic health records (EHR) system Doctors, nurses, and hospital and clinic administrators are interested in learning the best ways to implement and use an electronic health records system so that they can be shared across different health care settings via a network-connected information system. This helpful, plain-English guide provides need-to-know information on how to choose the right system, assure patients of the security of their records, and implement an EHR in such a way that it causes minimal disruption to the daily demands of a

  15. Analysis of the neurotoxic plasticizer n-butylbenzenesulfonamide by gas chromatography combined with accurate mass selected ion monitoring.

    Duffield, P; Bourne, D; Tan, K; Garruto, R M; Duncan, M W

    1994-01-01

    The plasticizer, n-butylbenzenesulfonamide (NBBS), is reported to be neurotoxic when inoculated intracisternally or intraperitoneally into rabbits. Because NBBS is commonly used in the production of polyamide (nylon) plastics and is soluble in water, the disposal of NBBS-containing plastics in landfill sites could result in NBBS appearing in the leachate. Further, NBBS could also be leached from packaging into their contents. To allow us to examine the risks posed by NBBS in the environment, we have developed a quantitative assay for this compound. The assay employs a one-step extraction into dichloromethane followed by gas chromatography with accurate mass selected ion recording. The assay incorporates [13C6]NBBS as an internal standard to allow precise quantitation, and four separate ion chromatograms are recorded. NBBS was found in some Australian domestic solidwaste landfill leachate (from less than 0.3 to 94.6 ng/mL), but ground water in the vicinity of a landfill had only trace quantities of NBBS. NBBS was also quantitated in some bottled and cask wines, and levels varied from not detected to 2.17 ng/mL (n = 14). Additional studies are required to assess the public health risks associated with the use of NBBS as a plasticizer. PMID:7861748

  16. Completeness of patient records in community pharmacies post-discharge after in-patient medication reconciliation : a before-after study

    Karapinar-Çarkıt, Fatma; van Breukelen, Ben R L; Borgsteede, Sander D; Janssen, Marjo J A; Egberts, Antoine C G; van den Bemt, Patricia M L A

    2014-01-01

    BACKGROUND: Transfer of discharge medication related information to community pharmacies could improve continuity of care. This requires for community pharmacies to accurately update their patient records when new information is transferred. An instruction manual that specifies how to document infor

  17. Patient records and clinical overview

    Jensen, Lotte Groth

    2016-01-01

    investigate the creation of overview in daily clinical practice and further investigates how respectively a paper-based patient record and an EPR supports the creation of clinical overview. That the change from paper-based patient record to EPR can cause difficulties, interruptions and challenges...... or rendered visible in detail. Studies have been conducted on the differences on using respectively a paper-based patient record and an EPR, but few studies have focused on the creation of overview in daily clinical practice in connection with this transition. Therefore, the aim of this PhD dissertation...... of clinical overview. Considering the change from paper-based patient record to EPR, the dissertation also investigates how the two artefacts support the creation of clinical overview. The present dissertation is primarily based on ethnographic inspired observational studies and semi-structured interviews...

  18. Scaling in Athletic World Records

    Savaglio, Sandra; Carbone, Vincenzo

    2000-01-01

    World records in athletics provide a measure of physical as well as physiological human performance. Here we analyse running records and show that the mean speed as a function of race time can be described by two scaling laws that have a breakpoint at about 150-170 seconds (corresponding to the ~1,000 m race). We interpret this as being the transition time between anaerobic and aerobic energy expenditure by athletes.

  19. How accurate is accident data in road safety research? An application of vehicle black box data regarding pedestrian-to-taxi accidents in Korea.

    Chung, Younshik; Chang, IlJoon

    2015-11-01

    Recently, the introduction of vehicle black box systems or in-vehicle video event data recorders enables the driver to use the system to collect more accurate crash information such as location, time, and situation at the pre-crash and crash moment, which can be analyzed to find the crash causal factors more accurately. This study presents the vehicle black box system in brief and its application status in Korea. Based on the crash data obtained from the vehicle black box system, this study analyzes the accuracy of the crash data collected from existing road crash data recording method, which has been recorded by police officers based on accident parties' statements or eyewitness's account. The analysis results show that the crash data observed by the existing method have an average of 84.48m of spatial difference and standard deviation of 157.75m as well as average 29.05min of temporal error and standard deviation of 19.24min. Additionally, the average and standard deviation of crash speed errors were found to be 9.03km/h and 7.21km/h, respectively. PMID:26298271

  20. Hospital Electronic Health Record Adoption and Its Influence on Postoperative Sepsis

    Fareed, Naleef

    2013-01-01

    Electronic Health Record (EHR) systems could make healthcare delivery safer by providing benefits such as timely access to accurate and complete patient information, advances in diagnosis and coordination of care, and enhancements for monitoring patient vitals. This study explored the nature of EHR adoption in U.S. hospitals and their patient…

  1. Geomagnetic polarity transitions of the Gilbert and Gauss chrons recorded in marine marls from Sicily

    van Hoof, A.A.M.

    1993-01-01

    One of the most fascinating phenomena of geophysics is the fact that in the geological past the Earth's magnetic field has frequently reversed its polarity. These polarity transitions are accurately established during at least the past 165 Myr - from their recording in the ocean floor: the marine ma

  2. Do GPs' medical records demonstrate a good recognition of depression? A new perspective on case extraction.

    Joling, K.J.; Marwijk, H.W.J. van; Piek, E.; Horst, H.E. van der; Penninx, B.W.; Verhaak, P.; Hout, H.P.J. van

    2011-01-01

    Background: Previous estimates of depression recognition in primary care are low and inconsistent. This may be due to registration artifacts and limited extraction efforts. This study investigated a) whether GPs' medical records demonstrate an accurate recognition of depression and b) which combinat

  3. Do GPs' medical records demonstrate a good recognition of depression? A new perspective on case extraction

    Joling, Karlijn J.; van Marwijk, Harm W. J.; Piek, Ellen; van der Horst, Henriette E.; Penninx, Brenda W.; Verhaak, Peter; van Hout, Hein P. J.

    2011-01-01

    Background: Previous estimates of depression recognition in primary care are low and inconsistent. This may be due to registration artifacts and limited extraction efforts. This study investigated a) whether GPs' medical records demonstrate an accurate recognition of depression and b) which combinat

  4. Usability of mobile phone food records to assess dietary intake in adolescents

    Mobile technologies are emerging as a valuable tool to collect and assess dietary intake. Adolescents readily accept and adopt new technologies, hence, a food record application (FRapp) may provide an accurate mechanism to monitor dietary intake. We examined the usability of a FRapp in 17 free-livin...

  5. Validation of a method for accurate and highly reproducible quantification of brain dopamine transporter SPECT studies

    Jensen, Peter S; Ziebell, Morten; Skouboe, Glenna; Khalid, Usman; de Nijs, Robin; Thomsen, Gerda; Knudsen, Gitte M; Svarer, Claus

    2011-01-01

    In nuclear medicine brain imaging, it is important to delineate regions of interest (ROIs) so that the outcome is both accurate and reproducible. The purpose of this study was to validate a new time-saving algorithm (DATquan) for accurate and reproducible quantification of the striatal dopamine...

  6. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm3) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm3, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm3, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0.28 ± 0.03 mm, and 1

  7. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    Gan, Yangzhou; Zhao, Qunfei [Department of Automation, Shanghai Jiao Tong University, and Key Laboratory of System Control and Information Processing, Ministry of Education of China, Shanghai 200240 (China); Xia, Zeyang, E-mail: zy.xia@siat.ac.cn, E-mail: jing.xiong@siat.ac.cn; Hu, Ying [Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, and The Chinese University of Hong Kong, Shenzhen 518055 (China); Xiong, Jing, E-mail: zy.xia@siat.ac.cn, E-mail: jing.xiong@siat.ac.cn [Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 510855 (China); Zhang, Jianwei [TAMS, Department of Informatics, University of Hamburg, Hamburg 22527 (Germany)

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  8. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture.

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  9. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    Zhiquan Gao

    2015-09-01

    Full Text Available Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  10. Accurate location estimation of moving object with energy constraint & adaptive update algorithms to save data

    Semwal, Vijay Bhaskar; Bhaskar, Vinay S; Sati, Meenakshi

    2011-01-01

    In research paper "Accurate estimation of the target location of object with energy constraint & Adaptive Update Algorithms to Save Data" one of the central issues in sensor networks is track the location, of moving object which have overhead of saving data, an accurate estimation of the target location of object with energy constraint .We do not have any mechanism which control and maintain data .The wireless communication bandwidth is also very limited. Some field which is using this technique are flood and typhoon detection, forest fire detection, temperature and humidity and ones we have these information use these information back to a central air conditioning and ventilation system. In this research paper, we propose protocol based on the prediction and adaptive based algorithm which is using less sensor node reduced by an accurate estimation of the target location. we are using minimum three sensor node to get the accurate position .We can extend it upto four or five to find more accurate location ...

  11. 36 CFR 1290.3 - Sources of assassination records and additional records and information.

    2010-07-01

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION JFK ASSASSINATION RECORDS GUIDANCE FOR INTERPRETATION AND IMPLEMENTATION OF THE PRESIDENT JOHN F. KENNEDY ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) §...

  12. 27 CFR 24.317 - Sugar record.

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Sugar record. 24.317... OF THE TREASURY LIQUORS WINE Records and Reports § 24.317 Sugar record. A proprietor who receives, stores, or uses sugar shall maintain a record of receipt and use. The record will show the date...

  13. Problems in the Preservation of Electronic Records.

    Lin, Lim Siew; Ramaiah, Chennupati K.; Wal, Pitt Kuan

    2003-01-01

    Discusses issues related to the preservation of electronic records. Highlights include differences between physical and electronic records; volume of electronic records; physical media; authenticity; migration of electronic records; metadata; legal issues; improved storage media; and projects for preservation of electronic records. (LRW)

  14. Measuring directional characteristics of in-ear recording devices

    Christensen, Flemming; Hoffmann, Pablo F.; Hammershøi, Dorte

    2013-01-01

    With the availability of small in-ear headphones and miniature microphones it is possible to construct combined in-ear devices for binaural recording and playback. When mounting a microphone on the outside of an insert earphone the microphone position deviates from ideal positions in the ear canal....... The pinna and thereby also the natural sound transmission are altered by the inserted device. This paper presents a methodology for accurately measuring the directional dependent transfer functions of such in-ear devices. Pilot measurements on a commercial available device are presented and...

  15. Cultural Heritage Recording Utilising Low-Cost Closerange Photogrammetry

    Melanie Kirchhöfer

    2011-12-01

    Full Text Available Cultural heritage is under a constant threat of damage or even destruction and comprehensive and accurate recording is necessary to attenuate the risk of losing heritage or serve as basis for reconstruction. Cost effective and easy to use methods are required to record cultural heritage, particularly during a world recession, and close-range photogrammetry has proven potential in this area. Off-the-shelf digital cameras can be used to rapidly acquire data at low cost, allowing non-experts to become involved. Exterior orientation of the camera during exposure ideally needs to be established for every image, traditionally requiring known coordinated target points. Establishing these points is time consuming and costly and using targets can be often undesirable on sensitive sites. MEMS-based sensors can assist in overcoming this problem by providing small-size and low-cost means to directly determine exterior orientation for close-range photogrammetry. This paper describes development of an image-based recording system, comprising an off-the-shelf digital SLR camera, a MEMS-based 3D orientation sensor and a GPS antenna. All system components were assembled in a compact and rigid frame that allows calibration of rotational and positional offsets between the components. The project involves collaboration between English Heritage and Loughborough University and the intention is to assess the system’s achievable accuracy and practicability in a heritage recording environment. Tests were conducted at Loughborough University and a case study at St. Catherine’s Oratory on the Isle of Wight, UK. These demonstrate that the data recorded by the system can indeed meet the accuracy requirements for heritage recording at medium accuracy (1-4cm, with either a single or even no control points. As the recording system has been configured with a focus on low-cost and easy-to-use components, it is believed to be suitable for heritage recording by non

  16. The use of automated bioacoustic recorders to replace human wildlife surveys: an example using nightjars.

    Mieke C Zwart

    Full Text Available To be able to monitor and protect endangered species, we need accurate information on their numbers and where they live. Survey methods using automated bioacoustic recorders offer significant promise, especially for species whose behaviour or ecology reduces their detectability during traditional surveys, such as the European nightjar. In this study we examined the utility of automated bioacoustic recorders and the associated classification software as a way to survey for wildlife, using the nightjar as an example. We compared traditional human surveys with results obtained from bioacoustic recorders. When we compared these two methods using the recordings made at the same time as the human surveys, we found that recorders were better at detecting nightjars. However, in practice fieldworkers are likely to deploy recorders for extended periods to make best use of them. Our comparison of this practical approach with human surveys revealed that recorders were significantly better at detecting nightjars than human surveyors: recorders detected nightjars during 19 of 22 survey periods, while surveyors detected nightjars on only six of these occasions. In addition, there was no correlation between the amount of vocalisation captured by the acoustic recorders and the abundance of nightjars as recorded by human surveyors. The data obtained from the recorders revealed that nightjars were most active just before dawn and just after dusk, and least active during the middle of the night. As a result, we found that recording at both dusk and dawn or only at dawn would give reasonably high levels of detection while significantly reducing recording time, preserving battery life. Our analyses suggest that automated bioacoustic recorders could increase the detection of other species, particularly those that are known to be difficult to detect using traditional survey methods. The accuracy of detection is especially important when the data are used to inform

  17. The use of automated bioacoustic recorders to replace human wildlife surveys: an example using nightjars.

    Zwart, Mieke C; Baker, Andrew; McGowan, Philip J K; Whittingham, Mark J

    2014-01-01

    To be able to monitor and protect endangered species, we need accurate information on their numbers and where they live. Survey methods using automated bioacoustic recorders offer significant promise, especially for species whose behaviour or ecology reduces their detectability during traditional surveys, such as the European nightjar. In this study we examined the utility of automated bioacoustic recorders and the associated classification software as a way to survey for wildlife, using the nightjar as an example. We compared traditional human surveys with results obtained from bioacoustic recorders. When we compared these two methods using the recordings made at the same time as the human surveys, we found that recorders were better at detecting nightjars. However, in practice fieldworkers are likely to deploy recorders for extended periods to make best use of them. Our comparison of this practical approach with human surveys revealed that recorders were significantly better at detecting nightjars than human surveyors: recorders detected nightjars during 19 of 22 survey periods, while surveyors detected nightjars on only six of these occasions. In addition, there was no correlation between the amount of vocalisation captured by the acoustic recorders and the abundance of nightjars as recorded by human surveyors. The data obtained from the recorders revealed that nightjars were most active just before dawn and just after dusk, and least active during the middle of the night. As a result, we found that recording at both dusk and dawn or only at dawn would give reasonably high levels of detection while significantly reducing recording time, preserving battery life. Our analyses suggest that automated bioacoustic recorders could increase the detection of other species, particularly those that are known to be difficult to detect using traditional survey methods. The accuracy of detection is especially important when the data are used to inform conservation. PMID

  18. Deciphering records of geomagnetic reversals

    Valet, Jean-Pierre; Fournier, Alexandre

    2016-06-01

    Polarity reversals of the geomagnetic field are a major feature of the Earth's dynamo. Questions remain regarding the dynamical processes that give rise to reversals and the properties of the geomagnetic field during a polarity transition. A large number of paleomagnetic reversal records have been acquired during the past 50 years in order to better constrain the structure and geometry of the transitional field. In addition, over the past two decades, numerical dynamo simulations have also provided insights into the reversal mechanism. Yet despite the large paleomagnetic database, controversial interpretations of records of the transitional field persist; they result from two characteristics inherent to all reversals, both of which are detrimental to an ambiguous analysis. On the one hand, the reversal process is rapid and requires adequate temporal resolution. On the other hand, weak field intensities during a reversal can affect the fidelity of magnetic recording in sedimentary records. This paper is aimed at reviewing critically the main reversal features derived from paleomagnetic records and at analyzing some of these features in light of numerical simulations. We discuss in detail the fidelity of the signal extracted from paleomagnetic records and pay special attention to their resolution with respect to the timing and mechanisms involved in the magnetization process. Records from marine sediments dominate the database. They give rise to transitional field models that often lead to overinterpret the data. Consequently, we attempt to separate robust results (and their subsequent interpretations) from those that do not stand on a strong observational footing. Finally, we discuss new avenues that should favor progress to better characterize and understand transitional field behavior.

  19. Wormholes record species history in space and time.

    Hedges, S Blair

    2013-02-23

    Genetic and fossil data often lack the spatial and temporal precision for tracing the recent biogeographic history of species. Data with finer resolution are needed for studying distributional changes during modern human history. Here, I show that printed wormholes in rare books and artwork are trace fossils of wood-boring species with unusually accurate locations and dates. Analyses of wormholes printed in western Europe since the fifteenth century document the detailed biogeographic history of two putative species of invasive wood-boring beetles. Their distributions now overlap broadly, as an outcome of twentieth century globalization. However, the wormhole record revealed, unexpectedly, that their original ranges were contiguous and formed a stable line across central Europe, apparently a result of competition. Extension of the wormhole record, globally, will probably reveal other species and evolutionary insights. These data also provide evidence for historians in determining the place of origin or movement of a woodblock, book, document or art print. PMID:23173192

  20. On the Reliability of Han Dynasty Solar Eclipse Records

    Pankenier, David W.

    2012-11-01

    The veracity of early Chinese records of astronomical observations has been questioned, principally based on two early studies from the 1950s, which suggested that political motives may have led scholar-officials at court to fabricate astral omens. Here I revisit the Han Dynasty (206 BCE-220 CE) solar eclipse reports to determine whether the charge has merit for those first four centuries of the imperial period. All 127 dated solar eclipses reported in the official sources are checked for accuracy against the "Five Millennium Catalog of Solar Eclipses" produced by Espenak and Meeus (2009). The Han Dynasty records prove remarkably accurate. Copyists' errors do occur, but there are only rare instances of totally erroneous reports, none of which is provably the result of politically-motivated manipulation.