WorldWideScience

Sample records for relevance accuracy timeliness

  1. Insider Trading B-side: relevance, timeliness and position influence

    Directory of Open Access Journals (Sweden)

    Luiz Felipe de A. Pontes Girão

    2015-12-01

    Full Text Available Objective – Our main objective is to analyze the impacto f insider trading on stock investments’ decision. Design/methodology/approach – We used an online survey, obtaining 271 valid answers. To analyze our data, we used some parametric (t and F Anova, and non-parametric techniques (Mann-Whitney and Kruskal-Wallis. Findings – We find that insider tradings are relevant to investment decisions, and the timeliness also exert an influence to this kind of decision, especially abnormal trades. Practical implications – In practical terms, our results suggests that the Brazilian Securities and Exchange Commission (CVM must update the Brazilian insider trading regulation to achieve the objective to protect investors. In the investors point of view, this possible update could improve investors’ ability to control insiders and follow his activities as well as to mimic his trades. Originality/value – The originality of our paper is an analysis of relevance, timeliness and influence of position in a firm as “determinants” of investment decisions. We use these three specific characteristics to criticize the Brazilian insider trading regulation.

  2. Whether Audit Committee Financial Expertise Is the Only Relevant Expertise: A Review of Audit Committee Expertise and Timeliness of Financial Reporting

    Directory of Open Access Journals (Sweden)

    Saeed Rabea Baatwah

    2013-06-01

    Full Text Available This study reviews the literature on audit committee expertise and financial reporting timeliness. Financial reporting timeliness and audit committee expertise are two areas of research gaining the attention of a large number of stakeholders because they contribute to the reliability and the  relevancy of financial reporting. Indeed, the focus of this review is primarily on the recent developments in the pertinent literature in order to show the limitations of such research and encourage future research to overcome these limitations. By also looking at the development of the audit committee expertise literature, this study concludes that (1 like most audit committee literature, financial reporting timeliness literature continues to assume the absence of the contribution of expertise other than financial expertise, and ignore the role of audit committee chair; (2 most of this literature fails to find a significant effect because it ignores the interaction among corporate governance mechanisms. Accordingly, this study posits that ignoring the issues raised in such research by future research would lead to major mistakes in reforms relating to how the quality of financial reporting can be enhanced.

  3. A Smartphone-Based Application Improves the Accuracy, Completeness, and Timeliness of Cattle Disease Reporting and Surveillance in Ethiopia

    Directory of Open Access Journals (Sweden)

    Tariku Jibat Beyene

    2018-01-01

    Full Text Available Accurate disease reporting, ideally in near real time, is a prerequisite to detecting disease outbreaks and implementing appropriate measures for their control. This study compared the performance of the traditional paper-based approach to animal disease reporting in Ethiopia to one using an application running on smartphones. In the traditional approach, the total number of cases for each disease or syndrome was aggregated by animal species and reported to each administrative level at monthly intervals; while in the case of the smartphone application demographic information, a detailed list of presenting signs, in addition to the putative disease diagnosis were immediately available to all administrative levels via a Cloud-based server. While the smartphone-based approach resulted in much more timely reporting, there were delays due to limited connectivity; these ranged on average from 2 days (in well-connected areas up to 13 days (in more rural locations. We outline the challenges that would likely be associated with any widespread rollout of a smartphone-based approach such as the one described in this study but demonstrate that in the long run the approach offers significant benefits in terms of timeliness of disease reporting, improved data integrity and greatly improved animal disease surveillance.

  4. Physiologically-based, predictive analytics using the heart-rate-to-Systolic-Ratio significantly improves the timeliness and accuracy of sepsis prediction compared to SIRS.

    Science.gov (United States)

    Danner, Omar K; Hendren, Sandra; Santiago, Ethel; Nye, Brittany; Abraham, Prasad

    2017-04-01

    Enhancing the efficiency of diagnosis and treatment of severe sepsis by using physiologically-based, predictive analytical strategies has not been fully explored. We hypothesize assessment of heart-rate-to-systolic-ratio significantly increases the timeliness and accuracy of sepsis prediction after emergency department (ED) presentation. We evaluated the records of 53,313 ED patients from a large, urban teaching hospital between January and June 2015. The HR-to-systolic ratio was compared to SIRS criteria for sepsis prediction. There were 884 patients with discharge diagnoses of sepsis, severe sepsis, and/or septic shock. Variations in three presenting variables, heart rate, systolic BP and temperature were determined to be primary early predictors of sepsis with a 74% (654/884) accuracy compared to 34% (304/884) using SIRS criteria (p < 0.0001)in confirmed septic patients. Physiologically-based predictive analytics improved the accuracy and expediency of sepsis identification via detection of variations in HR-to-systolic ratio. This approach may lead to earlier sepsis workup and life-saving interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Improving the timeliness and accuracy of injury severity data in road traffic accidents in an emerging economy setting.

    Science.gov (United States)

    Lam, Carlos; Chen, Chang-I; Chuang, Chia-Chang; Wu, Chia-Chieh; Yu, Shih-Hsiang; Chang, Kai-Kuo; Chiu, Wen-Ta

    2018-05-18

    Road traffic injuries (RTIs) are among the leading causes of injury and fatality worldwide. RTI casualties are continually increasing in Taiwan; however, because of a lack of an advanced method for classifying RTI severity data, as well as the fragmentation of data sources, road traffic safety and health agencies encounter difficulties in analyzing RTIs and their burden on the healthcare system and national resources. These difficulties lead to blind spots during policy-making for RTI prevention and control. After compiling classifications applied in various countries, we summarized data sources for RTI severity in Taiwan, through which we identified data fragmentation. Accordingly, we proposed a practical classification for RTI severity, as well as a feasible model for collecting and integrating these data nationwide. This model can provide timely relevant data recorded by medical professionals and is valuable to healthcare providers. The proposed model's pros and cons are also compared to those of other current models.

  6. Relevance of intracellular polarity to accuracy of eukaryotic chemotaxis

    International Nuclear Information System (INIS)

    Hiraiwa, Tetsuya; Nishikawa, Masatoshi; Shibata, Tatsuo; Nagamatsu, Akihiro; Akuzawa, Naohiro

    2014-01-01

    Eukaryotic chemotaxis is usually mediated by intracellular signals that tend to localize at the front or back of the cell. Such intracellular polarities frequently require no extracellular guidance cues, indicating that spontaneous polarization occurs in the signal network. Spontaneous polarization activity is considered relevant to the persistent motions in random cell migrations and chemotaxis. In this study, we propose a theoretical model that connects spontaneous intracellular polarity and motile ability in a chemoattractant solution. We demonstrate that the intracellular polarity can enhance the accuracy of chemotaxis. Chemotactic accuracy should also depend on chemoattractant concentration through the concentration-dependent correlation time in the polarity direction. Both the polarity correlation time and the chemotactic accuracy depend on the degree of responsiveness to the chemical gradient. We show that optimally accurate chemotaxis occurs at an intermediate responsiveness of intracellular polarity. Experimentally, we find that the persistence time of randomly migrating Dictyostelium cells depends on the chemoattractant concentration, as predicted by our theory. At the optimum responsiveness, this ameboid cell can enhance its chemotactic accuracy tenfold. (paper)

  7. Assessing effects of the e-Chasqui laboratory information system on accuracy and timeliness of bacteriology results in the Peruvian tuberculosis program.

    Science.gov (United States)

    Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin J A; Yale, Gloria; Suarez, Carmen; Asencios, Luis; Fraser, Hamish

    2007-10-11

    We created a web-based laboratory information system, e-Chasqui to connect public laboratories to health centers to improve communication and analysis. After one year, we performed a pre and post assessment of communication delays and found that e-Chasqui maintained the average delay but eliminated delays of over 60 days. Adding digital verification maintained the average delay, but should increase accuracy. We are currently performing a randomized evaluation of the impacts of e-Chasqui.

  8. Reference radiology in nephroblastoma: accuracy and relevance for preoperative chemotherapy

    International Nuclear Information System (INIS)

    Schenk, J.P.; Schrader, C.; Zieger, B.; Ley, S.; Troeger, J.; Furtwaengler, R.; Graf, N.; Leuschner, I.

    2006-01-01

    Purpose: A reference radiologic diagnosis was carried out for the purpose of quality control and in order to achieve high diagnostic accuracy in the ongoing trial and study SIOP 2001/GPOH for renal tumors during childhood. The aim of the present study is to evaluate the value of diagnostic imaging and the benefit of reference evaluation at a pediatric radiology center. Materials and Methods: In 2004 the imaging studies of 97 patients suspected of having a renal tumor were presented at the beginning of therapy. Diagnostic imaging was compared to the primary imaging results and the histological findings and was analyzed in regard to the therapeutic consequence (primary chemotherapy without prior histology). 77 MRI, 35 CT and 67 ultrasound examinations of 47 girls and 50 boys (mean age 4 years; one day to 15.87 years old) were analyzed. In addition to the histological findings, the reference pathological results were submitted in 86 cases. Results from the primary imaging corresponding to the histology and results from the reference radiology corresponding to the histology and results from the reference radiology corresponding to the histology were statistically compared in a binomial test. Results: In 76 of the reference-diagnosed Wilms' tumors, 67 were confirmed histologically. In 72 cases preoperative chemotherapy was initiated. In 5 cases neither a Wilms' tumor nor a nephroblastomatosis was found. 16 of 21 cases (76%) with reference-diagnosed non-Wilms' tumors were selected correctly. The results of the primary imaging corresponded to the histology in 71 cases, and those of the reference radiology in 82 cases. The statistical evaluation showed that the results of the reference radiology were significantly better (p=0.03971). (orig.)

  9. Clinical relevance of studies on the accuracy of visual inspection for detecting caries lesions

    DEFF Research Database (Denmark)

    Gimenez, Thais; Piovesan, Chaiana; Braga, Mariana M

    2015-01-01

    Although visual inspection is the most commonly used method for caries detection, and consequently the most investigated, studies have not been concerned about the clinical relevance of this procedure. Therefore, we conducted a systematic review in order to perform a critical evaluation considering...... the clinical relevance and methodological quality of studies on the accuracy of visual inspection for assessing caries lesions. Two independent reviewers searched several databases through July 2013 to identify papers/articles published in English. Other sources were checked to identify unpublished literature...... to clinical relevance and the methodological quality of the studies were evaluated. 96 of the 5,578 articles initially identified met the inclusion criteria. In general, most studies failed in considering some clinically relevant aspects: only 1 included study validated activity status of lesions, no study...

  10. The Importance of Accuracy, Stimulating writing, and Relevance in Middle School Science Textbook Writing

    Science.gov (United States)

    Hubisz, John

    2004-05-01

    While accuracy in Middle School science texts is most important, the texts should also read well, stimulating the student to want to go on, and the material must be relevant to the subject at hand as the typical student is not yet prepared to ignore that which is irrelevant. We know that children will read if the material is of interest (witness The Lord of the Rings and the Harry Potter book sales) and so we must write in a way that stimulates the student to want to examine the subject further and eliminate that which adds nothing to the discipline. Examples of the good and the bad will be presented.

  11. The Relevance of Interoception in Chronic Tinnitus: Analyzing Interoceptive Sensibility and Accuracy

    Directory of Open Access Journals (Sweden)

    Pia Lau

    2015-01-01

    Full Text Available In order to better understand tinnitus and distress associated with tinnitus, psychological variables such as emotional and cognitive processing are a central element in theoretical models of this debilitating condition. Interoception, that is, the perception of internal processes, may be such a psychological factor relevant to tinnitus. Against this background, 20 participants suffering from chronic tinnitus and 20 matched healthy controls were tested with questionnaires, assessing interoceptive sensibility, and participated in two tasks, assessing interoceptive accuracy: the Schandry task, a heartbeat estimation assignment, and a skin conductance fluctuations perception task assessing the participants’ ability to perceive phasic increases in sympathetic activation were used. To test stress reactivity, a construct tightly connected to tinnitus onset, we also included a stress induction. No differences between the groups were found for interoceptive accuracy and sensibility. However, the tinnitus group tended to overestimate the occurrence of phasic activation. Loudness of the tinnitus was associated with reduced interoceptive performance under stress. Our results indicate that interoceptive sensibility and accuracy do not play a significant role in tinnitus. However, tinnitus might be associated with a tendency to overestimate physical changes.

  12. The relevance of accuracy of heartbeat perception in noncardiac and cardiac chest pain.

    Science.gov (United States)

    Schroeder, Stefanie; Gerlach, Alexander L; Achenbach, Stephan; Martin, Alexandra

    2015-04-01

    The development and course of noncardiac chest pain are assumed to be influenced by interoceptive processes. It was investigated whether heartbeat perception was enhanced in patients suffering from noncardiac chest pain and to what degree it was associated with self-reported cognitive-perceptual features and chest pain characteristics. A total of 42 patients with noncardiac chest pain (NCCP), 35 patients with cardiac chest pain, and 52 healthy controls were recruited. Heartbeat perception was assessed using the Schandry task and a modified Brener-Kluvitse task. Self-report measures assessed anxiety sensitivity, somatosensory amplification, heart-focused anxiety, and chest pain characteristics. Heartbeat perception was not more accurate in patients with NCCP, compared to patients with cardiac chest pain and healthy controls. However, in patients with NCCP, the error score (Schandry task) was significantly associated with stronger chest pain impairment, and the response bias (Brener-Kluvitse task) was associated with lower chest pain intensity. Against assumptions of current etiological models, heartbeat perception was not enhanced in patients with NCCP. Chest pain characteristics and particularly their appraisal as threatening might be more relevant to NCCP than the perceptional accuracy of cardiac sensations and should be focused in psychological interventions. However, associations with chest pain impairment suggest cardiac interoception to influence the course of NCCP.

  13. Quantifying reporting timeliness to improve outbreak control

    NARCIS (Netherlands)

    Bonačić Marinović, Axel; Swaan, Corien; van Steenbergen, Jim; Kretzschmar, MEE

    The extent to which reporting delays should be reduced to gain substantial improvement in outbreak control is unclear. We developed a model to quantitatively assess reporting timeliness. Using reporting speed data for 6 infectious diseases in the notification system in the Netherlands, we calculated

  14. A Method for The Assessing of Reliability Characteristics Relevant to an Assumed Position-Fixing Accuracy in Navigational Positioning Systems

    Directory of Open Access Journals (Sweden)

    Specht Cezary

    2016-09-01

    Full Text Available This paper presents a method which makes it possible to determine reliability characteristics of navigational positioning systems, relevant to an assumed value of permissible error in position fixing. The method allows to calculate: availability , reliability as well as operation continuity of position fixing system for an assumed, determined on the basis of formal requirements - both worldwide and national, position-fixing accuracy. The proposed mathematical model allows to satisfy, by any navigational positioning system, not only requirements as to position-fixing accuracy of a given navigational application (for air , sea or land traffic but also the remaining characteristics associated with technical serviceability of a system.

  15. Task-relevant cognitive and motor functions are prioritized during prolonged speed-accuracy motor task performance.

    Science.gov (United States)

    Solianik, Rima; Satas, Andrius; Mickeviciene, Dalia; Cekanauskaite, Agne; Valanciene, Dovile; Majauskiene, Daiva; Skurvydas, Albertas

    2018-06-01

    This study aimed to explore the effect of prolonged speed-accuracy motor task on the indicators of psychological, cognitive, psychomotor and motor function. Ten young men aged 21.1 ± 1.0 years performed a fast- and accurate-reaching movement task and a control task. Both tasks were performed for 2 h. Despite decreased motivation, and increased perception of effort as well as subjective feeling of fatigue, speed-accuracy motor task performance improved during the whole period of task execution. After the motor task, the increased working memory function and prefrontal cortex oxygenation at rest and during conflict detection, and the decreased efficiency of incorrect response inhibition and visuomotor tracking were observed. The speed-accuracy motor task increased the amplitude of motor-evoked potentials, while grip strength was not affected. These findings demonstrate that to sustain the performance of 2-h speed-accuracy task under conditions of self-reported fatigue, task-relevant functions are maintained or even improved, whereas less critical functions are impaired.

  16. Timeliness “at a glance”: assessing the turnaround time through the six sigma metrics.

    Science.gov (United States)

    Ialongo, Cristiano; Bernardini, Sergio

    2016-01-01

    Almost thirty years of systematic analysis have proven the turnaround time to be a fundamental dimension for the clinical laboratory. Several indicators are to date available to assess and report quality with respect to timeliness, but they sometimes lack the communicative immediacy and accuracy. The six sigma is a paradigm developed within the industrial domain for assessing quality and addressing goal and issues. The sigma level computed through the Z-score method is a simple and straightforward tool which delivers quality by a universal dimensionless scale and allows to handle non-normal data. Herein we report our preliminary experience in using the sigma level to assess the change in urgent (STAT) test turnaround time due to the implementation of total automation. We found that the Z-score method is a valuable and easy to use method for assessing and communicating the quality level of laboratory timeliness, providing a good correspondence with the actual change in efficiency which was retrospectively observed.

  17. Accuracy, Precision, Ease-Of-Use, and Cost of Methods to Test Ebola-Relevant Chlorine Solutions.

    Directory of Open Access Journals (Sweden)

    Emma Wells

    Full Text Available To prevent transmission in Ebola Virus Disease (EVD outbreaks, it is recommended to disinfect living things (hands and people with 0.05% chlorine solution and non-living things (surfaces, personal protective equipment, dead bodies with 0.5% chlorine solution. In the current West African EVD outbreak, these solutions (manufactured from calcium hypochlorite (HTH, sodium dichloroisocyanurate (NaDCC, and sodium hypochlorite (NaOCl have been widely used in both Ebola Treatment Unit and community settings. To ensure solution quality, testing is necessary, however test method appropriateness for these Ebola-relevant concentrations has not previously been evaluated. We identified fourteen commercially-available methods to test Ebola-relevant chlorine solution concentrations, including two titration methods, four DPD dilution methods, and six test strips. We assessed these methods by: 1 determining accuracy and precision by measuring in quintuplicate five different 0.05% and 0.5% chlorine solutions manufactured from NaDCC, HTH, and NaOCl; 2 conducting volunteer testing to assess ease-of-use; and, 3 determining costs. Accuracy was greatest in titration methods (reference-12.4% error compared to reference method, then DPD dilution methods (2.4-19% error, then test strips (5.2-48% error; precision followed this same trend. Two methods had an accuracy of <10% error across all five chlorine solutions with good precision: Hach digital titration for 0.05% and 0.5% solutions (recommended for contexts with trained personnel and financial resources, and Serim test strips for 0.05% solutions (recommended for contexts where rapid, inexpensive, and low-training burden testing is needed. Measurement error from test methods not including pH adjustment varied significantly across the five chlorine solutions, which had pH values 5-11. Volunteers found test strip easiest and titration hardest; costs per 100 tests were $14-37 for test strips and $33-609 for titration

  18. SU-E-T-789: Validation of 3DVH Accuracy On Quantifying Delivery Errors Based On Clinical Relevant DVH Metrics

    International Nuclear Information System (INIS)

    Ma, T; Kumaraswamy, L

    2015-01-01

    Purpose: Detection of treatment delivery errors is important in radiation therapy. However, accurate quantification of delivery errors is also of great importance. This study aims to evaluate the 3DVH software’s ability to accurately quantify delivery errors. Methods: Three VMAT plans (prostate, H&N and brain) were randomly chosen for this study. First, we evaluated whether delivery errors could be detected by gamma evaluation. Conventional per-beam IMRT QA was performed with the ArcCHECK diode detector for the original plans and for the following modified plans: (1) induced dose difference error up to ±4.0% and (2) control point (CP) deletion (3 to 10 CPs were deleted) (3) gantry angle shift error (3 degree uniformly shift). 2D and 3D gamma evaluation were performed for all plans through SNC Patient and 3DVH, respectively. Subsequently, we investigated the accuracy of 3DVH analysis for all cases. This part evaluated, using the Eclipse TPS plans as standard, whether 3DVH accurately can model the changes in clinically relevant metrics caused by the delivery errors. Results: 2D evaluation seemed to be more sensitive to delivery errors. The average differences between ECLIPSE predicted and 3DVH results for each pair of specific DVH constraints were within 2% for all three types of error-induced treatment plans, illustrating the fact that 3DVH is fairly accurate in quantifying the delivery errors. Another interesting observation was that even though the gamma pass rates for the error plans are high, the DVHs showed significant differences between original plan and error-induced plans in both Eclipse and 3DVH analysis. Conclusion: The 3DVH software is shown to accurately quantify the error in delivered dose based on clinically relevant DVH metrics, where a conventional gamma based pre-treatment QA might not necessarily detect

  19. IFRS Adoption, Firm Traits and Audit Timeliness: Evidence from Nigeria

    Directory of Open Access Journals (Sweden)

    Musa Inuwa Fodio

    2015-06-01

    Full Text Available Audit timeliness is an important ingredient of quality financial reporting. Stale information might only benefit little to stakeholders in their decision making process. With the recent adoption of the International Financial Reporting Standards in Nigeria, the work of the auditor has seemingly become complicated. The question then emerges, if such adoption affects the timeliness of audit reports. This study empirically investigates the impact of IFRS adoption and other associated explanatory variables on audit timeliness in Nigerian deposit money banks for the period 2010 to 2013. Panel regression analysis reveals a positive significant impact of IFRS adoption on audit timeliness. Results also indicate that firm age, firm size and auditor firm type are significant predictors of audit timeliness in Nigeria deposit money banks. The study recommends that auditor firms should make stringent efforts to acclimatize with the complexities of the IFRS transition process so as to reduce audit report delays. Also reporting agencies should come up with regulations, deadlines and benchmarks for issuance of independent audit reports.

  20. Hans Joas & Daniel R. Huebner (eds.), The Timeliness of George Herbert Mead

    OpenAIRE

    Baggio, Guido

    2018-01-01

    The Timeliness of George Herbert Mead is a significant contribution to the recent “Mead renaissance.” It gathers some contributions first presented at the conference celebrating the 150th anniversary of the birth of George Herbert Mead held in April 2013 at the University of Chicago and organized by Hans Joas, Andrew Abbott, Daniel Huebner, and Christopher Takacs. The volume brings scholarship on G. H. Mead up to date highlighting Mead’s relevance for areas of research completely ignored by p...

  1. Vaccine Education During Pregnancy and Timeliness of Infant Immunization.

    Science.gov (United States)

    Veerasingam, Priya; Grant, Cameron C; Chelimo, Carol; Philipson, Kathryn; Gilchrist, Catherine A; Berry, Sarah; Carr, Polly Atatoa; Camargo, Carlos A; Morton, Susan

    2017-09-01

    Pregnant women routinely receive information in support of or opposing infant immunization. We aimed to describe immunization information sources of future mothers' and determine if receiving immunization information is associated with infant immunization timeliness. We analyzed data from a child cohort born 2009-2010 in New Zealand. Pregnant women ( N = 6822) at a median gestation of 39 weeks described sources of information encouraging or discouraging infant immunization. Immunizations received by cohort infants were determined through linkage with the National Immunization Register ( n = 6682 of 6853 [98%]). Independent associations of immunization information received with immunization timeliness were described by using adjusted odds ratios (ORs) and 95% confidence intervals (CIs). Immunization information sources were described by 6182 of 6822 (91%) women. Of these, 2416 (39%) received information encouraging immunization, 846 (14%) received discouraging information, and 565 (9%) received both encouraging and discouraging information. Compared with infants of women who received no immunization information (71% immunized on-time), infants of women who received discouraging information only (57% immunized on time, OR = 0.49, 95% CI 0.38-0.64) or encouraging and discouraging information (61% immunized on time, OR = 0.51, 95% CI 0.42-0.63) were at decreased odds of receiving all immunizations on time. Receipt of encouraging information only was not associated with infant immunization timeliness (73% immunized on time, OR = 1.00, 95% CI 0.87-1.15). Receipt, during pregnancy, of information against immunization was associated with delayed infant immunization regardless of receipt of information supporting immunization. In contrast, receipt of encouraging information is not associated with infant immunization timeliness. Copyright © 2017 by the American Academy of Pediatrics.

  2. Detection of relevant colonic neoplasms with PET/CT: promising accuracy with minimal CT dose and a standardised PET cut-off

    Energy Technology Data Exchange (ETDEWEB)

    Luboldt, Wolfgang [Multiorgan Screening Foundation, Frankfurt (Germany); University Hospital Frankfurt, Department of Radiology, Frankfurt am Main (Germany); University Hospital Dresden, Clinic and Policlinic of Nuclear Medicine, Dresden (Germany); Volker, Teresa; Zoephel, Klaus; Kotzerke, Joerg [University Hospital Dresden, Clinic and Policlinic of Nuclear Medicine, Dresden (Germany); Wiedemann, Baerbel [University Hospital Dresden, Institute of Medical Informatics and Biometrics, Dresden (Germany); Wehrmann, Ursula [University Hospital Dresden, Clinic and Policlinic of Surgery, Dresden (Germany); Koch, Arne; Abolmaali, Nasreddin [University Hospital Dresden, Oncoray, Dresden (Germany); Toussaint, Todd; Luboldt, Hans-Joachim [Multiorgan Screening Foundation, Frankfurt (Germany); Middendorp, Markus; Gruenwald, Frank [University Hospital Frankfurt, Department of Nuclear Medicine, Frankfurt (Germany); Aust, Daniela [University Hospital Dresden, Department of Pathology, Dresden (Germany); Vogl, Thomas J. [University Hospital Frankfurt, Department of Radiology, Frankfurt am Main (Germany)

    2010-09-15

    To determine the performance of FDG-PET/CT in the detection of relevant colorectal neoplasms (adenomas {>=}10 mm, with high-grade dysplasia, cancer) in relation to CT dose and contrast administration and to find a PET cut-off. 84 patients, who underwent PET/CT and colonoscopy (n=79)/sigmoidoscopy (n=5) for (79 x 6+5 x 2)=484 colonic segments, were included in a retrospective study. The accuracy of low-dose PET/CT in detecting mass-positive segments was evaluated by ROC analysis by two blinded independent reviewers relative to contrast-enhanced PET/CT. On a per-lesion basis characteristic PET values were tested as cut-offs. Low-dose PET/CT and contrast-enhanced PET/CT provide similar accuracies (area under the curve for the average ROC ratings 0.925 vs. 0.929, respectively). PET demonstrated all carcinomas (n=23) and 83% (30/36) of relevant adenomas. In all carcinomas and adenomas with high-grade dysplasia (n=10) the SUV{sub max} was {>=}5. This cut-off resulted in a better per-segment sensitivity and negative predictive value (NPV) than the average PET/CT reviews (sensitivity: 89% vs. 82%; NPV: 99% vs. 98%). All other tested cut-offs were inferior to the SUV{sub max}. FDG-PET/CT provides promising accuracy for colorectal mass detection. Low dose and lack of iodine contrast in the CT component do not impact the accuracy. The PET cut-off SUV{sub max}{>=} 5 improves the accuracy. (orig.)

  3. Detection of relevant colonic neoplasms with PET/CT: promising accuracy with minimal CT dose and a standardised PET cut-off

    International Nuclear Information System (INIS)

    Luboldt, Wolfgang; Volker, Teresa; Zoephel, Klaus; Kotzerke, Joerg; Wiedemann, Baerbel; Wehrmann, Ursula; Koch, Arne; Abolmaali, Nasreddin; Toussaint, Todd; Luboldt, Hans-Joachim; Middendorp, Markus; Gruenwald, Frank; Aust, Daniela; Vogl, Thomas J.

    2010-01-01

    To determine the performance of FDG-PET/CT in the detection of relevant colorectal neoplasms (adenomas ≥10 mm, with high-grade dysplasia, cancer) in relation to CT dose and contrast administration and to find a PET cut-off. 84 patients, who underwent PET/CT and colonoscopy (n=79)/sigmoidoscopy (n=5) for (79 x 6+5 x 2)=484 colonic segments, were included in a retrospective study. The accuracy of low-dose PET/CT in detecting mass-positive segments was evaluated by ROC analysis by two blinded independent reviewers relative to contrast-enhanced PET/CT. On a per-lesion basis characteristic PET values were tested as cut-offs. Low-dose PET/CT and contrast-enhanced PET/CT provide similar accuracies (area under the curve for the average ROC ratings 0.925 vs. 0.929, respectively). PET demonstrated all carcinomas (n=23) and 83% (30/36) of relevant adenomas. In all carcinomas and adenomas with high-grade dysplasia (n=10) the SUV max was ≥5. This cut-off resulted in a better per-segment sensitivity and negative predictive value (NPV) than the average PET/CT reviews (sensitivity: 89% vs. 82%; NPV: 99% vs. 98%). All other tested cut-offs were inferior to the SUV max . FDG-PET/CT provides promising accuracy for colorectal mass detection. Low dose and lack of iodine contrast in the CT component do not impact the accuracy. The PET cut-off SUV max ≥ 5 improves the accuracy. (orig.)

  4. Analysis of timeliness of infectious disease reporting in the Netherlands

    Directory of Open Access Journals (Sweden)

    Kretzschmar Mirjam EE

    2011-05-01

    Full Text Available Abstract Background Timely reporting of infectious disease cases to public health authorities is essential to effective public health response. To evaluate the timeliness of reporting to the Dutch Municipal Health Services (MHS, we used as quantitative measures the intervals between onset of symptoms and MHS notification, and between laboratory diagnosis and notification with regard to six notifiable diseases. Methods We retrieved reporting data from June 2003 to December 2008 from the Dutch national notification system for shigellosis, EHEC/STEC infection, typhoid fever, measles, meningococcal disease, and hepatitis A virus (HAV infection. For each disease, median intervals between date of onset and MHS notification were calculated and compared with the median incubation period. The median interval between date of laboratory diagnosis and MHS notification was similarly analysed. For the year 2008, we also investigated whether timeliness is improved by MHS agreements with physicians and laboratories that allow direct laboratory reporting. Finally, we investigated whether reports made by post, fax, or e-mail were more timely. Results The percentage of infectious diseases reported within one incubation period varied widely, between 0.4% for shigellosis and 90.3% for HAV infection. Not reported within two incubation periods were 97.1% of shigellosis cases, 76.2% of cases of EHEC/STEC infection, 13.3% of meningococcosis cases, 15.7% of measles cases, and 29.7% of typhoid fever cases. A substantial percentage of infectious disease cases was reported more than three days after laboratory diagnosis, varying between 12% for meningococcosis and 42% for shigellosis. MHS which had agreements with physicians and laboratories showed a significantly shorter notification time compared to MHS without such agreements. Conclusions Over the study period, many cases of the six notifiable diseases were not reported within two incubation periods, and many were

  5. Timeliness of earnings reported by Romanian listed companies

    Directory of Open Access Journals (Sweden)

    Mihai Carp

    2018-03-01

    Full Text Available The paper aims to analyze the quality of financial information, by assessing the timeliness of earnings, using information specific to non-financial companies listed on the regulated section of Bucharest Stock Exchange. The study also seeks to assess the symmetry of actions for the timely recognition of potential gains and losses (components of the economic income and, if there is an asymmetry, to identify the sense of the temporary gap. The phenomenon is analysed in conjunction with a number of control factors such as the Romanian Accounting Standards (RAS, the International Financial Reporting Standards (IFRS, the degree of indebtedness or the entities’ field of activity. Quantitative analysis performed through econometric models consecrated in the field, such as Basu (1997 and Ball and Shivakumar (2005, reveals that the companies included in the study provide financial information that meets the qualitatively criterion assessed, respectively earnings timeliness. Deepening the analysis has made it possible to identify a timely recognition both for unrealised gains and potential losses, as a result of tests carried out on the whole sample, an advance in what concerns the inclusion of economic lose in the accounting income compared to the recognition of economic gains. The presence of disjunctive factors in the analysis generated a number of particular results. In the case of normally indebted companies that apply the IFRS, a timely recognition of economic gains and losses was noted, without the gap specific to conservatism.

  6. Improving the timeliness of procedures in a pediatric endoscopy suite.

    Science.gov (United States)

    Tomer, Gitit; Choi, Steven; Montalvo, Andrea; Sutton, Sheila; Thompson, John; Rivas, Yolanda

    2014-02-01

    Pediatric endoscopic procedures are essential in the evaluation and treatment of gastrointestinal diseases in children. Although pediatric endoscopists are greatly interested in increasing efficiency and through-put in pediatric endoscopy units, there is scarcely any literature on this critical process. The goal of this study was to improve the timeliness of pediatric endoscopy procedures at Children's Hospital at Montefiore. In June 2010, a pediatric endoscopy quality improvement initiative was formed at Children's Hospital at Montefiore. We identified patient-, equipment-, and physician-related causes for case delays. Pareto charts, cause and effect diagrams, process flow mapping, and statistical process control charts were used for analysis. From June 2010 to December 2012, we were able to significantly decrease the first case endoscopy delay from an average of 17 to 10 minutes (P < .001), second case delay from 39 to 25 minutes (P = .01), third case delay from 61 to 45 minutes (P = .05), and fourth case delay from 79 to 51 minutes (P = .05). Total delay time decreased from 196 to 131 minutes, resulting in a reduction of 65 minutes (P = .02). From June 2010 to August 2011 (preintervention period), an average of 36% of first endoscopy cases started within 5 minutes, 51% within 10 minutes, and 61% within 15 minutes of the scheduled time. From September 2011 to December 2012 (postintervention period), the percentage of cases starting within 5 minutes, 10 minutes, and 15 minutes increased to 47% (P = .07), 61% (P = .04), and 79% (P = .01), respectively. Applying quality improvement methods and tools helped improve pediatric endoscopy timeliness and significantly decreased total delays.

  7. An Architecture for Improving Timeliness and Relevance of Cyber Incident Notifications

    Science.gov (United States)

    2011-03-01

    the difference between a beginning chess player, an experienced amateur, and a grand master. The beginner sees what his opponent is doing, but is...supplemented sparingly with traditional flowcharts where 69 additional detail is desired. These five are a Use Case diagram, a Class diagram...Figure 35 provides a flowchart example of this process. Obtain current ASF or MSF timestamp Count Dependencies Have all dependencies been checked

  8. The Challenge To Tactical Reconnaissance: Timeliness Through Technology

    Science.gov (United States)

    Stromfors, Richard D.

    1984-12-01

    As you have no doubt gathered from Mr. Henkel's introduction, I have spent over 20 years of my Air Force career involved in the reconnaissance mission either as a tactical reconnaissance pilot, as a tactical reconnaissance inspector, as a writer and speaker on that subject while attending the Air Force Professional Military Education Schools, and currently as the Air Force's operational manager for reconnaissance aircraft. In all of those positions, I've been challenged many times over with what appeared, at first, to be insurmountable problems that upon closer examination weren't irresolvable after all. All of these problems pale, however, when viewed side-by-side with the one challenge that has faced me since I began my military career and, in fact, faces all of us as I talk with you today. That one challenge is the problem of timeliness. Better put: "Getting information to our customers firstest with the mostest." Together we must develop better platforms and sensors to cure this age-old "Achilles heel" in the reconnaissance cycle. Despite all of our best intentions, despite all of the emerging technologies that will be available, and despite all of the dollars that we've thrown at research and development, we in the reconnaissance business still haven't done a good job in this area. We must do better.

  9. Timeliness of notification systems for infectious diseases: A systematic literature review.

    NARCIS (Netherlands)

    Swaan, Corien; van den Broek, Anouk; Kretzschmar, Mirjam; Richardus, Jan Hendrik

    2018-01-01

    Timely notification of infectious diseases is crucial for prompt response by public health services. Adequate notification systems facilitate timely notification. A systematic literature review was performed to assess outcomes of studies on notification timeliness and to determine which aspects of

  10. TIMELINESS LAPORAN KEUANGAN DI INDONESIA (STUDI EMPIRIS TERHADAP EMITEN BURSA EFEK JAKARTA)

    OpenAIRE

    Michell Suharli; Sofyan S. Harahap

    2008-01-01

    This Research examines variables which are predicted influencing timeliness finandal statement in Indonesia. Factors that are predicted influencing timeliness in this research are focused on 4 factors: firm scale, profitability, big 4 worldwide accounting firm , and securities return . This research can examines financial statement of 30 companies are listed Jakarta Stock Exchange for period ended December31, 2002 until December31, 2003. Data is collected from Jakarta Stock Exchange and Indon...

  11. Timeliness Laporan Keuangan di Indonesia (Studi Empiris terhadap Emiten Bursa Efek Jakarta)

    OpenAIRE

    Suharli, Michell; Harahap, Sofyan S

    2008-01-01

    This Research examines variables which are predicted influencing timeliness finandal statement in Indonesia. Factors that are predicted influencing timeliness in this research are focused on 4 factors: firm scale, profitability, big 4 worldwide accounting firm , and securities return . This research can examines financial statement of 30 companies are listed Jakarta Stock Exchange for period ended December31, 2002 until December31, 2003. Data is collected from Jakarta Stock Exchange and Indon...

  12. Physician peer group characteristics and timeliness of breast cancer surgery.

    Science.gov (United States)

    Bachand, Jacqueline; Soulos, Pamela R; Herrin, Jeph; Pollack, Craig E; Xu, Xiao; Ma, Xiaomei; Gross, Cary P

    2018-04-24

    Little is known about how the structure of interdisciplinary groups of physicians affects the timeliness of breast cancer surgery their patients receive. We used social network methods to examine variation in surgical delay across physician peer groups and the association of this delay with group characteristics. We used linked Surveillance, Epidemiology, and End Results-Medicare data to construct physician peer groups based on shared breast cancer patients. We used hierarchical generalized linear models to examine the association of three group characteristics, patient racial composition, provider density (the ratio of potential vs. actual connections between physicians), and provider transitivity (clustering of providers within groups), with delayed surgery. The study sample included 8338 women with breast cancer in 157 physician peer groups. Surgical delay varied widely across physician peer groups (interquartile range 28.2-50.0%). For every 10% increase in the percentage of black patients in a peer group, there was a 41% increase in the odds of delayed surgery for women in that peer group regardless of a patient's own race [odds ratio (OR) 1.41, 95% confidence interval (CI) 1.15-1.73]. Women in physician peer groups with the highest provider density were less likely to receive delayed surgery than those in physician peer groups with the lowest provider density (OR 0.65, 95% CI 0.44-0.98). We did not find an association between provider transitivity and delayed surgery. The likelihood of surgical delay varied substantially across physician peer groups and was associated with provider density and patient racial composition.

  13. Pengaruh Faktor Internal dan Eksternal Perusahaan Terhadap Audit Delay dan Timeliness

    Directory of Open Access Journals (Sweden)

    Sistya Rachmawati

    2008-01-01

    Full Text Available The objective of this research is to investigate the influence of the firm size, the profitability, the solvability, the public accountant size and the existence of internal auditor division toward the Audit Delay and Timeliness on manufacture companies that listed in Jakarta Stock Exchange.The Research sample was taken from Fifty-nine listed companies in Jakarta Stock Exchange. These samples were selected by using Purposive sampling method. Analysis hypothesis is using Multiple Regression, before hypothesis test, normality data test using P-Plot test.The result of Multiple Regression model shows that Audit Delay influenced by firm size and public accountant size, and Timeliness influenced by firm size and solvability. This result is recommended for auditor to increase effectiveness and efficiency of his audit performance and for all existing studies to contribute towards the current literature on Auditing. Abstract in Bahasa Indonesia: Penelitian ini bertujuan untuk mengukur pengaruh faktor internal yaitu: profitabilitas, solva¬bili¬tas, internal auditor dan size perusahaan dan faktor eksternal, yaitu ukuran KAP terhadap audit delay dan Timeliness pada perusahaan manufaktur yang terdaftar pada Jakarta Stock Exchange. Pemilihan sampel menggunakan metode Purposive Sampling. Dari hasil pengolahan Regresi Berganda pada Audit Delay diketahui bahwa koefisien determi¬nasi Adjusted R2 = 0,123. Artinya seluruh variabel independen (Profitabilitas, Solvabilitas, Internal Auditor, Size Perusahaan, dan KAP hanya mampu menjelaskan variasi dari variabel depen¬den (Audit Delay adalah sebesar 12,3%. Sedang¬kan pada Timeliness, seluruh variabel independen (Profitabilitas, Solvabilitas, Internal Auditor, Size Perusahaan, dan KAP dapat men¬jelaskan variasi pada variabel dependennya (Timeliness adalah sebesar 7,9%. Hasil dari penelitian ini dapat membantu profesi akuntan publik dalam upaya meningkatkan efisiensi dan efektivitas proses audit dengan

  14. Verification of Positional Accuracy of ZVS3003 Geodetic Control ...

    African Journals Online (AJOL)

    The International GPS Service (IGS) has provided GPS orbit products to the scientific community with increased precision and timeliness. Many users interested in geodetic positioning have adopted the IGS precise orbits to achieve centimeter level accuracy and ensure long-term reference frame stability. Positioning with ...

  15. Providing Mailing Cost Reimbursements: The Effect on Reporting Timeliness of Sexually Transmitted Diseases in Virginia.

    Science.gov (United States)

    Vasiliu, Oana E; Stover, Jeffrey A; Mays, Marissa J E; Bissette, Jennifer M; Dolan, Carrie B; Sirbu, Corina M

    2009-01-01

    We investigated the effect of providing mailing cost reimbursements to local health departments on the timeliness of the reporting of sexually transmitted diseases (STDs) in Virginia. The Division of Disease Prevention, Virginia Department of Health, provided mailing cost reimbursements to 31 Virginia health districts from October 2002 to December 2004. The difference (in days) between the diagnosis date (or date the STD paperwork was initiated) and the date the case/STD report was entered into the STD surveillance database was used in a negative binomial regression model against time (as divided into three periods-before, during, and after reimbursement) to estimate the effect of providing mailing cost reimbursements on reporting timeliness. We observed significant decreases in the number of days between diagnosis and reporting of a case, which were sustained after the reimbursement period ended, in 25 of the 31 health districts included in the analysis. We observed a significant initial decrease (during the reimbursement period) followed by a significant increase in the after-reimbursement phase in one health district. Two health districts had a significant initial decrease, while one health district had a significant decrease in reporting timeliness in the period after reimbursement. Two health districts showed no significant changes in the number of days to report to the central office. Providing reimbursements for mailing costs was statistically associated with improved STD reporting timeliness in almost all of Virginia's health districts. Sustained improvement after the reimbursement period ended is likely indicative of improved local health department reporting habits.

  16. Timeliness of notification systems for infectious diseases: A systematic literature review.

    Science.gov (United States)

    Swaan, Corien; van den Broek, Anouk; Kretzschmar, Mirjam; Richardus, Jan Hendrik

    2018-01-01

    Timely notification of infectious diseases is crucial for prompt response by public health services. Adequate notification systems facilitate timely notification. A systematic literature review was performed to assess outcomes of studies on notification timeliness and to determine which aspects of notification systems are associated with timely notification. Articles reviewing timeliness of notifications published between 2000 and 2017 were searched in Pubmed and Scopus. Using a standardized notification chain, timeliness of reporting system for each article was defined as either sufficient (≥ 80% notifications in time), partly sufficient (≥ 50-80%), or insufficient (systems were compared with conventional methods (postal mail, fax, telephone, email) and mobile phone reporting. 48 articles were identified. In almost one third of the studies with a predefined timeframe (39), timeliness of notification systems was either sufficient or insufficient (11/39, 28% and 12/39, 31% resp.). Applying the standardized timeframe (45 studies) revealed similar outcomes (13/45, 29%, sufficient notification timeframe, vs 15/45, 33%, insufficient). The disease specific timeframe was not met by any study. Systems involving reporting by laboratories most often complied sufficiently with predefined or standardized timeframes. Outcomes were not related to electronic, conventional notification systems or mobile phone reporting. Electronic systems were faster in comparative studies (10/13); this hardly resulted in sufficient timeliness, neither according to predefined nor to standardized timeframes. A minority of notification systems meets either predefined, standardized or disease specific timeframes. Systems including laboratory reporting are associated with timely notification. Electronic systems reduce reporting delay, but implementation needs considerable effort to comply with notification timeframes. During outbreak threats, patient, doctors and laboratory testing delays need to

  17. Analytical Performance Requirements for Systems for Self-Monitoring of Blood Glucose With Focus on System Accuracy: Relevant Differences Among ISO 15197:2003, ISO 15197:2013, and Current FDA Recommendations.

    Science.gov (United States)

    Freckmann, Guido; Schmid, Christina; Baumstark, Annette; Rutschmann, Malte; Haug, Cornelia; Heinemann, Lutz

    2015-07-01

    In the European Union (EU), the ISO (International Organization for Standardization) 15197 standard is applicable for the evaluation of systems for self-monitoring of blood glucose (SMBG) before the market approval. In 2013, a revised version of this standard was published. Relevant revisions in the analytical performance requirements are the inclusion of the evaluation of influence quantities, for example, hematocrit, and some changes in the testing procedures for measurement precision and system accuracy evaluation, for example, number of test strip lots. Regarding system accuracy evaluation, the most important change is the inclusion of more stringent accuracy criteria. In 2014, the Food and Drug Administration (FDA) in the United States published their own guidance document for the premarket evaluation of SMBG systems with even more stringent system accuracy criteria than stipulated by ISO 15197:2013. The establishment of strict accuracy criteria applicable for the premarket evaluation is a possible approach to further improve the measurement quality of SMBG systems. However, the system accuracy testing procedure is quite complex, and some critical aspects, for example, systematic measurement difference between the reference measurement procedure and a higher-order procedure, may potentially limit the apparent accuracy of a given system. Therefore, the implementation of a harmonized reference measurement procedure for which traceability to standards of higher order is verified through an unbroken, documented chain of calibrations is desirable. In addition, the establishment of regular and standardized post-marketing evaluations of distributed test strip lots should be considered as an approach toward an improved measurement quality of available SMBG systems. © 2015 Diabetes Technology Society.

  18. Accuracy and precision of glucose monitoring are relevant to treatment decision-making and clinical outcome in hospitalized patients with diabetes.

    Science.gov (United States)

    Voulgari, Christina; Tentolouris, Nicholas

    2011-07-01

    The accuracy and precision of three blood glucose meters (BGMs) were evaluated in 600 hospitalized patients with type 1 (n = 200) or type 2 (n = 400) diabetes. Capillary blood glucose values were analyzed with Accu-Chek(®) Aviva [Roche (Hellas) S.A., Maroussi, Greece], Precision-Xceed(®) [Abbott Laboratories (Hellas) S.A., Alimos, Greece], and Glucocard X-Sensor(®) (Menarini Diagnostics S.A., Argyroupolis, Greece). At the same time plasma glucose was analyzed using the World Health Organization's glucose oxidase method. Median plasma glucose values (141.2 [range, 13-553] mg/dL) were significantly different from that produced by the BGMs (P diabetes patients. In all cases, the BGMs were unreliable in sensing hypoglycemia. Multivariate linear regression analysis demonstrated that low blood pressure and hematocrit significantly affected glucose measurements obtained with all three BGMs (P diabetes patients, all three frequently used BGMs undersensed hypoglycemia and oversensed hyperglycemia to some extent. Patients and caregivers should be aware of these restrictions of the BGMs.

  19. Vaccination coverage and timeliness in three South African areas: a prospective study

    Directory of Open Access Journals (Sweden)

    Sanders David

    2011-05-01

    Full Text Available Abstract Background Timely vaccination is important to induce adequate protective immunity. We measured vaccination timeliness and vaccination coverage in three geographical areas in South Africa. Methods This study used vaccination information from a community-based cluster-randomized trial promoting exclusive breastfeeding in three South African sites (Paarl in the Western Cape Province, and Umlazi and Rietvlei in KwaZulu-Natal between 2006 and 2008. Five interview visits were carried out between birth and up to 2 years of age (median follow-up time 18 months, and 1137 children were included in the analysis. We used Kaplan-Meier time-to-event analysis to describe vaccination coverage and timeliness in line with the Expanded Program on Immunization for the first eight vaccines. This included Bacillus Calmette-Guérin (BCG, four oral polio vaccines and 3 doses of the pentavalent vaccine which protects against diphtheria, pertussis, tetanus, hepatitis B and Haemophilus influenzae type B. Results The proportion receiving all these eight recommended vaccines were 94% in Paarl (95% confidence interval [CI] 91-96, 62% in Rietvlei (95%CI 54-68 and 88% in Umlazi (95%CI 84-91. Slightly fewer children received all vaccines within the recommended time periods. The situation was worst for the last pentavalent- and oral polio vaccines. The hazard ratio for incomplete vaccination was 7.2 (95%CI 4.7-11 for Rietvlei compared to Paarl. Conclusions There were large differences between the different South African sites in terms of vaccination coverage and timeliness, with the poorer areas of Rietvlei performing worse than the better-off areas in Paarl. The vaccination coverage was lower for the vaccines given at an older age. There is a need for continued efforts to improve vaccination coverage and timeliness, in particular in rural areas. Trial registration number ClinicalTrials.gov: NCT00397150

  20. Timeliness of Surveillance during Outbreak of Shiga Toxin–producing Escherichia coli Infection, Germany, 2011

    OpenAIRE

    Altmann, Mathias; Wadl, Maria; Altmann, Doris; Benzler, Justus; Eckmanns, Tim; Krause, Gérard; Spode, Anke; an der Heiden, Matthias

    2011-01-01

    In the context of a large outbreak of Shiga toxin–producing Escherichia coli O104:H4 in Germany, we quantified the timeliness of the German surveillance system for hemolytic uremic syndrome and Shiga toxin–producing E. coli notifiable diseases during 2003–2011. Although reporting occurred faster than required by law, potential for improvement exists at all levels of the information chain.

  1. Timeliness of Surveillance during Outbreak of Shiga Toxin–producing Escherichia coli Infection, Germany, 2011

    Science.gov (United States)

    Wadl, Maria; Altmann, Doris; Benzler, Justus; Eckmanns, Tim; Krause, Gérard; Spode, Anke; an der Heiden, Matthias

    2011-01-01

    In the context of a large outbreak of Shiga toxin–producing Escherichia coli O104:H4 in Germany, we quantified the timeliness of the German surveillance system for hemolytic uremic syndrome and Shiga toxin–producing E. coli notifiable diseases during 2003–2011. Although reporting occurred faster than required by law, potential for improvement exists at all levels of the information chain. PMID:22000368

  2. The effect of fair disclosure regulation on timeliness and informativeness of earnings announcements

    Directory of Open Access Journals (Sweden)

    Yeonhee Park

    2013-03-01

    Full Text Available This paper examines the effect of Korea’s fair disclosure regulation on the timeliness and informativeness of earnings announcements. The present regulation for Korean listed firms requires that if a company’s sales revenue, operating income (or loss and net income (or loss have changed by over 30% compared to the prior year, the firm must disclose this information through a preliminary financial report (PFR even before the company is audited by external auditors. To analyze the effects of this policy, we first investigate the timeliness of preliminary financial report disclosures. We examine the extent to which Korean listed companies actually comply with the requirement for prompt notification of information concerning material changes in financial performance. Second, we investigate the informativeness of preliminary financial reports by analyzing differential stock market reactions to different timings of preliminary financial report disclosures. Our empirical results reveal that more than half of our sample firms release their preliminary financial reports after external audits are completed, thereby potentially invalidating the effectiveness of the regulation. In addition, we find that preliminary financial reports have information value only if they are disclosed prior to annual audit report dates. This finding supports the notion that timeliness increases the informativeness of preliminary financial report disclosure by curbing insiders’ ability to potentially profit from their information advantage.

  3. Validation of the Six Sigma Z-score for the quality assessment of clinical laboratory timeliness.

    Science.gov (United States)

    Ialongo, Cristiano; Bernardini, Sergio

    2018-03-28

    The International Federation of Clinical Chemistry and Laboratory Medicine has introduced in recent times the turnaround time (TAT) as mandatory quality indicator for the postanalytical phase. Classic TAT indicators, namely, average, median, 90th percentile and proportion of acceptable test (PAT), are in use since almost 40 years and to date represent the mainstay for gauging the laboratory timeliness. In this study, we investigated the performance of the Six Sigma Z-score, which was previously introduced as a device for the quantitative assessment of timeliness. A numerical simulation was obtained modeling the actual TAT data set using the log-logistic probability density function. Five thousand replicates for each size of the artificial TAT random sample (n=20, 50, 250 and 1000) were generated, and different laboratory conditions were simulated manipulating the PDF in order to generate more or less variable data. The Z-score and the classic TAT indicators were assessed for precision (%CV), robustness toward right-tailing (precision at different sample variability), sensitivity and specificity. Z-score showed sensitivity and specificity comparable to PAT (≈80% with n≥250), but superior precision that ranged within 20% by moderately small sized samples (n≥50); furthermore, Z-score was less affected by the value of the cutoff used for setting the acceptable TAT, as well as by the sample variability that reflected into the magnitude of right-tailing. The Z-score was a valid indicator of laboratory timeliness and a suitable device to improve as well as to maintain the achieved quality level.

  4. In the journalism, between timeliness and recurrence: a long term event

    Directory of Open Access Journals (Sweden)

    Angela Zamin

    2011-12-01

    Full Text Available The text presents an analysis exercise concerning the production of a long term event which, for its presence over time, allows the observation of timeliness and recurrence. It is about the exam of what was produced by the Colombian reference newspaper El Tiempo, between march of 2008 and march of 2010, on the diplomatic crisis between Colombia and Equator, triggered by the Colombian military incursion in Ecuadorian territory. Such analysis also considers the problematic fields that emerge and the return of meaning frames provoked by events that succeed each other.

  5. A case for inherent geometric and geodetic accuracy in remotely sensed VNIR and SWIR imaging products

    Science.gov (United States)

    Driver, J. M.

    1982-01-01

    Significant aberrations can occur in acquired images which, unless compensated on board the spacecraft, can seriously impair throughput and timeliness for typical Earth observation missions. Conceptual compensations options are advanced to enable acquisition of images with inherent geometric and geodetic accuracy. Research needs are identified which, when implemented, can provide inherently accurate images. Agressive pursuit of these research needs is recommended.

  6. A strategy for optimizing staffing to improve the timeliness of inpatient phlebotomy collections.

    Science.gov (United States)

    Morrison, Aileen P; Tanasijevic, Milenko J; Torrence-Hill, Joi N; Goonan, Ellen M; Gustafson, Michael L; Melanson, Stacy E F

    2011-12-01

    The timely availability of inpatient test results is a key to physician satisfaction with the clinical laboratory, and in an institution with a phlebotomy service may depend on the timeliness of blood collections. In response to safety reports filed for delayed phlebotomy collections, we applied Lean principles to the inpatient phlebotomy service at our institution. Our goal was to improve service without using additional resources by optimizing our staffing model. To evaluate the effect of a new phlebotomy staffing model on the timeliness of inpatient phlebotomy collections. We compared the median time of morning blood collections and average number of safety reports filed for delayed phlebotomy collections during a 6-month preimplementation period and 5-month postimplementation period. The median time of morning collections was 17 minutes earlier after implementation (7:42 am preimplementation; interquartile range, 6:27-8:48 am; versus 7:25 am postimplementation; interquartile range, 6:20-8:26 am). The frequency of safety reports filed for delayed collections decreased 80% from 10.6 per 30 days to 2.2 per 30 days. Reallocating staff to match the pattern of demand for phlebotomy collections throughout the day represents a strategy for improving the performance of an inpatient phlebotomy service.

  7. Application of EMCAS timeliness model to the safeguards/facility interface

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.

    1987-01-01

    The Hanford operating contractor has developed a timeliness model for periodic mass balance tests (MBTs) for loss of special nuclear material (SNM). The purpose of the model is to compute the probability that an adversary will be detected by a periodic MBT before he could escape from a facility with stolen SNM using stealth and deceit to avoid detection. The model considers (a) the loss detection sensitivity of the MBT, (b) the time between MBTs, and (c) the statistical distribution of the total time required to complete stealth and deceit strategies. The model shows whether or not it is cost-effective to conduct frequent MBTs for loss and where improvements should be made. The Evaluation Methods for Material Control and Accountability Safeguards Systems (EMCAS) timeliness model computes the loss detection capability of periodic materials control and accounting (MC ampersand A) tests in terms of (a) the ability of the test to detect the specified target quantity and (b) the probability that the MC ampersand A test will occur before the adversary can complete the sequence of stealth and deceit strategies required to avoid detection

  8. THE TIMELINESS OF FINANCIAL REPORTING IN THE CONTEXT OF EUROPEAN UNION’S EMERGING ECONOMIES

    Directory of Open Access Journals (Sweden)

    Andra GAJEVSZKY

    2013-12-01

    Full Text Available Purpose- This research aims to investigate the timeliness of financial statements of the companies across the European Union‘s emerging economies. Research Design- Out of the emerging economies from European Union, the following sample was constituted: the companies listed on Bucharest Stock Exchange, Warsaw Stock Exchange, Prague Stock Exchange and Budapest Stock Exchange, no matter what tier. The final sample, after eliminating the financial institutions and the entities which were not listed in all the studied years (2008-2012, consists of 37 companies. Findings- While comparing the results of this research with those from prior literature, it can be noticed a slightly decrease of days delay in the case of the analyzed emerging economies. Moreover, consistent with other researchers` findings, companies audited by a Big 4 auditor and with a qualified opinion in the auditor`s report, publish their financial results later than entities which have a favourable audit opinion. Value/Practical Implications- This study highlights the importance of financial statements` timeliness in the context of four European Union`s emerging economies, economies which are known for their delay in publishing their financial results compared to the market economies.

  9. Inequity in Timeliness of MMR Vaccination in Children Living in the Suburbs of Iranian Cities.

    Science.gov (United States)

    Jadidi, Rahmatollah; Mohammadbeigi, Abolfazl; Mohammadsalehi, Narges; Ansari, Hossein; Ghaderi, Ebrahim

    2015-06-01

    High coverage of immunization is one of the indicators of good performance of health system but timely vaccination is another indicator which is associated with protective effect of vaccines. The present study aimed at evaluating the inequity in timely vaccination with a focus on inequities in timeliness by gender, birth order, parents' education and place of residence (rural or urban). A historical cohort study was conducted on children of 24-47 months of age who were living in the suburbs of big cities in Iran and were selected through stratified proportional sampling method. Only children who had vaccine cards -i.e. 3610 children -were included in data analysis. The primary outcome was age-appropriate vaccination of MMR1. Inequity was measured by Concentration Index (C) and Relative Index of Inequity (RII). Inequity indexes were calculated according to the mother and father's education, child birth order, child's sex and the family's place of residence at the time of vaccination. The overall on-time MMR1 vaccination was 70% and 54.4% for Iranians and Non-Iranians, respectively. The C index of mother and father's education for timely MMR vaccination was 0.023 and was 0.029 in Iranian children as well as 0.044 and 0.019 for non-Iranians, respectively. The C index according to child order in Iranians and Non-Iranians was 0.025 and C=0.078. With regard to children who lived in cities, the on-time vaccination was 0.36% and 0.29% higher than that in rural areas . In male children it was 0.12% and 0.14% higher than that in female children for Iranians and Non-Iranians, respectively. Timeliness MMR vaccination in Iranian children is higher than that in non-Iranian children. Regarding the existence of differences in timely vaccination rate in all Iranian and Non-Iranian children, no evidence was observed for inequity by focusing on parents' education, birth order, gender or place of residence. So, increasing timeliness of vaccination for enhancing the protective effect

  10. Novel combined patient instruction and discharge summary tool improves timeliness of documentation and outpatient provider satisfaction

    Directory of Open Access Journals (Sweden)

    Meredith Gilliam

    2017-03-01

    Full Text Available Background: Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. Objective: To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. Methods: In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. Results: The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04, and to 2.8 h in 2015 (p < 0.001. Conclusion: The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process.

  11. Timeliness vaccination of measles containing vaccine and barriers to vaccination among migrant children in East China.

    Directory of Open Access Journals (Sweden)

    Yu Hu

    Full Text Available BACKGROUND: The reported coverage rates of first and second doses of measles containing vaccine (MCV are almost 95% in China, while measles cases are constantly being reported. This study evaluated the vaccine coverage, timeliness, and barriers to immunization of MCV1 and MCV2 in children aged from 8-48 months. METHODS: We assessed 718 children aged 8-48 months, of which 499 children aged 18-48 months in September 2011. Face to face interviews were administered with children's mothers to estimate MCV1 and MCV2 coverage rate, its timeliness and barriers to vaccine uptake. RESULTS: The coverage rates were 76.9% for MCV1 and 44.7% for MCV2 in average. Only 47.5% of surveyed children received the MCV1 timely, which postpone vaccination by up to one month beyond the stipulated age of 8 months. Even if coverage thus improves with time, postponed vaccination adds to the pool of unprotected children in the population. Being unaware of the necessity for vaccination and its schedule, misunderstanding of side-effect of vaccine, and child being sick during the recommended vaccination period were significant preventive factors for both MCV1 and MCV2 vaccination. Having multiple children, mother's education level, household income and children with working mothers were significantly associated with delayed or missing MCV1 immunization. CONCLUSIONS: To avoid future outbreaks, it is crucial to attain high coverage levels by timely vaccination, thus, accurate information should be delivered and a systematic approach should be targeted to high-risk groups.

  12. Vaccine Hesitancy Among Caregivers and Association with Childhood Vaccination Timeliness in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Masters, Nina B; Tefera, Yemesrach A; Wagner, Abram L; Boulton, Matthew L

    2018-05-24

    Vaccines are vital to reducing childhood mortality, and prevent an estimated 2 to 3 million deaths annually which disproportionately occur in the developing world. Overall vaccine coverage is typically used as a metric to evaluate the adequacy of vaccine program performance, though it does not account for untimely administration, which may unnecessarily prolong children's susceptibility to disease. This study explored a hypothesized positive association between increasing vaccine hesitancy and untimeliness of immunizations administered under the Expanded Program on Immunization (EPI) in Addis Ababa, Ethiopia. This cross-sectional survey employed a multistage sampling design, randomly selecting one health center within five sub-cities of Addis Ababa. Caregivers of 3 to 12-month-old infants completed a questionnaire on vaccine hesitancy, and their infants' vaccination cards were examined to assess timeliness of received vaccinations. The sample comprised 350 caregivers. Overall, 82.3% of the surveyed children received all recommended vaccines, although only 55.9% of these vaccinations were timely. Few caregivers (3.4%) reported ever hesitating and 3.7% reported ever refusing a vaccine for their child. Vaccine hesitancy significantly increased the odds of untimely vaccination (AOR 1.94, 95% CI: 1.02, 3.71) in the adjusted analysis. This study found high vaccine coverage among a sample of 350 young children in Addis Ababa, though only half received all recommended vaccines on time. High vaccine hesitancy was strongly associated with infants' untimely vaccination, indicating that increased efforts to educate community members and providers about vaccines may have a beneficial impact on vaccine timeliness in Addis Ababa.

  13. Improving Timeliness of Winter Wheat Production Forecast in United States of America, Ukraine and China Using MODIS Data and NCAR Growing Degree Day

    Science.gov (United States)

    Vermote, E.; Franch, B.; Becker-Reshef, I.; Claverie, M.; Huang, J.; Zhang, J.; Sobrino, J. A.

    2014-12-01

    Wheat is the most important cereal crop traded on international markets and winter wheat constitutes approximately 80% of global wheat production. Thus, accurate and timely forecasts of its production are critical for informing agricultural policies and investments, as well as increasing market efficiency and stability. Becker-Reshef et al. (2010) used an empirical generalized model for forecasting winter wheat production. Their approach combined BRDF-corrected daily surface reflectance from Moderate resolution Imaging Spectroradiometer (MODIS) Climate Modeling Grid (CMG) with detailed official crop statistics and crop type masks. It is based on the relationship between the Normalized Difference Vegetation Index (NDVI) at the peak of the growing season, percent wheat within the CMG pixel, and the final yields. This method predicts the yield approximately one month to six weeks prior to harvest. In this study, we include the Growing Degree Day (GDD) information extracted from NCEP/NCAR reanalysis data in order to improve the winter wheat production forecast by increasing the timeliness of the forecasts while conserving the accuracy of the original model. We apply this modified model to three major wheat-producing countries: United States of America, Ukraine and China from 2001 to 2012. We show that a reliable forecast can be made between one month to a month and a half prior to the peak NDVI (meaning two months to two and a half months prior to harvest) while conserving an accuracy of 10% in the production forecast.

  14. Socio-economic determinants and inequities in coverage and timeliness of early childhood immunisation in rural Ghana

    NARCIS (Netherlands)

    Gram, Lu; Soremekun, Seyi; ten Asbroek, Augustinus; Manu, Alexander; O'Leary, Maureen; Hill, Zelee; Danso, Samuel; Amenga-Etego, Seeba; Owusu-Agyei, Seth; Kirkwood, Betty R.

    2014-01-01

    To assess the extent of socio-economic inequity in coverage and timeliness of key childhood immunisations in Ghana. Secondary analysis of vaccination card data collected from babies born between January 2008 and January 2010 who were registered in the surveillance system supporting the ObaapaVita

  15. Completeness and timeliness of Salmonella notifications in Ireland in 2008: a cross sectional study

    Directory of Open Access Journals (Sweden)

    Cormican Martin

    2010-09-01

    Full Text Available Abstract Background In Ireland, salmonellosis is the second most common cause of bacterial gastroenteritis. A new electronic system for reporting (Computerised Infectious Disease Reporting - CIDR of Salmonella cases was established in 2004. It collates clinical (and/or laboratory data on confirmed and probable Salmonella cases. The authors studied the completeness and the timeliness of Salmonella notifications in 2008. Methods This analysis was based upon laboratory confirmed cases of salmonella gastroenteritis. Using data contained in CIDR, we examined completeness for certain non-mandatory fields (country of infection, date of onset of illness, organism, outcome, patient type, and ethnicity. We matched the CIDR data with the dataset provided by the national Salmonella reference laboratory (NSRL to which all Salmonella spp. isolates are referred for definitive typing. We calculated the main median time intervals in the flow of events of the notification process. Results In total, 416 laboratory confirmed Salmonella cases were captured by the national surveillance system and the NSRL and were included in the analysis. Completeness of non mandatory fields varied considerably. Organism was the most complete field (98.8%, ethnicity the least (11%. The median time interval between sample collection (first contact of the patient with the healthcare professional to the first notification to the regional Department of Public Health (either a clinical or a laboratory notification was 6 days (Interquartile 4-7 days. The median total identification time interval, time between sample collections to availability of serotyping and phage-typing results on the system was 25 days (Interquartile 19-32 days. Timeliness varied with respect to Salmonella species. Clinical notifications occurred more rapidly than laboratory notifications. Conclusions Further feedback and education should be given to health care professionals to improve completeness of reporting of

  16. Timeliness of contact tracing among flight passengers for influenza A/H1N1 2009

    Directory of Open Access Journals (Sweden)

    Swaan Corien M

    2011-12-01

    Full Text Available Abstract Background During the initial containment phase of influenza A/H1N1 2009, close contacts of cases were traced to provide antiviral prophylaxis within 48 h after exposure and to alert them on signs of disease for early diagnosis and treatment. Passengers seated on the same row, two rows in front or behind a patient infectious for influenza, during a flight of ≥ 4 h were considered close contacts. This study evaluates the timeliness of flight-contact tracing (CT as performed following national and international CT requests addressed to the Center of Infectious Disease Control (CIb/RIVM, and implemented by the Municipal Health Services of Schiphol Airport. Methods Elapsed days between date of flight arrival and the date passenger lists became available (contact details identified - CI was used as proxy for timeliness of CT. In a retrospective study, dates of flight arrival, onset of illness, laboratory diagnosis, CT request and identification of contacts details through passenger lists, following CT requests to the RIVM for flights landed at Schiphol Airport were collected and analyzed. Results 24 requests for CT were identified. Three of these were declined as over 4 days had elapsed since flight arrival. In 17 out of 21 requests, contact details were obtained within 7 days after arrival (81%. The average delay between arrival and CI was 3,9 days (range 2-7, mainly caused by delay in diagnosis of the index patient after arrival (2,6 days. In four flights (19%, contacts were not identified or only after > 7 days. CI involving Dutch airlines was faster than non-Dutch airlines (P Conclusion CT for influenza A/H1N1 2009 among flight passengers was not successful for timely provision of prophylaxis. CT had little additional value for alerting passengers for disease symptoms, as this information already was provided during and after the flight. Public health authorities should take into account patient delays in seeking medical advise and

  17. Completeness and timeliness of Salmonella notifications in Ireland in 2008: a cross sectional study

    LENUS (Irish Health Repository)

    Nicolay, Nathalie

    2010-09-22

    Abstract Background In Ireland, salmonellosis is the second most common cause of bacterial gastroenteritis. A new electronic system for reporting (Computerised Infectious Disease Reporting - CIDR) of Salmonella cases was established in 2004. It collates clinical (and\\/or laboratory) data on confirmed and probable Salmonella cases. The authors studied the completeness and the timeliness of Salmonella notifications in 2008. Methods This analysis was based upon laboratory confirmed cases of salmonella gastroenteritis. Using data contained in CIDR, we examined completeness for certain non-mandatory fields (country of infection, date of onset of illness, organism, outcome, patient type, and ethnicity). We matched the CIDR data with the dataset provided by the national Salmonella reference laboratory (NSRL) to which all Salmonella spp. isolates are referred for definitive typing. We calculated the main median time intervals in the flow of events of the notification process. Results In total, 416 laboratory confirmed Salmonella cases were captured by the national surveillance system and the NSRL and were included in the analysis. Completeness of non mandatory fields varied considerably. Organism was the most complete field (98.8%), ethnicity the least (11%). The median time interval between sample collection (first contact of the patient with the healthcare professional) to the first notification to the regional Department of Public Health (either a clinical or a laboratory notification) was 6 days (Interquartile 4-7 days). The median total identification time interval, time between sample collections to availability of serotyping and phage-typing results on the system was 25 days (Interquartile 19-32 days). Timeliness varied with respect to Salmonella species. Clinical notifications occurred more rapidly than laboratory notifications. Conclusions Further feedback and education should be given to health care professionals to improve completeness of reporting of non

  18. Socio-economic determinants and inequities in coverage and timeliness of early childhood immunisation in rural Ghana.

    Science.gov (United States)

    Gram, Lu; Soremekun, Seyi; ten Asbroek, Augustinus; Manu, Alexander; O'Leary, Maureen; Hill, Zelee; Danso, Samuel; Amenga-Etego, Seeba; Owusu-Agyei, Seth; Kirkwood, Betty R

    2014-07-01

    To assess the extent of socio-economic inequity in coverage and timeliness of key childhood immunisations in Ghana. Secondary analysis of vaccination card data collected from babies born between January 2008 and January 2010 who were registered in the surveillance system supporting the ObaapaVita and Newhints Trials was carried out. 20 251 babies had 6 weeks' follow-up, 16 652 had 26 weeks' follow-up, and 5568 had 1 year's follow-up. We performed a descriptive analysis of coverage and timeliness of vaccinations by indicators for urban/rural status, wealth and educational attainment. The association of coverage with socio-economic indicators was tested using a chi-square-test and the association with timeliness using Cox regression. Overall coverage at 1 year of age was high (>95%) for Bacillus Calmette-Guérin (BCG), all three pentavalent diphtheria-pertussis-tetanus-haemophilus influenzae B-hepatitis B (DPTHH) doses and all polio doses except polio at birth (63%). Coverage against measles and yellow fever was 85%. Median delay for BCG was 1.7 weeks. For polio at birth, the median delay was 5 days; all other vaccine doses had median delays of 2-4 weeks. We found substantial health inequity across all socio-economic indicators for all vaccines in terms of timeliness, but not coverage at 1 year. For example, for the last DPTHH dose, the proportion of children delayed more than 8 weeks were 27% for urban children and 31% for rural children (P < 0.001), 21% in the wealthiest quintile and 41% in the poorest quintile (P < 0.001), and 9% in the most educated group and 39% in the least educated group (P < 0.001). However, 1-year coverage of the same dose remained above 90% for all levels of all socio-economic indicators. Ghana has substantial health inequity across urban/rural, socio-economic and educational divides. While overall coverage was high, most vaccines suffered from poor timeliness. We suggest that countries achieving high coverage should include timeliness

  19. Impact of electronic order management on the timeliness of antibiotic administration in critical care patients.

    Science.gov (United States)

    Cartmill, Randi S; Walker, James M; Blosky, Mary Ann; Brown, Roger L; Djurkovic, Svetolik; Dunham, Deborah B; Gardill, Debra; Haupt, Marilyn T; Parry, Dean; Wetterneck, Tosha B; Wood, Kenneth E; Carayon, Pascale

    2012-11-01

    To examine the effect of implementing electronic order management on the timely administration of antibiotics to critical-care patients. We used a prospective pre-post design, collecting data on first-dose IV antibiotic orders before and after the implementation of an integrated electronic medication-management system, which included computerized provider order entry (CPOE), pharmacy order processing and an electronic medication administration record (eMAR). The research was performed in a 24-bed adult medical/surgical ICU in a large, rural, tertiary medical center. Data on the time of ordering, pharmacy processing and administration were prospectively collected and time intervals for each stage and the overall process were calculated. The overall turnaround time from ordering to administration significantly decreased from a median of 100 min before order management implementation to a median of 64 min after implementation. The first part of the medication use process, i.e., from order entry to pharmacy processing, improved significantly whereas no change was observed in the phase from pharmacy processing to medication administration. The implementation of an electronic order-management system improved the timeliness of antibiotic administration to critical-care patients. Additional system changes are required to further decrease the turnaround time. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Perceived timeliness of referral to hospice palliative care among bereaved family members in Korea.

    Science.gov (United States)

    Jho, Hyun Jung; Chang, Yoon Jung; Song, Hye Young; Choi, Jin Young; Kim, Yeol; Park, Eun Jung; Paek, Soo Jin; Choi, Hee Jae

    2015-09-01

    We aimed to explore the perceived timeliness of referral to hospice palliative care unit (HPCU) among bereaved family members in Korea and factors associated therewith. Cross-sectional questionnaire survey was performed for bereaved family members of patients who utilized 40 designated HPCUs across Korea. The questionnaire assessed whether admission to the HPCU was "too late" or "appropriate" and the Good Death Inventory (GDI). A total of 383 questionnaires were analyzed. Of participants, 25.8 % replied that admission to HPCU was too late. Patients with hepatobiliary cancer, poor performance status, abnormal consciousness level, and unawareness of terminal status were significantly related with the too late perception. Family members with younger age and being a child of the patient were more frequently noted in the too late group. Ten out of 18 GDI scores were significantly lower in the too late group. Multiple logistic regression analysis revealed patients' unawareness of terminal status, shorter stay in the HPCU, younger age of bereaved family, and lower scores for two GDI items (staying in a favored place, living without concerning death or disease) were significantly associated with the too late group. To promote timely HPCU utilization and better quality of end of life care, patients need to be informed of the terminal status and their preference should be respected.

  1. The effect of nurse navigation on timeliness of breast cancer care at an academic comprehensive cancer center.

    Science.gov (United States)

    Basu, Mohua; Linebarger, Jared; Gabram, Sheryl G A; Patterson, Sharla Gayle; Amin, Miral; Ward, Kevin C

    2013-07-15

    A patient navigation process is required for accreditation by the National Accreditation Program for Breast Centers (NAPBC). Patient navigation has previously been shown to improve timely diagnosis in patients with breast cancer. This study sought to assess the effect of nurse navigation on timeliness of care following the diagnosis of breast cancer by comparing patients who were treated in a comprehensive cancer center with and without the assistance of nurse navigation. Navigation services were initiated at an NAPBC-accredited comprehensive breast center in July 2010. Two 9-month study intervals were chosen for comparison of timeliness of care: October 2009 through June 2010 and October 2010 through June 2011. All patients with breast cancer diagnosed in the cancer center with stage 0 to III disease during the 2 study periods were identified by retrospective cancer registry review. Time from diagnosis to initial oncology consultation was measured in business days, excluding holidays and weekends. Overall, 176 patients met inclusion criteria: 100 patients prior to and 76 patients following nurse navigation implementation. Nurse navigation was found to significantly shorten time to consultation for patients older than 60 years (B = -4.90, P = .0002). There was no change in timeliness for patients 31 to 60 years of age. Short-term analysis following navigation implementation showed decreased time to consultation for older patients, but not younger patients. Further studies are indicated to assess the long-term effects and durability of this quality improvement initiative. © 2013 American Cancer Society.

  2. The Effect of Ratio, Issuance of Stocks and Auditors’ Quality toward the Timeliness of Financial Reporting on the Internet by Consumer Goods Sector Companies in Indonesia

    Directory of Open Access Journals (Sweden)

    Lidiyawati Lidiyawati

    2015-11-01

    Full Text Available This study was conducted to analyze the factors that affect the timeliness of financial reporting on the Internet in the Consumer Goods sector companies listed in Indonesia Stock Exchange (IDX. Variables used were leverage, profitability, size of company, the issuance of stock and the quality of auditors. Data analysis method used was logistic regression at the 0.05 level. The data used were secondary data and using sample Consumer Goods companies listed in the Indonesia Stock Exchange in 2010-2012. This study tested the effect of variable leverage, profitability, firm size, auditor quality stocks, and the timeliness of financial reporting on the Internet. The results obtained from these tests support the timeliness of audit quality of financial reporting on theInternet. However, other variables such as leverage, profitability, firm size, stock issuance did not support the timeliness of financial reporting on the Internet.

  3. Completeness and timeliness of vaccination and determinants for low and late uptake among young children in eastern China

    Science.gov (United States)

    Hu, Yu; Chen, Yaping; Guo, Jing; Tang, Xuewen; Shen, Lingzhi

    2014-01-01

    Background: We studied completeness and timeliness of vaccination and determinants for low and delayed uptake in children born between 2008 and 2009 in Zhejiang province in eastern China. Methods: We used data from a cross-sectional cluster survey conducted in 2011, which included 1146 children born from 1 Jan 2008 to 31 Dec 2009. Various vaccination history, social-demographic factors, attitude and satisfaction toward immunization from caregivers were collected by a standard questionnaire. We restricted to the third dose of HepB, PV, and DPT (HepB3, PV3, and DPT3) as outcome variables for completeness of vaccination and restricted to the first dose of HepB, PV, DPT, and MCV(HepB1, PV1, DPT1, and MCV1) as outcome variables for timeliness of vaccination. The χ2 test and logistic regression analysis were applied to identify the determinants of completeness and timeliness of vaccination. Survival analysis by the Kaplan–Meier method was performed to present the timeliness vaccination. Results: Coverage for HepB1, HepB3, PV1, PV3, DPT1, DPT3, and MCV1 was 93.22%, 90.15%, 96.42%, 91.63%, 95.80%, 90.16%, and 92.70%, respectively. Timely vaccination occurred in 501/1146(43.72%) children for HepB1, 520/1146(45.38%) for PV1, 511/1146(44.59%) for DPT1, and 679/1146(59.25%) for MCV1. Completeness of specific vaccines was associated with mother’ age, immigration status, birth place of child, maternal education level, maternal occupation status, socio-economic development level of surveyed areas, satisfaction toward immunization service and distance of the house to immunization clinic. Timeliness of vaccination for specific vaccines was associated with mother’ age, maternal education level, immigration status, siblings, birth place, and distance of the house to immunization clinic. Conclusion: Despite reasonably high vaccination coverage, we observed substantial vaccination delays. We found specific factors associated with low and/or delayed vaccine uptake. These findings

  4. Timeliness of abnormal screening and diagnostic mammography follow-up at facilities serving vulnerable women.

    Science.gov (United States)

    Goldman, L Elizabeth; Walker, Rod; Hubbard, Rebecca; Kerlikowske, Karla

    2013-04-01

    Whether timeliness of follow-up after abnormal mammography differs at facilities serving vulnerable populations, such as women with limited education or income, in rural areas, and racial/ethnic minorities is unknown. We examined receipt of diagnostic evaluation after abnormal mammography using 1998-2006 Breast Cancer Surveillance Consortium-linked Medicare claims. We compared whether time to recommended breast imaging or biopsy depended on whether women attended facilities serving vulnerable populations. We characterized a facility by the proportion of mammograms performed on women with limited education or income, in rural areas, or racial/ethnic minorities. We analyzed 30,874 abnormal screening examinations recommended for follow-up imaging across 142 facilities and 10,049 abnormal diagnostic examinations recommended for biopsy across 114 facilities. Women at facilities serving populations with less education or more racial/ethnic minorities had lower rates of follow-up imaging (4%-5% difference, Pfacilities serving more rural and low-income populations had lower rates of biopsy (4%-5% difference, Pfacilities serving vulnerable populations had longer times until biopsy than those at facilities serving nonvulnerable populations (21.6 vs. 15.6 d; 95% confidence interval for mean difference 4.1-7.7). The proportion of women receiving recommended imaging within 11 months and biopsy within 3 months varied across facilities (interquartile range, 85.5%-96.5% for imaging and 79.4%-87.3% for biopsy). Among Medicare recipients, follow-up rates were slightly lower at facilities serving vulnerable populations, and among those women who returned for diagnostic evaluation, time to follow-up was slightly longer at facilities that served vulnerable population. Interventions should target variability in follow-up rates across facilities, and evaluate effectiveness particularly at facilities serving vulnerable populations.

  5. Identifying and Prioritizing the Effective Parameters on Lack of Timeliness of Operations of Sugarcane Production using Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    N Monjezi

    2017-10-01

    Full Text Available Introduction Planning and scheduling of farming mechanized operations is very important. If the operation is not performed on time, yield will be reduced. Also for sugarcane, any delay in crop planting and harvesting operations reduces the yield. The most useful priority setting method for agricultural projects is the analytic hierarchy process (AHP. So, this article presents an introductry application manner of the Analytical Hierarchy Process (AHP as a mostly common method of setting agricultural projects priorities. Analytic Hierarchy process (AHP is a decision making algorithm developed by Dr. Saatyin 1980. It has many applications as documented in Decision Support System literature. Currently, this technique is widely used in complicated management decision makings which AHP was preferred from other established methodologies as it does not demand prior knowledge of the utility function; it is based on a hierarchy of criteria and attributes reflecting the understanding of the problem, and finally, because it allows relative and absolute comparisons, thus making this method a very robust tool. The purpose of this research is to identify and prioritize the effective parameters on lack of timeliness of operations of sugarcane production using AHP in Khuzestan province of Iran. Materials and Methods The effective parameters effecting on lack of timeliness of operations have been defined based on expert’s opinions. A questionnaire and personal interviews have formed the basis of this research. The study was applied to a panel of qualified informants made up of fourteen experts. Those interviewed were distributed in Sugarcane Development and By-products Company in 2013-2014. Then, by using the Analytical hierarchy process, a questionnaire was designed for defining the weight and importance of parameters affecting on lack of timeliness of operations. For this method of evaluation, three main criteria considered were yield criteria, cost criteria

  6. Utilizing distributional analytics and electronic records to assess timeliness of inpatient blood glucose monitoring in non-critical care wards

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2016-04-01

    Full Text Available Abstract Background Regular and timely monitoring of blood glucose (BG levels in hospitalized patients with diabetes mellitus is crucial to optimizing inpatient glycaemic control. However, methods to quantify timeliness as a measurement of quality of care are lacking. We propose an analytical approach that utilizes BG measurements from electronic records to assess adherence to an inpatient BG monitoring protocol in hospital wards. Methods We applied our proposed analytical approach to electronic records obtained from 24 non-critical care wards in November and December 2013 from a tertiary care hospital in Singapore. We applied distributional analytics to evaluate daily adherence to BG monitoring timings. A one-sample Kolmogorov-Smirnov (1S-KS test was performed to test daily BG timings against non-adherence represented by the uniform distribution. This test was performed among wards with high power, determined through simulation. The 1S-KS test was coupled with visualization via the cumulative distribution function (cdf plot and a two-sample Kolmogorov-Smirnov (2S-KS test, enabling comparison of the BG timing distributions between two consecutive days. We also applied mixture modelling to identify the key features in daily BG timings. Results We found that 11 out of the 24 wards had high power. Among these wards, 1S-KS test with cdf plots indicated adherence to BG monitoring protocols. Integrating both 1S-KS and 2S-KS information within a moving window consisting of two consecutive days did not suggest frequent potential change from or towards non-adherence to protocol. From mixture modelling among wards with high power, we consistently identified four components with high concentration of BG measurements taken before mealtimes and around bedtime. This agnostic analysis provided additional evidence that the wards were adherent to BG monitoring protocols. Conclusions We demonstrated the utility of our proposed analytical approach as a monitoring

  7. Combined Loadings and Cross-Dimensional Loadings Timeliness of Presentation of Financial Statements of Local Government

    Science.gov (United States)

    Muda, I.; Dharsuky, A.; Siregar, H. S.; Sadalia, I.

    2017-03-01

    This study examines the pattern of readiness dimensional accuracy of financial statements of local government in North Sumatra with a routine pattern of two (2) months after the fiscal year ends and patterns of at least 3 (three) months after the fiscal year ends. This type of research is explanatory survey with quantitative methods. The population and the sample used is of local government officials serving local government financial reports. Combined Analysis And Cross-Loadings Loadings are used with statistical tools WarpPLS. The results showed that there was a pattern that varies above dimensional accuracy of the financial statements of local government in North Sumatra.

  8. The impact of system level factors on treatment timeliness: utilizing the Toyota Production System to implement direct intake scheduling in a semi-rural community mental health clinic.

    Science.gov (United States)

    Weaver, Addie; Greeno, Catherine G; Goughler, Donald H; Yarzebinski, Kathleen; Zimmerman, Tina; Anderson, Carol

    2013-07-01

    This study examined the effect of using the Toyota Production System (TPS) to change intake procedures on treatment timeliness within a semi-rural community mental health clinic. One hundred randomly selected cases opened the year before the change and 100 randomly selected cases opened the year after the change were reviewed. An analysis of covariance demonstrated that changing intake procedures significantly decreased the number of days consumers waited for appointments (F(1,160) = 4.9; p = .03) from an average of 11 to 8 days. The pattern of difference on treatment timeliness was significantly different between adult and child programs (F(1,160) = 4.2; p = .04), with children waiting an average of 4 days longer than adults for appointments. Findings suggest that small system level changes may elicit important changes and that TPS offers a valuable model to improve processes within community mental health settings. Results also indicate that different factors drive adult and children's treatment timeliness.

  9. Effect of the Adoption of IFRS on the Information Relevance of Accounting Profits in Brazil

    Directory of Open Access Journals (Sweden)

    Mateus Alexandre Costa dos Santos

    2014-12-01

    Full Text Available This study aimed to assess the effect of adopting the International Financial Reporting Standards (IFRS in Brazil on the information relevance of accounting profits of publicly traded companies. International studies have shown that the adoption of IFRS improves the quality of accounting information compared with domestic accounting standards. Concurrent evidence is sparse in Brazil. Information relevance is understood herein as a multidimensional attribute that is closely related to the quality and usefulness of the information conveyed by accounting profits. The associative capacity and information timeliness of accounting profits in relation to share prices were examined. Furthermore, the level of conditional conservatism present in accounting profits was also analyzed because according to Basu (1997, this aspect is related to timeliness. The study used pooled regressions and panel data models to analyze the quarterly accounting profits of 246 companies between the first quarter of 1999 and the first quarter of 2013, resulting in 9,558 quarter-company observations. The results indicated that the adoption of IFRS in Brazil (1 increased the associative capacity of accounting profits; (2 reduced information timeliness to non-significant levels; and (3 had no effect on conditional conservatism. The joint analysis of the empirical evidence from the present study conclusively precludes stating that the adoption of IFRS in Brazil contributed to an increase the information relevance of accounting profits of publicly traded companies.

  10. Impact of two interventions on timeliness and data quality of an electronic disease surveillance system in a resource limited setting (Peru: a prospective evaluation

    Directory of Open Access Journals (Sweden)

    Quispe Jose A

    2009-03-01

    Full Text Available Abstract Background A timely detection of outbreaks through surveillance is needed in order to prevent future pandemics. However, current surveillance systems may not be prepared to accomplish this goal, especially in resource limited settings. As data quality and timeliness are attributes that improve outbreak detection capacity, we assessed the effect of two interventions on such attributes in Alerta, an electronic disease surveillance system in the Peruvian Navy. Methods 40 Alerta reporting units (18 clinics and 22 ships were included in a 12-week prospective evaluation project. After a short refresher course on the notification process, units were randomly assigned to either a phone, visit or control group. Phone group sites were called three hours before the biweekly reporting deadline if they had not sent their report. Visit group sites received supervision visits on weeks 4 & 8, but no phone calls. The control group sites were not contacted by phone or visited. Timeliness and data quality were assessed by calculating the percentage of reports sent on time and percentage of errors per total number of reports, respectively. Results Timeliness improved in the phone group from 64.6% to 84% in clinics (+19.4 [95% CI, +10.3 to +28.6]; p Conclusion Regular phone reminders significantly improved timeliness of reports in clinics and ships, whereas supervision visits led to improved data quality only among clinics. Further investigations are needed to establish the cost-effectiveness and optimal use of each of these strategies.

  11. Mechanized farming in the humid tropics with special reference to soil tillage, workability and timeliness of farm operations : a case study for the Zanderij area of Suriname

    NARCIS (Netherlands)

    Goense, D.

    1987-01-01

    The reported investigations concern aspects of mechanized farming for the production of rainfed crops on the loamy soils of the Zanderij formation in Suriname and in particular, the effect of tillage on crop yield and soil properties, workability of field operations and timeliness of field

  12. Diagnostic timeliness in adolescents and young adults with cancer: a cross-sectional analysis of the BRIGHTLIGHT cohort.

    Science.gov (United States)

    Herbert, Annie; Lyratzopoulos, Georgios; Whelan, Jeremy; Taylor, Rachel M; Barber, Julie; Gibson, Faith; Fern, Lorna A

    2018-03-01

    Adolescents and young adults (AYAs) are thought to experience prolonged intervals to cancer diagnosis, but evidence quantifying this hypothesis and identifying high-risk patient subgroups is insufficient. We aimed to investigate diagnostic timeliness in a cohort of AYAs with incident cancers and to identify factors associated with variation in timeliness. We did a cross-sectional analysis of the BRIGHTLIGHT cohort, which included AYAs aged 12-24 years recruited within an average of 6 months from new primary cancer diagnosis from 96 National Health Service hospitals across England between July 1, 2012, and April 30, 2015. Participants completed structured, face-to-face interviews to provide information on their diagnostic experience (eg, month and year of symptom onset, number of consultations before referral to specialist care); demographic information was extracted from case report forms and date of diagnosis and cancer type from the national cancer registry. We analysed these data to assess patient interval (time from symptom onset to first presentation to a general practitioner [GP] or emergency department), the number of prereferral GP consultations, and the symptom onset-to-diagnosis interval (time from symptom onset to diagnosis) by patient characteristic and cancer site, and examined associations using multivariable regression models. Of 1114 participants recruited to the BRIGHTLIGHT cohort, 830 completed a face-to-face interview. Among participants with available information, 204 (27%) of 748 had a patient interval of more than a month and 242 (35%) of 701 consulting a general practitioner had three or more prereferral consultations. The median symptom onset-to-diagnosis interval was 62 days (IQR 29-153). Compared with male AYAs, female AYAs were more likely to have three or more consultations (adjusted odds ratio [OR] 1·6 [95% CI 1·1-2·3], p=0·0093) and longer median symptom onset-to-diagnosis intervals (adjusted median interval longer by 24 days [95

  13. Predictors of Uptake and Timeliness of Newly Introduced Pneumococcal and Rotavirus Vaccines, and of Measles Vaccine in Rural Malawi: A Population Cohort Study.

    Directory of Open Access Journals (Sweden)

    Hazzie Mvula

    Full Text Available Malawi introduced pneumococcal conjugate vaccine (PCV13 and monovalent rotavirus vaccine (RV1 in 2011 and 2012 respectively, and is planning the introduction of a second-dose measles vaccine (MV. We assessed predictors of availability, uptake and timeliness of these vaccines in a rural Malawian setting.Commencing on the first date of PCV13 eligibility we conducted a prospective population-based birth cohort study of 2,616 children under demographic surveillance in Karonga District, northern Malawi who were eligible for PCV13, or from the date of RV1 introduction both PCV13 and RV1. Potential predictors of vaccine uptake and timeliness for PCV13, RV1 and MV were analysed respectively using robust Poisson and Cox regression.Vaccine coverage was high for all vaccines, ranging from 86.9% for RV1 dose 2 to 95.4% for PCV13 dose 1. Median time delay for PCV13 dose 1 was 17 days (IQR 7-36, 19 days (IQR 8-36 for RV1 dose 1 and 20 days (IQR 3-46 for MV. Infants born to lower educated or farming mothers and those living further away from the road or clinic were at greater risk of being not fully vaccinated and being vaccinated late. Delays in vaccination were also associated with non-facility birth. Vaccine stock-outs resulted in both a delay in vaccine timeliness and in a decrease in completion of schedule.Despite high vaccination coverage in this setting, delays in vaccination were common. We identified programmatic and socio-demographic risk factors for uptake and timeliness of vaccination. Understanding who remains most vulnerable to be unvaccinated allows for focussed delivery thereby increasing population coverage and maximising the equitable benefits of universal vaccination programmes.

  14. Analyzing Influential Factors Against Timeliness of Financial Reporting (Empirical Study of Automation and Components and Telecommunication Companies Listed on Indonesia Stock Exchange.

    Directory of Open Access Journals (Sweden)

    Joko Suryanto

    2016-12-01

    Full Text Available This research aims to examine the effect of the relationship between firm size, profitability, solvency, public ownership, and the audit opinion on the timeliness of financial reporting. The dependent variable in the form of timekeeping company deliver the financial statements to the Stock Exchange. Meanwhile for the independent variables such as firm size measured by total asets of the company, profitability is measured by profit margin ratio, solvency measured by debt-to-equity ratio, public ownership is measured by the percentage of the number of shares owned by the community, and the audit opinion is measured with an unqualified opinion and otherwise unqualified. This study uses secondary data with population automotive companies and telecommunications components and annual financial statements issued on the Stock Exchange in the period 2010-2012. From the analysis conducted in this study it can be concluded that the size of the company significantly influence the timeliness of financial reporting. While profitability, solvency, public ownership, and the audit opinion does not affect the timeliness of financial reporting.

  15. The Impact of System Level Factors on Treatment Timeliness: Utilizing the Toyota Production System to Implement Direct Intake Scheduling in a Semi-Rural Community Mental Health Clinic

    Science.gov (United States)

    Weaver, A.; Greeno, C.G.; Goughler, D.H.; Yarzebinski, K.; Zimmerman, T.; Anderson, C.

    2013-01-01

    This study examined the effect of using the Toyota Production System (TPS) to change intake procedures on treatment timeliness within a semi-rural community mental health clinic. One hundred randomly selected cases opened the year before the change and one hundred randomly selected cases opened the year after the change were reviewed. An analysis of covariance (ANCOVA) demonstrated that changing intake procedures significantly decreased the number of days consumers waited for appointments (F(1,160)=4.9; p=.03) from an average of 11 days to 8 days. The pattern of difference on treatment timeliness was significantly different between adult and child programs (F(1,160)=4.2; p=.04), with children waiting an average of 4 days longer than adults for appointments. Findings suggest that small system level changes may elicit important changes and that TPS offers a valuable model to improve processes within community mental health settings. Results also indicate that different factors drive adult and children’s treatment timeliness. PMID:23576137

  16. Timeliness of Operating Room Case Planning and Time Utilization: Influence of First and To-Follow Cases

    Directory of Open Access Journals (Sweden)

    Konrad Meissner

    2017-04-01

    Full Text Available Resource and cost constraints in hospitals demand thorough planning of operating room schedules. Ideally, exact start times and durations are known in advance for each case. However, aside from the first case’s start, most factors are hard to predict. While the role of the start of the first case for optimal room utilization has been shown before, data for to-follow cases are lacking. The present study therefore aimed to analyze all elective surgery cases of a university hospital within 1 year in search of visible patterns. A total of 14,014 cases scheduled on 254 regular working days at a university hospital between September 2015 and August 2016 underwent screening. After eliminating 112 emergencies during regular working hours, 13,547 elective daytime cases were analyzed, out of which 4,346 ranked first, 3,723 second, and 5,478 third or higher in the daily schedule. Also, 36% of cases changed start times from the day before to 7:00 a.m., with half of these (52% resulting in a delay of more than 15 min. After 7:00 a.m., 87% of cases started more than 10 min off schedule, with 26% being early and 74% late. Timeliness was 15 ± 72 min (mean ± SD for first, 21 ± 84 min for second, and 25 ± 93 min for all to-follow cases, compared to preoperative day planning, and 21 ± 45, 23 ± 61, and 19 ± 74 min compared to 7:00 a.m. status. Start time deviations were also related to procedure duration, with cases of 61–90 min duration being most reliable (deviation 9.8 ± 67 min compared to 7:00 a.m., regardless of order. In consequence, cases following after 61–90 min long cases had the shortest deviations of incision time from schedule (16 ± 66 min. Taken together, start times for elective surgery cases deviate substantially from schedule, with first and second cases falling into the highest mean deviation category. Second cases had the largest deviations from scheduled times compared to

  17. Sex Differences in Timeliness of Reperfusion in Young Patients With ST-Segment-Elevation Myocardial Infarction by Initial Electrocardiographic Characteristics.

    Science.gov (United States)

    Gupta, Aakriti; Barrabes, Jose A; Strait, Kelly; Bueno, Hector; Porta-Sánchez, Andreu; Acosta-Vélez, J Gabriel; Lidón, Rosa-Maria; Spatz, Erica; Geda, Mary; Dreyer, Rachel P; Lorenze, Nancy; Lichtman, Judith; D'Onofrio, Gail; Krumholz, Harlan M

    2018-03-07

    Young women with ST-segment-elevation myocardial infarction experience reperfusion delays more frequently than men. Our aim was to determine the electrocardiographic correlates of delay in reperfusion in young patients with ST-segment-elevation myocardial infarction. We examined sex differences in initial electrocardiographic characteristics among 1359 patients with ST-segment-elevation myocardial infarction in a prospective, observational, cohort study (2008-2012) of 3501 patients with acute myocardial infarction, 18 to 55 years of age, as part of the VIRGO (Variation in Recovery: Role of Gender on Outcomes of Young AMI Patients) study at 103 US and 24 Spanish hospitals enrolling in a 2:1 ratio for women/men. We created a multivariable logistic regression model to assess the relationship between reperfusion delay (door-to-balloon time >90 or >120 minutes for transfer or door-to-needle time >30 minutes) and electrocardiographic characteristics, adjusting for sex, sociodemographic characteristics, and clinical characteristics at presentation. In our study (834 women and 525 men), women were more likely to exceed reperfusion time guidelines than men (42.4% versus 31.5%; P ST elevation in lateral leads was an inverse predictor of reperfusion delay. Sex disparities in timeliness to reperfusion in young patients with ST-segment-elevation myocardial infarction persisted, despite adjusting for initial electrocardiographic characteristics. Left ventricular hypertrophy by voltage criteria and absence of prehospital ECG are strongly positively correlated and ST elevation in lateral leads is negatively correlated with reperfusion delay. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  18. Evaluation of radiographers’ mammography screen-reading accuracy in Australia

    International Nuclear Information System (INIS)

    Debono, Josephine C; Poulos, Ann E; Houssami, Nehmat; Turner, Robin M; Boyages, John

    2015-01-01

    This study aimed to evaluate the accuracy of radiographers’ screen-reading mammograms. Currently, radiologist workforce shortages may be compromising the BreastScreen Australia screening program goal to detect early breast cancer. The solution to a similar problem in the United Kingdom has successfully encouraged radiographers to take on the role as one of two screen-readers. Prior to consideration of this strategy in Australia, educational and experiential differences between radiographers in the United Kingdom and Australia emphasise the need for an investigation of Australian radiographers’ screen-reading accuracy. Ten radiographers employed by the Westmead Breast Cancer Institute with a range of radiographic (median = 28 years), mammographic (median = 13 years) and BreastScreen (median = 8 years) experience were recruited to blindly and independently screen-read an image test set of 500 mammograms, without formal training. The radiographers indicated the presence of an abnormality using BI-RADS®. Accuracy was determined by comparison with the gold standard of known outcomes of pathology results, interval matching and client 6-year follow-up. Individual sensitivity and specificity levels ranged between 76.0% and 92.0%, and 74.8% and 96.2% respectively. Pooled screen-reader accuracy across the radiographers estimated sensitivity as 82.2% and specificity as 89.5%. Areas under the reading operating characteristic curve ranged between 0.842 and 0.923. This sample of radiographers in an Australian setting have adequate accuracy levels when screen-reading mammograms. It is expected that with formal screen-reading training, accuracy levels will improve, and with support, radiographers have the potential to be one of the two screen-readers in the BreastScreen Australia program, contributing to timeliness and improved program outcomes

  19. Evaluation of radiographers’ mammography screen-reading accuracy in Australia

    Energy Technology Data Exchange (ETDEWEB)

    Debono, Josephine C, E-mail: josephine.debono@bci.org.au [Westmead Breast Cancer Institute, Westmead, New South Wales (Australia); Poulos, Ann E [Discipline of Medical Radiation Sciences, Faculty of Health Sciences, University of Sydney, Lidcombe, New South Wales (Australia); Houssami, Nehmat [Screening and Test Evaluation Program, School of Public Health (A27), Sydney Medical School, University of Sydney, Sydney, New South Wales (Australia); Turner, Robin M [School of Public Health and Community Medicine, University of New South Wales, Sydney, New South Wales (Australia); Boyages, John [Macquarie University Cancer Institute, Macquarie University Hospital, Australian School of Advanced Medicine, Macquarie University, Sydney, New South Wales (Australia); Westmead Breast Cancer Institute, Westmead, New South Wales (Australia)

    2015-03-15

    This study aimed to evaluate the accuracy of radiographers’ screen-reading mammograms. Currently, radiologist workforce shortages may be compromising the BreastScreen Australia screening program goal to detect early breast cancer. The solution to a similar problem in the United Kingdom has successfully encouraged radiographers to take on the role as one of two screen-readers. Prior to consideration of this strategy in Australia, educational and experiential differences between radiographers in the United Kingdom and Australia emphasise the need for an investigation of Australian radiographers’ screen-reading accuracy. Ten radiographers employed by the Westmead Breast Cancer Institute with a range of radiographic (median = 28 years), mammographic (median = 13 years) and BreastScreen (median = 8 years) experience were recruited to blindly and independently screen-read an image test set of 500 mammograms, without formal training. The radiographers indicated the presence of an abnormality using BI-RADS®. Accuracy was determined by comparison with the gold standard of known outcomes of pathology results, interval matching and client 6-year follow-up. Individual sensitivity and specificity levels ranged between 76.0% and 92.0%, and 74.8% and 96.2% respectively. Pooled screen-reader accuracy across the radiographers estimated sensitivity as 82.2% and specificity as 89.5%. Areas under the reading operating characteristic curve ranged between 0.842 and 0.923. This sample of radiographers in an Australian setting have adequate accuracy levels when screen-reading mammograms. It is expected that with formal screen-reading training, accuracy levels will improve, and with support, radiographers have the potential to be one of the two screen-readers in the BreastScreen Australia program, contributing to timeliness and improved program outcomes.

  20. Why relevance theory is relevant for lexicography

    DEFF Research Database (Denmark)

    Bothma, Theo; Tarp, Sven

    2014-01-01

    This article starts by providing a brief summary of relevance theory in information science in relation to the function theory of lexicography, explaining the different types of relevance, viz. objective system relevance and the subjective types of relevance, i.e. topical, cognitive, situational...... that is very important for lexicography as well as for information science, viz. functional relevance. Since all lexicographic work is ultimately aimed at satisfying users’ information needs, the article then discusses why the lexicographer should take note of all these types of relevance when planning a new...... dictionary project, identifying new tasks and responsibilities of the modern lexicographer. The article furthermore discusses how relevance theory impacts on teaching dictionary culture and reference skills. By integrating insights from lexicography and information science, the article contributes to new...

  1. Pengaruh Karakteristik Sistem Informasi Akuntansi Manajemen: Broad Scope, Timeliness, Aggregated, Dan Integrated Terhadap Kinerja Manajerial Umkm. (Studi Pada Umkm Di Desa Wedoro, Kab. Sidoarjo

    Directory of Open Access Journals (Sweden)

    Susi Handayani

    2014-04-01

    Full Text Available SMEs need information systems that are reliable and competent entrepreneurial personality that will have an impact on managerial performance. Information reliable accounting system according Chenhall and Morris (1986 is one that has the characteristics of broad scope, timeliness, aggregation and integration. Information management accounting system that is broad scope is information that attention focus, quantification, and time horizon. Timeliness dimension has two sub dimensions, namely the frequency of reporting and speed of reporting. Dimensions aggregate is a summary of information by function, time periods, and the decision model. Integrated information reflects the lack of coordination between segments of one and the other subunits within the organization. This research is quantitative. This study uses analysis of causal relationships, that is how one variable affects changes in other variables. To analyze the data, this study uses analysis of Structural Equation Modeling (SEM approach Partial Least Square (PLS.In data processing using software Warp PLS. The results showed that the Accounting Information Systems Management is broadscope, timeliness, integrated, and the aggregate effect on Managerial Performance is measured using an instrument of self-rating which is reflected in four indicators, namely increased revenue, cost savings, improved customer satisfaction and increased asset utilization. This shows that although SMEs are the type of business that is not great, but still requires a wide range of information, timely, integrated and comprehensive that can assist managers in making informed decisions that impact the increase managerial performance related to efficiency-related costs but still consider satisfaction customers thus increasing the income of the SMEs in environmental conditions of uncertainty. Keywords: SME, SIAM characteristics, Managerial Performance

  2. Deep learning relevance

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Petersen, Casper

    2016-01-01

    train a Recurrent Neural Network (RNN) on existing relevant information to that query. We then use the RNN to "deep learn" a single, synthetic, and we assume, relevant document for that query. We design a crowdsourcing experiment to assess how relevant the "deep learned" document is, compared...... to existing relevant documents. Users are shown a query and four wordclouds (of three existing relevant documents and our deep learned synthetic document). The synthetic document is ranked on average most relevant of all....

  3. Target Price Accuracy

    Directory of Open Access Journals (Sweden)

    Alexander G. Kerl

    2011-04-01

    Full Text Available This study analyzes the accuracy of forecasted target prices within analysts’ reports. We compute a measure for target price forecast accuracy that evaluates the ability of analysts to exactly forecast the ex-ante (unknown 12-month stock price. Furthermore, we determine factors that explain this accuracy. Target price accuracy is negatively related to analyst-specific optimism and stock-specific risk (measured by volatility and price-to-book ratio. However, target price accuracy is positively related to the level of detail of each report, company size and the reputation of the investment bank. The potential conflicts of interests between an analyst and a covered company do not bias forecast accuracy.

  4. Diagnosing Eyewitness Accuracy

    OpenAIRE

    Russ, Andrew

    2015-01-01

    Eyewitnesses frequently mistake innocent people for the perpetrator of an observed crime. Such misidentifications have led to the wrongful convictions of many people. Despite this, no reliable method yet exists to determine eyewitness accuracy. This thesis explored two new experimental methods for this purpose. Chapter 2 investigated whether repetition priming can measure prior exposure to a target and compared this with observers’ explicit eyewitness accuracy. Across three experiments slower...

  5. Timeliness and completeness of measles vaccination among children in rural areas of Guangxi, China: A stratified three-stage cluster survey.

    Science.gov (United States)

    Tang, Xianyan; Geater, Alan; McNeil, Edward; Zhou, Hongxia; Deng, Qiuyun; Dong, Aihu

    2017-07-01

    Large-scale outbreaks of measles occurred in 2013 and 2014 in rural Guangxi, a region in Southwest China with high coverage for measles-containing vaccine (MCV). This study aimed to estimate the timely vaccination coverage, the timely-and-complete vaccination coverage, and the median delay period for MCV among children aged 18-54 months in rural Guangxi. Based on quartiles of measles incidence during 2011-2013, a stratified three-stage cluster survey was conducted from June through August 2015. Using weighted estimation and finite population correction, vaccination coverage and 95% confidence intervals (CIs) were calculated. Weighted Kaplan-Meier analyses were used to estimate the median delay periods for the first (MCV1) and second (MCV2) doses of the vaccine. A total of 1216 children were surveyed. The timely vaccination coverage rate was 58.4% (95% CI, 54.9%-62.0%) for MCV1, and 76.9% (95% CI, 73.6%-80.0%) for MCV2. The timely-and-complete vaccination coverage rate was 47.4% (95% CI, 44.0%-51.0%). The median delay period was 32 (95% CI, 27-38) days for MCV1, and 159 (95% CI, 118-195) days for MCV2. The timeliness and completeness of measles vaccination was low, and the median delay period was long among children in rural Guangxi. Incorporating the timeliness and completeness into official routine vaccination coverage statistics may help appraise the coverage of vaccination in China. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  6. Timeliness and completeness of measles vaccination among children in rural areas of Guangxi, China: A stratified three-stage cluster survey

    Directory of Open Access Journals (Sweden)

    Xianyan Tang

    2017-07-01

    Full Text Available Background: Large-scale outbreaks of measles occurred in 2013 and 2014 in rural Guangxi, a region in Southwest China with high coverage for measles-containing vaccine (MCV. This study aimed to estimate the timely vaccination coverage, the timely-and-complete vaccination coverage, and the median delay period for MCV among children aged 18–54 months in rural Guangxi. Methods: Based on quartiles of measles incidence during 2011–2013, a stratified three-stage cluster survey was conducted from June through August 2015. Using weighted estimation and finite population correction, vaccination coverage and 95% confidence intervals (CIs were calculated. Weighted Kaplan–Meier analyses were used to estimate the median delay periods for the first (MCV1 and second (MCV2 doses of the vaccine. Results: A total of 1216 children were surveyed. The timely vaccination coverage rate was 58.4% (95% CI, 54.9%–62.0% for MCV1, and 76.9% (95% CI, 73.6%–80.0% for MCV2. The timely-and-complete vaccination coverage rate was 47.4% (95% CI, 44.0%–51.0%. The median delay period was 32 (95% CI, 27–38 days for MCV1, and 159 (95% CI, 118–195 days for MCV2. Conclusions: The timeliness and completeness of measles vaccination was low, and the median delay period was long among children in rural Guangxi. Incorporating the timeliness and completeness into official routine vaccination coverage statistics may help appraise the coverage of vaccination in China.

  7. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  8. An email-based intervention to improve the number and timeliness of letters sent from the hospital outpatient clinic to the general practitioner : A pair-randomized controlled trial

    NARCIS (Netherlands)

    Medlock, Stephanie; Parlevliet, Juliette L.; Sent, Danielle; Eslami, Saeid; Askari, Marjan; Arts, Derk L.; Hoekstra, Joost B.; de Rooij, Sophia E.; Abu-Hanna, Ameen

    2017-01-01

    Objective: Letters from the hospital to the general practitioner are important for maintaining continuity of care. Although doctors feel letters are important, they are often not written on time. To improve the number and timeliness of letters sent from the hospital outpatient department to the

  9. An email-based intervention to improve the number and timeliness of letters sent from the hospital outpatient clinic to the general practitioner: A pair-randomized controlled trial

    NARCIS (Netherlands)

    Medlock, Stephanie; Parlevliet, Juliette L.; Sent, Danielle; Eslami, Saeid; Askari, Marjan; Arts, Derk L.; Hoekstra, Joost B.; de Rooij, Sophia E.; Abu-Hanna, Ameen

    2017-01-01

    Letters from the hospital to the general practitioner are important for maintaining continuity of care. Although doctors feel letters are important, they are often not written on time. To improve the number and timeliness of letters sent from the hospital outpatient department to the general

  10. Overlay accuracy fundamentals

    Science.gov (United States)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  11. Improving shuffler assay accuracy

    International Nuclear Information System (INIS)

    Rinard, P.M.

    1995-01-01

    Drums of uranium waste should be disposed of in an economical and environmentally sound manner. The most accurate possible assays of the uranium masses in the drums are required for proper disposal. The accuracies of assays from a shuffler are affected by the type of matrix material in the drums. Non-hydrogenous matrices have little effect on neutron transport and accuracies are very good. If self-shielding is known to be a minor problem, good accuracies are also obtained with hydrogenous matrices when a polyethylene sleeve is placed around the drums. But for those cases where self-shielding may be a problem, matrices are hydrogenous, and uranium distributions are non-uniform throughout the drums, the accuracies are degraded. They can be greatly improved by determining the distributions of the uranium and then applying correction factors based on the distributions. This paper describes a technique for determining uranium distributions by using the neutron count rates in detector banks around the waste drum and solving a set of overdetermined linear equations. Other approaches were studied to determine the distributions and are described briefly. Implementation of this correction is anticipated on an existing shuffler next year

  12. Making Deferred Taxes Relevant

    NARCIS (Netherlands)

    Brouwer, Arjan; Naarding, Ewout

    2018-01-01

    We analyse the conceptual problems in current accounting for deferred taxes and provide solutions derived from the literature in order to make International Financial Reporting Standards (IFRS) deferred tax numbers value-relevant. In our view, the empirical results concerning the value relevance of

  13. Parsimonious relevance models

    NARCIS (Netherlands)

    Meij, E.; Weerkamp, W.; Balog, K.; de Rijke, M.; Myang, S.-H.; Oard, D.W.; Sebastiani, F.; Chua, T.-S.; Leong, M.-K.

    2008-01-01

    We describe a method for applying parsimonious language models to re-estimate the term probabilities assigned by relevance models. We apply our method to six topic sets from test collections in five different genres. Our parsimonious relevance models (i) improve retrieval effectiveness in terms of

  14. Right care, right place, right time: improving the timeliness of health care in New South Wales through a public-private hospital partnership.

    Science.gov (United States)

    Saunders, Carla; Carter, David J

    2017-10-01

    Objective The overall aim of the study was to investigate and assess the feasibility of improving the timeliness of public hospital care through a New South Wales (NSW)-wide public-private hospital partnership. Methods The study reviewed the academic and professional grey literature, and undertook exploratory analyses of secondary data acquired from two national health data repositories informing in-patient access and utilisation across NSW public and private hospitals. Results In 2014-15, the NSW public hospital system was unable to deliver care within the medically recommended time frame for over 27400 people who were awaiting elective surgery. Available information indicates that the annual commissioning of 15% of public in-patient rehabilitation bed days to the private hospital system would potentially free up enough capacity in the NSW public hospital system to enable elective surgery for all public patients within recommended time frames. Conclusions The findings of the study justify a strategic whole-of-health system approach to reducing public patient wait times in NSW and highlight the need for research efforts aimed at securing a better understanding of available hospital capacity across the public and private hospital systems, and identifying and testing workable models that improve the timeliness of public hospital care. What is known about the topic? There are very few studies available to inform public-private hospital service partnerships and the opportunities available to improve timely health care access through such partnerships. What does this paper add? This paper has the potential to open and prompt timely discussion and debate, and generate further fundamental investigation, on public-private hospital service partnerships in Australia where opportunity is available to address elective surgery wait times in a reliable and effective manner. What are the implications for practitioners? The NSW Ministry of Health and its Local Health Districts

  15. Geoid undulation accuracy

    Science.gov (United States)

    Rapp, Richard H.

    1993-01-01

    The determination of the geoid and equipotential surface of the Earth's gravity field, has long been of interest to geodesists and oceanographers. The geoid provides a surface to which the actual ocean surface can be compared with the differences implying information on the circulation patterns of the oceans. For use in oceanographic applications the geoid is ideally needed to a high accuracy and to a high resolution. There are applications that require geoid undulation information to an accuracy of +/- 10 cm with a resolution of 50 km. We are far from this goal today but substantial improvement in geoid determination has been made. In 1979 the cumulative geoid undulation error to spherical harmonic degree 20 was +/- 1.4 m for the GEM10 potential coefficient model. Today the corresponding value has been reduced to +/- 25 cm for GEM-T3 or +/- 11 cm for the OSU91A model. Similar improvements are noted by harmonic degree (wave-length) and in resolution. Potential coefficient models now exist to degree 360 based on a combination of data types. This paper discusses the accuracy changes that have taken place in the past 12 years in the determination of geoid undulations.

  16. Culturally Relevant Cyberbullying Prevention

    OpenAIRE

    Phillips, Gregory John

    2017-01-01

    In this action research study, I, along with a student intervention committee of 14 members, developed a cyberbullying intervention for a large urban high school on the west coast. This high school contained a predominantly African American student population. I aimed to discover culturally relevant cyberbullying prevention strategies for African American students. The intervention committee selected video safety messages featuring African American actors as the most culturally relevant cyber...

  17. Using novel computer-assisted linguistic analysis techniques to assess the timeliness and impact of FP7 Health’s research – a work in progress report

    Energy Technology Data Exchange (ETDEWEB)

    Stanciauskas, V.; Brozaitis, H.; Manola, N.; Metaxas, O.; Galsworthy, M.

    2016-07-01

    This paper presents the ongoing developments of the ex-post evaluation of the Health theme in FP7 which will be finalised in early 2017. the evaluation was launched by DG Research and Innovation, European Commission. Among other questions the evaluation asked to assess the structuring effect of FP7 Health on the European Research Area dnd the timeliness of the research performed. To this end the evalaution team has applied two innovative computerassisted linguistic analysis techniques to adderss these questions, including dynamic topic modelling and network analysis of co-publications. The topic model built for this evaluation contributed to comprehensive mapping of FP7 Health's research activities and building of a dynamic topic model that has not been attempted in previous evalautions of the Framework Programmes. Our applied network analysiswas of co-publications proved to be a powerful tool in determining the structuring effect of the FP7 Health to a level of detail which was again not implemented in previous evaluations of EU-funded research programmes. (Author)

  18. A pre-admission program for underrepresented minority and disadvantaged students: application, acceptance, graduation rates and timeliness of graduating from medical school.

    Science.gov (United States)

    Strayhorn, G

    2000-04-01

    To determine whether students' performances in a pre-admission program predicted whether participants would (1) apply to medical school, (2) get accepted, and (3) graduate. Using prospectively collected data from participants in the University of North Carolina at Chapel Hill's Medical Education Development Program (MEDP) and data from the Association of American Colleges Student and Applicant Information Management System, the author identified 371 underrepresented minority (URM) students who were full-time participants and completed the program between 1984 and 1989, prior to their acceptance into medical school. Logistic regression analysis was used to determine whether MEDP performance significantly predicted (after statistically controlling for traditional predictors of these outcomes) the proportions of URM participants who applied to medical school and were accepted, the timeliness of graduating, and the proportion graduating. Odds ratios with 95% confidence intervals were calculated to determine the associations between the independent and outcome variables. In separate logistic regression models, MEDP performance predicted the study's outcomes after statistically controlling for traditional predictors with 95% confidence intervals. Pre-admission programs with similar outcomes can improve the diversity of the physician workforce and the access to health care for underrepresented minority and economically disadvantaged populations.

  19. The impact of an early-morning radiologist work shift on the timeliness of communicating urgent imaging findings on portable chest radiography.

    Science.gov (United States)

    Kaewlai, Rathachai; Greene, Reginald E; Asrani, Ashwin V; Abujudeh, Hani H

    2010-09-01

    The aim of this study was to assess the potential impact of staggered radiologist work shifts on the timeliness of communicating urgent imaging findings that are detected on portable overnight chest radiography of hospitalized patients. The authors conducted a retrospective study that compared the interval between the acquisition and communication of urgent findings on portable overnight critical care chest radiography detected by an early-morning shift for radiologists (3 am to 11 am) with historical experience with a standard daytime shift (8 am to 5 pm) in the detection and communication of urgent findings in a similar patient population a year earlier. During a 4-month period, 6,448 portable chest radiographic studies were interpreted on the early-morning radiologist shift. Urgent findings requiring immediate communication were detected in 308 (4.8%) studies. The early-morning shift of radiologists, on average, communicated these findings 2 hours earlier compared with the historical control group (P chest radiography of hospitalized patients. Published by Elsevier Inc.

  20. Assessing the impact of the introduction of an electronic hospital discharge system on the completeness and timeliness of discharge communication: a before and after study.

    Science.gov (United States)

    Mehta, Rajnikant L; Baxendale, Bryn; Roth, Katie; Caswell, Victoria; Le Jeune, Ivan; Hawkins, Jack; Zedan, Haya; Avery, Anthony J

    2017-09-05

    Hospital discharge summaries are a key communication tool ensuring continuity of care between primary and secondary care. Incomplete or untimely communication of information increases risk of hospital readmission and associated complications. The aim of this study was to evaluate whether the introduction of a new electronic discharge system (NewEDS) was associated with improvements in the completeness and timeliness of discharge information, in Nottingham University Hospitals NHS Trust, England. A before and after longitudinal study design was used. Data were collected using the gold standard auditing tool from the Royal College of Physicians (RCP). This tool contains a checklist of 57 items grouped into seven categories, 28 of which are classified as mandatory by RCP. Percentage completeness (out of the 28 mandatory items) was considered to be the primary outcome measure. Data from 773 patients discharged directly from the acute medical unit over eight-week long time periods (four before and four after the change to the NewEDS) from August 2010 to May 2012 were extracted and evaluated. Results were summarised by effect size on completeness before and after changeover to NewEDS respectively. The primary outcome variable was represented with percentage of completeness score and a non-parametric technique was used to compare pre-NewEDS and post-NewEDS scores. The changeover to the NewEDS resulted in an increased completeness of discharge summaries from 60.7% to 75.0% (p communication.

  1. Ontology: ambiguity and accuracy

    Directory of Open Access Journals (Sweden)

    Marcelo Schiessl

    2012-08-01

    Full Text Available Ambiguity is a major obstacle to information retrieval. It is source of several researches in Information Science. Ontologies have been studied in order to solve problems related to ambiguities. Paradoxically, “ontology” term is also ambiguous and it is understood according to the use by the community. Philosophy and Computer Science seems to have the most accentuated difference related to the term sense. The former holds undisputed tradition and authority. The latter, in despite of being quite recent, holds an informal sense, but pragmatic. Information Science acts ranging from philosophical to computational approaches so as to get organized collections based on balance between users’ necessities and available information. The semantic web requires informational cycle automation and demands studies related to ontologies. Consequently, revisiting relevant approaches for the study of ontologies plays a relevant role as a way to provide useful ideas to researchers maintaining philosophical rigor, and convenience provided by computers.

  2. Relevant test set using feature selection algorithm for early detection ...

    African Journals Online (AJOL)

    The objective of feature selection is to find the most relevant features for classification. Thus, the dimensionality of the information will be reduced and may improve classification's accuracy. This paper proposed a minimum set of relevant questions that can be used for early detection of dyslexia. In this research, we ...

  3. Evaluation of dynamic message signs and their potential impact on traffic flow : [research summary].

    Science.gov (United States)

    2013-04-01

    The objective of this research was to understand the potential impact of DMS messages on traffic : flow and evaluate their accuracy, timeliness, relevance and usefulness. Additionally, Bluetooth : sensors were used to track and analyze the diversion ...

  4. The Limits to Relevance

    Science.gov (United States)

    Averill, M.; Briggle, A.

    2006-12-01

    Science policy and knowledge production lately have taken a pragmatic turn. Funding agencies increasingly are requiring scientists to explain the relevance of their work to society. This stems in part from mounting critiques of the "linear model" of knowledge production in which scientists operating according to their own interests or disciplinary standards are presumed to automatically produce knowledge that is of relevance outside of their narrow communities. Many contend that funded scientific research should be linked more directly to societal goals, which implies a shift in the kind of research that will be funded. While both authors support the concept of useful science, we question the exact meaning of "relevance" and the wisdom of allowing it to control research agendas. We hope to contribute to the conversation by thinking more critically about the meaning and limits of the term "relevance" and the trade-offs implicit in a narrow utilitarian approach. The paper will consider which interests tend to be privileged by an emphasis on relevance and address issues such as whose goals ought to be pursued and why, and who gets to decide. We will consider how relevance, narrowly construed, may actually limit the ultimate utility of scientific research. The paper also will reflect on the worthiness of research goals themselves and their relationship to a broader view of what it means to be human and to live in society. Just as there is more to being human than the pragmatic demands of daily life, there is more at issue with knowledge production than finding the most efficient ways to satisfy consumer preferences or fix near-term policy problems. We will conclude by calling for a balanced approach to funding research that addresses society's most pressing needs but also supports innovative research with less immediately apparent application.

  5. Relevant Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering aims at detecting clusters in any subspace projection of a high dimensional space. As the number of possible subspace projections is exponential in the number of dimensions, the result is often tremendously large. Recent approaches fail to reduce results to relevant subspace...... clusters. Their results are typically highly redundant, i.e. many clusters are detected multiple times in several projections. In this work, we propose a novel model for relevant subspace clustering (RESCU). We present a global optimization which detects the most interesting non-redundant subspace clusters...... achieves top clustering quality while competing approaches show greatly varying performance....

  6. Is Information Still Relevant?

    Science.gov (United States)

    Ma, Lia

    2013-01-01

    Introduction: The term "information" in information science does not share the characteristics of those of a nomenclature: it does not bear a generally accepted definition and it does not serve as the bases and assumptions for research studies. As the data deluge has arrived, is the concept of information still relevant for information…

  7. Impact of investigations in general practice on timeliness of referral for patients subsequently diagnosed with cancer: analysis of national primary care audit data.

    Science.gov (United States)

    Rubin, G P; Saunders, C L; Abel, G A; McPhail, S; Lyratzopoulos, G; Neal, R D

    2015-02-17

    For patients with symptoms of possible cancer who do not fulfil the criteria for urgent referral, initial investigation in primary care has been advocated in the United Kingdom and supported by additional resources. The consequence of this strategy for the timeliness of diagnosis is unknown. We analysed data from the English National Audit of Cancer Diagnosis in Primary Care on patients with lung (1494), colorectal (2111), stomach (246), oesophagus (513), pancreas (327), and ovarian (345) cancer relating to the ordering of investigations by the General Practitioner and their nature. Presenting symptoms were categorised according to National Institute for Health and Care Excellence (NICE) guidance on referral for suspected cancer. We used linear regression to estimate the mean difference in primary-care interval by cancer, after adjustment for age, gender, and the symptomatic presentation category. Primary-care investigations were undertaken in 3198/5036 (64%) of cases. The median primary-care interval was 16 days (IQR 5-45) for patients undergoing investigation and 0 days (IQR 0-10) for those not investigated. Among patients whose symptoms mandated urgent referral to secondary care according to NICE guidelines, between 37% (oesophagus) and 75% (pancreas) were first investigated in primary care. In multivariable linear regression analyses stratified by cancer site, adjustment for age, sex, and NICE referral category explained little of the observed prolongation associated with investigation. For six specified cancers, investigation in primary care was associated with later referral for specialist assessment. This effect was independent of the nature of symptoms. Some patients for whom urgent referral is mandated by NICE guidance are nevertheless investigated before referral. Reducing the intervals between test order, test performance, and reporting can help reduce the prolongation of primary-care intervals associated with investigation use. Alternative models of

  8. Clinical Relevance of Adipokines

    Directory of Open Access Journals (Sweden)

    Matthias Blüher

    2012-10-01

    Full Text Available The incidence of obesity has increased dramatically during recent decades. Obesity increases the risk for metabolic and cardiovascular diseases and may therefore contribute to premature death. With increasing fat mass, secretion of adipose tissue derived bioactive molecules (adipokines changes towards a pro-inflammatory, diabetogenic and atherogenic pattern. Adipokines are involved in the regulation of appetite and satiety, energy expenditure, activity, endothelial function, hemostasis, blood pressure, insulin sensitivity, energy metabolism in insulin sensitive tissues, adipogenesis, fat distribution and insulin secretion in pancreatic β-cells. Therefore, adipokines are clinically relevant as biomarkers for fat distribution, adipose tissue function, liver fat content, insulin sensitivity, chronic inflammation and have the potential for future pharmacological treatment strategies for obesity and its related diseases. This review focuses on the clinical relevance of selected adipokines as markers or predictors of obesity related diseases and as potential therapeutic tools or targets in metabolic and cardiovascular diseases.

  9. Information Needs/Relevance

    OpenAIRE

    Wildemuth, Barbara M.

    2009-01-01

    A user's interaction with a DL is often initiated as the result of the user experiencing an information need of some kind. Aspects of that experience and how it might affect the user's interactions with the DL are discussed in this module. In addition, users continuously make decisions about and evaluations of the materials retrieved from a DL, relative to their information needs. Relevance judgments, and their relationship to the user's information needs, are discussed in this module. Draft

  10. Meditation experience predicts introspective accuracy.

    Directory of Open Access Journals (Sweden)

    Kieran C R Fox

    Full Text Available The accuracy of subjective reports, especially those involving introspection of one's own internal processes, remains unclear, and research has demonstrated large individual differences in introspective accuracy. It has been hypothesized that introspective accuracy may be heightened in persons who engage in meditation practices, due to the highly introspective nature of such practices. We undertook a preliminary exploration of this hypothesis, examining introspective accuracy in a cross-section of meditation practitioners (1-15,000 hrs experience. Introspective accuracy was assessed by comparing subjective reports of tactile sensitivity for each of 20 body regions during a 'body-scanning' meditation with averaged, objective measures of tactile sensitivity (mean size of body representation area in primary somatosensory cortex; two-point discrimination threshold as reported in prior research. Expert meditators showed significantly better introspective accuracy than novices; overall meditation experience also significantly predicted individual introspective accuracy. These results suggest that long-term meditators provide more accurate introspective reports than novices.

  11. [Relevant public health enteropathogens].

    Science.gov (United States)

    Riveros, Maribel; Ochoa, Theresa J

    2015-01-01

    Diarrhea remains the third leading cause of death in children under five years, despite recent advances in the management and prevention of this disease. It is caused by multiple pathogens, however, the prevalence of each varies by age group, geographical area and the scenario where cases (community vs hospital) are recorded. The most relevant pathogens in public health are those associated with the highest burden of disease, severity, complications and mortality. In our country, norovirus, Campylobacter and diarrheagenic E. coli are the most prevalent pathogens at the community level in children. In this paper we review the local epidemiology and potential areas of development in five selected pathogens: rotavirus, norovirus, Shiga toxin-producing E. coli (STEC), Shigella and Salmonella. Of these, rotavirus is the most important in the pediatric population and the main agent responsible for child mortality from diarrhea. The introduction of rotavirus vaccination in Peru will have a significant impact on disease burden and mortality from diarrhea. However, surveillance studies are needed to determine the impact of vaccination and changes in the epidemiology of diarrhea in Peru following the introduction of new vaccines, as well as antibiotic resistance surveillance of clinical relevant bacteria.

  12. POC CD4 Testing Improves Linkage to HIV Care and Timeliness of ART Initiation in a Public Health Approach: A Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Lara Vojnov

    Full Text Available CD4 cell count is an important test in HIV programs for baseline risk assessment, monitoring of ART where viral load is not available, and, in many settings, antiretroviral therapy (ART initiation decisions. However, access to CD4 testing is limited, in part due to the centralized conventional laboratory network. Point of care (POC CD4 testing has the potential to address some of the challenges of centralized CD4 testing and delays in delivery of timely testing and ART initiation. We conducted a systematic review and meta-analysis to identify the extent to which POC improves linkages to HIV care and timeliness of ART initiation.We searched two databases and four conference sites between January 2005 and April 2015 for studies reporting test turnaround times, proportion of results returned, and retention associated with the use of point-of-care CD4. Random effects models were used to estimate pooled risk ratios, pooled proportions, and 95% confidence intervals.We identified 30 eligible studies, most of which were completed in Africa. Test turnaround times were reduced with the use of POC CD4. The time from HIV diagnosis to CD4 test was reduced from 10.5 days with conventional laboratory-based testing to 0.1 days with POC CD4 testing. Retention along several steps of the treatment initiation cascade was significantly higher with POC CD4 testing, notably from HIV testing to CD4 testing, receipt of results, and pre-CD4 test retention (all p<0.001. Furthermore, retention between CD4 testing and ART initiation increased with POC CD4 testing compared to conventional laboratory-based testing (p = 0.01. We also carried out a non-systematic review of the literature observing that POC CD4 increased the projected life expectancy, was cost-effective, and acceptable.POC CD4 technologies reduce the time and increase patient retention along the testing and treatment cascade compared to conventional laboratory-based testing. POC CD4 is, therefore, a useful tool

  13. Achieving Climate Change Absolute Accuracy in Orbit

    Science.gov (United States)

    Wielicki, Bruce A.; Young, D. F.; Mlynczak, M. G.; Thome, K. J; Leroy, S.; Corliss, J.; Anderson, J. G.; Ao, C. O.; Bantges, R.; Best, F.; hide

    2013-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission will provide a calibration laboratory in orbit for the purpose of accurately measuring and attributing climate change. CLARREO measurements establish new climate change benchmarks with high absolute radiometric accuracy and high statistical confidence across a wide range of essential climate variables. CLARREO's inherently high absolute accuracy will be verified and traceable on orbit to Système Internationale (SI) units. The benchmarks established by CLARREO will be critical for assessing changes in the Earth system and climate model predictive capabilities for decades into the future as society works to meet the challenge of optimizing strategies for mitigating and adapting to climate change. The CLARREO benchmarks are derived from measurements of the Earth's thermal infrared spectrum (5-50 micron), the spectrum of solar radiation reflected by the Earth and its atmosphere (320-2300 nm), and radio occultation refractivity from which accurate temperature profiles are derived. The mission has the ability to provide new spectral fingerprints of climate change, as well as to provide the first orbiting radiometer with accuracy sufficient to serve as the reference transfer standard for other space sensors, in essence serving as a "NIST [National Institute of Standards and Technology] in orbit." CLARREO will greatly improve the accuracy and relevance of a wide range of space-borne instruments for decadal climate change. Finally, CLARREO has developed new metrics and methods for determining the accuracy requirements of climate observations for a wide range of climate variables and uncertainty sources. These methods should be useful for improving our understanding of observing requirements for most climate change observations.

  14. Other relevant biological papers

    International Nuclear Information System (INIS)

    Shimizu, M.

    1989-01-01

    A considerable number of CRESP-relevant papers concerning deep-sea biology and radioecology have been published. It is the purpose of this study to call attention to them. They fall into three general categories. The first is papers of general interest. They are mentioned only briefly, and include text references to the global bibliography at the end of the volume. The second are papers that are not only mentioned and referenced, but for various reasons are described in abstract form. The last is a list of papers compiled by H.S.J. Roe specifically for this volume. They are listed in bibliographic form, and are also included in the global bibliography at the end of the volume

  15. Test Expectancy Affects Metacomprehension Accuracy

    Science.gov (United States)

    Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2011-01-01

    Background: Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and…

  16. Diagnostic accuracy of sonoelastography in different diseases

    Directory of Open Access Journals (Sweden)

    Iqra Manzoor

    2018-03-01

    Full Text Available The objective of this study was to evaluate the diagnostic accuracy of sonoelastography in patients of primary and secondary health care settings. Google scholar, PubMed, Medline, Medscape, Wikipedia and NCBI were searched in October 2017 for all original studies and review articles to identify the relevant material. Two reviewers independently selected articles for evaluation of the diagnostic accuracy of sonoelastography in different diseases based on titles and abstracts retrieved by the literature search. The accuracy of sonoelastography in different diseases was used as the index text, while B-mode sonography, micro pure imaging, surgery and histological findings were used as reference texts. Superficial lymph nodes, neck nodules, malignancy in thyroid nodules, benign and malignant cervical lymph nodes, thyroid nodules, prostate carcinoma, benign and malignant breast abnormalities, liver diseases, parotid and salivary gland masses, pancreatic masses, musculoskeletal diseases and renal disorders were target conditions. The data extracted by the two reviewers concerning selected study characteristics and results were presented in tables and figures. In total, 46 studies were found for breast masses, lymph nodes, prostate carcinoma, liver diseases, salivary and parotid gland diseases, pancreatic masses, musculoskeletal diseases and renal diseases, and the overall sensitivity of sonoelastography in diagnosing all these diseases was 83.14% while specificity was 81.41%. This literature review demonstrates that sonoelastography is characterized by high sensitivity and specificity in diagnosing different disorders of the body.

  17. The percentage of prostate-specific antigen (PSA) isoform [-2]proPSA and the Prostate Health Index improve the diagnostic accuracy for clinically relevant prostate cancer at initial and repeat biopsy compared with total PSA and percentage free PSA in men aged ≤65 years.

    Science.gov (United States)

    Boegemann, Martin; Stephan, Carsten; Cammann, Henning; Vincendeau, Sébastien; Houlgatte, Alain; Jung, Klaus; Blanchet, Jean-Sebastien; Semjonow, Axel

    2016-01-01

    To prospectively test the diagnostic accuracy of the percentage of prostate specific antigen (PSA) isoform [-2]proPSA (%p2PSA) and the Prostate Health Index (PHI), and to determine their role for discrimination between significant and insignificant prostate cancer at initial and repeat prostate biopsy in men aged ≤65 years. The diagnostic performance of %p2PSA and PHI were evaluated in a multicentre study. In all, 769 men aged ≤65 years scheduled for initial or repeat prostate biopsy were recruited in four sites based on a total PSA (t-PSA) level of 1.6-8.0 ng/mL World Health Organization (WHO) calibrated (2-10 ng/mL Hybritech-calibrated). Serum samples were measured for the concentration of t-PSA, free PSA (f-PSA) and p2PSA with Beckman Coulter immunoassays on Access-2 or DxI800 instruments. PHI was calculated as (p2PSA/f-PSA × √t-PSA). Uni- and multivariable logistic regression models and an artificial neural network (ANN) were complemented by decision curve analysis (DCA). In univariate analysis %p2PSA and PHI were the best predictors of prostate cancer detection in all patients (area under the curve [AUC] 0.72 and 0.73, respectively), at initial (AUC 0.67 and 0.69) and repeat biopsy (AUC 0.74 and 0.74). t-PSA and %f-PSA performed less accurately for all patients (AUC 0.54 and 0.62). For detection of significant prostate cancer (based on Prostate Cancer Research International Active Surveillance [PRIAS] criteria) the %p2PSA and PHI equally demonstrated best performance (AUC 0.70 and 0.73) compared with t-PSA and %f-PSA (AUC 0.54 and 0.59). In multivariate analysis PHI we added to a base model of age, prostate volume, digital rectal examination, t-PSA and %f-PSA. PHI was strongest in predicting prostate cancer in all patients, at initial and repeat biopsy and for significant prostate cancer (AUC 0.73, 0.68, 0.78 and 0.72, respectively). In DCA for all patients the ANN showed the broadest threshold probability and best net benefit. PHI as single parameter

  18. ACCURACY DIMENSIONS IN REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    Á. Barsi

    2018-04-01

    selected, practice-oriented approaches are evaluated too, finally widely-used dimension metrics like Root Mean Squared Error (RMSE or confusion matrix are discussed. The authors present data quality features of well-defined and poorly defined object. The central part of the study is the land cover mapping, describing its accuracy management model, presented relevance and uncertainty measures of its influencing quality dimensions. In the paper theory is supported by a case study, where the remote sensing technology is used for supporting the area-based agricultural subsidies of the European Union, in Hungarian administration.

  19. Accuracy Dimensions in Remote Sensing

    Science.gov (United States)

    Barsi, Á.; Kugler, Zs.; László, I.; Szabó, Gy.; Abdulmutalib, H. M.

    2018-04-01

    -oriented approaches are evaluated too, finally widely-used dimension metrics like Root Mean Squared Error (RMSE) or confusion matrix are discussed. The authors present data quality features of well-defined and poorly defined object. The central part of the study is the land cover mapping, describing its accuracy management model, presented relevance and uncertainty measures of its influencing quality dimensions. In the paper theory is supported by a case study, where the remote sensing technology is used for supporting the area-based agricultural subsidies of the European Union, in Hungarian administration.

  20. User perspectives on relevance criteria

    DEFF Research Database (Denmark)

    Maglaughlin, Kelly L.; Sonnenwald, Diane H.

    2002-01-01

    , partially relevant, or not relevant to their information need; and explained their decisions in an interview. Analysis revealed 29 criteria, discussed positively and negatively, that were used by the participants when selecting passages that contributed or detracted from a document's relevance......This study investigates the use of criteria to assess relevant, partially relevant, and not-relevant documents. Study participants identified passages within 20 document representations that they used to make relevance judgments; judged each document representation as a whole to be relevant...... matter, thought catalyst), full text (e.g., audience, novelty, type, possible content, utility), journal/publisher (e.g., novelty, main focus, perceived quality), and personal (e.g., competition, time requirements). Results further indicate that multiple criteria are used when making relevant, partially...

  1. Test expectancy affects metacomprehension accuracy.

    Science.gov (United States)

    Thiede, Keith W; Wiley, Jennifer; Griffin, Thomas D

    2011-06-01

    Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and practice tests. The purpose of the present study was to examine whether the accuracy metacognitive monitoring was affected by the nature of the test expected. Students (N= 59) were randomly assigned to one of two test expectancy groups (memory vs. inference). Then after reading texts, judging learning, completed both memory and inference tests. Test performance and monitoring accuracy were superior when students received the kind of test they had been led to expect rather than the unexpected test. Tests influence students' perceptions of what constitutes learning. Our findings suggest that this could affect how students prepare for tests and how they monitoring their own learning. ©2010 The British Psychological Society.

  2. Forecast Accuracy Uncertainty and Momentum

    OpenAIRE

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  3. Social Power Increases Interoceptive Accuracy

    Directory of Open Access Journals (Sweden)

    Mehrad Moeini-Jazani

    2017-08-01

    Full Text Available Building on recent psychological research showing that power increases self-focused attention, we propose that having power increases accuracy in perception of bodily signals, a phenomenon known as interoceptive accuracy. Consistent with our proposition, participants in a high-power experimental condition outperformed those in the control and low-power conditions in the Schandry heartbeat-detection task. We demonstrate that the effect of power on interoceptive accuracy is not explained by participants’ physiological arousal, affective state, or general intention for accuracy. Rather, consistent with our reasoning that experiencing power shifts attentional resources inward, we show that the effect of power on interoceptive accuracy is dependent on individuals’ chronic tendency to focus on their internal sensations. Moreover, we demonstrate that individuals’ chronic sense of power also predicts interoceptive accuracy similar to, and independent of, how their situationally induced feeling of power does. We therefore provide further support on the relation between power and enhanced perception of bodily signals. Our findings offer a novel perspective–a psychophysiological account–on how power might affect judgments and behavior. We highlight and discuss some of these intriguing possibilities for future research.

  4. What do we mean by accuracy in geomagnetic measurements?

    Science.gov (United States)

    Green, A.W.

    1990-01-01

    High accuracy is what distinguishes measurements made at the world's magnetic observatories from other types of geomagnetic measurements. High accuracy in determining the absolute values of the components of the Earth's magnetic field is essential to studying geomagnetic secular variation and processes at the core mantle boundary, as well as some magnetospheric processes. In some applications of geomagnetic data, precision (or resolution) of measurements may also be important. In addition to accuracy and resolution in the amplitude domain, it is necessary to consider these same quantities in the frequency and space domains. New developments in geomagnetic instruments and communications make real-time, high accuracy, global geomagnetic observatory data sets a real possibility. There is a growing realization in the scientific community of the unique relevance of geomagnetic observatory data to the principal contemporary problems in solid Earth and space physics. Together, these factors provide the promise of a 'renaissance' of the world's geomagnetic observatory system. ?? 1990.

  5. Trait Perception Accuracy and Acquaintance Within Groups: Tracking Accuracy Development.

    Science.gov (United States)

    Brown, Jill A; Bernieri, Frank

    2017-05-01

    Previous work on trait perception has evaluated accuracy at discrete stages of relationships (e.g., strangers, best friends). A relatively limited body of literature has investigated changes in accuracy as acquaintance within a dyad or group increases. Small groups of initially unacquainted individuals spent more than 30 hr participating in a wide range of activities designed to represent common interpersonal contexts (e.g., eating, traveling). We calculated how accurately each participant judged others in their group on the big five traits across three distinct points within the acquaintance process: zero acquaintance, after a getting-to-know-you conversation, and after 10 weeks of interaction and activity. Judgments of all five traits exhibited accuracy above chance levels after 10 weeks. An examination of the trait rating stability revealed that much of the revision in judgments occurred not over the course of the 10-week relationship as suspected, but between zero acquaintance and the getting-to-know-you conversation.

  6. Prognostic accuracy of electroencephalograms in preterm infants

    DEFF Research Database (Denmark)

    Fogtmann, Emilie Pi; Plomgaard, Anne Mette; Greisen, Gorm

    2017-01-01

    CONTEXT: Brain injury is common in preterm infants, and predictors of neurodevelopmental outcome are relevant. OBJECTIVE: To assess the prognostic test accuracy of the background activity of the EEG recorded as amplitude-integrated EEG (aEEG) or conventional EEG early in life in preterm infants...... for predicting neurodevelopmental outcome. DATA SOURCES: The Cochrane Library, PubMed, Embase, and the Cumulative Index to Nursing and Allied Health Literature. STUDY SELECTION: We included observational studies that had obtained an aEEG or EEG within 7 days of life in preterm infants and reported...... neurodevelopmental outcomes 1 to 10 years later. DATA EXTRACTION: Two reviewers independently performed data extraction with regard to participants, prognostic testing, and outcomes. RESULTS: Thirteen observational studies with a total of 1181 infants were included. A metaanalysis was performed based on 3 studies...

  7. EOG feature relevance determination for microsleep detection

    Directory of Open Access Journals (Sweden)

    Golz Martin

    2017-09-01

    Full Text Available Automatic relevance determination (ARD was applied to two-channel EOG recordings for microsleep event (MSE recognition. 10 s immediately before MSE and also before counterexamples of fatigued, but attentive driving were analysed. Two type of signal features were extracted: the maximum cross correlation (MaxCC and logarithmic power spectral densities (PSD averaged in spectral bands of 0.5 Hz width ranging between 0 and 8 Hz. Generalised learn-ing vector quantisation (GRLVQ was used as ARD method to show the potential of feature reduction. This is compared to support-vector machines (SVM, in which the feature reduction plays a much smaller role. Cross validation yielded mean normalised relevancies of PSD features in the range of 1.6 – 4.9 % and 1.9 – 10.4 % for horizontal and vertical EOG, respectively. MaxCC relevancies were 0.002 – 0.006 % and 0.002 – 0.06 %, respectively. This shows that PSD features of vertical EOG are indispensable, whereas MaxCC can be neglected. Mean classification accuracies were estimated at 86.6±b 1.3 % and 92.3±b 0.2 % for GRLVQ and SVM, respectively. GRLVQ permits objective feature reduction by inclusion of all processing stages, but is not as accurate as SVM.

  8. EOG feature relevance determination for microsleep detection

    Directory of Open Access Journals (Sweden)

    Golz Martin

    2017-09-01

    Full Text Available Automatic relevance determination (ARD was applied to two-channel EOG recordings for microsleep event (MSE recognition. 10 s immediately before MSE and also before counterexamples of fatigued, but attentive driving were analysed. Two type of signal features were extracted: the maximum cross correlation (MaxCC and logarithmic power spectral densities (PSD averaged in spectral bands of 0.5 Hz width ranging between 0 and 8 Hz. Generalised learn-ing vector quantisation (GRLVQ was used as ARD method to show the potential of feature reduction. This is compared to support-vector machines (SVM, in which the feature reduction plays a much smaller role. Cross validation yielded mean normalised relevancies of PSD features in the range of 1.6 - 4.9 % and 1.9 - 10.4 % for horizontal and vertical EOG, respectively. MaxCC relevancies were 0.002 - 0.006 % and 0.002 - 0.06 %, respectively. This shows that PSD features of vertical EOG are indispensable, whereas MaxCC can be neglected. Mean classification accuracies were estimated at 86.6±b 1.3 % and 92.3±b 0.2 % for GRLVQ and SVM, respec-tively. GRLVQ permits objective feature reduction by inclu-sion of all processing stages, but is not as accurate as SVM.

  9. Diagnostic accuracy in virtual dermatopathology

    DEFF Research Database (Denmark)

    Mooney, E.; Kempf, W.; Jemec, G.B.E.

    2012-01-01

    Background Virtual microscopy is used for teaching medical students and residents and for in-training and certification examinations in the United States. However, no existing studies compare diagnostic accuracy using virtual slides and photomicrographs. The objective of this study was to compare...... diagnostic accuracy of dermatopathologists and pathologists using photomicrographs vs. digitized images, through a self-assessment examination, and to elucidate assessment of virtual dermatopathology. Methods Forty-five dermatopathologists and pathologists received a randomized combination of 15 virtual...... slides and photomicrographs with corresponding clinical photographs and information in a self-assessment examination format. Descriptive data analysis and comparison of groups were performed using a chi-square test. Results Diagnostic accuracy in dermatopathology using virtual dermatopathology...

  10. Accuracy of measurement of pulmonary emphysema with computed tomography: relevant points

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Marchiori, Edson; Oliveira, Hugo

    2010-01-01

    Some technical aspects should be taken into consideration in order to guarantee the reliability of the assessment of pulmonary emphysema with lung computed tomography densitometry. Changes in lung density associated with variations in lungs inspiratory and expiratory levels, computed tomography slice thickness, reconstruction algorithm and type of computed tomography apparatus make tomographic comparisons more difficult in follow-up studies of pulmonary emphysema. Nevertheless, quantitative computed tomography has replaced the visual assessment competing with pulmonary function tests as a sensitive method to measure pulmonary emphysema. The present review discusses technical variables of lung computed tomography and their influence on measurements of pulmonary emphysema. (author)

  11. Accuracy of measurement of pulmonary emphysema with computed tomography: relevant points

    Energy Technology Data Exchange (ETDEWEB)

    Hochhegger, Bruno, E-mail: brunohochhegger@googlemail.co [Hospital Moinhos de Vento, Porto Alegre, RS (Brazil); Marchiori, Edson [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Radiologia; Irion, Klaus L. [Liverpool Heart and Chest Hospital, Liverpool (United Kingdom); Oliveira, Hugo [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil)

    2010-07-15

    Some technical aspects should be taken into consideration in order to guarantee the reliability of the assessment of pulmonary emphysema with lung computed tomography densitometry. Changes in lung density associated with variations in lungs inspiratory and expiratory levels, computed tomography slice thickness, reconstruction algorithm and type of computed tomography apparatus make tomographic comparisons more difficult in follow-up studies of pulmonary emphysema. Nevertheless, quantitative computed tomography has replaced the visual assessment competing with pulmonary function tests as a sensitive method to measure pulmonary emphysema. The present review discusses technical variables of lung computed tomography and their influence on measurements of pulmonary emphysema. (author)

  12. [Accuracy and relevance of CT volumetry in open ocular injuries with intraocular foreign bodies].

    Science.gov (United States)

    Maneschg, O A; Volek, E; Lohinai, Z; Resch, M D; Papp, A; Korom, C; Karlinger, K; Németh, J

    2015-04-01

    The aim of the study was to evaluate the volume of intraocular foreign bodies (IOFB) using computed tomography (CT) volumetry as a prognostic factor for clinical outcome in open ocular injuries. This study compared the volume of 11 IOFBs more than 5 mm(3) in size based on CT volumetry with the real size determined by in vitro measurement. A retrospective evaluation of clinical data, visual acuity, complications and relation of size of IOFBs with clinical outcome in 33 patients (mean age 41.0 ± 13.5 years) with open ocular injuries treated at our department between January 2005 and December 2010 was carried out. No significant differences were found between pairwise in vitro measurement and CT volumetric size (p = 0.07). All patients were surgically treated by pars plana vitrectomy. The mean follow-up time was 7.6± 6.2 months and the mean preoperative best corrected visual acuity (BCVA) was 0.063 ± 0.16 (logMAR 1.2 ± 0.79). Postoperatively, a mean BCVA of 0.25 ± 0.2 (logMAR 0.6 ± 0.69) could be achieved. Clinical outcomes were significantly better in injuries with small IOFBs measuring volumetry is an accurate method for measurement of IOFBs. Exact data about the size and measurement of volume are also an important factor for the prognosis of clinical outcome in open ocular injuries with IOFBs and CT volumetry can also provide important information about the localization of IOFBs.

  13. Improving Accuracy and Relevance of Race/Ethnicity Data: Results of a Statewide Collaboration in Hawaii.

    Science.gov (United States)

    Pellegrin, Karen L; Miyamura, Jill B; Ma, Carolyn; Taniguchi, Ronald

    2016-01-01

    Current race/ethnicity categories established by the U.S. Office of Management and Budget are neither reliable nor valid for understanding health disparities or for tracking improvements in this area. In Hawaii, statewide hospitals have collaborated to collect race/ethnicity data using a standardized method consistent with recommended practices that overcome the problems with the federal categories. The purpose of this observational study was to determine the impact of this collaboration on key measures of race/ethnicity documentation. After this collaborative effort, the number of standardized categories available across hospitals increased from 6 to 34, and the percent of inpatients with documented race/ethnicity increased from 88 to 96%. This improved standardized methodology is now the foundation for tracking population health indicators statewide and focusing quality improvement efforts. The approach used in Hawaii can serve as a model for other states and regions. Ultimately, the ability to standardize data collection methodology across states and regions will be needed to track improvements nationally.

  14. Alpha power gates relevant information during working memory updating.

    Science.gov (United States)

    Manza, Peter; Hau, Chui Luen Vera; Leung, Hoi-Chung

    2014-04-23

    Human working memory (WM) is inherently limited, so we must filter out irrelevant information in our environment or our mind while retaining limited important relevant contents. Previous work suggests that neural oscillations in the alpha band (8-14 Hz) play an important role in inhibiting incoming distracting information during attention and selective encoding tasks. However, whether alpha power is involved in inhibiting no-longer-relevant content or in representing relevant WM content is still debated. To clarify this issue, we manipulated the amount of relevant/irrelevant information using a task requiring spatial WM updating while measuring neural oscillatory activity via EEG and localized current sources across the scalp using a surface Laplacian transform. An initial memory set of two, four, or six spatial locations was to be memorized over a delay until an updating cue was presented indicating that only one or three locations remained relevant for a subsequent recognition test. Alpha amplitude varied with memory maintenance and updating demands among a cluster of left frontocentral electrodes. Greater postcue alpha power was associated with the high relevant load conditions (six and four dots cued to reduce to three relevant) relative to the lower load conditions (four and two dots reduced to one). Across subjects, this difference in alpha power was correlated with condition differences in performance accuracy. In contrast, no significant effects of irrelevant load were observed. These findings demonstrate that, during WM updating, alpha power reflects maintenance of relevant memory contents rather than suppression of no-longer-relevant memory traces.

  15. RPAS ACCURACY TESTING FOR USING IT IN THE CADASTRE OF REAL ESTATES OF THE CZECH REPUBLIC

    Directory of Open Access Journals (Sweden)

    E. Housarová

    2016-06-01

    Full Text Available In the last few years, interest in the collection of data using remotely piloted aircraft systems (RPAS has sharply risen. RPAS technology has a very wide area of use; one of its main advantages is its accuracy, timeliness of data, frequency of collecting data and low operating costs. RPAS can be used for the mapping of small, dangerous and inaccessible areas in contrast with ordinary aerial photogrammetry. In the cadastre of real estates of the Czech Republic, it is possible to map out areas by using aerial photogrammetry, so it has been done in the past. However, this is a relatively expensive and complex technology, and therefore we are looking for new alternatives. An alternative would be to use RPAS technology for data acquisition. The testing of the possibility of using RPAS for the cadastre of real estates of the Czech Republic is the subject of this paper. When evaluating results we compared point coordinates measured by geodetic method, GNSS technology and RPAS technology.

  16. Cadastral Database Positional Accuracy Improvement

    Science.gov (United States)

    Hashim, N. M.; Omar, A. H.; Ramli, S. N. M.; Omar, K. M.; Din, N.

    2017-10-01

    Positional Accuracy Improvement (PAI) is the refining process of the geometry feature in a geospatial dataset to improve its actual position. This actual position relates to the absolute position in specific coordinate system and the relation to the neighborhood features. With the growth of spatial based technology especially Geographical Information System (GIS) and Global Navigation Satellite System (GNSS), the PAI campaign is inevitable especially to the legacy cadastral database. Integration of legacy dataset and higher accuracy dataset like GNSS observation is a potential solution for improving the legacy dataset. However, by merely integrating both datasets will lead to a distortion of the relative geometry. The improved dataset should be further treated to minimize inherent errors and fitting to the new accurate dataset. The main focus of this study is to describe a method of angular based Least Square Adjustment (LSA) for PAI process of legacy dataset. The existing high accuracy dataset known as National Digital Cadastral Database (NDCDB) is then used as bench mark to validate the results. It was found that the propose technique is highly possible for positional accuracy improvement of legacy spatial datasets.

  17. Classification Accuracy Is Not Enough

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    A recent review of the research literature evaluating music genre recognition (MGR) systems over the past two decades shows that most works (81\\%) measure the capacity of a system to recognize genre by its classification accuracy. We show here, by implementing and testing three categorically...

  18. The hidden KPI registration accuracy.

    Science.gov (United States)

    Shorrosh, Paul

    2011-09-01

    Determining the registration accuracy rate is fundamental to improving revenue cycle key performance indicators. A registration quality assurance (QA) process allows errors to be corrected before bills are sent and helps registrars learn from their mistakes. Tools are available to help patient access staff who perform registration QA manually.

  19. The effect of pre-existing mental health comorbidities on the stage at diagnosis and timeliness of care of solid tumor malignances in a Veterans Affairs (VA) medical center

    International Nuclear Information System (INIS)

    Wadia, Roxanne J; Yao, Xiaopan; Deng, Yanhong; Li, Jia; Maron, Steven; Connery, Donna; Gunduz-Bruce, Handan; Rose, Michal G

    2015-01-01

    There are limited data on the impact of mental health comorbidities (MHC) on stage at diagnosis and timeliness of cancer care. Axis I MHC affect approximately 30% of Veterans receiving care within the Veterans Affairs (VA) system. The purpose of this study was to compare stage at diagnosis and timeliness of care of solid tumor malignancies among Veterans with and without MHC. We performed a retrospective analysis of 408 charts of Veterans with colorectal, urothelial, and head/neck cancer diagnosed and treated at VA Connecticut Health Care System (VACHS) between 2008 and 2011. We collected demographic data, stage at diagnosis, medical and mental health co-morbidities, treatments received, key time intervals, and number of appointments missed. The study was powered to assess for stage migration of 15–20% from Stage I/II to Stage III/IV. There was no significant change in stage distribution for patients with and without MHC in the entire study group (p = 0.9442) and in each individual tumor type. There were no significant differences in the time intervals from onset of symptoms to initiation of treatment between patients with and without MHC (p = 0.1135, 0.2042 and 0.2352, respectively). We conclude that at VACHS, stage at diagnosis for patients with colorectal, urothelial and head and neck cancers did not differ significantly between patients with and without MHC. Patients with MHC did not experience significant delays in care. Our study indicates that in a medical system in which mental health is integrated into routine care, patients with Axis I MHC do not experience delays in cancer care

  20. Accuracy in Optical Information Processing

    Science.gov (United States)

    Timucin, Dogan Aslan

    Low computational accuracy is an important obstacle for optical processors which blocks their way to becoming a practical reality and a serious challenger for classical computing paradigms. This research presents a comprehensive solution approach to the problem of accuracy enhancement in discrete analog optical information processing systems. Statistical analysis of a generic three-plane optical processor is carried out first, taking into account the effects of diffraction, interchannel crosstalk, and background radiation. Noise sources included in the analysis are photon, excitation, and emission fluctuations in the source array, transmission and polarization fluctuations in the modulator, and photoelectron, gain, dark, shot, and thermal noise in the detector array. Means and mutual coherence and probability density functions are derived for both optical and electrical output signals. Next, statistical models for a number of popular optoelectronic devices are studied. Specific devices considered here are light-emitting and laser diode sources, an ideal noiseless modulator and a Gaussian random-amplitude-transmittance modulator, p-i-n and avalanche photodiode detectors followed by electronic postprocessing, and ideal free-space geometrical -optics propagation and single-lens imaging systems. Output signal statistics are determined for various interesting device combinations by inserting these models into the general formalism. Finally, based on these special-case output statistics, results on accuracy limitations and enhancement in optical processors are presented. Here, starting with the formulation of the accuracy enhancement problem as (1) an optimal detection problem and (2) as a parameter estimation problem, the potential accuracy improvements achievable via the classical multiple-hypothesis -testing and maximum likelihood and Bayesian parameter estimation methods are demonstrated. Merits of using proper normalizing transforms which can potentially stabilize

  1. Collective animal decisions: preference conflict and decision accuracy.

    Science.gov (United States)

    Conradt, Larissa

    2013-12-06

    Social animals frequently share decisions that involve uncertainty and conflict. It has been suggested that conflict can enhance decision accuracy. In order to judge the practical relevance of such a suggestion, it is necessary to explore how general such findings are. Using a model, I examine whether conflicts between animals in a group with respect to preferences for avoiding false positives versus avoiding false negatives could, in principle, enhance the accuracy of collective decisions. I found that decision accuracy nearly always peaked when there was maximum conflict in groups in which individuals had different preferences. However, groups with no preferences were usually even more accurate. Furthermore, a relatively slight skew towards more animals with a preference for avoiding false negatives decreased the rate of expected false negatives versus false positives considerably (and vice versa), while resulting in only a small loss of decision accuracy. I conclude that in ecological situations in which decision accuracy is crucial for fitness and survival, animals cannot 'afford' preferences with respect to avoiding false positives versus false negatives. When decision accuracy is less crucial, animals might have such preferences. A slight skew in the number of animals with different preferences will result in the group avoiding that type of error more that the majority of group members prefers to avoid. The model also indicated that knowing the average success rate ('base rate') of a decision option can be very misleading, and that animals should ignore such base rates unless further information is available.

  2. Profiles of Dialogue for Relevance

    Directory of Open Access Journals (Sweden)

    Douglas Walton

    2016-12-01

    Full Text Available This paper uses argument diagrams, argumentation schemes, and some tools from formal argumentation systems developed in artificial intelligence to build a graph-theoretic model of relevance shown to be applicable (with some extensions as a practical method for helping a third party judge issues of relevance or irrelevance of an argument in real examples. Examples used to illustrate how the method works are drawn from disputes about relevance in natural language discourse, including a criminal trial and a parliamentary debate.

  3. Evaluation of Callable Bonds: Finite Difference Methods, Stability and Accuracy.

    OpenAIRE

    Buttler, Hans-Jurg

    1995-01-01

    The purpose of this paper is to evaluate numerically the semi-American callable bond by means of finite difference methods. This study implies three results. First, the numerical error is greater for the callable bond price than for the straight bond price, and too large for real applications Secondly, the numerical accuracy of the callable bond price computed for the relevant range of interest rates depends entirely on the finite difference scheme which is chosen for the boundary points. Thi...

  4. High accuracy wavelength calibration for a scanning visible spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Scotti, Filippo; Bell, Ronald E. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

    2010-10-15

    Spectroscopic applications for plasma velocity measurements often require wavelength accuracies {<=}0.2 A. An automated calibration, which is stable over time and environmental conditions without the need to recalibrate after each grating movement, was developed for a scanning spectrometer to achieve high wavelength accuracy over the visible spectrum. This method fits all relevant spectrometer parameters using multiple calibration spectra. With a stepping-motor controlled sine drive, an accuracy of {approx}0.25 A has been demonstrated. With the addition of a high resolution (0.075 arc sec) optical encoder on the grating stage, greater precision ({approx}0.005 A) is possible, allowing absolute velocity measurements within {approx}0.3 km/s. This level of precision requires monitoring of atmospheric temperature and pressure and of grating bulk temperature to correct for changes in the refractive index of air and the groove density, respectively.

  5. Relevance theory: pragmatics and cognition.

    Science.gov (United States)

    Wearing, Catherine J

    2015-01-01

    Relevance Theory is a cognitively oriented theory of pragmatics, i.e., a theory of language use. It builds on the seminal work of H.P. Grice(1) to develop a pragmatic theory which is at once philosophically sensitive and empirically plausible (in both psychological and evolutionary terms). This entry reviews the central commitments and chief contributions of Relevance Theory, including its Gricean commitment to the centrality of intention-reading and inference in communication; the cognitively grounded notion of relevance which provides the mechanism for explaining pragmatic interpretation as an intention-driven, inferential process; and several key applications of the theory (lexical pragmatics, metaphor and irony, procedural meaning). Relevance Theory is an important contribution to our understanding of the pragmatics of communication. © 2014 John Wiley & Sons, Ltd.

  6. Clinical relevance in anesthesia journals

    DEFF Research Database (Denmark)

    Lauritsen, Jakob; Møller, Ann M

    2006-01-01

    The purpose of this review is to present the latest knowledge and research on the definition and distribution of clinically relevant articles in anesthesia journals. It will also discuss the importance of the chosen methodology and outcome of articles.......The purpose of this review is to present the latest knowledge and research on the definition and distribution of clinically relevant articles in anesthesia journals. It will also discuss the importance of the chosen methodology and outcome of articles....

  7. Classification Accuracy Increase Using Multisensor Data Fusion

    Science.gov (United States)

    Makarau, A.; Palubinskas, G.; Reinartz, P.

    2011-09-01

    The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to

  8. 29 CFR 1611.14 - Exemptions-Office of Inspector General Files.

    Science.gov (United States)

    2010-07-01

    ... determine relevance or necessity of information in the early stages of an investigation. The value of such... its investigations attempting to resolve questions of accuracy, relevance, timeliness and completeness. (4) From subsection (e)(1), because it is often impossible to determine relevance or necessity of...

  9. Shippingport: A relevant decommissioning project

    International Nuclear Information System (INIS)

    Crimi, F.P.

    1988-01-01

    Because of Shippingport's low electrical power rating (72 MWe), there has been some misunderstanding on the relevancy of the Shippingport Station Decommissioning Project (SSDP) to a modern 1175 MWe commercial pressurized water reactor (PWR) power station. This paper provides a comparison of the major components of the reactor plant of the 72 MWe Shippingport Atomic Power Station and an 1175 MWe nuclear plant and the relevancy of the Shippingport decommissioning as a demonstration project for the nuclear industry. For the purpose of this comparison, Portland General Electric Company's 1175 MWe Trojan Nuclear Plant at Rainier, Oregon, has been used as the reference nuclear power plant. 2 refs., 2 figs., 1 tab

  10. Serum albumin: accuracy and clinical use.

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2013-04-18

    Albumin is the major plasma protein and its determination is used for the prognostic assessment of several diseases. Clinical guidelines call for monitoring of serum albumin with specific target cut-offs that are independent of the assay used. This requires accurate and equivalent results among different commercially available methods (i.e., result standardization) through a consistent definition and application of a reference measurement system. This should be associated with the definition of measurement uncertainty goals based on medical relevance of serum albumin to make results reliable for patient management. In this paper, we show that, in the current situation, if one applies analytical goals for serum albumin measurement derived from its biologic variation, the uncertainty budget derived from each step of the albumin traceability chain is probably too high to fulfil established quality levels for albumin measurement and to guarantee the accuracy needed for clinical usefulness of the test. The situation is further worsened if non-specific colorimetric methods are used for albumin measurement as they represent an additional random source of uncertainty. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Dramatic lives and relevant becomings

    DEFF Research Database (Denmark)

    Henriksen, Ann-Karina; Miller, Jody

    2012-01-01

    of marginality into positions of relevance. The analysis builds on empirical data from Copenhagen, Denmark, gained through ethnographic fieldwork with the participation of 20 female informants aged 13–22. The theoretical contribution proposes viewing conflicts as multi-linear, multi-causal and non...

  12. Regularization in Matrix Relevance Learning

    NARCIS (Netherlands)

    Schneider, Petra; Bunte, Kerstin; Stiekema, Han; Hammer, Barbara; Villmann, Thomas; Biehl, Michael

    A In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can

  13. Unfamiliar voice identification: Effect of post-event information on accuracy and voice ratings

    Directory of Open Access Journals (Sweden)

    Harriet Mary Jessica Smith

    2014-04-01

    Full Text Available This study addressed the effect of misleading post-event information (PEI on voice ratings, identification accuracy, and confidence, as well as the link between verbal recall and accuracy. Participants listened to a dialogue between male and female targets, then read misleading information about voice pitch. Participants engaged in verbal recall, rated voices on a feature checklist, and made a lineup decision. Accuracy rates were low, especially on target-absent lineups. Confidence and accuracy were unrelated, but the number of facts recalled about the voice predicted later lineup accuracy. There was a main effect of misinformation on ratings of target voice pitch, but there was no effect on identification accuracy or confidence ratings. As voice lineup evidence from earwitnesses is used in courts, the findings have potential applied relevance.

  14. Data accuracy assessment using enterprise architecture

    Science.gov (United States)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  15. The Improved Relevance Voxel Machine

    DEFF Research Database (Denmark)

    Ganz, Melanie; Sabuncu, Mert; Van Leemput, Koen

    The concept of sparse Bayesian learning has received much attention in the machine learning literature as a means of achieving parsimonious representations of features used in regression and classification. It is an important family of algorithms for sparse signal recovery and compressed sensing....... Hence in its current form it is reminiscent of a greedy forward feature selection algorithm. In this report, we aim to solve the problems of the original RVoxM algorithm in the spirit of [7] (FastRVM).We call the new algorithm Improved Relevance Voxel Machine (IRVoxM). Our contributions...... and enables basis selection from overcomplete dictionaries. One of the trailblazers of Bayesian learning is MacKay who already worked on the topic in his PhD thesis in 1992 [1]. Later on Tipping and Bishop developed the concept of sparse Bayesian learning [2, 3] and Tipping published the Relevance Vector...

  16. Comparative Dose Accuracy of Durable and Patch Insulin Infusion Pumps

    Science.gov (United States)

    Jahn, Luis G.; Capurro, Jorge J.; Levy, Brian L.

    2013-01-01

    Background: As all major insulin pump manufacturers comply with the international infusion pump standard EN 60601-2-24:1998, there may be a general assumption that all pumps are equal in insulin-delivery accuracy. This research investigates single-dose and averaged-dose accuracy of incremental basal deliveries for one patch model and three durable models of insulin pumps. Method: For each pump model, discrete single doses delivered during 0.5 U/h basal rate infusion over a 20 h period were measured using a time-stamped microgravimetric system. Dose accuracy was analyzed by comparing single doses and time-averaged doses to specific accuracy thresholds (±5% to ±30%). Results: The percentage of single doses delivered outside accuracy thresholds of ±5%, ±10%, and ±20% were as follows: Animas OneTouch® Ping® (43.2%, 14.3%, and 1.8%, respectively), Roche Accu-Chek® Combo (50.6%, 24.4%, and 5.5%), Medtronic Paradigm® RevelTM/VeoTM (54.2%, 26.7%, and 6.6%), and Insulet OmniPod® (79.1%, 60.5%, and 34.9%). For 30 min, 1 h, and 2 h averaging windows, the percentage of doses delivered outside a ±15% accuracy were as follows: OneTouch Ping (1.0%, 0.4%, and 0%, respectively), Accu-Chek Combo (4.2%, 3.5%, and 3.1%), Paradigm Revel/Veo (3.9%, 3.1%, and 2.2%), and OmniPod (33.9%, 19.9%, and 10.3%). Conclusions: This technical evaluation demonstrates significant differences in single-dose and averaged-dose accuracy among the insulin pumps tested. Differences in dose accuracy were most evident between the patch pump model and the group of durable pump models. Of the pumps studied, the Animas OneTouch Ping demonstrated the best single-dose and averaged-dose accuracy. Further research on the clinical relevance of these findings is warranted. PMID:23911184

  17. Diagnostic accuracy of sonography for pleural effusion: systematic review

    Directory of Open Access Journals (Sweden)

    Alexandre Grimberg

    Full Text Available CONTEXT AND OBJECTIVE: The initial method for evaluating the presence of pleural effusion was chest radiography. Isolated studies have shown that sonography has greater accuracy than radiography for this diagnosis; however, no systematic reviews on this matter are available in the literature. Thus, the aim of this study was to evaluate the accuracy of sonography in detecting pleural effusion, by means of a systematic review of the literature. DESIGN AND SETTING: This was a systematic review with meta-analysis on accuracy studies. This study was conducted in the Department of Diagnostic Imaging and in the Brazilian Cochrane Center, Discipline of Emergency Medicine and Evidence-Based Medicine, Department of Medicine, Universidade Federal de São Paulo (Unifesp, São Paulo, Brazil. METHOD: The following databases were searched: Cochrane Library, Medline, Web of Science, Embase and Literatura Latino-Americana e do Caribe em Ciências da Saúde (Lilacs. The references of relevant studies were also screened for additional citations of interest. Studies in which the accuracy of sonography for detecting pleural effusion was tested, with an acceptable reference standard (computed tomography or thoracic drainage, were included. RESULTS: Four studies were included. All of them showed that sonography had high sensitivity, specificity and accuracy for detecting pleural effusions. The mean sensitivity was 93% (95% confidence interval, CI: 89% to 96%, and specificity was 96% (95% CI: 95% to 98%. CONCLUSIONS: In different populations and clinical settings, sonography showed consistently high sensitivity, specificity and accuracy for detecting fluid in the pleural space.

  18. Technical accuracy in historical writing

    International Nuclear Information System (INIS)

    Taylor, L.S.

    1981-01-01

    A guest editorial is presented on the question of accuracy in the writing of radiation protection history. The author has written several books and articles dealing with various aspects of the development of radiation protection standards and philosophy; some of his own minor errors which have been picked up and frequently repeated are confessed. The author also outlines some of the general faults he has encountered in other articles on the subject. A common complaint is that many writers give source references without checking back to the original sources which leads to much carelessness and misunderstanding in technical writing. In addition, some writers all too frequently refer mainly to review articles which can be especially troublesome if the review is of the interpretative type. The limited outlook of some writers is also deplored in that the scope of the literature referred to is often limited to the author's country. A few glaring examples of factual errors encountered in various radiation protection articles are highlighted; these errors have since been repeated in subsequent review articles. (U.K.)

  19. Mammography: Technique and diagnostic accuracy

    International Nuclear Information System (INIS)

    Kim, Chung Ja; Bahk, Yong Whee; Lee, Don Young

    1974-01-01

    Mammography is now in world wide use, But this has received rather scanty attention in Korea. The purposes of the present communication are twofold: (1) Detailing of technical and photographic aspects of mam in ography and (2) an assessment of its diagnostic accuracy as experienced by us. The clinical materials consisted of 88 cases of mammography performed at the Department of Radiology, St. Mary's Hospital, Catholic Medical College during the 2 years-period from April 1972. We used nonscreen type mammographic or industrial fine- grain films, and a special mammographic device that can be attached to any of the ordinary radiographic machine. Technical factors are shown in Table II. Of 88 cases 19 were operated on or biopsied. There were 7 cases of carcinoma. 8 cases of inflammatory diseases, and 4 cases of benign tumor. Mammographic diagnosis was correct in 85.7% of carcinoma and 87.5% of inflammatory diseases. One misdiagnosis of 7 cases of carcinoma was turned out to be cystosarcoma phylloides. Of 4 cases of benign tumors 2 were correctly diagnosed, and the other 2 mistaken for either inflammatory disease or simple lactating breast. However, none of the benign conditions were diagnosed as malignant process. We found that nonscreen type mammographic or industrial fine-grain films, and hand-processing were necessary in obtaining the mammograms of desirable quality

  20. Accuracy of recumbent height measurement.

    Science.gov (United States)

    Gray, D S; Crider, J B; Kelley, C; Dickinson, L C

    1985-01-01

    Since many patients requiring specialized nutritional support are bedridden, measurement of height for purposes of nutritional assessment or prescription must often be done with the patient in bed. This study examined the accuracy of measuring body height in bed in the supine position. Two measurements were performed on 108 ambulatory inpatients: (1) standing height using a standard height-weight scale, and (2) bed height using a flexible tape. Patients were divided into four groups based on which of two researchers performed each of the two measurements. Each patient was also weighed and self-reported height, weight, sex, and age were recorded. Bed height was significantly longer than standing height by 3.68 cm, but the two measurements were equally precise. It was believed, however, that this 2% difference was probably not clinically significant in most circumstances. Bed height correlated highly with standing height (r = 0.95), and the regression equation was standing height = 13.82 +/- 0.09 bed height. Patients overestimated their heights. Heights recorded by nurses were more accurate when patients were measured than when asked about their heights, but the patients were more often asked than measured.

  1. IGS polar motion measurement accuracy

    Directory of Open Access Journals (Sweden)

    Jim Ray

    2017-11-01

    Full Text Available We elaborate an error budget for the long-term accuracy of IGS (International Global Navigation Satellite System Service polar motion estimates, concluding that it is probably about 25–30 μas (1-sigma overall, although it is not possible to quantify possible contributions (mainly annual that might transfer directly from aliases of subdaily rotational tide errors. The leading sources are biases arising from the need to align daily, observed terrestrial frames, within which the pole coordinates are expressed and which are continuously deforming, to the secular, linear international reference frame. Such biases are largest over spans longer than about a year. Thanks to the very large number of IGS tracking stations, the formal covariance errors are much smaller, around 5 to 10 μas. Large networks also permit the systematic frame-related errors to be more effectively minimized but not eliminated. A number of periodic errors probably also influence polar motion results, mainly at annual, GPS (Global Positioning System draconitic, and fortnightly periods, but their impact on the overall error budget is unlikely to be significant except possibly for annual tidal aliases. Nevertheless, caution should be exercised in interpreting geophysical excitations near any of the suspect periods.

  2. Accuracy limit of rigid 3-point water models

    Science.gov (United States)

    Izadi, Saeed; Onufriev, Alexey V.

    2016-08-01

    Classical 3-point rigid water models are most widely used due to their computational efficiency. Recently, we introduced a new approach to constructing classical rigid water models [S. Izadi et al., J. Phys. Chem. Lett. 5, 3863 (2014)], which permits a virtually exhaustive search for globally optimal model parameters in the sub-space that is most relevant to the electrostatic properties of the water molecule in liquid phase. Here we apply the approach to develop a 3-point Optimal Point Charge (OPC3) water model. OPC3 is significantly more accurate than the commonly used water models of same class (TIP3P and SPCE) in reproducing a comprehensive set of liquid bulk properties, over a wide range of temperatures. Beyond bulk properties, we show that OPC3 predicts the intrinsic charge hydration asymmetry (CHA) of water — a characteristic dependence of hydration free energy on the sign of the solute charge — in very close agreement with experiment. Two other recent 3-point rigid water models, TIP3PFB and H2ODC, each developed by its own, completely different optimization method, approach the global accuracy optimum represented by OPC3 in both the parameter space and accuracy of bulk properties. Thus, we argue that an accuracy limit of practical 3-point rigid non-polarizable models has effectively been reached; remaining accuracy issues are discussed.

  3. Enabling multi-level relevance feedback on PubMed by integrating rank learning into DBMS.

    Science.gov (United States)

    Yu, Hwanjo; Kim, Taehoon; Oh, Jinoh; Ko, Ilhwan; Kim, Sungchul; Han, Wook-Shin

    2010-04-16

    Finding relevant articles from PubMed is challenging because it is hard to express the user's specific intention in the given query interface, and a keyword query typically retrieves a large number of results. Researchers have applied machine learning techniques to find relevant articles by ranking the articles according to the learned relevance function. However, the process of learning and ranking is usually done offline without integrated with the keyword queries, and the users have to provide a large amount of training documents to get a reasonable learning accuracy. This paper proposes a novel multi-level relevance feedback system for PubMed, called RefMed, which supports both ad-hoc keyword queries and a multi-level relevance feedback in real time on PubMed. RefMed supports a multi-level relevance feedback by using the RankSVM as the learning method, and thus it achieves higher accuracy with less feedback. RefMed "tightly" integrates the RankSVM into RDBMS to support both keyword queries and the multi-level relevance feedback in real time; the tight coupling of the RankSVM and DBMS substantially improves the processing time. An efficient parameter selection method for the RankSVM is also proposed, which tunes the RankSVM parameter without performing validation. Thereby, RefMed achieves a high learning accuracy in real time without performing a validation process. RefMed is accessible at http://dm.postech.ac.kr/refmed. RefMed is the first multi-level relevance feedback system for PubMed, which achieves a high accuracy with less feedback. It effectively learns an accurate relevance function from the user's feedback and efficiently processes the function to return relevant articles in real time.

  4. Microdosing: Concept, application and relevance

    Directory of Open Access Journals (Sweden)

    Tushar Tewari

    2010-01-01

    Full Text Available The use of microdose pharmacokinetic studies as an essential tool in drug development is still to catch on. While this approach promises potential cost savings and a quantum leap in efficiencies of the drug development process, major hurdles still need to be overcome before the technique becomes commonplace and part of routine practice. Clear regulations in Europe and the USA have had an enabling effect. The lack of enabling provisions for microdosing studies in Indian regulation, despite low risk and manifest relevance for the local drug development industry, is inconsistent with the country′s aspirations to be among the leaders in pharmaceutical research.

  5. Key Performance Indicators and Analysts' Earnings Forecast Accuracy: An Application of Content Analysis

    OpenAIRE

    Alireza Dorestani; Zabihollah Rezaee

    2011-01-01

    We examine the association between the extent of change in key performance indicator (KPI) disclosures and the accuracy of forecasts made by analysts. KPIs are regarded as improving both the transparency and relevancy of public financial information. The results of using linear regression models show that contrary to our prediction and the hypothesis of this paper, there is no significant association between the change in non- financial KPI disclosures and the accuracy of analysts' forecasts....

  6. Effects of accuracy motivation and anchoring on metacomprehension judgment and accuracy.

    Science.gov (United States)

    Zhao, Qin

    2012-01-01

    The current research investigates how accuracy motivation impacts anchoring and adjustment in metacomprehension judgment and how accuracy motivation and anchoring affect metacomprehension accuracy. Participants were randomly assigned to one of six conditions produced by the between-subjects factorial design involving accuracy motivation (incentive or no) and peer performance anchor (95%, 55%, or no). Two studies showed that accuracy motivation did not impact anchoring bias, but the adjustment-from-anchor process occurred. Accuracy incentive increased anchor-judgment gap for the 95% anchor but not for the 55% anchor, which induced less certainty about the direction of adjustment. The findings offer support to the integrative theory of anchoring. Additionally, the two studies revealed a "power struggle" between accuracy motivation and anchoring in influencing metacomprehension accuracy. Accuracy motivation could improve metacomprehension accuracy in spite of anchoring effect, but if anchoring effect is too strong, it could overpower the motivation effect. The implications of the findings were discussed.

  7. 24 CFR 2003.8 - General exemptions.

    Science.gov (United States)

    2010-04-01

    ... its investigations attempting to resolve questions of accuracy, relevance, timeliness and completeness. (4) From subsection (e)(1), because it is often impossible to determine relevance or necessity of information in the early stages of an investigation. The value of such information is a question of judgment...

  8. 24 CFR 2003.9 - Specific exemptions.

    Science.gov (United States)

    2010-04-01

    ... attempting to resolve questions of accuracy, relevance, timeliness and completeness. (4) From subsection (e)(1), because it is often impossible to determine relevance or necessity of information in the early stages of an investigation. The value of such information is a question of judgment and timing; what...

  9. 24 CFR 16.15 - Specific exemptions.

    Science.gov (United States)

    2010-04-01

    ... eligibility attempting to resolve questions of accuracy, relevance, timeliness and completeness. (4) From subsection (e)(1) because it is often impossible to determine relevance or necessity of information in pre-investigative early stages. The value of such information is a question of judgment and timing; what appears...

  10. Accuracies Of Optical Processors For Adaptive Optics

    Science.gov (United States)

    Downie, John D.; Goodman, Joseph W.

    1992-01-01

    Paper presents analysis of accuracies and requirements concerning accuracies of optical linear-algebra processors (OLAP's) in adaptive-optics imaging systems. Much faster than digital electronic processor and eliminate some residual distortion. Question whether errors introduced by analog processing of OLAP overcome advantage of greater speed. Paper addresses issue by presenting estimate of accuracy required in general OLAP that yields smaller average residual aberration of wave front than digital electronic processor computing at given speed.

  11. Other relevant numerical modelling papers

    International Nuclear Information System (INIS)

    Chartier, M.

    1989-01-01

    The ocean modelling is a rapidly evolving science and a large number of results have been published. Several categories of papers are of particular interest for this review: the papers published by the international atomic institutions, such as the NEA (for the CRESP or Subseabed Programs), the IAEA (for example the Safety Series, the Technical Report Series or the TECDOC), and the ICRP, and the papers concerned by more fundamental research, which are published in specific scientific literature. This paper aims to list some of the most relevant publications for the CRESP purposes. It means by no way to be exhaustive, but informative on the incontestable progress recently achieved in that field. One should note that some of these papers are so recent that their final version has not yet been published

  12. Industrial relevance of thermophilic Archaea.

    Science.gov (United States)

    Egorova, Ksenia; Antranikian, Garabed

    2005-12-01

    The dramatic increase of newly isolated extremophilic microorganisms, analysis of their genomes and investigations of their enzymes by academic and industrial laboratories demonstrate the great potential of extremophiles in industrial (white) biotechnology. Enzymes derived from extremophiles (extremozymes) are superior to the traditional catalysts because they can perform industrial processes even under harsh conditions, under which conventional proteins are completely denatured. In particular, enzymes from thermophilic and hyperthermophilic Archaea have industrial relevance. Despite intensive investigations, our knowledge of the structure-function relationships of their enzymes is still limited. Information concerning the molecular properties of their enzymes and genes has to be obtained to be able to understand the mechanisms that are responsible for catalytic activity and stability at the boiling point of water.

  13. The Relevance of Hegel's Logic

    Directory of Open Access Journals (Sweden)

    John W Burbidge

    2007-12-01

    Full Text Available Hegel defines his Logic as the science that thinks about thinking.nbsp; But when we interpret that work as outlining what happens when we reason we are vulnerable to Fregersquo;s charge of psychologism.nbsp; I use Hegelrsquo;s tripartite distinction among understanding, dialectical and speculative reason as operations of pure thought to suggest how thinking can work with objective concepts.nbsp; In the last analysis, however, our ability to move from the subjective contingency of representations and ideas to the pure concepts we think develops from mechanical memory, which separates sign from sense so hat we can focus simply on the latter.nbsp; By becoming aware of the connections that underlie our thinking processes we may be able to both move beyond the abstractions of symbolic logic and clarify what informal logicians call relevance.

  14. Accuracy and precision of oscillometric blood pressure in standing conscious horses

    DEFF Research Database (Denmark)

    Olsen, Emil; Pedersen, Tilde Louise Skovgaard; Robinson, Rebecca

    2016-01-01

    from a teaching and research herd. HYPOTHESIS/OBJECTIVE: To evaluate the accuracy and precision of systolic arterial pressure (SAP), diastolic arterial pressure (DAP), and mean arterial pressure (MAP) in conscious horses obtained with an oscillometric NIBP device when compared to invasively measured...... administration. Agreement analysis with replicate measures was utilized to calculate bias (accuracy) and standard deviation (SD) of bias (precision). RESULTS: A total of 252 pairs of invasive arterial BP and NIBP measurements were analyzed. Compared to the direct BP measures, the NIBP MAP had an accuracy of -4...... mm Hg and precision of 10 mm Hg. SAP had an accuracy of -8 mm Hg and a precision of 17 mm Hg and DAP had an accuracy of -7 mm Hg and a precision of 14 mm Hg. CONCLUSIONS AND CLINICAL RELEVANCE: MAP from the evaluated NIBP monitor is accurate and precise in the adult horse across a range of BP...

  15. Quality of reporting of diagnostic accuracy studies

    NARCIS (Netherlands)

    Smidt, N.; Rutjes, A.W.; Windt - Mens, van der D.A.W.M.; Ostelo, R.W.J.G.; Reitsma, J.B.; Bouter, L.M.; Vet, de H.C.W.

    2005-01-01

    PURPOSE: To evaluate quality of reporting in diagnostic accuracy articles published in 2000 in journals with impact factor of at least 4 by using items of Standards for Reporting of Diagnostic Accuracy (STARD) statement published later in 2003. MATERIALS AND METHODS: English-language articles on

  16. Accuracy of Parent Identification of Stuttering Occurrence

    Science.gov (United States)

    Einarsdottir, Johanna; Ingham, Roger

    2009-01-01

    Background: Clinicians rely on parents to provide information regarding the onset and development of stuttering in their own children. The accuracy and reliability of their judgments of stuttering is therefore important and is not well researched. Aim: To investigate the accuracy of parent judgements of stuttering in their own children's speech…

  17. Improving Accuracy of Processing Through Active Control

    Directory of Open Access Journals (Sweden)

    N. N. Barbashov

    2016-01-01

    Full Text Available An important task of modern mathematical statistics with its methods based on the theory of probability is a scientific estimate of measurement results. There are certain costs under control, and under ineffective control when a customer has got defective products these costs are significantly higher because of parts recall.When machining the parts, under the influence of errors a range scatter of part dimensions is offset towards the tolerance limit. To improve a processing accuracy and avoid defective products involves reducing components of error in machining, i.e. to improve the accuracy of machine and tool, tool life, rigidity of the system, accuracy of the adjustment. In a given time it is also necessary to adapt machine.To improve an accuracy and a machining rate there, currently  become extensively popular various the in-process gaging devices and controlled machining that uses adaptive control systems for the process monitoring. Improving the accuracy in this case is compensation of a majority of technological errors. The in-cycle measuring sensors (sensors of active control allow processing accuracy improvement by one or two quality and provide a capability for simultaneous operation of several machines.Efficient use of in-cycle measuring sensors requires development of methods to control the accuracy through providing the appropriate adjustments. Methods based on the moving average, appear to be the most promising for accuracy control since they include data on the change in some last measured values of the parameter under control.

  18. Vygotsky's Crisis: Argument, context, relevance.

    Science.gov (United States)

    Hyman, Ludmila

    2012-06-01

    Vygotsky's The Historical Significance of the Crisis in Psychology (1926-1927) is an important text in the history and philosophy of psychology that has only become available to scholars in 1982 in Russian, and in 1997 in English. The goal of this paper is to introduce Vygotsky's conception of psychology to a wider audience. I argue that Vygotsky's argument about the "crisis" in psychology and its resolution can be fully understood only in the context of his social and political thinking. Vygotsky shared the enthusiasm, widespread among Russian leftist intelligentsia in the 1920s, that Soviet society had launched an unprecedented social experiment: The socialist revolution opened the way for establishing social conditions that would let the individual flourish. For Vygotsky, this meant that "a new man" of the future would become "the first and only species in biology that would create itself." He envisioned psychology as a science that would serve this humanist teleology. I propose that The Crisis is relevant today insofar as it helps us define a fundamental problem: How can we systematically account for the development of knowledge in psychology? I evaluate how Vygotsky addresses this problem as a historian of the crisis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Relevance of equilibrium in multifragmentation

    International Nuclear Information System (INIS)

    Furuta, Takuya; Ono, Akira

    2009-01-01

    The relevance of equilibrium in a multifragmentation reaction of very central 40 Ca + 40 Ca collisions at 35 MeV/nucleon is investigated by using simulations of antisymmetrized molecular dynamics (AMD). Two types of ensembles are compared. One is the reaction ensemble of the states at each reaction time t in collision events simulated by AMD, and the other is the equilibrium ensemble prepared by solving the AMD equation of motion for a many-nucleon system confined in a container for a long time. The comparison of the ensembles is performed for the fragment charge distribution and the excitation energies. Our calculations show that there exists an equilibrium ensemble that well reproduces the reaction ensemble at each reaction time t for the investigated period 80≤t≤300 fm/c. However, there are some other observables that show discrepancies between the reaction and equilibrium ensembles. These may be interpreted as dynamical effects in the reaction. The usual static equilibrium at each instant is not realized since any equilibrium ensemble with the same volume as that of the reaction system cannot reproduce the fragment observables

  20. Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.

    Science.gov (United States)

    McNally, Shawna L; Donohue, Michael C; Newton, Kimberly P; Ogletree, Sandra P; Conner, Kristen K; Ingegneri, Sarah E; Kagnoff, Martin F

    2012-04-04

    50% of the core celiac disease information that was considered important for inclusion on websites that provide general information about celiac disease. Academic websites were significantly less transparent (P = .005) than commercial websites in attributing authorship, timeliness of information, sources of information, and other important disclosures. The type of website publisher did not predict website accuracy, comprehensiveness, or overall website quality. Only 4 of 98 (4%) websites achieved an overall quality score of 80 or above, which a priori was set as the minimum score for a website to be judged trustworthy and reliable. The information on many websites addressing celiac disease was not sufficiently accurate, comprehensive, and transparent, or presented at an appropriate reading grade level, to be considered sufficiently trustworthy and reliable for patients, health care providers, celiac disease support groups, and the general public. This has the potential to adversely affect decision making about important aspects of celiac disease, including its appropriate and proper diagnosis, treatment, and management.

  1. Global discriminative learning for higher-accuracy computational gene prediction.

    Directory of Open Access Journals (Sweden)

    Axel Bernal

    2007-03-01

    Full Text Available Most ab initio gene predictors use a probabilistic sequence model, typically a hidden Markov model, to combine separately trained models of genomic signals and content. By combining separate models of relevant genomic features, such gene predictors can exploit small training sets and incomplete annotations, and can be trained fairly efficiently. However, that type of piecewise training does not optimize prediction accuracy and has difficulty in accounting for statistical dependencies among different parts of the gene model. With genomic information being created at an ever-increasing rate, it is worth investigating alternative approaches in which many different types of genomic evidence, with complex statistical dependencies, can be integrated by discriminative learning to maximize annotation accuracy. Among discriminative learning methods, large-margin classifiers have become prominent because of the success of support vector machines (SVM in many classification tasks. We describe CRAIG, a new program for ab initio gene prediction based on a conditional random field model with semi-Markov structure that is trained with an online large-margin algorithm related to multiclass SVMs. Our experiments on benchmark vertebrate datasets and on regions from the ENCODE project show significant improvements in prediction accuracy over published gene predictors that use intrinsic features only, particularly at the gene level and on genes with long introns.

  2. On the automated assessment of nuclear reactor systems code accuracy

    International Nuclear Information System (INIS)

    Kunz, Robert F.; Kasmala, Gerald F.; Mahaffy, John H.; Murray, Christopher J.

    2002-01-01

    An automated code assessment program (ACAP) has been developed to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. The tool provides a suite of metrics for quality of fit to specific data sets, and the means to produce one or more figures of merit (FOM) for a code, based on weighted averages of results from the batch execution of a large number of code-experiment and code-code data comparisons. Accordingly, this tool has the potential to significantly streamline the verification and validation (V and V) processes in NRS code development environments which are characterized by rapidly evolving software, many contributing developers and a large and growing body of validation data. In this paper, a survey of data conditioning and analysis techniques is summarized which focuses on their relevance to NRS code accuracy assessment. A number of methods are considered for their applicability to the automated assessment of the accuracy of NRS code simulations. A variety of data types and computational modeling methods are considered from a spectrum of mathematical and engineering disciplines. The goal of the survey was to identify needs, issues and techniques to be considered in the development of an automated code assessment procedure, to be used in United States Nuclear Regulatory Commission (NRC) advanced thermal-hydraulic T/H code consolidation efforts. The ACAP software was designed based in large measure on the findings of this survey. An overview of this tool is summarized and several NRS data applications are provided. The paper is organized as follows: The motivation for this work is first provided by background discussion that summarizes the relevance of this subject matter to the nuclear reactor industry. Next, the spectrum of NRS data types are classified into categories, in order to provide a basis for assessing individual comparison methods. Then, a summary of the survey is provided, where each

  3. Timeliness of Creative Subjects in Architecture Education

    Science.gov (United States)

    Vargot, T.

    2017-11-01

    The following article is about the problem of insufficient number of drawing and painting lessons delivered in the process of architectural education. There is a comparison between the education of successful architects of the past and modern times. The author stands for the importance of creative subjects being the essential part of development and education of future architects. Skills achieved during the study of creative subjects will be used not only as a mean of self-expression but as an instrument in the toolkit of a professional. Sergei Tchoban was taken as an example of a successful architect for whom the knowledge of a man-made drawing is very important. He arranges the contests of architectural drawings for students promoting creative development in this way. Nowadays, students tend to use computer programs to make architectural projects losing their individual approach. The creative process becomes a matter of scissors and paste being just a copy of something that already exists. The solution of the problem is the reconsideration of the department’s curriculum and adding extra hours for creative subjects.

  4. Timeliness of notification in infectious disease cases.

    OpenAIRE

    Domínguez, A; Coll, J J; Fuentes, M; Salleras, L

    1992-01-01

    Records of notification in cases of eight infectious diseases in the "Servei Territorial de Salut Publica" of the Province of Barcelona, Spain, between 1982 and 1986 were reviewed. Time from onset of symptoms to notification, time from notification to completion of data collection, and time from onset to completion of the case investigation were analyzed. For the period from onset to notification, the shortest mean was registered for meningococcal infection (6.31 days) and the longest was for...

  5. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  6. Effects of presentation modality on team awareness and choice accuracy in a simulated police team task

    NARCIS (Netherlands)

    Streefkerk, J.W.; Wiering, C.; Esch van-Bussemakers, M.; Neerincx, M.

    2008-01-01

    Team awareness is important when asking team members for assistance, for example in the police domain. This paper investigates how presentation modality (visual or auditory) of relevant team information and communication influences team awareness and choice accuracy in a collaborative team task. An

  7. Accuracy Limitations in Optical Linear Algebra Processors

    Science.gov (United States)

    Batsell, Stephen Gordon

    1990-01-01

    One of the limiting factors in applying optical linear algebra processors (OLAPs) to real-world problems has been the poor achievable accuracy of these processors. Little previous research has been done on determining noise sources from a systems perspective which would include noise generated in the multiplication and addition operations, noise from spatial variations across arrays, and from crosstalk. In this dissertation, we propose a second-order statistical model for an OLAP which incorporates all these system noise sources. We now apply this knowledge to determining upper and lower bounds on the achievable accuracy. This is accomplished by first translating the standard definition of accuracy used in electronic digital processors to analog optical processors. We then employ our second-order statistical model. Having determined a general accuracy equation, we consider limiting cases such as for ideal and noisy components. From the ideal case, we find the fundamental limitations on improving analog processor accuracy. From the noisy case, we determine the practical limitations based on both device and system noise sources. These bounds allow system trade-offs to be made both in the choice of architecture and in individual components in such a way as to maximize the accuracy of the processor. Finally, by determining the fundamental limitations, we show the system engineer when the accuracy desired can be achieved from hardware or architecture improvements and when it must come from signal pre-processing and/or post-processing techniques.

  8. Multiple sequence alignment accuracy and phylogenetic inference.

    Science.gov (United States)

    Ogden, T Heath; Rosenberg, Michael S

    2006-04-01

    Phylogenies are often thought to be more dependent upon the specifics of the sequence alignment rather than on the method of reconstruction. Simulation of sequences containing insertion and deletion events was performed in order to determine the role that alignment accuracy plays during phylogenetic inference. Data sets were simulated for pectinate, balanced, and random tree shapes under different conditions (ultrametric equal branch length, ultrametric random branch length, nonultrametric random branch length). Comparisons between hypothesized alignments and true alignments enabled determination of two measures of alignment accuracy, that of the total data set and that of individual branches. In general, our results indicate that as alignment error increases, topological accuracy decreases. This trend was much more pronounced for data sets derived from more pectinate topologies. In contrast, for balanced, ultrametric, equal branch length tree shapes, alignment inaccuracy had little average effect on tree reconstruction. These conclusions are based on average trends of many analyses under different conditions, and any one specific analysis, independent of the alignment accuracy, may recover very accurate or inaccurate topologies. Maximum likelihood and Bayesian, in general, outperformed neighbor joining and maximum parsimony in terms of tree reconstruction accuracy. Results also indicated that as the length of the branch and of the neighboring branches increase, alignment accuracy decreases, and the length of the neighboring branches is the major factor in topological accuracy. Thus, multiple-sequence alignment can be an important factor in downstream effects on topological reconstruction.

  9. Systematic review of discharge coding accuracy

    Science.gov (United States)

    Burns, E.M.; Rigby, E.; Mamidanna, R.; Bottle, A.; Aylin, P.; Ziprin, P.; Faiz, O.D.

    2012-01-01

    Introduction Routinely collected data sets are increasingly used for research, financial reimbursement and health service planning. High quality data are necessary for reliable analysis. This study aims to assess the published accuracy of routinely collected data sets in Great Britain. Methods Systematic searches of the EMBASE, PUBMED, OVID and Cochrane databases were performed from 1989 to present using defined search terms. Included studies were those that compared routinely collected data sets with case or operative note review and those that compared routinely collected data with clinical registries. Results Thirty-two studies were included. Twenty-five studies compared routinely collected data with case or operation notes. Seven studies compared routinely collected data with clinical registries. The overall median accuracy (routinely collected data sets versus case notes) was 83.2% (IQR: 67.3–92.1%). The median diagnostic accuracy was 80.3% (IQR: 63.3–94.1%) with a median procedure accuracy of 84.2% (IQR: 68.7–88.7%). There was considerable variation in accuracy rates between studies (50.5–97.8%). Since the 2002 introduction of Payment by Results, accuracy has improved in some respects, for example primary diagnoses accuracy has improved from 73.8% (IQR: 59.3–92.1%) to 96.0% (IQR: 89.3–96.3), P= 0.020. Conclusion Accuracy rates are improving. Current levels of reported accuracy suggest that routinely collected data are sufficiently robust to support their use for research and managerial decision-making. PMID:21795302

  10. Does relevance matter in academic policy research?

    DEFF Research Database (Denmark)

    Dredge, Dianne

    2015-01-01

    A reflection on whether relevance matters in tourism policy research. A debate among tourism scholars.......A reflection on whether relevance matters in tourism policy research. A debate among tourism scholars....

  11. Accuracy and precision in thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    Marshall, T.O.

    1984-01-01

    The question of accuracy and precision in thermoluminescent dosimetry, particularly in relation to lithium fluoride phosphor, is discussed. The more important sources of error, including those due to the detectors, the reader, annealing and dosemeter design, are identified and methods of reducing their effects on accuracy and precision to a minimum are given. Finally, the accuracy and precision achievable for three quite different applications are discussed, namely, for personal dosimetry, environmental monitoring and for the measurement of photon dose distributions in phantoms. (U.K.)

  12. A multiple relevance feedback strategy with positive and negative models.

    Directory of Open Access Journals (Sweden)

    Yunlong Ma

    Full Text Available A commonly used strategy to improve search accuracy is through feedback techniques. Most existing work on feedback relies on positive information, and has been extensively studied in information retrieval. However, when a query topic is difficult and the results from the first-pass retrieval are very poor, it is impossible to extract enough useful terms from a few positive documents. Therefore, the positive feedback strategy is incapable to improve retrieval in this situation. Contrarily, there is a relatively large number of negative documents in the top of the result list, and it has been confirmed that negative feedback strategy is an important and useful way for adapting this scenario by several recent studies. In this paper, we consider a scenario when the search results are so poor that there are at most three relevant documents in the top twenty documents. Then, we conduct a novel study of multiple strategies for relevance feedback using both positive and negative examples from the first-pass retrieval to improve retrieval accuracy for such difficult queries. Experimental results on these TREC collections show that the proposed language model based multiple model feedback method which is generally more effective than both the baseline method and the methods using only positive or negative model.

  13. BRIEF REPORT: Beyond Clinical Experience: Features of Data Collection and Interpretation That Contribute to Diagnostic Accuracy

    Science.gov (United States)

    Nendaz, Mathieu R; Gut, Anne M; Perrier, Arnaud; Louis-Simonet, Martine; Blondon-Choa, Katherine; Herrmann, François R; Junod, Alain F; Vu, Nu V

    2006-01-01

    BACKGROUND Clinical experience, features of data collection process, or both, affect diagnostic accuracy, but their respective role is unclear. OBJECTIVE, DESIGN Prospective, observational study, to determine the respective contribution of clinical experience and data collection features to diagnostic accuracy. METHODS Six Internists, 6 second year internal medicine residents, and 6 senior medical students worked up the same 7 cases with a standardized patient. Each encounter was audiotaped and immediately assessed by the subjects who indicated the reasons underlying their data collection. We analyzed the encounters according to diagnostic accuracy, information collected, organ systems explored, diagnoses evaluated, and final decisions made, and we determined predictors of diagnostic accuracy by logistic regression models. RESULTS Several features significantly predicted diagnostic accuracy after correction for clinical experience: early exploration of correct diagnosis (odds ratio [OR] 24.35) or of relevant diagnostic hypotheses (OR 2.22) to frame clinical data collection, larger number of diagnostic hypotheses evaluated (OR 1.08), and collection of relevant clinical data (OR 1.19). CONCLUSION Some features of data collection and interpretation are related to diagnostic accuracy beyond clinical experience and should be explicitly included in clinical training and modeled by clinical teachers. Thoroughness in data collection should not be considered a privileged way to diagnostic success. PMID:17105525

  14. Science and the struggle for relevance

    NARCIS (Netherlands)

    Hessels, L.K.|info:eu-repo/dai/nl/304832863

    2010-01-01

    This thesis deals with struggles for relevance of university researchers, their efforts to make their work correspond with ruling standards of relevance and to influence these standards. Its general research question is: How to understand changes in the struggle for relevance of Dutch academic

  15. The Personal Relevance of the Social Studies.

    Science.gov (United States)

    VanSickle, Ronald L.

    1990-01-01

    Conceptualizes a personal-relevance framework derived from Ronald L. VanSickle's five areas of life integrated with four general motivating goals from Abraham Maslow's hierarchy of needs and Richard and Patricia Schmuck's social motivation theory. Illustrates ways to apply the personal relevance framework to make social studies more relevant to…

  16. The Development of Relevance in Information Retrieval

    Directory of Open Access Journals (Sweden)

    Mu-hsuan Huang

    1997-12-01

    Full Text Available This article attempts to investigate the notion of relevance in information retrieval. It discusses various definitions for relevance from historical viewpoints and the characteristics of relevance judgments. Also, it introduces empirical results of important related researches.[Article content in Chinese

  17. Speed-Accuracy Tradeoff in Olfaction

    National Research Council Canada - National Science Library

    Rinberg, Dmitry; Koulakov, ALexel; Gelperin, Alan

    2006-01-01

    The basic psychophysical principle of speed-accuracy tradeoff (SAT) has been used to understand key aspects of neuronal information processing in vision and audition, but the principle of SAT is still debated in olfaction...

  18. Social Security Administration Data for Enumeration Accuracy

    Data.gov (United States)

    Social Security Administration — This dataset provides data at the national level from federal fiscal year 2006 onwards for the accuracy of the assignment of Social Security numbers (SSN) based on...

  19. A Note on "Accuracy" and "Precision"

    Science.gov (United States)

    Stallings, William M.; Gillmore, Gerald M.

    1971-01-01

    Advocates the use of precision" rather than accuracy" in defining reliability. These terms are consistently differentiated in certain sciences. Review of psychological and measurement literature reveals, however, interchangeable usage of the terms in defining reliability. (Author/GS)

  20. Predictive accuracy of backpropagation neural network ...

    Indian Academy of Sciences (India)

    incorporated into the BP model for high accuracy management purpose of irrigation water, which relies on accurate values of ET ... as seen from the recent food crisis demonstra- tion in most .... layers by using Geographical Information System.

  1. Strategies to Increase Accuracy in Text Classification

    NARCIS (Netherlands)

    D. Blommesteijn (Dennis)

    2014-01-01

    htmlabstractText classification via supervised learning involves various steps from processing raw data, features extraction to training and validating classifiers. Within these steps implementation decisions are critical to the resulting classifier accuracy. This paper contains a report of the

  2. Accuracy Assessment of Different Digital Surface Models

    Directory of Open Access Journals (Sweden)

    Ugur Alganci

    2018-03-01

    Full Text Available Digital elevation models (DEMs, which can occur in the form of digital surface models (DSMs or digital terrain models (DTMs, are widely used as important geospatial information sources for various remote sensing applications, including the precise orthorectification of high-resolution satellite images, 3D spatial analyses, multi-criteria decision support systems, and deformation monitoring. The accuracy of DEMs has direct impacts on specific calculations and process chains; therefore, it is important to select the most appropriate DEM by considering the aim, accuracy requirement, and scale of each study. In this research, DSMs obtained from a variety of satellite sensors were compared to analyze their accuracy and performance. For this purpose, freely available Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER 30 m, Shuttle Radar Topography Mission (SRTM 30 m, and Advanced Land Observing Satellite (ALOS 30 m resolution DSM data were obtained. Additionally, 3 m and 1 m resolution DSMs were produced from tri-stereo images from the SPOT 6 and Pleiades high-resolution (PHR 1A satellites, respectively. Elevation reference data provided by the General Command of Mapping, the national mapping agency of Turkey—produced from 30 cm spatial resolution stereo aerial photos, with a 5 m grid spacing and ±3 m or better overall vertical accuracy at the 90% confidence interval (CI—were used to perform accuracy assessments. Gross errors and water surfaces were removed from the reference DSM. The relative accuracies of the different DSMs were tested using a different number of checkpoints determined by different methods. In the first method, 25 checkpoints were selected from bare lands to evaluate the accuracies of the DSMs on terrain surfaces. In the second method, 1000 randomly selected checkpoints were used to evaluate the methods’ accuracies for the whole study area. In addition to the control point approach, vertical cross

  3. Improving coding accuracy in an academic practice.

    Science.gov (United States)

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  4. Increasing of AC compensation method accuracy

    International Nuclear Information System (INIS)

    Havlicek, V.; Pokorny, M.

    2003-01-01

    The original MMF compensation method allows the magnetic properties of single sheets and strips to be measured in the same way as the closed specimen properties. The accuracy of the method is limited due to the finite gain of the feedback loop fulfilling the condition of its stability. Digitalisation of the compensation loop appropriate processing of the error signal can rapidly improve the accuracy. The basic ideas of this new approach and the experimental results are described in this paper

  5. Increasing of AC compensation method accuracy

    Science.gov (United States)

    Havlíček, V.; Pokorný, M.

    2003-01-01

    The original MMF compensation method allows the magnetic properties of single sheets and strips to be measured in the same way as the closed specimen properties. The accuracy of the method is limited due to the finite gain of the feedback loop fulfilling the condition of its stability. Digitalisation of the compensation loop appropriate processing of the error signal can rapidly improve the accuracy. The basic ideas of this new approach and the experimental results are described in this paper.

  6. Nostalgia's place among self-relevant emotions.

    Science.gov (United States)

    van Tilburg, Wijnand A P; Wildschut, Tim; Sedikides, Constantine

    2017-07-24

    How is nostalgia positioned among self-relevant emotions? We tested, in six studies, which self-relevant emotions are perceived as most similar versus least similar to nostalgia, and what underlies these similarities/differences. We used multidimensional scaling to chart the perceived similarities/differences among self-relevant emotions, resulting in two-dimensional models. The results were revealing. Nostalgia is positioned among self-relevant emotions characterised by positive valence, an approach orientation, and low arousal. Nostalgia most resembles pride and self-compassion, and least resembles embarrassment and shame. Our research pioneered the integration of nostalgia among self-relevant emotions.

  7. Effects of Objective and Subjective Competence on the Reliability of Crowdsourced Relevance Judgments

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi; Webber, William; Koh, Yun Sing

    2017-01-01

    Introduction: Despite the popularity of crowdsourcing, the reliability of crowdsourced output has been questioned since crowdsourced workers display varied degrees of attention, ability and accuracy. It is important, therefore, to understand the factors that affect the reliability of crowdsourcing. In the context of producing relevance judgments,…

  8. A Parallel Relational Database Management System Approach to Relevance Feedback in Information Retrieval.

    Science.gov (United States)

    Lundquist, Carol; Frieder, Ophir; Holmes, David O.; Grossman, David

    1999-01-01

    Describes a scalable, parallel, relational database-drive information retrieval engine. To support portability across a wide range of execution environments, all algorithms adhere to the SQL-92 standard. By incorporating relevance feedback algorithms, accuracy is enhanced over prior database-driven information retrieval efforts. Presents…

  9. Cultural relevance of a fruit and vegetable food frequency questionnaire.

    Science.gov (United States)

    Paisley, Judy; Greenberg, Marlene; Haines, Jess

    2005-01-01

    Canada's multicultural population poses challenges for culturally competent nutrition research and practice. In this qualitative study, the cultural relevance of a widely used semi-quantitative fruit and vegetable food frequency questionnaire (FFQ) was examined among convenience samples of adults from Toronto's Cantonese-, Mandarin-, Portuguese-, and Vietnamese-speaking communities. Eighty-nine participants were recruited through community-based organizations, programs, and advertisements to participate in semi-structured interviews moderated in their native language. Data from the interviews were translated into English and transcribed for analysis using the constant comparative approach. Four main themes emerged from the analysis: the cultural relevance of the foods listed on the FFQ, words with multiple meanings, the need for culturally appropriate portion-size prompts, and the telephone survey as a Western concept. This research highlights the importance of investing resources to develop culturally relevant dietary assessment tools that ensure dietary assessment accuracy and, more important, reduce ethnocentric biases in food and nutrition research and practice. The transferability of findings must be established through further research.

  10. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi

    2014-01-01

    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  11. Decision aids for improved accuracy and standardization of mammographic diagnosis

    International Nuclear Information System (INIS)

    D'Orsi, C.J.; Getty, D.J.; Swets, J.A.; Pickett, R.M.; Seltzer, S.E.; McNeil, B.J.

    1990-01-01

    This paper examines the gains in the accuracy of mammographic diagnosis of breast cancer achievable from a pair of decision aids. Twenty-three potentially relevant perceptual features of mammograms were identified through interviews, psychometric tests, and consensus meetings with mammography specialists. Statistical analyses determined the 12 independent features that were most information diagnostically and assigned a weight to each according to its importance. Two decision aids were developed: a checklist that solicits a scale value from the radiologist for each feature and a computer program that merges those values optimally in an advisory estimate of the probability of malignancy. Six radiologists read a set of 150 cases, first in their usual way and later with the aids

  12. Accuracy limits on rapid assessment of gently varying bathymetry

    Science.gov (United States)

    McDonald, B. Edward; Holland, Charles

    2002-05-01

    Accuracy limits for rapidly probing shallow water bathymetry are investigated as a function of bottom slope and other relevant parameters. The probe scheme [B. E. McDonald and Charles Holland, J. Acoust. Soc. Am. 110, 2767 (2001)] uses a time reversed mirror (TRM) to ensonify a thin annulus on the ocean bottom at ranges of a few km from a vertical send/ receive array. The annulus is shifted in range by variable bathymetry (perturbation theory shows that the focal annulus experiences a radial shift proportional to the integrated bathymetry along a given azimuth). The range shift implies an azimuth-dependent time of maximum reverberation. Thus the reverberant return contains information that might be inverted to give bathymetric parameters. The parameter range over which the perturbation result is accurate is explored using the RAM code for propagation in arbitrarily range-dependent environments. [Work supported by NRL.

  13. Accuracy Assessment and Analysis for GPT2

    Directory of Open Access Journals (Sweden)

    YAO Yibin

    2015-07-01

    Full Text Available GPT(global pressure and temperature is a global empirical model usually used to provide temperature and pressure for the determination of tropospheric delay, there are some weakness to GPT, these have been improved with a new empirical model named GPT2, which not only improves the accuracy of temperature and pressure, but also provides specific humidity, water vapor pressure, mapping function coefficients and other tropospheric parameters, and no accuracy analysis of GPT2 has been made until now. In this paper high-precision meteorological data from ECWMF and NOAA were used to test and analyze the accuracy of temperature, pressure and water vapor pressure expressed by GPT2, testing results show that the mean Bias of temperature is -0.59℃, average RMS is 3.82℃; absolute value of average Bias of pressure and water vapor pressure are less than 1 mb, GPT2 pressure has average RMS of 7 mb, and water vapor pressure no more than 3 mb, accuracy is different in different latitudes, all of them have obvious seasonality. In conclusion, GPT2 model has high accuracy and stability on global scale.

  14. Reliability and accuracy of Crystaleye spectrophotometric system.

    Science.gov (United States)

    Chen, Li; Tan, Jian Guo; Zhou, Jian Feng; Yang, Xu; Du, Yang; Wang, Fang Ping

    2010-01-01

    to develop an in vitro shade-measuring model to evaluate the reliability and accuracy of the Crystaleye spectrophotometric system, a newly developed spectrophotometer. four shade guides, VITA Classical, VITA 3D-Master, Chromascop and Vintage Halo NCC, were measured with the Crystaleye spectrophotometer in a standardised model, ten times for 107 shade tabs. The shade-matching results and the CIE L*a*b* values of the cervical, body and incisal regions for each measurement were automatically analysed using the supporting software. Reliability and accuracy were calculated for each shade tab both in percentage and in colour difference (ΔE). Difference was analysed by one-way ANOVA in the cervical, body and incisal regions. range of reliability was 88.81% to 98.97% and 0.13 to 0.24 ΔE units, and that of accuracy was 44.05% to 91.25% and 1.03 to 1.89 ΔE units. Significant differences in reliability and accuracy were found between the body region and the cervical and incisal regions. Comparisons made among regions and shade guides revealed that evaluation in ΔE was prone to disclose the differences. measurements with the Crystaleye spectrophotometer had similar, high reliability in different shade guides and regions, indicating predictable repeated measurements. Accuracy in the body region was high and less variable compared with the cervical and incisal regions.

  15. Relevance: An Interdisciplinary and Information Science Perspective

    Directory of Open Access Journals (Sweden)

    Howard Greisdorf

    2000-01-01

    Full Text Available Although relevance has represented a key concept in the field of information science for evaluating information retrieval effectiveness, the broader context established by interdisciplinary frameworks could provide greater depth and breadth to on-going research in the field. This work provides an overview of the nature of relevance in the field of information science with a cursory view of how cross-disciplinary approaches to relevance could represent avenues for further investigation into the evaluative characteristics of relevance as a means for enhanced understanding of human information behavior.

  16. Coordinate metrology accuracy of systems and measurements

    CERN Document Server

    Sładek, Jerzy A

    2016-01-01

    This book focuses on effective methods for assessing the accuracy of both coordinate measuring systems and coordinate measurements. It mainly reports on original research work conducted by Sladek’s team at Cracow University of Technology’s Laboratory of Coordinate Metrology. The book describes the implementation of different methods, including artificial neural networks, the Matrix Method, the Monte Carlo method and the virtual CMM (Coordinate Measuring Machine), and demonstrates how these methods can be effectively used in practice to gauge the accuracy of coordinate measurements. Moreover, the book includes an introduction to the theory of measurement uncertainty and to key techniques for assessing measurement accuracy. All methods and tools are presented in detail, using suitable mathematical formulations and illustrated with numerous examples. The book fills an important gap in the literature, providing readers with an advanced text on a topic that has been rapidly developing in recent years. The book...

  17. Evaluating measurement accuracy a practical approach

    CERN Document Server

    Rabinovich, Semyon G

    2017-01-01

    This book presents a systematic and comprehensive exposition of the theory of measurement accuracy and provides solutions that fill significant and long-standing gaps in the classical theory. It eliminates the shortcomings of the classical theory by including methods for estimating accuracy of single measurements, the most common type of measurement. The book also develops methods of reduction and enumeration for indirect measurements, which do not require Taylor series and produce a precise solution to this problem. It produces grounded methods and recommendations for summation of errors. The monograph also analyzes and critiques two foundation metrological documents, the International Vocabulary of Metrology (VIM) and the Guide to the Expression of Uncertainty in Measurement (GUM), and discusses directions for their revision. This new edition adds a step-by-step guide on how to evaluate measurement accuracy and recommendations on how to calculate systematic error of multiple measurements. There is also an e...

  18. Social class, contextualism, and empathic accuracy.

    Science.gov (United States)

    Kraus, Michael W; Côté, Stéphane; Keltner, Dacher

    2010-11-01

    Recent research suggests that lower-class individuals favor explanations of personal and political outcomes that are oriented to features of the external environment. We extended this work by testing the hypothesis that, as a result, individuals of a lower social class are more empathically accurate in judging the emotions of other people. In three studies, lower-class individuals (compared with upper-class individuals) received higher scores on a test of empathic accuracy (Study 1), judged the emotions of an interaction partner more accurately (Study 2), and made more accurate inferences about emotion from static images of muscle movements in the eyes (Study 3). Moreover, the association between social class and empathic accuracy was explained by the tendency for lower-class individuals to explain social events in terms of features of the external environment. The implications of class-based patterns in empathic accuracy for well-being and relationship outcomes are discussed.

  19. On the Accuracy Potential in Underwater/Multimedia Photogrammetry.

    Science.gov (United States)

    Maas, Hans-Gerd

    2015-07-24

    Underwater applications of photogrammetric measurement techniques usually need to deal with multimedia photogrammetry aspects, which are characterized by the necessity of handling optical rays that are refracted at interfaces between optical media with different refractive indices according to Snell's Law. This so-called multimedia geometry has to be incorporated into geometric models in order to achieve correct measurement results. The paper shows a flexible yet strict geometric model for the handling of refraction effects on the optical path, which can be implemented as a module into photogrammetric standard tools such as spatial resection, spatial intersection, bundle adjustment or epipolar line computation. The module is especially well suited for applications, where an object in water is observed by cameras in air through one or more planar glass interfaces, as it allows for some simplifications here. In the second part of the paper, several aspects, which are relevant for an assessment of the accuracy potential in underwater/multimedia photogrammetry, are discussed. These aspects include network geometry and interface planarity issues as well as effects caused by refractive index variations and dispersion and diffusion under water. All these factors contribute to a rather significant degradation of the geometric accuracy potential in underwater/multimedia photogrammetry. In practical experiments, a degradation of the quality of results by a factor two could be determined under relatively favorable conditions.

  20. Requisite accuracy for hot spot factors in fast reactors

    International Nuclear Information System (INIS)

    Miki, Kazuyoshi; Inoue, Kotaro

    1976-01-01

    In the thermal design of a fast reactor, it should be most effective to reduce hot spot factors to the lowest possible level compatible with safety considerations, in order to minimize the design margin for the temperature prevailing in the core. Hot spot factors account for probabilistic and statistic deviations from nominal value of fuel element temperatures, due to uncertainties in the data adopted for estimating various factors including the physical properties. Such temperature deviations necessitate the provision of correspondingly large design margins for temperatures in order to keep within permissible limits the probability of exceeding the allowable temperatures. Evaluation of the desired accuracy for hot spot factors is performed by a method of optimization, which permits determination of the degree of accuracy that should minimize the design margins, to give realistic results with consideration given not only to sensitivity coefficients but also to the present-day uncertainty levels in the data adopted in the calculations. A concept of ''degree of difficulty'' is introduced for the purpose of determining the hot spot factors to be given higher priority for reduction. Application of this method to the core of a prototype fast reactor leads to the conclusion that the hot spot factors to be given the highest priority are those relevant to the power distribution, the flow distribution, the fuel enrichment, the fuel-cladding gap conductance and the fuel thermal conductivity. (auth.)

  1. The Difference between Right and Wrong: Accuracy of Older and Younger Adults’ Story Recall

    Science.gov (United States)

    Davis, Danielle K.; Alea, Nicole; Bluck, Susan

    2015-01-01

    Sharing stories is an important social activity in everyday life. This study used fine-grained content analysis to investigate the accuracy of recall of two central story elements: the gist and detail of socially-relevant stories. Younger (M age = 28.06) and older (M age = 75.03) American men and women (N = 63) recalled fictional stories that were coded for (i) accuracy of overall gist and specific gist categories and (ii) accuracy of overall detail and specific detail categories. Findings showed no age group differences in accuracy of overall gist or detail, but differences emerged for specific categories. Older adults more accurately recalled the gist of when the event occurred whereas younger adults more accurately recalled the gist of why the event occurred. These differences were related to episodic memory ability and education. For accuracy in recalling details, there were some age differences, but gender differences were more robust. Overall, women remembered details of these social stories more accurately than men, particularly time and perceptual details. Women were also more likely to accurately remember the gist of when the event occurred. The discussion focuses on how accurate recall of socially-relevant stories is not clearly age-dependent but is related to person characteristics such as gender and episodic memory ability/education. PMID:26404344

  2. The Difference between Right and Wrong: Accuracy of Older and Younger Adults' Story Recall.

    Science.gov (United States)

    Davis, Danielle K; Alea, Nicole; Bluck, Susan

    2015-09-02

    Sharing stories is an important social activity in everyday life. This study used fine-grained content analysis to investigate the accuracy of recall of two central story elements: the gist and detail of socially-relevant stories. Younger (M age = 28.06) and older (M age = 75.03) American men and women (N = 63) recalled fictional stories that were coded for (i) accuracy of overall gist and specific gist categories and (ii) accuracy of overall detail and specific detail categories. Findings showed no age group differences in accuracy of overall gist or detail, but differences emerged for specific categories. Older adults more accurately recalled the gist of when the event occurred whereas younger adults more accurately recalled the gist of why the event occurred. These differences were related to episodic memory ability and education. For accuracy in recalling details, there were some age differences, but gender differences were more robust. Overall, women remembered details of these social stories more accurately than men, particularly time and perceptual details. Women were also more likely to accurately remember the gist of when the event occurred. The discussion focuses on how accurate recall of socially-relevant stories is not clearly age-dependent but is related to person characteristics such as gender and episodic memory ability/education.

  3. The Difference between Right and Wrong: Accuracy of Older and Younger Adults’ Story Recall

    Directory of Open Access Journals (Sweden)

    Danielle K. Davis

    2015-09-01

    Full Text Available Sharing stories is an important social activity in everyday life. This study used fine-grained content analysis to investigate the accuracy of recall of two central story elements: the gist and detail of socially-relevant stories. Younger (M age = 28.06 and older (M age = 75.03 American men and women (N = 63 recalled fictional stories that were coded for (i accuracy of overall gist and specific gist categories and (ii accuracy of overall detail and specific detail categories. Findings showed no age group differences in accuracy of overall gist or detail, but differences emerged for specific categories. Older adults more accurately recalled the gist of when the event occurred whereas younger adults more accurately recalled the gist of why the event occurred. These differences were related to episodic memory ability and education. For accuracy in recalling details, there were some age differences, but gender differences were more robust. Overall, women remembered details of these social stories more accurately than men, particularly time and perceptual details. Women were also more likely to accurately remember the gist of when the event occurred. The discussion focuses on how accurate recall of socially-relevant stories is not clearly age-dependent but is related to person characteristics such as gender and episodic memory ability/education.

  4. Systematic reviews of diagnostic test accuracy

    DEFF Research Database (Denmark)

    Leeflang, Mariska M G; Deeks, Jonathan J; Gatsonis, Constantine

    2008-01-01

    More and more systematic reviews of diagnostic test accuracy studies are being published, but they can be methodologically challenging. In this paper, the authors present some of the recent developments in the methodology for conducting systematic reviews of diagnostic test accuracy studies....... Restrictive electronic search filters are discouraged, as is the use of summary quality scores. Methods for meta-analysis should take into account the paired nature of the estimates and their dependence on threshold. Authors of these reviews are advised to use the hierarchical summary receiver...

  5. FIELD ACCURACY TEST OF RPAS PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    P. Barry

    2013-08-01

    Full Text Available Baseline Surveys Ltd is a company which specialises in the supply of accurate geospatial data, such as cadastral, topographic and engineering survey data to commercial and government bodies. Baseline Surveys Ltd invested in aerial drone photogrammetric technology and had a requirement to establish the spatial accuracy of the geographic data derived from our unmanned aerial vehicle (UAV photogrammetry before marketing our new aerial mapping service. Having supplied the construction industry with survey data for over 20 years, we felt that is was crucial for our clients to clearly understand the accuracy of our photogrammetry so they can safely make informed spatial decisions, within the known accuracy limitations of our data. This information would also inform us on how and where UAV photogrammetry can be utilised. What we wanted to find out was the actual accuracy that can be reliably achieved using a UAV to collect data under field conditions throughout a 2 Ha site. We flew a UAV over the test area in a "lawnmower track" pattern with an 80% front and 80% side overlap; we placed 45 ground markers as check points and surveyed them in using network Real Time Kinematic Global Positioning System (RTK GPS. We specifically designed the ground markers to meet our accuracy needs. We established 10 separate ground markers as control points and inputted these into our photo modelling software, Agisoft PhotoScan. The remaining GPS coordinated check point data were added later in ArcMap to the completed orthomosaic and digital elevation model so we could accurately compare the UAV photogrammetry XYZ data with the RTK GPS XYZ data at highly reliable common points. The accuracy we achieved throughout the 45 check points was 95% reliably within 41 mm horizontally and 68 mm vertically and with an 11.7 mm ground sample distance taken from a flight altitude above ground level of 90 m.The area covered by one image was 70.2 m × 46.4 m, which equals 0.325 Ha. This

  6. Electron ray tracing with high accuracy

    International Nuclear Information System (INIS)

    Saito, K.; Okubo, T.; Takamoto, K.; Uno, Y.; Kondo, M.

    1986-01-01

    An electron ray tracing program is developed to investigate the overall geometrical and chromatic aberrations in electron optical systems. The program also computes aberrations due to manufacturing errors in lenses and deflectors. Computation accuracy is improved by (1) calculating electrostatic and magnetic scalar potentials using the finite element method with third-order isoparametric elements, and (2) solving the modified ray equation which the aberrations satisfy. Computation accuracy of 4 nm is achieved for calculating optical properties of the system with an electrostatic lens

  7. Revolutionizing radiographic diagnostic accuracy in periodontics

    Directory of Open Access Journals (Sweden)

    Brijesh Sharma

    2016-01-01

    Full Text Available Effective diagnostic accuracy has in some way been the missing link between periodontal diagnosis and treatment. Most of the clinicians rely on the conventional two-dimensional (2D radiographs. But being a 2D image, it has its own limitations. 2D images at times can give an incomplete picture about the severity or type of disease and can further affect the treatment plan. Cone beam computed tomography (CBCT has a better potential for detecting periodontal bone defects with accuracy. The purpose here is to describe how CBCT imaging is beneficial in accurate diagnosis and will lead to a precise treatment plan.

  8. ACCURACY ANALYSIS OF KINECT DEPTH DATA

    Directory of Open Access Journals (Sweden)

    K. Khoshelham

    2012-09-01

    Full Text Available This paper presents an investigation of the geometric quality of depth data obtained by the Kinect sensor. Based on the mathematical model of depth measurement by the sensor a theoretical error analysis is presented, which provides an insight into the factors influencing the accuracy of the data. Experimental results show that the random error of depth measurement increases with increasing distance to the sensor, and ranges from a few millimetres up to about 4 cm at the maximum range of the sensor. The accuracy of the data is also found to be influenced by the low resolution of the depth measurements.

  9. Final Technical Report: Increasing Prediction Accuracy.

    Energy Technology Data Exchange (ETDEWEB)

    King, Bruce Hardison [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hansen, Clifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    PV performance models are used to quantify the value of PV plants in a given location. They combine the performance characteristics of the system, the measured or predicted irradiance and weather at a site, and the system configuration and design into a prediction of the amount of energy that will be produced by a PV system. These predictions must be as accurate as possible in order for finance charges to be minimized. Higher accuracy equals lower project risk. The Increasing Prediction Accuracy project at Sandia focuses on quantifying and reducing uncertainties in PV system performance models.

  10. Sampling Molecular Conformers in Solution with Quantum Mechanical Accuracy at a Nearly Molecular-Mechanics Cost.

    Science.gov (United States)

    Rosa, Marta; Micciarelli, Marco; Laio, Alessandro; Baroni, Stefano

    2016-09-13

    We introduce a method to evaluate the relative populations of different conformers of molecular species in solution, aiming at quantum mechanical accuracy, while keeping the computational cost at a nearly molecular-mechanics level. This goal is achieved by combining long classical molecular-dynamics simulations to sample the free-energy landscape of the system, advanced clustering techniques to identify the most relevant conformers, and thermodynamic perturbation theory to correct the resulting populations, using quantum-mechanical energies from density functional theory. A quantitative criterion for assessing the accuracy thus achieved is proposed. The resulting methodology is demonstrated in the specific case of cyanin (cyanidin-3-glucoside) in water solution.

  11. 75 FR 57253 - Submission for OMB Review; Comment Request

    Science.gov (United States)

    2010-09-20

    ... and expanded data on the income and general economic and financial situation of the U.S. population... context of several goals--cost reduction and improved accuracy, relevance, timeliness, reduced burden on... incentive, a newsletter reporting findings from the 2008 SIPP Panel, or no contact between interview periods...

  12. Does relevance matter in academic policy research

    DEFF Research Database (Denmark)

    Dredge, Dianne

    2015-01-01

    A reflection on whether relevance matters in tourism policy research, and if so, to whom/what should it matter......A reflection on whether relevance matters in tourism policy research, and if so, to whom/what should it matter...

  13. Inoculating Relevance Feedback Against Poison Pills

    NARCIS (Netherlands)

    Dehghani, Mostafa; Azarbonyad, Hosein; Kamps, Jaap; Hiemstra, Djoerd; Marx, Maarten

    2016-01-01

    Relevance Feedback is a common approach for enriching queries, given a set of explicitly or implicitly judged documents to improve the performance of the retrieval. Although it has been shown that on average, the overall performance of retrieval will be improved after relevance feedback, for some

  14. Relevant cost information for order acceptance decisions

    NARCIS (Netherlands)

    Wouters, M.J.F.

    1997-01-01

    Some economic considerations for order acceptance decisions are discussed. The relevant economic considerations for order acceptance are widely discussed in the literature: only those costs are relevant which would be avoidable by not accepting the order incremental costs plus opportunity costs .

  15. Android Smartphone Relevance to Military Weather Applications

    Science.gov (United States)

    2011-10-01

    lithium -ion battery that may be replaced by the user (unlike Apple iPod Touch devices), thus spare batteries can be carried. If there is only sporadic...Android Smartphone Relevance to Military Weather Applications by David Sauter ARL-TR-5793 October 2011...Android Smartphone Relevance to Military Weather Applications David Sauter Computational and Information Sciences Directorate, ARL

  16. Using small XML elements to support relevance

    NARCIS (Netherlands)

    G. Ramirez Camps (Georgina); T.H.W. Westerveld (Thijs); A.P. de Vries (Arjen)

    2006-01-01

    htmlabstractSmall XML elements are often estimated relevant by the retrieval model but they are not desirable retrieval units. This paper presents a generic model that exploits the information obtained from small elements. We identify relationships between small and relevant elements and use this

  17. Translation as secondary communication. The relevance theory ...

    African Journals Online (AJOL)

    Ernst-August Gutt started one of the greatest translation debates of the past ten years when he suggested that relevance theory holds the key to providing a unified account of translation. The bulk of the debate has been between practitioners of functional equivalence and advocates of a relevance theoretic approach to ...

  18. Evolutionary relevance facilitates visual information processing.

    Science.gov (United States)

    Jackson, Russell E; Calvillo, Dusti P

    2013-11-03

    Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  19. Evolutionary Relevance Facilitates Visual Information Processing

    Directory of Open Access Journals (Sweden)

    Russell E. Jackson

    2013-07-01

    Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  20. Accuracy Of Stereometry In Assessing Orthognathic Surgery

    Science.gov (United States)

    King, Geoffrey E.; Bays, R. A.

    1983-07-01

    An X-ray stereometric technique has been developed for the determination of 3-dimensional coordinates of spherical metallic markers previously implanted in monkey skulls. The accuracy of the technique is better than 0.5mm. and uses readily available demountable X-ray equipment. The technique is used to study the effects and stability of experimental orthognathic surgery.

  1. Accuracy of sampling during mushroom cultivation

    NARCIS (Netherlands)

    Baars, J.J.P.; Hendrickx, P.M.; Sonnenberg, A.S.M.

    2015-01-01

    Experiments described in this report were performed to increase the accuracy of the analysis of the biological efficiency of Agaricus bisporus strains. Biological efficiency is a measure of the efficiency with which the mushroom strains use dry matter in the compost to produce mushrooms (expressed

  2. Laser measuring scanners and their accuracy limits

    Science.gov (United States)

    Jablonski, Ryszard

    1993-09-01

    Scanning methods have gained the greater importance for some years now due to a short measuring time and wide range of application in flexible manufacturing processes. This paper is a summing up of the autho?s creative scientific work in the field of measuring scanners. The research conducted allowed to elaborate the optimal configurations of measuring systems based on the scanning method. An important part of the work was the analysis of a measuring scanner - as a transducer of an angle rotation into the linear displacement which resulted in obtaining its much higher accuracy and finally in working out a measuring scanner eliminating the use of an additional reference standard. The completion of the work is an attempt to determine an attainable accuracy limit of scanning measurement of both length and angle. Using a high stability deflector and a corrected scanning lens one can obtain the angle determination over 30 (or 2 mm) to an accuracy 0 (or 0 tm) when the measuring rate is 1000 Hz or the range d60 (4 mm) with accuracy 0 " (0 jim) and measurement frequency 6 Hz.

  3. High Accuracy Transistor Compact Model Calibrations

    Energy Technology Data Exchange (ETDEWEB)

    Hembree, Charles E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Mar, Alan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robertson, Perry J. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirements require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.

  4. Accuracy of References in Five Entomology Journals.

    Science.gov (United States)

    Kristof, Cynthia

    ln this paper, the bibliographical references in five core entomology journals are examined for citation accuracy in order to determine if the error rates are similar. Every reference printed in each journal's first issue of 1992 was examined, and these were compared to the original (cited) publications, if possible, in order to determine the…

  5. Accuracy of abdominal auscultation for bowel obstruction

    DEFF Research Database (Denmark)

    Breum, Birger Michael; Rud, Bo; Kirkegaard, Thomas

    2015-01-01

    AIM: To investigate the accuracy and inter-observer variation of bowel sound assessment in patients with clinically suspected bowel obstruction. METHODS: Bowel sounds were recorded in patients with suspected bowel obstruction using a Littmann(®) Electronic Stethoscope. The recordings were process...

  6. Positional Accuracy Assessment for Effective Shoreline Change ...

    African Journals Online (AJOL)

    Ghana Mining Journal ... Data quality may be expressed in terms of several indicators such as attributes, temporal or positional accuracies. ... It is concluded that for the purpose of shoreline change analysis, such as shoreline change trends, large scale data sources should be used where possible for accurate ...

  7. Accuracy assessment of an industrial actuator

    DEFF Research Database (Denmark)

    Dalla Costa, Giuseppe; Genta, Gianfranco; Barbato, Giulio

    2016-01-01

    A commercial linear actuator equipped with a 0.1 μm resolution encoder was used as a contact displacement sensor with adjustable force. The accuracy of the position reading of the actuator was evaluated from experimental data taking into account the uncertainty contributions. The tests consisted ...

  8. Diagnostic accuracy of organ electrodermal diagnostics | Szopinski ...

    African Journals Online (AJOL)

    Objective. To estimate the diagnostic accuracy as well as the scope of utilisation of a new bio-electronic method of organ diagnostics. Design. Double-blind comparative study of the diagnostic results obtained by means of organ electrodermal diagnostics (OED) and clinical diagnoses, as a criterion standard. Setting.

  9. Task Speed and Accuracy Decrease When Multitasking

    Science.gov (United States)

    Lin, Lin; Cockerham, Deborah; Chang, Zhengsi; Natividad, Gloria

    2016-01-01

    As new technologies increase the opportunities for multitasking, the need to understand human capacities for multitasking continues to grow stronger. Is multitasking helping us to be more efficient? This study investigated the multitasking abilities of 168 participants, ages 6-72, by measuring their task accuracy and completion time when they…

  10. Content in Context Improves Deception Detection Accuracy

    Science.gov (United States)

    Blair, J. Pete; Levine, Timothy R.; Shaw, Allison S.

    2010-01-01

    Past research has shown that people are only slightly better than chance at distinguishing truths from lies. Higher accuracy rates, however, are possible when contextual knowledge is used to judge the veracity of situated message content. The utility of content in context was shown in a series of experiments with students (N = 26, 45, 51, 25, 127)…

  11. On the accuracy of short read mapping

    DEFF Research Database (Denmark)

    Menzel, Karl Peter; Frellsen, Jes; Plass, Mireya

    2013-01-01

    .e., mapping the reads to a reference genome. In this new situation, conventional alignment tools are obsolete, as they cannot handle this huge amount of data in a reasonable amount of time. Thus, new mapping algorithms have been developed, which are fast at the expense of a small decrease in accuracy...

  12. Studies on the diagnostic accuracy of lymphography

    International Nuclear Information System (INIS)

    Luening, M.; Stargardt, A.; Abet, L.

    1979-01-01

    Contradictory reports in the literature on the reliability of lymphography stimulated the authors to test the diagnostic accuracy, employing methods which are approximately analogous to practice, using carcinoma of the cervix as the model on which the study was carried out. Using 21 observers it was found that there was no correlation between their experience and on-target accuracy of the diagnosis. Good observers obtained an accuracy of 85% with good proportions between sensitivity in the recognition of detail, specificity and readiness to arrive at a decision on the basis of discriminatory ability. With the help of the concept of the ROC curves, the position taken up by the observers in respect of diagnostic decisions, and a complex manner of assessing the various characteristic factors determining diagnostic accuracy, are demonstrated. This form of test, which permits manipulation of different variants of diagnosis, is recommended, among other things, for performance control at the end of training and continuing education courses in other fields of x-ray diagnosis as well. (orig.) [de

  13. Accuracy of Digital vs. Conventional Implant Impressions

    Science.gov (United States)

    Lee, Sang J.; Betensky, Rebecca A.; Gianneschi, Grace E.; Gallucci, German O.

    2015-01-01

    The accuracy of digital impressions greatly influences the clinical viability in implant restorations. The aim of this study is to compare the accuracy of gypsum models acquired from the conventional implant impression to digitally milled models created from direct digitalization by three-dimensional analysis. Thirty gypsum and 30 digitally milled models impressed directly from a reference model were prepared. The models were scanned by a laboratory scanner and 30 STL datasets from each group were imported to an inspection software. The datasets were aligned to the reference dataset by a repeated best fit algorithm and 10 specified contact locations of interest were measured in mean volumetric deviations. The areas were pooled by cusps, fossae, interproximal contacts, horizontal and vertical axes of implant position and angulation. The pooled areas were statistically analysed by comparing each group to the reference model to investigate the mean volumetric deviations accounting for accuracy and standard deviations for precision. Milled models from digital impressions had comparable accuracy to gypsum models from conventional impressions. However, differences in fossae and vertical displacement of the implant position from the gypsum and digitally milled models compared to the reference model, exhibited statistical significance (p<0.001, p=0.020 respectively). PMID:24720423

  14. Analyzing thematic maps and mapping for accuracy

    Science.gov (United States)

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  15. T-ray relevant frequencies for osteosarcoma classification

    Science.gov (United States)

    Withayachumnankul, W.; Ferguson, B.; Rainsford, T.; Findlay, D.; Mickan, S. P.; Abbott, D.

    2006-01-01

    We investigate the classification of the T-ray response of normal human bone cells and human osteosarcoma cells, grown in culture. Given the magnitude and phase responses within a reliable spectral range as features for input vectors, a trained support vector machine can correctly classify the two cell types to some extent. Performance of the support vector machine is deteriorated by the curse of dimensionality, resulting from the comparatively large number of features in the input vectors. Feature subset selection methods are used to select only an optimal number of relevant features for inputs. As a result, an improvement in generalization performance is attainable, and the selected frequencies can be used for further describing different mechanisms of the cells, responding to T-rays. We demonstrate a consistent classification accuracy of 89.6%, while the only one fifth of the original features are retained in the data set.

  16. Relevance vector machine technique for the inverse scattering problem

    International Nuclear Information System (INIS)

    Wang Fang-Fang; Zhang Ye-Rong

    2012-01-01

    A novel method based on the relevance vector machine (RVM) for the inverse scattering problem is presented in this paper. The nonlinearity and the ill-posedness inherent in this problem are simultaneously considered. The nonlinearity is embodied in the relation between the scattered field and the target property, which can be obtained through the RVM training process. Besides, rather than utilizing regularization, the ill-posed nature of the inversion is naturally accounted for because the RVM can produce a probabilistic output. Simulation results reveal that the proposed RVM-based approach can provide comparative performances in terms of accuracy, convergence, robustness, generalization, and improved performance in terms of sparse property in comparison with the support vector machine (SVM) based approach. (general)

  17. Using the Characteristics of Documents, Users and Tasks to Predict the Situational Relevance of Health Web Documents

    Directory of Open Access Journals (Sweden)

    Melinda Oroszlányová

    2017-09-01

    Full Text Available Relevance is usually estimated by search engines using document content, disregarding the user behind the search and the characteristics of the task. In this work, we look at relevance as framed in a situational context, calling it situational relevance, and analyze whether it is possible to predict it using documents, users and tasks characteristics. Using an existing dataset composed of health web documents, relevance judgments for information needs, user and task characteristics, we build a multivariate prediction model for situational relevance. Our model has an accuracy of 77.17%. Our findings provide insights into features that could improve the estimation of relevance by search engines, helping to conciliate the systemic and situational views of relevance. In a near future we will work on the automatic assessment of document, user and task characteristics.

  18. Diagnostic accuracy of general physician versus emergency medicine specialist in interpretation of chest X-ray suspected for iatrogenic pneumothorax: a brief report

    Directory of Open Access Journals (Sweden)

    Ghane Mohammad-reza

    2012-03-01

    Conclusion: These findings indicate that the diagnostic accuracy of emergency medicine specialists is significantly higher than those of general physicians. The diagnostic accuracy of both physician groups was higher than the values in similar studies that signifies the role of relevant training given in the emergency departments of the Hospital.

  19. Computed tomography angiogram. Accuracy in renal surgery

    International Nuclear Information System (INIS)

    Rabah, Danny M.; Al-Hathal, Naif; Al-Fuhaid, Turki; Raza, Sayed; Al-Yami, Fahad; Al-Taweel, Waleed; Alomar, Mohamed; Al-Nagshabandi, Nizar

    2009-01-01

    The objective of this study was to determine the sensitivity and specificity of computed tomography angiogram (CTA) in detecting number and location of renal arteries and veins as well as crossing vessels causing uretero-pelvic junction obstruction (UPJO), and to determine if this can be used in decision-making algorithms for treatment of UPJO. A prospective study was carried out in patients undergoing open, laparoscopic and robotic renal surgery from April 2005 until October 2006. All patients were imaged using CTA with 1.25 collimation of arterial and venous phases. Each multi-detector CTA was then read by one radiologist and his results were compared prospectively with the actual intra-operative findings. Overall, 118 patients were included. CTA had 93% sensitivity, 77% specificity and 90% overall accuracy for detecting a single renal artery, and 76% sensitivity, 92% specificity and 90% overall accuracy for detecting two or more renal arteries (Pearson χ 2 =0.001). There was 95% sensitivity, 84% specificity and 85% overall accuracy for detecting the number of renal veins. CTA had 100% overall accuracy in detecting early dividing renal artery (defined as less than 1.5 cm branching from origin), and 83.3% sensitivity, specificity and overall accuracy in detecting crossing vessels at UPJ. The percentage of surgeons stating CTA to be helpful as pre-operative diagnostic tool was 85%. Computed tomography angiogram is simple, quick and can provide an accurate pre-operative renal vascular anatomy in terms of number and location of renal vessels, early dividing renal arteries and crossing vessels at UPJ. (author)

  20. Diagnostic accuracy of MRCP in choledocholithiasis

    International Nuclear Information System (INIS)

    Guarise, Alessandro; Mainardi, Paride; Baltieri, Susanna; Faccioli, Niccolo'

    2005-01-01

    Purpose: To evaluate the accuracy of MRCP in diagnosing choledocholithiasis considering Endoscopic Retrograde Cholangiopancreatography (ERCP) as the gold standard. To compare the results achieved during the first two years of use (1999-2000) of Magnetic Resonance Cholangiopancreatography (MRCP) in patients with suspected choledocholithiasis with those achieved during the following two years (2001-2002) in order to establish the repeatability and objectivity of MRCP results. Materials and methods: One hundred and seventy consecutive patients underwent MRCP followed by ERCP within 72 h. In 22/170 (13%) patients ERCP was unsuccessful for different reasons. MRCP was performed using a 1.5 T magnet with both multi-slice HASTE sequences and thick-slice projection technique. Choledocholithiasis was diagnosed in the presence of signal void images in the dependent portion of the duct surrounded by hyperintense bile and detected at least in two projections. The MRCP results, read independently from the ERCP results, were compared in two different and subsequent periods. Results: ERCP confirmed choledocholithiasis in 87 patients. In these cases the results of MRCP were the following: 78 true positives, 53 true negatives, 7 false positives, and 9 false negatives. The sensitivity, specificity and accuracy were 90%, 88% and 89%, respectively. After the exclusion of stones with diameters smaller than 6 mm, the sensitivity, specificity and accuracy were 100%, 99% and 99%, respectively. MRCP accuracy was related to the size of the stones. There was no significant statistical difference between the results obtained in the first two-year period and those obtained in the second period. Conclusions: MRCP i sufficiently accurate to replace ERCP in patients with suspected choledocholithiasis. The results are related to the size of stones. The use of well-defined radiological signs allows good diagnostic accuracy independent of the learning curve [it

  1. Photon caliper to achieve submillimeter positioning accuracy

    Science.gov (United States)

    Gallagher, Kyle J.; Wong, Jennifer; Zhang, Junan

    2017-09-01

    The purpose of this study was to demonstrate the feasibility of using a commercial two-dimensional (2D) detector array with an inherent detector spacing of 5 mm to achieve submillimeter accuracy in localizing the radiation isocenter. This was accomplished by delivering the Vernier ‘dose’ caliper to a 2D detector array where the nominal scale was the 2D detector array and the non-nominal Vernier scale was the radiation dose strips produced by the high-definition (HD) multileaf collimators (MLCs) of the linear accelerator. Because the HD MLC sequence was similar to the picket fence test, we called this procedure the Vernier picket fence (VPF) test. We confirmed the accuracy of the VPF test by offsetting the HD MLC bank by known increments and comparing the known offset with the VPF test result. The VPF test was able to determine the known offset within 0.02 mm. We also cross-validated the accuracy of the VPF test in an evaluation of couch hysteresis. This was done by using both the VPF test and the ExacTrac optical tracking system to evaluate the couch position. We showed that the VPF test was in agreement with the ExacTrac optical tracking system within a root-mean-square value of 0.07 mm for both the lateral and longitudinal directions. In conclusion, we demonstrated the VPF test can determine the offset between a 2D detector array and the radiation isocenter with submillimeter accuracy. Until now, no method to locate the radiation isocenter using a 2D detector array has been able to achieve such accuracy.

  2. High accuracy FIONA-AFM hybrid imaging

    International Nuclear Information System (INIS)

    Fronczek, D.N.; Quammen, C.; Wang, H.; Kisker, C.; Superfine, R.; Taylor, R.; Erie, D.A.; Tessmer, I.

    2011-01-01

    Multi-protein complexes are ubiquitous and play essential roles in many biological mechanisms. Single molecule imaging techniques such as electron microscopy (EM) and atomic force microscopy (AFM) are powerful methods for characterizing the structural properties of multi-protein and multi-protein-DNA complexes. However, a significant limitation to these techniques is the ability to distinguish different proteins from one another. Here, we combine high resolution fluorescence microscopy and AFM (FIONA-AFM) to allow the identification of different proteins in such complexes. Using quantum dots as fiducial markers in addition to fluorescently labeled proteins, we are able to align fluorescence and AFM information to ≥8 nm accuracy. This accuracy is sufficient to identify individual fluorescently labeled proteins in most multi-protein complexes. We investigate the limitations of localization precision and accuracy in fluorescence and AFM images separately and their effects on the overall registration accuracy of FIONA-AFM hybrid images. This combination of the two orthogonal techniques (FIONA and AFM) opens a wide spectrum of possible applications to the study of protein interactions, because AFM can yield high resolution (5-10 nm) information about the conformational properties of multi-protein complexes and the fluorescence can indicate spatial relationships of the proteins in the complexes. -- Research highlights: → Integration of fluorescent signals in AFM topography with high (<10 nm) accuracy. → Investigation of limitations and quantitative analysis of fluorescence-AFM image registration using quantum dots. → Fluorescence center tracking and display as localization probability distributions in AFM topography (FIONA-AFM). → Application of FIONA-AFM to a biological sample containing damaged DNA and the DNA repair proteins UvrA and UvrB conjugated to quantum dots.

  3. Does filler database size influence identification accuracy?

    Science.gov (United States)

    Bergold, Amanda N; Heaton, Paul

    2018-06-01

    Police departments increasingly use large photo databases to select lineup fillers using facial recognition software, but this technological shift's implications have been largely unexplored in eyewitness research. Database use, particularly if coupled with facial matching software, could enable lineup constructors to increase filler-suspect similarity and thus enhance eyewitness accuracy (Fitzgerald, Oriet, Price, & Charman, 2013). However, with a large pool of potential fillers, such technologies might theoretically produce lineup fillers too similar to the suspect (Fitzgerald, Oriet, & Price, 2015; Luus & Wells, 1991; Wells, Rydell, & Seelau, 1993). This research proposes a new factor-filler database size-as a lineup feature affecting eyewitness accuracy. In a facial recognition experiment, we select lineup fillers in a legally realistic manner using facial matching software applied to filler databases of 5,000, 25,000, and 125,000 photos, and find that larger databases are associated with a higher objective similarity rating between suspects and fillers and lower overall identification accuracy. In target present lineups, witnesses viewing lineups created from the larger databases were less likely to make correct identifications and more likely to select known innocent fillers. When the target was absent, database size was associated with a lower rate of correct rejections and a higher rate of filler identifications. Higher algorithmic similarity ratings were also associated with decreases in eyewitness identification accuracy. The results suggest that using facial matching software to select fillers from large photograph databases may reduce identification accuracy, and provides support for filler database size as a meaningful system variable. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Audiovisual biofeedback improves motion prediction accuracy.

    Science.gov (United States)

    Pollock, Sean; Lee, Danny; Keall, Paul; Kim, Taeho

    2013-04-01

    The accuracy of motion prediction, utilized to overcome the system latency of motion management radiotherapy systems, is hampered by irregularities present in the patients' respiratory pattern. Audiovisual (AV) biofeedback has been shown to reduce respiratory irregularities. The aim of this study was to test the hypothesis that AV biofeedback improves the accuracy of motion prediction. An AV biofeedback system combined with real-time respiratory data acquisition and MR images were implemented in this project. One-dimensional respiratory data from (1) the abdominal wall (30 Hz) and (2) the thoracic diaphragm (5 Hz) were obtained from 15 healthy human subjects across 30 studies. The subjects were required to breathe with and without the guidance of AV biofeedback during each study. The obtained respiratory signals were then implemented in a kernel density estimation prediction algorithm. For each of the 30 studies, five different prediction times ranging from 50 to 1400 ms were tested (150 predictions performed). Prediction error was quantified as the root mean square error (RMSE); the RMSE was calculated from the difference between the real and predicted respiratory data. The statistical significance of the prediction results was determined by the Student's t-test. Prediction accuracy was considerably improved by the implementation of AV biofeedback. Of the 150 respiratory predictions performed, prediction accuracy was improved 69% (103/150) of the time for abdominal wall data, and 78% (117/150) of the time for diaphragm data. The average reduction in RMSE due to AV biofeedback over unguided respiration was 26% (p biofeedback improves prediction accuracy. This would result in increased efficiency of motion management techniques affected by system latencies used in radiotherapy.

  5. Astrophysical relevance of γ transition energies

    International Nuclear Information System (INIS)

    Rauscher, Thomas

    2008-01-01

    The relevant γ energy range is explicitly identified where additional γ strength must be located to have an impact on astrophysically relevant reactions. It is shown that folding the energy dependences of the transmission coefficients and the level density leads to maximal contributions for γ energies of 2≤E γ ≤4 unless quantum selection rules allow isolated states to contribute. Under this condition, electric dipole transitions dominate. These findings allow us to more accurately judge the relevance of modifications of the γ strength for astrophysics

  6. Neutrophil programming dynamics and its disease relevance.

    Science.gov (United States)

    Ran, Taojing; Geng, Shuo; Li, Liwu

    2017-11-01

    Neutrophils are traditionally considered as first responders to infection and provide antimicrobial host defense. However, recent advances indicate that neutrophils are also critically involved in the modulation of host immune environments by dynamically adopting distinct functional states. Functionally diverse neutrophil subsets are increasingly recognized as critical components mediating host pathophysiology. Despite its emerging significance, molecular mechanisms as well as functional relevance of dynamically programmed neutrophils remain to be better defined. The increasing complexity of neutrophil functions may require integrative studies that address programming dynamics of neutrophils and their pathophysiological relevance. This review aims to provide an update on the emerging topics of neutrophil programming dynamics as well as their functional relevance in diseases.

  7. Evaluating measurement accuracy a practical approach

    CERN Document Server

    Rabinovich, Semyon G

    2013-01-01

    The goal of Evaluating Measurement Accuracy: A Practical Approach is to present methods for estimating the accuracy of measurements performed in industry, trade, and scientific research. From developing the theory of indirect measurements to proposing new methods of reduction, transformation, and enumeration, this work encompasses the full range of measurement data processing. It includes many examples that illustrate the application of general theory to typical problems encountered in measurement practice. As a result, the book serves as an inclusive reference work for data processing of all types of measurements: single and multiple, combined and simultaneous, direct (both linear and nonlinear), and indirect (both dependent and independent). It is a working tool for experimental scientists and engineers of all disciplines who work with instrumentation. It is also a good resource for natural science and engineering students and for technicians performing measurements in industry. A key feature of the book is...

  8. Improving the accuracy of dynamic mass calculation

    Directory of Open Access Journals (Sweden)

    Oleksandr F. Dashchenko

    2015-06-01

    Full Text Available With the acceleration of goods transporting, cargo accounting plays an important role in today's global and complex environment. Weight is the most reliable indicator of the materials control. Unlike many other variables that can be measured indirectly, the weight can be measured directly and accurately. Using strain-gauge transducers, weight value can be obtained within a few milliseconds; such values correspond to the momentary load, which acts on the sensor. Determination of the weight of moving transport is only possible by appropriate processing of the sensor signal. The aim of the research is to develop a methodology for weighing freight rolling stock, which increases the accuracy of the measurement of dynamic mass, in particular wagon that moves. Apart from time-series methods, preliminary filtration for improving the accuracy of calculation is used. The results of the simulation are presented.

  9. Teaching accuracy and reliability for student projects

    Science.gov (United States)

    Fisher, Nick

    2002-09-01

    Physics students at Rugby School follow the Salters Horners A-level course, which involves working on a two-week practical project of their own choosing. Pupils often misunderstand the concepts of accuracy and reliability, believing, for example, that repeating readings makes them more accurate and more reliable, whereas all it does is help to check repeatability. The course emphasizes the ideas of checking anomalous points, improving accuracy and making readings more sensitive. This article describes how we teach pupils in preparation for their projects. Based on many years of running such projects, much of this material is from a short booklet that we give out to pupils, when we train them in practical project skills.

  10. The foundation of the concept of relevance

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2010-01-01

    that what was regarded as the most fundamental view by Saracevic in 1975 has not since been considered (with very few exceptions). Other views, which are based on less fruitful assumptions, have dominated the discourse on relevance in information retrieval and information science. Many authors have...... reexamined the concept of relevance in information science, but have neglected the subject knowledge view, hence basic theoretical assumptions seem not to have been properly addressed. It is as urgent now as it was in 1975 seriously to consider “the subject knowledge view” of relevance (which may also...... be termed “the epistemological view”). The concept of relevance, like other basic concepts, is influenced by overall approaches to information science, such as the cognitive view and the domain-analytic view. There is today a trend toward a social paradigm for information science. This paper offers...

  11. Exploring Educational Quality and Relevance through Integrating ...

    African Journals Online (AJOL)

    Exploring Educational Quality and Relevance through Integrating Environmental and Social Issues in Science Education. ... However, the new contextualised concept of learning and teaching was applied only to one of them. A post-test was ...

  12. Has Financial Statement Information become Less Relevant?

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Damkier, Jesper

    This paper presents insights into the question of whether accounting information based on the EU’s Accounting Directives has become less value-relevant to investors over time. The study is based on a research design first used by Francis and Schipper (1999), where value-relevance is measured......? The sample is based on non-financial companies listed on the Copenhagen Stock Exchange in the period 1984-2002. Our analyses show that all the applied accounting measures are value-relevant as investment strategies based on the information earn positive market-adjusted returns in our sample period....... The results provide some indication of a decline in the value-relevance of earnings information in the 1984-2001 period, and mixed, but not statistically reliable, evidence for accounting measures where book value information and asset values are also extracted from financial statements. The results seem...

  13. Accuracy of clinical diagnosis in knee arthroscopy.

    OpenAIRE

    Brooks, Stuart; Morgan, Mamdouh

    2002-01-01

    A prospective study of 238 patients was performed in a district general hospital to assess current diagnostic accuracy rates and to ascertain the use and the effectiveness of magnetic resonance imaging (MRI) scanning in reducing the number of negative arthroscopies. The pre-operative diagnosis of patients listed for knee arthroscopy was medial meniscus tear 94 (40%) and osteoarthritis 59 (25%). MRI scans were requested in 57 patients (24%) with medial meniscus tear representing 65% (37 patien...

  14. High accuracy 3-D laser radar

    DEFF Research Database (Denmark)

    Busck, Jens; Heiselberg, Henning

    2004-01-01

    We have developed a mono-static staring 3-D laser radar based on gated viewing with range accuracy below 1 m at 10 m and 1 cm at 100. We use a high sensitivity, fast, intensified CCD camera, and a Nd:Yag passively Q-switched 32.4 kHz pulsed green laser at 532 nm. The CCD has 752x582 pixels. Camera...

  15. Accuracy of radiocarbon analyses at ANTARES

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E M; Fink, D; Hotchkis, M; Hua, Q; Jacobsen, G; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accuracy in Accelerator Mass Spectroscopy (AMS) measurements, as distinct from precision, requires the application of a number of corrections. Most of these are well known except in extreme circumstances and AMS can deliver radiocarbon results which are both precise and accurate in the 0.5 to 1.0% range. The corrections involved in obtaining final radiocarbon ages are discussed. 3 refs., 1 tab.

  16. Measurement Accuracy Limitation Analysis on Synchrophasors

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Jiecheng [University of Tennessee (UT); Zhan, Lingwei [University of Tennessee (UT); Liu, Yilu [University of Tennessee (UTK) and Oak Ridge National Laboratory (ORNL); Qi, Hairong [University of Tennessee, Knoxville (UTK); Gracia, Jose R [ORNL; Ewing, Paul D [ORNL

    2015-01-01

    This paper analyzes the theoretical accuracy limitation of synchrophasors measurements on phase angle and frequency of the power grid. Factors that cause the measurement error are analyzed, including error sources in the instruments and in the power grid signal. Different scenarios of these factors are evaluated according to the normal operation status of power grid measurement. Based on the evaluation and simulation, the errors of phase angle and frequency caused by each factor are calculated and discussed.

  17. Accuracy of radiocarbon analyses at ANTARES

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M.; Fink, D.; Hotchkis, M.; Hua, Q.; Jacobsen, G.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accuracy in Accelerator Mass Spectroscopy (AMS) measurements, as distinct from precision, requires the application of a number of corrections. Most of these are well known except in extreme circumstances and AMS can deliver radiocarbon results which are both precise and accurate in the 0.5 to 1.0% range. The corrections involved in obtaining final radiocarbon ages are discussed. 3 refs., 1 tab.

  18. Simulation analysis for hyperbola locating accuracy

    International Nuclear Information System (INIS)

    Wang Changli; Liu Daizhi

    2004-01-01

    In the system of the hyperbola location, the geometric shape of the detecting stations has an important influence on the locating accuracy. At first, this paper simulates the process of the hyperbola location by the computer, and then analyzes the influence of the geometric shape on the locating errors and gives the computer simulation results, finally, discusses the problems that require attention in course of selecting the detecting station. The conclusion has practicality. (authors)

  19. On the Accuracy of Language Trees

    Science.gov (United States)

    Pompei, Simone; Loreto, Vittorio; Tria, Francesca

    2011-01-01

    Historical linguistics aims at inferring the most likely language phylogenetic tree starting from information concerning the evolutionary relatedness of languages. The available information are typically lists of homologous (lexical, phonological, syntactic) features or characters for many different languages: a set of parallel corpora whose compilation represents a paramount achievement in linguistics. From this perspective the reconstruction of language trees is an example of inverse problems: starting from present, incomplete and often noisy, information, one aims at inferring the most likely past evolutionary history. A fundamental issue in inverse problems is the evaluation of the inference made. A standard way of dealing with this question is to generate data with artificial models in order to have full access to the evolutionary process one is going to infer. This procedure presents an intrinsic limitation: when dealing with real data sets, one typically does not know which model of evolution is the most suitable for them. A possible way out is to compare algorithmic inference with expert classifications. This is the point of view we take here by conducting a thorough survey of the accuracy of reconstruction methods as compared with the Ethnologue expert classifications. We focus in particular on state-of-the-art distance-based methods for phylogeny reconstruction using worldwide linguistic databases. In order to assess the accuracy of the inferred trees we introduce and characterize two generalizations of standard definitions of distances between trees. Based on these scores we quantify the relative performances of the distance-based algorithms considered. Further we quantify how the completeness and the coverage of the available databases affect the accuracy of the reconstruction. Finally we draw some conclusions about where the accuracy of the reconstructions in historical linguistics stands and about the leading directions to improve it. PMID:21674034

  20. Do Investors Learn About Analyst Accuracy?

    OpenAIRE

    Chang, Charles; Daouk, Hazem; Wang, Albert

    2008-01-01

    We study the impact of analyst forecasts on prices to determine whether investors learn about analyst accuracy. Our test market is the crude oil futures market. Prices rise when analysts forecast a decrease (increase) in crude supplies. In the 15 minutes following supply realizations, prices rise (fall) when forecasts have been too high (low). In both the initial price action relative to forecasts and in the subsequent reaction relative to realized forecast errors, the price response is stron...

  1. On the accuracy of language trees.

    Directory of Open Access Journals (Sweden)

    Simone Pompei

    Full Text Available Historical linguistics aims at inferring the most likely language phylogenetic tree starting from information concerning the evolutionary relatedness of languages. The available information are typically lists of homologous (lexical, phonological, syntactic features or characters for many different languages: a set of parallel corpora whose compilation represents a paramount achievement in linguistics. From this perspective the reconstruction of language trees is an example of inverse problems: starting from present, incomplete and often noisy, information, one aims at inferring the most likely past evolutionary history. A fundamental issue in inverse problems is the evaluation of the inference made. A standard way of dealing with this question is to generate data with artificial models in order to have full access to the evolutionary process one is going to infer. This procedure presents an intrinsic limitation: when dealing with real data sets, one typically does not know which model of evolution is the most suitable for them. A possible way out is to compare algorithmic inference with expert classifications. This is the point of view we take here by conducting a thorough survey of the accuracy of reconstruction methods as compared with the Ethnologue expert classifications. We focus in particular on state-of-the-art distance-based methods for phylogeny reconstruction using worldwide linguistic databases. In order to assess the accuracy of the inferred trees we introduce and characterize two generalizations of standard definitions of distances between trees. Based on these scores we quantify the relative performances of the distance-based algorithms considered. Further we quantify how the completeness and the coverage of the available databases affect the accuracy of the reconstruction. Finally we draw some conclusions about where the accuracy of the reconstructions in historical linguistics stands and about the leading directions to improve

  2. Speed-Accuracy Tradeoffs in Speech Production

    Science.gov (United States)

    2017-06-01

    capacity of discrete motor responses under different cognitive sets. Journal of Experimental Psychology , 71 (4), 475. SPEED-ACCURACY TRADEOFFS IN HUMAN...space defined by vocal tract constriction degree and location, as in Articulatory Phonology Browman & Goldstein (1992). These high-level spaces are...relationship between speech gestures varies as a function of their positions within the syllable Browman & Goldstein (1995); Krakow (1999); Byrd et al

  3. The dimensional accuracy of the sintered billets

    Directory of Open Access Journals (Sweden)

    Чингиз Ариф оглы Алиев

    2016-01-01

    Full Text Available The article presents the results of assessing the impact of the behaviour stability of the components included in the compositions and process parameters of their production, on the dimensional accuracy of workpieces. It was found that by increasing the amount of oxide in the composition is greater compaction of the sintered billet in the process of heat treatment. This also increases the density of all components of the composition

  4. QUALITY LOSS FUNCTION FOR MACHINING PROCESS ACCURACY

    Directory of Open Access Journals (Sweden)

    Adrian Stere PARIS

    2017-05-01

    Full Text Available The main goal of the paper is to propose new quality loss models for machining process accuracy in the classical case “zero the best”, MMF and Harris type. In addition a numerical example illustrates that the choose regression functions are directly linked with the quality loss of manufacturing process. The proposed models can be adapted for the “maximal the best” and “nominal the best” cases.

  5. Accuracy and precision in activation analysis: counting

    International Nuclear Information System (INIS)

    Becker, D.A.

    1974-01-01

    Accuracy and precision in activation analysis was investigated with regard to counting of induced radioactivity. The various parameters discussed include configuration, positioning, density, homogeneity, intensity, radioisotopic purity, peak integration, and nuclear constants. Experimental results are presented for many of these parameters. The results obtained indicate that counting errors often contribute significantly to the inaccuracy and imprecision of analyses. The magnitude of these errors range from less than 1 percent to 10 percent or more in many cases

  6. THE RELEVANCE OF GOODWILL REPORTING IN AN ISLAMIC CONTEXT

    Directory of Open Access Journals (Sweden)

    Radu-Daniel LOGHIN

    2014-11-01

    Full Text Available In recent years global finance has seen the emergence of Islamic finance as an alternative to the western secular system. While the two systems posses largely similar concepts of social equity and well-being the major divide between them rests in the distinction between divine and natural law as a source of protection for the downtrodden. As communication barriers between the Arabic and Anglo-European accounting systems start to blur, the question posed for the practitioners as to what constitutes a source of equity becomes more and more relevant. Considering the case of Islamic countries, besides internally-generated and acquired goodwill Islamic sources of social equity such as zakat also provide a source of social equity. For the purpose of this paper, two models pertaining to value relevance are tested for a sample of 56 companies in 6 accounting jurisdictions with the purpose of identifying the underlying sources of social equity revealing that zakat disclosures marginally improve the accuracy of the model.

  7. Ecological Relevance Determines Task Priority in Older Adults' Multitasking.

    Science.gov (United States)

    Doumas, Michail; Krampe, Ralf Th

    2015-05-01

    Multitasking is a challenging aspect of human behavior, especially if the concurrently performed tasks are different in nature. Several studies demonstrated pronounced performance decrements (dual-task costs) in older adults for combinations of cognitive and motor tasks. However, patterns of costs among component tasks differed across studies and reasons for participants' resource allocation strategies remained elusive. We investigated young and older adults' multitasking of a working memory task and two sensorimotor tasks, one with low (finger force control) and one with high ecological relevance (postural control). The tasks were performed in single-, dual-, and triple-task contexts. Working memory accuracy was reduced in dual-task contexts with either sensorimotor task and deteriorated further under triple-task conditions. Postural and force performance deteriorated with age and task difficulty in dual-task contexts. However, in the triple-task context with its maximum resource demands, older adults prioritized postural control over both force control and memory. Our results identify ecological relevance as the key factor in older adults' multitasking. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Accuracy of Environmental Monitoring in China: Exploring the Influence of Institutional, Political and Ideological Factors

    Directory of Open Access Journals (Sweden)

    Daniele Brombal

    2017-02-01

    Full Text Available Environmental monitoring data are essential to informing decision-making processes relevant to the management of the environment. Their accuracy is therefore of extreme importance. The credibility of Chinese environmental data has been long questioned by domestic and foreign observers. This paper explores the potential impact of institutional, political, and ideological factors on the accuracy of China’s environmental monitoring data. It contends that the bureaucratic incentive system, conflicting agency goals, particular interests, and ideological structures constitute potential sources of bias in processes of environmental monitoring in China. The current leadership has acknowledged the issue, implementing new measures to strengthen administrative coordination and reinforce the oversight of the central government over local authorities. However, the failure to address the deeper political roots of the problem and the ambivalence over the desirability of public participation to enhance transparency might jeopardize Beijing’s strive for environmental data accuracy.

  9. Radioactivity analysis of food and accuracy control

    International Nuclear Information System (INIS)

    Ota, Tomoko

    2013-01-01

    From the fact that radioactive substances have been detected from the foods such as agricultural and livestock products and marine products due to the accident of the Fukushima Daiichi Nuclear Power Station of Tokyo Electric Power Company, the Ministry of Health, Labour and Welfare stipulated new standards geared to general foods on radioactive cesium by replacing the interim standards up to now. Various institutions began to measure radioactivity on the basis of this instruction, but as a new challenge, a problem of the reliability of the data occurred. Therefore, accuracy control to indicate the proof that the quality of the data can be retained at an appropriate level judging from an objective manner is important. In order to consecutively implement quality management activities, it is necessary for each inspection agency to build an accuracy control system. This paper introduces support service, as a new attempt, for establishing the accuracy control system. This service is offered jointly by three organizations, such as TUV Rheinland Japan Ltd., Japan Frozen Foods Inspection Corporation, and Japan Chemical Analysis Center. This service consists of the training of radioactivity measurement practitioners, proficiency test for radioactive substance measurement, and personal authentication. (O.A.)

  10. Ultra-wideband ranging precision and accuracy

    International Nuclear Information System (INIS)

    MacGougan, Glenn; O'Keefe, Kyle; Klukas, Richard

    2009-01-01

    This paper provides an overview of ultra-wideband (UWB) in the context of ranging applications and assesses the precision and accuracy of UWB ranging from both a theoretical perspective and a practical perspective using real data. The paper begins with a brief history of UWB technology and the most current definition of what constitutes an UWB signal. The potential precision of UWB ranging is assessed using Cramer–Rao lower bound analysis. UWB ranging methods are described and potential error sources are discussed. Two types of commercially available UWB ranging radios are introduced which are used in testing. Actual ranging accuracy is assessed from line-of-sight testing under benign signal conditions by comparison to high-accuracy electronic distance measurements and to ranges derived from GPS real-time kinematic positioning. Range measurements obtained in outdoor testing with line-of-sight obstructions and strong reflection sources are compared to ranges derived from classically surveyed positions. The paper concludes with a discussion of the potential applications for UWB ranging

  11. Accuracy of stereolithographic models of human anatomy

    International Nuclear Information System (INIS)

    Barker, T.M.; Earwaker, W.J.S.; Lisle, D.A.

    1994-01-01

    A study was undertaken to determine the dimensional accuracy of anatomical replicas derived from X-ray 3D computed tomography (CT) images and produced using the rapid prototyping technique of stereolithography (SLA). A dry bone skull and geometric phantom were scanned, and replicas were produced. Distance measurements were obtained to compare the original objects and the resulting replicas. Repeated measurements between anatomical landmarks were used for comparison of the original skull and replica. Results for the geometric phantom demonstrate a mean difference of +0.47mm, representing an accuracy of 97.7-99.12%. Measurements of the skull produced a range of absolute differences (maximum +4.62mm, minimum +0.1mm, mean +0.85mm). These results support the use of SLA models of human anatomical structures in such areas as pre-operative planning of complex surgical procedures. For applications where higher accuracy is required, improvements can be expected by utilizing smaller pixel resolution in the CT images. Stereolithographic models can now be confidently employed as accurate, three-dimensional replicas of complex, anatomical structures. 14 refs., 2 tabs., 8 figs

  12. Eye movement accuracy determines natural interception strategies.

    Science.gov (United States)

    Fooken, Jolande; Yeo, Sang-Hoon; Pai, Dinesh K; Spering, Miriam

    2016-11-01

    Eye movements aid visual perception and guide actions such as reaching or grasping. Most previous work on eye-hand coordination has focused on saccadic eye movements. Here we show that smooth pursuit eye movement accuracy strongly predicts both interception accuracy and the strategy used to intercept a moving object. We developed a naturalistic task in which participants (n = 42 varsity baseball players) intercepted a moving dot (a "2D fly ball") with their index finger in a designated "hit zone." Participants were instructed to track the ball with their eyes, but were only shown its initial launch (100-300 ms). Better smooth pursuit resulted in more accurate interceptions and determined the strategy used for interception, i.e., whether interception was early or late in the hit zone. Even though early and late interceptors showed equally accurate interceptions, they may have relied on distinct tactics: early interceptors used cognitive heuristics, whereas late interceptors' performance was best predicted by pursuit accuracy. Late interception may be beneficial in real-world tasks as it provides more time for decision and adjustment. Supporting this view, baseball players who were more senior were more likely to be late interceptors. Our findings suggest that interception strategies are optimally adapted to the proficiency of the pursuit system.

  13. 100% classification accuracy considered harmful: the normalized information transfer factor explains the accuracy paradox.

    Directory of Open Access Journals (Sweden)

    Francisco J Valverde-Albacete

    Full Text Available The most widely spread measure of performance, accuracy, suffers from a paradox: predictive models with a given level of accuracy may have greater predictive power than models with higher accuracy. Despite optimizing classification error rate, high accuracy models may fail to capture crucial information transfer in the classification task. We present evidence of this behavior by means of a combinatorial analysis where every possible contingency matrix of 2, 3 and 4 classes classifiers are depicted on the entropy triangle, a more reliable information-theoretic tool for classification assessment. Motivated by this, we develop from first principles a measure of classification performance that takes into consideration the information learned by classifiers. We are then able to obtain the entropy-modulated accuracy (EMA, a pessimistic estimate of the expected accuracy with the influence of the input distribution factored out, and the normalized information transfer factor (NIT, a measure of how efficient is the transmission of information from the input to the output set of classes. The EMA is a more natural measure of classification performance than accuracy when the heuristic to maximize is the transfer of information through the classifier instead of classification error count. The NIT factor measures the effectiveness of the learning process in classifiers and also makes it harder for them to "cheat" using techniques like specialization, while also promoting the interpretability of results. Their use is demonstrated in a mind reading task competition that aims at decoding the identity of a video stimulus based on magnetoencephalography recordings. We show how the EMA and the NIT factor reject rankings based in accuracy, choosing more meaningful and interpretable classifiers.

  14. Seeking kinetic pathways relevant to the structural evolution of metal nanoparticles

    International Nuclear Information System (INIS)

    Haldar, Paramita; Chatterjee, Abhijit

    2015-01-01

    Understanding the kinetic pathways that cause metal nanoparticles to structurally evolve over time is essential for predicting their shape and size distributions and catalytic properties. Consequently, we need detailed kinetic models that can provide such information. Most kinetic Monte Carlo models used for metal systems contain a fixed catalogue of atomic moves; the catalogue is largely constructed based on our physical understanding of the material. In some situations, it is possible that an incorrect picture of the overall dynamics is obtained when kinetic pathways that are relevant to the dynamics are missing from the catalogue. Hence, a computational framework that can systematically determine the relevant pathways is required. This work intends to fulfil this requirement. Examples involving an Ag nanoparticle are studied to illustrate how molecular dynamics (MD) calculations can be employed to find the relevant pathways in a system. Since pathways that are unlikely to be selected at short timescales can become relevant at longer times, the accuracy of the catalogue is maintained by continually seeking these pathways using MD. We discuss various aspects of our approach, namely, defining the relevance of atomic moves to the dynamics and determining when additional MD is required to ensure the desired accuracy, as well as physical insights into the Ag nanoparticle. (paper)

  15. Kernel-Based Relevance Analysis with Enhanced Interpretability for Detection of Brain Activity Patterns

    Directory of Open Access Journals (Sweden)

    Andres M. Alvarez-Meza

    2017-10-01

    Full Text Available We introduce Enhanced Kernel-based Relevance Analysis (EKRA that aims to support the automatic identification of brain activity patterns using electroencephalographic recordings. EKRA is a data-driven strategy that incorporates two kernel functions to take advantage of the available joint information, associating neural responses to a given stimulus condition. Regarding this, a Centered Kernel Alignment functional is adjusted to learning the linear projection that best discriminates the input feature set, optimizing the required free parameters automatically. Our approach is carried out in two scenarios: (i feature selection by computing a relevance vector from extracted neural features to facilitating the physiological interpretation of a given brain activity task, and (ii enhanced feature selection to perform an additional transformation of relevant features aiming to improve the overall identification accuracy. Accordingly, we provide an alternative feature relevance analysis strategy that allows improving the system performance while favoring the data interpretability. For the validation purpose, EKRA is tested in two well-known tasks of brain activity: motor imagery discrimination and epileptic seizure detection. The obtained results show that the EKRA approach estimates a relevant representation space extracted from the provided supervised information, emphasizing the salient input features. As a result, our proposal outperforms the state-of-the-art methods regarding brain activity discrimination accuracy with the benefit of enhanced physiological interpretation about the task at hand.

  16. Inferring relevance in a changing world

    Directory of Open Access Journals (Sweden)

    Robert C Wilson

    2012-01-01

    Full Text Available Reinforcement learning models of human and animal learning usually concentrate on how we learn the relationship between different stimuli or actions and rewards. However, in real world situations stimuli are ill-defined. On the one hand, our immediate environment is extremely multi-dimensional. On the other hand, in every decision-making scenario only a few aspects of the environment are relevant for obtaining reward, while most are irrelevant. Thus a key question is how do we learn these relevant dimensions, that is, how do we learn what to learn about? We investigated this process of representation learning experimentally, using a task in which one stimulus dimension was relevant for determining reward at each point in time. As in real life situations, in our task the relevant dimension can change without warning, adding ever-present uncertainty engendered by a constantly changing environment. We show that human performance on this task is better described by a suboptimal strategy based on selective attention and serial hypothesis testing rather than a normative strategy based on probabilistic inference. From this, we conjecture that the problem of inferring relevance in general scenarios is too computationally demanding for the brain to solve optimally. As a result the brain utilizes approximations, employing these even in simplified scenarios in which optimal representation learning is tractable, such as the one in our experiment.

  17. Accuracy of abdominal auscultation for bowel obstruction.

    Science.gov (United States)

    Breum, Birger Michael; Rud, Bo; Kirkegaard, Thomas; Nordentoft, Tyge

    2015-09-14

    To investigate the accuracy and inter-observer variation of bowel sound assessment in patients with clinically suspected bowel obstruction. Bowel sounds were recorded in patients with suspected bowel obstruction using a Littmann(®) Electronic Stethoscope. The recordings were processed to yield 25-s sound sequences in random order on PCs. Observers, recruited from doctors within the department, classified the sound sequences as either normal or pathological. The reference tests for bowel obstruction were intraoperative and endoscopic findings and clinical follow up. Sensitivity and specificity were calculated for each observer and compared between junior and senior doctors. Interobserver variation was measured using the Kappa statistic. Bowel sound sequences from 98 patients were assessed by 53 (33 junior and 20 senior) doctors. Laparotomy was performed in 47 patients, 35 of whom had bowel obstruction. Two patients underwent colorectal stenting due to large bowel obstruction. The median sensitivity and specificity was 0.42 (range: 0.19-0.64) and 0.78 (range: 0.35-0.98), respectively. There was no significant difference in accuracy between junior and senior doctors. The median frequency with which doctors classified bowel sounds as abnormal did not differ significantly between patients with and without bowel obstruction (26% vs 23%, P = 0.08). The 53 doctors made up 1378 unique pairs and the median Kappa value was 0.29 (range: -0.15-0.66). Accuracy and inter-observer agreement was generally low. Clinical decisions in patients with possible bowel obstruction should not be based on auscultatory assessment of bowel sounds.

  18. Improving calibration accuracy in gel dosimetry

    International Nuclear Information System (INIS)

    Oldham, M.; McJury, M.; Webb, S.; Baustert, I.B.; Leach, M.O.

    1998-01-01

    A new method of calibrating gel dosimeters (applicable to both Fricke and polyacrylamide gels) is presented which has intrinsically higher accuracy than current methods, and requires less gel. Two test-tubes of gel (inner diameter 2.5 cm, length 20 cm) are irradiated separately with a 10x10cm 2 field end-on in a water bath, such that the characteristic depth-dose curve is recorded in the gel. The calibration is then determined by fitting the depth-dose measured in water, against the measured change in relaxivity with depth in the gel. Increased accuracy is achieved in this simple depth-dose geometry by averaging the relaxivity at each depth. A large number of calibration data points, each with relatively high accuracy, are obtained. Calibration data over the full range of dose (1.6-10 Gy) is obtained by irradiating one test-tube to 10 Gy at dose maximum (D max ), and the other to 4.5 Gy at D max . The new calibration method is compared with a 'standard method' where five identical test-tubes of gel were irradiated to different known doses between 2 and 10 Gy. The percentage uncertainties in the slope and intercept of the calibration fit are found to be lower with the new method by a factor of about 4 and 10 respectively, when compared with the standard method and with published values. The gel was found to respond linearly within the error bars up to doses of 7 Gy, with a slope of 0.233±0.001 s -1 Gy -1 and an intercept of 1.106±0.005 Gy. For higher doses, nonlinear behaviour was observed. (author)

  19. Climate Change Accuracy: Requirements and Economic Value

    Science.gov (United States)

    Wielicki, B. A.; Cooke, R.; Mlynczak, M. G.; Lukashin, C.; Thome, K. J.; Baize, R. R.

    2014-12-01

    Higher than normal accuracy is required to rigorously observe decadal climate change. But what level is needed? How can this be quantified? This presentation will summarize a new more rigorous and quantitative approach to determining the required accuracy for climate change observations (Wielicki et al., 2013, BAMS). Most current global satellite observations cannot meet this accuracy level. A proposed new satellite mission to resolve this challenge is CLARREO (Climate Absolute Radiance and Refractivity Observatory). CLARREO is designed to achieve advances of a factor of 10 for reflected solar spectra and a factor of 3 to 5 for thermal infrared spectra (Wielicki et al., Oct. 2013 BAMS). The CLARREO spectrometers are designed to serve as SI traceable benchmarks for the Global Satellite Intercalibration System (GSICS) and to greatly improve the utility of a wide range of LEO and GEO infrared and reflected solar passive satellite sensors for climate change observations (e.g. CERES, MODIS, VIIIRS, CrIS, IASI, Landsat, SPOT, etc). Providing more accurate decadal change trends can in turn lead to more rapid narrowing of key climate science uncertainties such as cloud feedback and climate sensitivity. A study has been carried out to quantify the economic benefits of such an advance as part of a rigorous and complete climate observing system. The study concludes that the economic value is $12 Trillion U.S. dollars in Net Present Value for a nominal discount rate of 3% (Cooke et al. 2013, J. Env. Sys. Dec.). A brief summary of these two studies and their implications for the future of climate science will be presented.

  20. Accuracy Requirements in Medical Radiation Dosimetry

    International Nuclear Information System (INIS)

    Andreo, P.

    2011-01-01

    The need for adopting unambiguous terminology on 'accuracy in medical radiation dosimetry' which is consistent with international recommendations for metrology is emphasized. Uncertainties attainable, or the need for improving their estimates, are analysed for the fields of radiotherapy, diagnostic radiology and nuclear medicine dosimetry. This review centres on uncertainties related to the first step of the dosimetry chain in the three fields, which in all cases involves the use of a detector calibrated by a standards laboratory to determine absorbed dose, air kerma or activity under reference conditions in a clinical environment. (author)

  1. Quantum chemistry by random walk: Higher accuracy

    International Nuclear Information System (INIS)

    Anderson, J.B.

    1980-01-01

    The random walk method of solving the Schroedinger equation is extended to allow the calculation of eigenvalues of atomic and molecular systems with higher accuracy. The combination of direct calculation of the difference delta between a true wave function psi and a trial wave function psi/sub o/ with importance sampling greatly reduces systematic and statistical error. The method is illustrated with calculations for ground-state hydrogen and helium atoms using trial wave functions from variational calculations. The energies obtained are 20 to 100 times more accurate than those of the corresponding variational calculations

  2. ArcticDEM Validation and Accuracy Assessment

    Science.gov (United States)

    Candela, S. G.; Howat, I.; Noh, M. J.; Porter, C. C.; Morin, P. J.

    2017-12-01

    ArcticDEM comprises a growing inventory Digital Elevation Models (DEMs) covering all land above 60°N. As of August, 2017, ArcticDEM had openly released 2-m resolution, individual DEM covering over 51 million km2, which includes areas of repeat coverage for change detection, as well as over 15 million km2 of 5-m resolution seamless mosaics. By the end of the project, over 80 million km2 of 2-m DEMs will be produced, averaging four repeats of the 20 million km2 Arctic landmass. ArcticDEM is produced from sub-meter resolution, stereoscopic imagery using open source software (SETSM) on the NCSA Blue Waters supercomputer. These DEMs have known biases of several meters due to errors in the sensor models generated from satellite positioning. These systematic errors are removed through three-dimensional registration to high-precision Lidar or other control datasets. ArcticDEM is registered to seasonally-subsetted ICESat elevations due its global coverage and high report accuracy ( 10 cm). The vertical accuracy of ArcticDEM is then obtained from the statistics of the fit to the ICESat point cloud, which averages -0.01 m ± 0.07 m. ICESat, however, has a relatively coarse measurement footprint ( 70 m) which may impact the precision of the registration. Further, the ICESat data predates the ArcticDEM imagery by a decade, so that temporal changes in the surface may also impact the registration. Finally, biases may exist between different the different sensors in the ArcticDEM constellation. Here we assess the accuracy of ArcticDEM and the ICESat registration through comparison to multiple high-resolution airborne lidar datasets that were acquired within one year of the imagery used in ArcticDEM. We find the ICESat dataset is performing as anticipated, introducing no systematic bias during the coregistration process, and reducing vertical errors to within the uncertainty of the airborne Lidars. Preliminary sensor comparisons show no significant difference post coregistration

  3. Accuracy of prehospital transport time estimation.

    Science.gov (United States)

    Wallace, David J; Kahn, Jeremy M; Angus, Derek C; Martin-Gill, Christian; Callaway, Clifton W; Rea, Thomas D; Chhatwal, Jagpreet; Kurland, Kristen; Seymour, Christopher W

    2014-01-01

    Estimates of prehospital transport times are an important part of emergency care system research and planning; however, the accuracy of these estimates is unknown. The authors examined the accuracy of three estimation methods against observed transport times in a large cohort of prehospital patient transports. This was a validation study using prehospital records in King County, Washington, and southwestern Pennsylvania from 2002 to 2006 and 2005 to 2011, respectively. Transport time estimates were generated using three methods: linear arc distance, Google Maps, and ArcGIS Network Analyst. Estimation error, defined as the absolute difference between observed and estimated transport time, was assessed, as well as the proportion of estimated times that were within specified error thresholds. Based on the primary results, a regression estimate was used that incorporated population density, time of day, and season to assess improved accuracy. Finally, hospital catchment areas were compared using each method with a fixed drive time. The authors analyzed 29,935 prehospital transports to 44 hospitals. The mean (± standard deviation [±SD]) absolute error was 4.8 (±7.3) minutes using linear arc, 3.5 (±5.4) minutes using Google Maps, and 4.4 (±5.7) minutes using ArcGIS. All pairwise comparisons were statistically significant (p Google Maps, and 11.6 [±10.9] minutes for ArcGIS). Estimates were within 5 minutes of observed transport time for 79% of linear arc estimates, 86.6% of Google Maps estimates, and 81.3% of ArcGIS estimates. The regression-based approach did not substantially improve estimation. There were large differences in hospital catchment areas estimated by each method. Route-based transport time estimates demonstrate moderate accuracy. These methods can be valuable for informing a host of decisions related to the system organization and patient access to emergency medical care; however, they should be employed with sensitivity to their limitations.

  4. Accuracy in Robot Generated Image Data Sets

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Dahl, Anders Bjorholm

    2015-01-01

    In this paper we present a practical innovation concerning how to achieve high accuracy of camera positioning, when using a 6 axis industrial robots to generate high quality data sets for computer vision. This innovation is based on the realization that to a very large extent the robots positioning...... error is deterministic, and can as such be calibrated away. We have successfully used this innovation in our efforts for creating data sets for computer vision. Since the use of this innovation has a significant effect on the data set quality, we here present it in some detail, to better aid others...

  5. Passage relevance models for genomics search

    Directory of Open Access Journals (Sweden)

    Frieder Ophir

    2009-03-01

    Full Text Available Abstract We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.

  6. Response moderation models for conditional dependence between response time and response accuracy.

    Science.gov (United States)

    Bolsinova, Maria; Tijmstra, Jesper; Molenaar, Dylan

    2017-05-01

    It is becoming more feasible and common to register response times in the application of psychometric tests. Researchers thus have the opportunity to jointly model response accuracy and response time, which provides users with more relevant information. The most common choice is to use the hierarchical model (van der Linden, 2007, Psychometrika, 72, 287), which assumes conditional independence between response time and accuracy, given a person's speed and ability. However, this assumption may be violated in practice if, for example, persons vary their speed or differ in their response strategies, leading to conditional dependence between response time and accuracy and confounding measurement. We propose six nested hierarchical models for response time and accuracy that allow for conditional dependence, and discuss their relationship to existing models. Unlike existing approaches, the proposed hierarchical models allow for various forms of conditional dependence in the model and allow the effect of continuous residual response time on response accuracy to be item-specific, person-specific, or both. Estimation procedures for the models are proposed, as well as two information criteria that can be used for model selection. Parameter recovery and usefulness of the information criteria are investigated using simulation, indicating that the procedure works well and is likely to select the appropriate model. Two empirical applications are discussed to illustrate the different types of conditional dependence that may occur in practice and how these can be captured using the proposed hierarchical models. © 2016 The British Psychological Society.

  7. Identity theory and personality theory: mutual relevance.

    Science.gov (United States)

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  8. Accuracy of colonoscopy in localizing colonic cancer.

    Science.gov (United States)

    Stanciu, C; Trifan, Anca; Khder, Saad Alla

    2007-01-01

    It is important to establish the precise localization of colonic cancer preoperatively; while colonoscopy is regarded as the diagnostic gold standard for colorectal cancer, its ability to localize the tumor is less reliable. To define the accuracy of colonoscopy in identifying the location of colonic cancer. All of the patients who had a colorectal cancer diagnosed by colonoscopy at the Institute of Gastroenterology and Hepatology, Iaşi and subsequently received a surgical intervention at three teaching hospitals in Iaşi, between January 2001 and December 2005, were included in this study. Endoscopic records and operative notes were carefully reviewed, and tumor localization was recorded. There were 161 patients (89 men, 72 women, aged 61.3 +/- 12.8 years) who underwent conventional surgery for colon cancer detected by colonoscopy during the study period. Twenty-two patients (13.66%) had erroneous colonoscopic localization of the tumors. The overall accuracy of preoperative colonoscopic localization was 87.58%. Colonoscopy is an accurate, reliable method for locating colon cancer, although additional techniques (i.e., endoscopic tattooing) should be performed at least for small lesions.

  9. Accuracy requirements in radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Buzdar, S. A.; Afzal, M.; Nazir, A.; Gadhi, M. A.

    2013-01-01

    Radiation therapy attempts to deliver ionizing radiation to the tumour and can improve the survival chances and/or quality of life of patients. There are chances of errors and uncertainties in the entire process of radiotherapy that may affect the accuracy and precision of treatment management and decrease degree of conformation. All expected inaccuracies, like radiation dose determination, volume calculation, complete evaluation of the full extent of the tumour, biological behaviour of specific tumour types, organ motion during radiotherapy, imaging, biological/molecular uncertainties, sub-clinical diseases, microscopic spread of the disease, uncertainty in normal tissue responses and radiation morbidity need sound appreciation. Conformity can be increased by reduction of such inaccuracies. With the yearly increase in computing speed and advancement in other technologies the future will provide the opportunity to optimize a greater number of variables and reduce the errors in the treatment planning process. In multi-disciplined task of radiotherapy, efforts are needed to overcome the errors and uncertainty, not only by the physicists but also by radiologists, pathologists and oncologists to reduce molecular and biological uncertainties. The radiation therapy physics is advancing towards an optimal goal that is definitely to improve accuracy where necessary and to reduce uncertainty where possible. (author)

  10. High accuracy satellite drag model (HASDM)

    Science.gov (United States)

    Storz, Mark F.; Bowman, Bruce R.; Branson, Major James I.; Casali, Stephen J.; Tobiska, W. Kent

    The dominant error source in force models used to predict low-perigee satellite trajectories is atmospheric drag. Errors in operational thermospheric density models cause significant errors in predicted satellite positions, since these models do not account for dynamic changes in atmospheric drag for orbit predictions. The Air Force Space Battlelab's High Accuracy Satellite Drag Model (HASDM) estimates and predicts (out three days) a dynamically varying global density field. HASDM includes the Dynamic Calibration Atmosphere (DCA) algorithm that solves for the phases and amplitudes of the diurnal and semidiurnal variations of thermospheric density near real-time from the observed drag effects on a set of Low Earth Orbit (LEO) calibration satellites. The density correction is expressed as a function of latitude, local solar time and altitude. In HASDM, a time series prediction filter relates the extreme ultraviolet (EUV) energy index E10.7 and the geomagnetic storm index ap, to the DCA density correction parameters. The E10.7 index is generated by the SOLAR2000 model, the first full spectrum model of solar irradiance. The estimated and predicted density fields will be used operationally to significantly improve the accuracy of predicted trajectories for all low-perigee satellites.

  11. Fast and High Accuracy Wire Scanner

    CERN Document Server

    Koujili, M; Koopman, J; Ramos, D; Sapinski, M; De Freitas, J; Ait Amira, Y; Djerdir, A

    2009-01-01

    Scanning of a high intensity particle beam imposes challenging requirements on a Wire Scanner system. It is expected to reach a scanning speed of 20 m.s-1 with a position accuracy of the order of 1 μm. In addition a timing accuracy better than 1 millisecond is needed. The adopted solution consists of a fork holding a wire rotating by a maximum of 200°. Fork, rotor and angular position sensor are mounted on the same axis and located in a chamber connected to the beam vacuum. The requirements imply the design of a system with extremely low vibration, vacuum compatibility, radiation and temperature tolerance. The adopted solution consists of a rotary brushless synchronous motor with the permanent magnet rotor installed inside of the vacuum chamber and the stator installed outside. The accurate position sensor will be mounted on the rotary shaft inside of the vacuum chamber, has to resist a bake-out temperature of 200°C and ionizing radiation up to a dozen of kGy/year. A digital feedback controller allows maxi...

  12. Can Translation Improve EFL Students' Grammatical Accuracy? [

    Directory of Open Access Journals (Sweden)

    Carol Ebbert-Hübner

    2018-01-01

    Full Text Available This report focuses on research results from a project completed at Trier University in December 2015 that provides insight into whether a monolingual group of learners can improve their grammatical accuracy and reduce interference mistakes in their English via contrastive analysis and translation instruction and activities. Contrastive analysis and translation (CAT instruction in this setting focusses on comparing grammatical differences between students’ dominant language (German and English, and practice activities where sentences or short texts are translated from German into English. The results of a pre- and post-test administered in the first and final week of a translation class were compared to two other class types: a grammar class which consisted of form-focused instruction but not translation, and a process-approach essay writing class where students received feedback on their written work throughout the semester. The results of our study indicate that with C1 level EAP students, more improvement in grammatical accuracy is seen through teaching with CAT than in explicit grammar instruction or through language feedback on written work alone. These results indicate that CAT does indeed have a place in modern language classes.

  13. Accuracy of magnetic resonance based susceptibility measurements

    Science.gov (United States)

    Erdevig, Hannah E.; Russek, Stephen E.; Carnicka, Slavka; Stupic, Karl F.; Keenan, Kathryn E.

    2017-05-01

    Magnetic Resonance Imaging (MRI) is increasingly used to map the magnetic susceptibility of tissue to identify cerebral microbleeds associated with traumatic brain injury and pathological iron deposits associated with neurodegenerative diseases such as Parkinson's and Alzheimer's disease. Accurate measurements of susceptibility are important for determining oxygen and iron content in blood vessels and brain tissue for use in noninvasive clinical diagnosis and treatment assessments. Induced magnetic fields with amplitude on the order of 100 nT, can be detected using MRI phase images. The induced field distributions can then be inverted to obtain quantitative susceptibility maps. The focus of this research was to determine the accuracy of MRI-based susceptibility measurements using simple phantom geometries and to compare the susceptibility measurements with magnetometry measurements where SI-traceable standards are available. The susceptibilities of paramagnetic salt solutions in cylindrical containers were measured as a function of orientation relative to the static MRI field. The observed induced fields as a function of orientation of the cylinder were in good agreement with simple models. The MRI susceptibility measurements were compared with SQUID magnetometry using NIST-traceable standards. MRI can accurately measure relative magnetic susceptibilities while SQUID magnetometry measures absolute magnetic susceptibility. Given the accuracy of moment measurements of tissue mimicking samples, and the need to look at small differences in tissue properties, the use of existing NIST standard reference materials to calibrate MRI reference structures is problematic and better reference materials are required.

  14. Diatomic molecule vibrational potentials: Accuracy of representations

    International Nuclear Information System (INIS)

    Engelke, R.

    1978-01-01

    A method is presented for increasing the radius of convergence of certain representations of diatomic molecule vibrational potentials. The method relies on using knowledge of the analytic structure of such potentials to the maximum when attempting to approximate them. The known singular point (due to the centrifugal and/or Coulomb potentials) at zero internuclear separation should be included in its exact form in an approximate representation. The efficacy of this idea is tested [using Peek's ''exact'' numerical Born-Oppenheimer potential for the (1ssigma/sub g/) 2 Σ + /sub g/ state of H + 2 as a test problem] when the representational form is the series of (1) Dunham, (2) Simons, Parr, and Finlan, (3) Thakkar, and (4) Ogilvie-Tipping, and also (5) when the form is a [2, 2] or a [3, 3] Pade approximant. Significant improvements in accuracy are obtained in some of these cases, particularly on the inner wall of the potential. A comparison of the effectiveness of the five methods is made both with and without the origin behavior being included exactly. This is useful in itself as no comprehensive accuracy comparison of the standard representations seems to have appeared in the literature. The Ogilvie-Tipping series, corrected at the origin for singular behavior, is the best representation presently available for states analogous to the (1ssigma/sub g/) 2 Σ + /sub g/ state of H + 2

  15. Dimensional accuracy of 3D printed vertebra

    Science.gov (United States)

    Ogden, Kent; Ordway, Nathaniel; Diallo, Dalanda; Tillapaugh-Fay, Gwen; Aslan, Can

    2014-03-01

    3D printer applications in the biomedical sciences and medical imaging are expanding and will have an increasing impact on the practice of medicine. Orthopedic and reconstructive surgery has been an obvious area for development of 3D printer applications as the segmentation of bony anatomy to generate printable models is relatively straightforward. There are important issues that should be addressed when using 3D printed models for applications that may affect patient care; in particular the dimensional accuracy of the printed parts needs to be high to avoid poor decisions being made prior to surgery or therapeutic procedures. In this work, the dimensional accuracy of 3D printed vertebral bodies derived from CT data for a cadaver spine is compared with direct measurements on the ex-vivo vertebra and with measurements made on the 3D rendered vertebra using commercial 3D image processing software. The vertebra was printed on a consumer grade 3D printer using an additive print process using PLA (polylactic acid) filament. Measurements were made for 15 different anatomic features of the vertebral body, including vertebral body height, endplate width and depth, pedicle height and width, and spinal canal width and depth, among others. It is shown that for the segmentation and printing process used, the results of measurements made on the 3D printed vertebral body are substantially the same as those produced by direct measurement on the vertebra and measurements made on the 3D rendered vertebra.

  16. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  17. Can Consumers Trust Web-Based Information About Celiac Disease? Accuracy, Comprehensiveness, Transparency, and Readability of Information on the Internet

    Science.gov (United States)

    McNally, Shawna L; Donohue, Michael C; Newton, Kimberly P; Ogletree, Sandra P; Conner, Kristen K; Ingegneri, Sarah E

    2012-01-01

    98 (52%) websites contained less than 50% of the core celiac disease information that was considered important for inclusion on websites that provide general information about celiac disease. Academic websites were significantly less transparent (P = .005) than commercial websites in attributing authorship, timeliness of information, sources of information, and other important disclosures. The type of website publisher did not predict website accuracy, comprehensiveness, or overall website quality. Only 4 of 98 (4%) websites achieved an overall quality score of 80 or above, which a priori was set as the minimum score for a website to be judged trustworthy and reliable. Conclusions The information on many websites addressing celiac disease was not sufficiently accurate, comprehensive, and transparent, or presented at an appropriate reading grade level, to be considered sufficiently trustworthy and reliable for patients, health care providers, celiac disease support groups, and the general public. This has the potential to adversely affect decision making about important aspects of celiac disease, including its appropriate and proper diagnosis, treatment, and management. PMID:23611901

  18. Bible Translation And Relevance Theory | Deist | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics Plus. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22 (1992) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Bible Translation And Relevance Theory. F Deist ...

  19. Relevant Scatterers Characterization in SAR Images

    Science.gov (United States)

    Chaabouni, Houda; Datcu, Mihai

    2006-11-01

    Recognizing scenes in a single look meter resolution Synthetic Aperture Radar (SAR) images, requires the capability to identify relevant signal signatures in condition of variable image acquisition geometry, arbitrary objects poses and configurations. Among the methods to detect relevant scatterers in SAR images, we can mention the internal coherence. The SAR spectrum splitted in azimuth generates a series of images which preserve high coherence only for particular object scattering. The detection of relevant scatterers can be done by correlation study or Independent Component Analysis (ICA) methods. The present article deals with the state of the art for SAR internal correlation analysis and proposes further extensions using elements of inference based on information theory applied to complex valued signals. The set of azimuth looks images is analyzed using mutual information measures and an equivalent channel capacity is derived. The localization of the "target" requires analysis in a small image window, thus resulting in imprecise estimation of the second order statistics of the signal. For a better precision, a Hausdorff measure is introduced. The method is applied to detect and characterize relevant objects in urban areas.

  20. Seeking Relevance: American Political Science and America

    Science.gov (United States)

    Maranto, Robert; Woessner, Matthew C.

    2012-01-01

    In this article, the authors talk about the relevance of American political science and America. Political science has enormous strengths in its highly talented practitioners and sophisticated methods. However, its disconnection from its host society, while not so severe as for fields like English and sociology, nonetheless poses an existential…

  1. Is Enterprise Education Relevant to Social Enterprise?

    Science.gov (United States)

    Bridge, Simon

    2015-01-01

    Purpose: Both enterprise education and social enterprise have become fashionable but what, if any, should be the connections between them? The purpose of this paper is to explore those connections and to reflect on what relevance the two concepts might have for each other. Design/methodology/approach: Both enterprise education and social…

  2. The Relevant Physical Trace in Criminal Investigation

    Directory of Open Access Journals (Sweden)

    Durdica Hazard

    2016-01-01

    Full Text Available A criminal investigation requires the forensic scientist to search and to interpret vestiges of a criminal act that happened in the past. The forensic scientist is one of the many stakeholders who take part in the information quest within the criminal justice system. She reads the investigation scene in search of physical traces that should enable her to tell the story of the offense/crime that allegedly occurred. The challenge for any investigator is to detect and recognize relevant physical traces in order to provide clues for investigation and intelligence purposes, and that will constitute sound and relevant evidence for the court. This article shows how important it is to consider the relevancy of physical traces from the beginning of the investigation and what might influence the evaluation process. The exchange and management of information between the investigation stakeholders are important. Relevancy is a dimension that needs to be understood from the standpoints of law enforcement personnel and forensic scientists with the aim of strengthening investigation and ultimately the overall judicial process.

  3. Interpersonal communication: It's relevance to nursing practice ...

    African Journals Online (AJOL)

    This paper is aimed at highlighting how essential interpersonal communication is necessary for establishing rapport, understanding the needs of the patients and planning effective intervention for meeting holistic health care. To be continually relevant, Nurses have to improve on their communication skills to meet the ...

  4. The Relevance of Causal Social Construction

    Directory of Open Access Journals (Sweden)

    Marques Teresa

    2017-02-01

    Full Text Available Social constructionist claims are surprising and interesting when they entail that presumably natural kinds are in fact socially constructed. The claims are interesting because of their theoretical and political importance. Authors like Díaz-León argue that constitutive social construction is more relevant for achieving social justice than causal social construction. This paper challenges this claim. Assuming there are socially salient groups that are discriminated against, the paper presents a dilemma: if there were no constitutively constructed social kinds, the causes of the discrimination of existing social groups would have to be addressed, and understanding causal social construction would be relevant to achieve social justice. On the other hand, not all possible constitutively socially constructed kinds are actual social kinds. If an existing social group is constitutively constructed as a social kind K, the fact that it actually exists as a K has social causes. Again, causal social construction is relevant. The paper argues that (i for any actual social kind X, if X is constitutively socially constructed as K, then it is also causally socially constructed; and (ii causal social construction is at least as relevant as constitutive social construction for concerns of social justice. For illustration, I draw upon two phenomena that are presumed to contribute towards the discrimination of women: (i the poor performance effects of stereotype threat, and (ii the silencing effects of gendered language use.

  5. The economic lot size and relevant costs

    NARCIS (Netherlands)

    Corbeij, M.H.; Jansen, R.A.; Grübström, R.W.; Hinterhuber, H.H.; Lundquist, J.

    1993-01-01

    In many accounting textbooks it is strongly argued that decisions should always be evaluated on relevant costs; that is variable costs and opportunity costs. Surprisingly, when it comes to Economic Order Quantities or Lot Sizes, some textbooks appear to be less straightforward. The question whether

  6. Bootstrapping Visual Categorization with Relevant Negatives

    NARCIS (Netherlands)

    Li, X.; Snoek, C.G.M.; Worring, M.; Koelma, D.; Smeulders, A.W.M.

    Learning classifiers for many visual concepts are important for image categorization and retrieval. As a classifier tends to misclassify negative examples which are visually similar to positive ones, inclusion of such misclassified and thus relevant negatives should be stressed during learning.

  7. Why ritual plant use has ethnopharmacological relevance

    NARCIS (Netherlands)

    Quiroz, Diana; Sosef, Marc; Andel, Van Tinde

    2016-01-01

    Ethnopharmacological relevance Although ritual plant use is now recognised both for its socio-cultural importance and for its contribution to nature conservation, its potential pharmacological effects remain overlooked. Aim of the study Our objective was to see whether ritual plant use could have

  8. Bradford's Law and Its Relevance to Researchers

    Science.gov (United States)

    Shenton, Andrew K.; Hay-Gibson, Naomi V.

    2009-01-01

    Bradford's Law has been the subject of much discussion and analysis in library and information science since its formulation in the 1930s and remains frequently debated to this day. It has been applied to various practices within the discipline, especially with regard to collection development, but its relevance to researchers and the potential it…

  9. The relevance of cosmopolitanism for moral education

    NARCIS (Netherlands)

    Merry, M.S.; de Ruyter, D.J.

    2011-01-01

    In this article we defend a moral conception of cosmopolitanism and its relevance for moral education. Our moral conception of cosmopolitanism presumes that persons possess an inherent dignity in the Kantian sense and therefore they should be recognised as ends-in-themselves. We argue that

  10. Ranking Music Data by Relevance and Importance

    DEFF Research Database (Denmark)

    Ruxanda, Maria Magdalena; Nanopoulos, Alexandros; Jensen, Christian Søndergaard

    2008-01-01

    Due to the rapidly increasing availability of audio files on the Web, it is relevant to augment search engines with advanced audio search functionality. In this context, the ranking of the retrieved music is an important issue. This paper proposes a music ranking method capable of flexibly fusing...

  11. Fast multi-output relevance vector regression

    OpenAIRE

    Ha, Youngmin

    2017-01-01

    This paper aims to decrease the time complexity of multi-output relevance vector regression from O(VM^3) to O(V^3+M^3), where V is the number of output dimensions, M is the number of basis functions, and V

  12. Pragmatic inferences and self-relevant judgments

    DEFF Research Database (Denmark)

    Puente-Diaz, Rogelio; Cavazos Arroyo, Judith; Brem, Alexander

    2016-01-01

    Three studies examined the influence of type of scale on self-relevant judgments and the moderating role of age, prevention, focus, and need for cogni- tion. Participants were randomly assigned to a bipolar or a unipolar scale condition in all three studies. Results from study 1 with a representa...

  13. Making Chemistry Relevant to the Engineering Major

    Science.gov (United States)

    Basu-Dutt, Sharmistha; Slappey, Charles; Bartley, Julie K.

    2010-01-01

    As part of a campus-wide, externally funded project to increase performance in, enthusiasm for, and retention within STEM disciplines, we developed an interdisciplinary, team-taught first-year seminar course. The construction and delivery of this course was designed to show the relevance of selected general chemistry topics such as matter and…

  14. Contingent Attentional Capture by Conceptually Relevant Images

    Science.gov (United States)

    Wyble, Brad; Folk, Charles; Potter, Mary C.

    2013-01-01

    Attentional capture is an unintentional shift of visuospatial attention to the location of a distractor that is either highly salient, or relevant to the current task set. The latter situation is referred to as contingent capture, in that the effect is contingent on a match between characteristics of the stimuli and the task-defined…

  15. The Relevance Aura of Bibliographic Records.

    Science.gov (United States)

    Brooks, Terrence A.

    1997-01-01

    Analyzes relevance assessments of topical descriptors for bibliographic records for two dimensions: (1) a vertical conceptual hierarchy of broad to narrow descriptors, and (2) a horizontal linkage of related terms. The data were analyzed for a semantic distance and semantic direction effect as postulated by the Semantic Distance Model. (Author/LRW)

  16. The Relevance of Cosmopolitanism for Moral Education

    Science.gov (United States)

    Merry, Michael S.; de Ruyter, Doret J.

    2011-01-01

    In this article we defend a moral conception of cosmopolitanism and its relevance for moral education. Our moral conception of cosmopolitanism presumes that persons possess an inherent dignity in the Kantian sense and therefore they should be recognised as ends-in-themselves. We argue that cosmopolitan ideals can inspire moral educators to awaken…

  17. Forecasting space weather: Can new econometric methods improve accuracy?

    Science.gov (United States)

    Reikard, Gordon

    2011-06-01

    Space weather forecasts are currently used in areas ranging from navigation and communication to electric power system operations. The relevant forecast horizons can range from as little as 24 h to several days. This paper analyzes the predictability of two major space weather measures using new time series methods, many of them derived from econometrics. The data sets are the A p geomagnetic index and the solar radio flux at 10.7 cm. The methods tested include nonlinear regressions, neural networks, frequency domain algorithms, GARCH models (which utilize the residual variance), state transition models, and models that combine elements of several techniques. While combined models are complex, they can be programmed using modern statistical software. The data frequency is daily, and forecasting experiments are run over horizons ranging from 1 to 7 days. Two major conclusions stand out. First, the frequency domain method forecasts the A p index more accurately than any time domain model, including both regressions and neural networks. This finding is very robust, and holds for all forecast horizons. Combining the frequency domain method with other techniques yields a further small improvement in accuracy. Second, the neural network forecasts the solar flux more accurately than any other method, although at short horizons (2 days or less) the regression and net yield similar results. The neural net does best when it includes measures of the long-term component in the data.

  18. Perceptual Load Affects Eyewitness Accuracy & Susceptibility to Leading Questions

    Directory of Open Access Journals (Sweden)

    Gillian Murphy

    2016-08-01

    Full Text Available Load Theory (Lavie, 1995; 2005 states that the level of perceptual load in a task (i.e. the amount of information involved in processing task-relevant stimuli determines the efficiency of selective attention. There is evidence that perceptual load affects distractor processing, with increased inattentional blindness under high load. Given that high load can result in individuals failing to report seeing obvious objects, it is conceivable that load may also impair memory for the scene. The current study is the first to assess the effect of perceptual load on eyewitness memory. Across three experiments (two video-based and one in a driving simulator, the effect of perceptual load on eyewitness memory was assessed. The results showed that eyewitnesses were less accurate under high load, in particular for peripheral details. For example, memory for the central character in the video was not affected by load but memory for a witness who passed by the window at the edge of the scene was significantly worse under high load. High load memories were also more open to suggestion, showing increased susceptibility to leading questions. High visual perceptual load also affected recall for auditory information, illustrating a possible cross-modal perceptual load effect on memory accuracy. These results have implications for eyewitness memory researchers and forensic professionals.

  19. Perceptual Load Affects Eyewitness Accuracy and Susceptibility to Leading Questions.

    Science.gov (United States)

    Murphy, Gillian; Greene, Ciara M

    2016-01-01

    Load Theory (Lavie, 1995, 2005) states that the level of perceptual load in a task (i.e., the amount of information involved in processing task-relevant stimuli) determines the efficiency of selective attention. There is evidence that perceptual load affects distractor processing, with increased inattentional blindness under high load. Given that high load can result in individuals failing to report seeing obvious objects, it is conceivable that load may also impair memory for the scene. The current study is the first to assess the effect of perceptual load on eyewitness memory. Across three experiments (two video-based and one in a driving simulator), the effect of perceptual load on eyewitness memory was assessed. The results showed that eyewitnesses were less accurate under high load, in particular for peripheral details. For example, memory for the central character in the video was not affected by load but memory for a witness who passed by the window at the edge of the scene was significantly worse under high load. High load memories were also more open to suggestion, showing increased susceptibility to leading questions. High visual perceptual load also affected recall for auditory information, illustrating a possible cross-modal perceptual load effect on memory accuracy. These results have implications for eyewitness memory researchers and forensic professionals.

  20. Diagnostic accuracy of patch test in children with food allergy.

    Science.gov (United States)

    Caglayan Sozmen, Sule; Povesi Dascola, Carlotta; Gioia, Edoardo; Mastrorilli, Carla; Rizzuti, Laura; Caffarelli, Carlo

    2015-08-01

    The gold standard test for confirming whether a child has clinical hypersensitivity reactions to foods is the oral food challenge. Therefore, there is increasing interest in simpler diagnostic markers of food allergy, especially in children, to avoid oral food challenge. The goal of this study was to assess the diagnostic accuracy of atopy patch test in comparison with oral food challenge. We investigated 243 children (mean age, 51 months) referred for evaluation of suspected egg or cow's milk allergy. Skin prick test and atopy patch test were carried out, and after a 2 weeks elimination diet, oral food challenge was performed. Two hundred and forty-three children underwent OFC to the suspected food. We found clinically relevant food allergies in 40 (65%) children to egg and in 22 (35%) to cow's milk. The sensitivity of skin prick test for both milk and egg was 92%, specificity 91%, positive predictive value 35%, and negative predictive value of 93%. Sensitivity, specificity, positive predictive value, and negative predictive value of atopy patch test for both milk and egg were 21%, 73%, 20%, and 74%, respectively. Our study suggests that there is insufficient evidence for the routine use of atopy patch test for the evaluation of egg and cow's milk allergy. OFC remains gold standard for the diagnosis of egg and milk allergy even in the presence of high costs in terms of both time and risks during application. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. THE THIRD GRAVITATIONAL LENSING ACCURACY TESTING (GREAT3) CHALLENGE HANDBOOK

    International Nuclear Information System (INIS)

    Mandelbaum, Rachel; Kannawadi, Arun; Simet, Melanie; Rowe, Barnaby; Kacprzak, Tomasz; Bosch, James; Miyatake, Hironao; Chang, Chihway; Gill, Mandeep; Courbin, Frederic; Jarvis, Mike; Armstrong, Bob; Lackner, Claire; Leauthaud, Alexie; Nakajima, Reiko; Rhodes, Jason; Zuntz, Joe; Bridle, Sarah; Coupon, Jean; Dietrich, Jörg P.

    2014-01-01

    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is the third in a series of image analysis challenges, with a goal of testing and facilitating the development of methods for analyzing astronomical images that will be used to measure weak gravitational lensing. This measurement requires extremely precise estimation of very small galaxy shape distortions, in the presence of far larger intrinsic galaxy shapes and distortions due to the blurring kernel caused by the atmosphere, telescope optics, and instrumental effects. The GREAT3 challenge is posed to the astronomy, machine learning, and statistics communities, and includes tests of three specific effects that are of immediate relevance to upcoming weak lensing surveys, two of which have never been tested in a community challenge before. These effects include many novel aspects including realistically complex galaxy models based on high-resolution imaging from space; a spatially varying, physically motivated blurring kernel; and a combination of multiple different exposures. To facilitate entry by people new to the field, and for use as a diagnostic tool, the simulation software for the challenge is publicly available, though the exact parameters used for the challenge are blinded. Sample scripts to analyze the challenge data using existing methods will also be provided. See http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/ for more information

  2. High accuracy mantle convection simulation through modern numerical methods

    KAUST Repository

    Kronbichler, Martin

    2012-08-21

    Numerical simulation of the processes in the Earth\\'s mantle is a key piece in understanding its dynamics, composition, history and interaction with the lithosphere and the Earth\\'s core. However, doing so presents many practical difficulties related to the numerical methods that can accurately represent these processes at relevant scales. This paper presents an overview of the state of the art in algorithms for high-Rayleigh number flows such as those in the Earth\\'s mantle, and discusses their implementation in the Open Source code Aspect (Advanced Solver for Problems in Earth\\'s ConvecTion). Specifically, we show how an interconnected set of methods for adaptive mesh refinement (AMR), higher order spatial and temporal discretizations, advection stabilization and efficient linear solvers can provide high accuracy at a numerical cost unachievable with traditional methods, and how these methods can be designed in a way so that they scale to large numbers of processors on compute clusters. Aspect relies on the numerical software packages deal.II and Trilinos, enabling us to focus on high level code and keeping our implementation compact. We present results from validation tests using widely used benchmarks for our code, as well as scaling results from parallel runs. © 2012 The Authors Geophysical Journal International © 2012 RAS.

  3. Accuracy of computer-assisted orthognathic surgery.

    Science.gov (United States)

    De Riu, Giacomo; Virdis, Paola Ilaria; Meloni, Silvio Mario; Lumbau, Aurea; Vaira, Luigi Angelo

    2018-02-01

    The purpose of this study was to retrospectively evaluate the difference between the planned and the actual movements of the jaws, using three-dimensional (3D) software for PC-assisted orthognathic surgery, to establish the accuracy of the procedure. A retrospective study was performed with 49 patients who had undergone PC-guided bimaxillary surgery. The accuracy of the protocol was determined by comparing planned movements of the jaws with the actual surgical movements, analysing frontal and lateral cephalometries. The overall results were deemed accurate, and differences among 12 of the 15 parameters were considered nonsignificant. Significant differences were reported for SNA (p = 0.008), SNB (p = 0.006), and anterior facial height (p = 0.033). The latter was significantly different in patients who had undergone genioplasty when compared with patients who had not. Virtual surgical planning presented a good degree of accuracy for most of the parameters assessed, with an average error of 1.98 mm for linear measures and 1.19° for angular measures. In general, a tendency towards under-projection in jaws was detected, probably due to imperfect condylar seating. A slight overcorrection of SNA and SNB during virtual planning (approximately 2°) could be beneficial. Further progress is required in the development of 3D simulation of the soft tissue, which currently does not allow an accurate management of the facial height and the chin position. Virtual planning cannot replace the need for constant intraoperative monitoring of the jaws' movements and real-time comparisons between planned and actual outcomes. It is therefore appropriate to leave some margin for correction of inaccuracies in the virtual planning. In this sense, it may be appropriate to use only the intermediate splint, and then use the planned occlusion and clinical measurements to guide repositioning of the second jaw and chin, respectively. Copyright © 2017 European Association for Cranio

  4. Accuracy of CNV Detection from GWAS Data.

    Directory of Open Access Journals (Sweden)

    Dandan Zhang

    2011-01-01

    Full Text Available Several computer programs are available for detecting copy number variants (CNVs using genome-wide SNP arrays. We evaluated the performance of four CNV detection software suites--Birdsuite, Partek, HelixTree, and PennCNV-Affy--in the identification of both rare and common CNVs. Each program's performance was assessed in two ways. The first was its recovery rate, i.e., its ability to call 893 CNVs previously identified in eight HapMap samples by paired-end sequencing of whole-genome fosmid clones, and 51,440 CNVs identified by array Comparative Genome Hybridization (aCGH followed by validation procedures, in 90 HapMap CEU samples. The second evaluation was program performance calling rare and common CNVs in the Bipolar Genome Study (BiGS data set (1001 bipolar cases and 1033 controls, all of European ancestry as measured by the Affymetrix SNP 6.0 array. Accuracy in calling rare CNVs was assessed by positive predictive value, based on the proportion of rare CNVs validated by quantitative real-time PCR (qPCR, while accuracy in calling common CNVs was assessed by false positive/false negative rates based on qPCR validation results from a subset of common CNVs. Birdsuite recovered the highest percentages of known HapMap CNVs containing >20 markers in two reference CNV datasets. The recovery rate increased with decreased CNV frequency. In the tested rare CNV data, Birdsuite and Partek had higher positive predictive values than the other software suites. In a test of three common CNVs in the BiGS dataset, Birdsuite's call was 98.8% consistent with qPCR quantification in one CNV region, but the other two regions showed an unacceptable degree of accuracy. We found relatively poor consistency between the two "gold standards," the sequence data of Kidd et al., and aCGH data of Conrad et al. Algorithms for calling CNVs especially common ones need substantial improvement, and a "gold standard" for detection of CNVs remains to be established.

  5. Accuracy of hazardous waste project estimates

    International Nuclear Information System (INIS)

    Hackney, J.W.

    1989-01-01

    The HAZRATE system has been developed to appraise the current state of definition of hazardous waste remedial projects. This is shown to have a high degree of correlation to the financial risk of such projects. The method employs a weighted checklist indicating the current degree of definition of some 150 significant project elements. It is based on the author's experience with a similar system for establishing the risk characteristics of process plant projects (Hackney, 1965 and 1989; 1985). In this paper definition ratings for 15 hazardous waste remedial projects have been correlated with the excesses of their actual costs over their base estimates, excluding any allowances for contingencies. Equations are presented, based on this study, for computation of the contingency allowance needed and estimate accuracy possible at a given stage of project development

  6. Quantum mechanical calculations to chemical accuracy

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.

    1991-01-01

    The accuracy of current molecular-structure calculations is illustrated with examples of quantum mechanical solutions for chemical problems. Two approaches are considered: (1) the coupled-cluster singles and doubles (CCSD) with a perturbational estimate of the contribution of connected triple excitations, or CCDS(T); and (2) the multireference configuration-interaction (MRCI) approach to the correlation problem. The MRCI approach gains greater applicability by means of size-extensive modifications such as the averaged-coupled pair functional approach. The examples of solutions to chemical problems include those for C-H bond energies, the vibrational frequencies of O3, identifying the ground state of Al2 and Si2, and the Lewis-Rayleigh afterglow and the Hermann IR system of N2. Accurate molecular-wave functions can be derived from a combination of basis-set saturation studies and full configuration-interaction calculations.

  7. Quantitative code accuracy evaluation of ISP33

    Energy Technology Data Exchange (ETDEWEB)

    Kalli, H.; Miwrrin, A. [Lappeenranta Univ. of Technology (Finland); Purhonen, H. [VTT Energy, Lappeenranta (Finland)] [and others

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  8. Comparative diagnostic accuracy in virtual dermatopathology

    DEFF Research Database (Denmark)

    Mooney, E.; Hood, A.F.; Lampros, J.

    2011-01-01

    studies comparing the diagnostic accuracy and acceptability of virtual slides compared to traditional glass slides. Methods: Ten Nordic dermatopathologists and pathologists were given a randomized combination of 20 virtual and glass slides and asked to identify the diagnoses. They were then asked to give...... their impressions about the virtual images. Descriptive data analysis and comparison of groups using Fisher's exact test were performed. Objective: To compare the diagnostic ability of dermatopathologists and pathologists in two image formats: the traditional (glass) microscopic slides, and whole mount digitized...... of virtual or glass slides did not affect the percentage of questions answered correctly. Seven of nine participants completing the questionnaire, felt virtual microscopy is useful for both learning and testing. Conclusion: There was no significant difference in the participants' diagnostic ability using...

  9. The Accuracy of GBM GRB Localizations

    Science.gov (United States)

    Briggs, Michael Stephen; Connaughton, V.; Meegan, C.; Hurley, K.

    2010-03-01

    We report an study of the accuracy of GBM GRB localizations, analyzing three types of localizations: those produced automatically by the GBM Flight Software on board GBM, those produced automatically with ground software in near real time, and localizations produced with human guidance. The two types of automatic locations are distributed in near real-time via GCN Notices; the human-guided locations are distributed on timescale of many minutes or hours using GCN Circulars. This work uses a Bayesian analysis that models the distribution of the GBM total location error by comparing GBM locations to more accurate locations obtained with other instruments. Reference locations are obtained from Swift, Super-AGILE, the LAT, and with the IPN. We model the GBM total location errors as having systematic errors in addition to the statistical errors and use the Bayesian analysis to constrain the systematic errors.

  10. THE ACCURACY OF Hβ CCD PHOTOMETRY

    Directory of Open Access Journals (Sweden)

    C. Kim

    1994-12-01

    Full Text Available We have undertaken CCD observations of field standard stars with Hβ photometric system to investigate the reliability of Hβ CCD photometry. Flat fielding with dome flat and sky flat for Hβw and Hβn filter was compared with that of B filter in UBV system and, from these, we have not found any difference. It was confirmed that there is a good linear relationship between our Hβ values observed with 2.3m reflector and standard values. However, Hβ values observed with 60cm reflector at Sobaeksan Astronomy Observatory showed very poor relationship. To investigate the accuracy of Hβ CCD photometry for fainter objects, open cluster NGC2437 was observed and reduced with DoPHOT, and the results were compared with those for photoelectric photometry of Stetson (1981.

  11. High current high accuracy IGBT pulse generator

    International Nuclear Information System (INIS)

    Nesterov, V.V.; Donaldson, A.R.

    1995-05-01

    A solid state pulse generator capable of delivering high current triangular or trapezoidal pulses into an inductive load has been developed at SLAC. Energy stored in a capacitor bank of the pulse generator is switched to the load through a pair of insulated gate bipolar transistors (IGBT). The circuit can then recover the remaining energy and transfer it back to the capacitor bank without reversing the capacitor voltage. A third IGBT device is employed to control the initial charge to the capacitor bank, a command charging technique, and to compensate for pulse to pulse power losses. The rack mounted pulse generator contains a 525 μF capacitor bank. It can deliver 500 A at 900V into inductive loads up to 3 mH. The current amplitude and discharge time are controlled to 0.02% accuracy by a precision controller through the SLAC central computer system. This pulse generator drives a series pair of extraction dipoles

  12. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  13. Estimating the accuracy of geographical imputation

    Directory of Open Access Journals (Sweden)

    Boscoe Francis P

    2008-01-01

    Full Text Available Abstract Background To reduce the number of non-geocoded cases researchers and organizations sometimes include cases geocoded to postal code centroids along with cases geocoded with the greater precision of a full street address. Some analysts then use the postal code to assign information to the cases from finer-level geographies such as a census tract. Assignment is commonly completed using either a postal centroid or by a geographical imputation method which assigns a location by using both the demographic characteristics of the case and the population characteristics of the postal delivery area. To date no systematic evaluation of geographical imputation methods ("geo-imputation" has been completed. The objective of this study was to determine the accuracy of census tract assignment using geo-imputation. Methods Using a large dataset of breast, prostate and colorectal cancer cases reported to the New Jersey Cancer Registry, we determined how often cases were assigned to the correct census tract using alternate strategies of demographic based geo-imputation, and using assignments obtained from postal code centroids. Assignment accuracy was measured by comparing the tract assigned with the tract originally identified from the full street address. Results Assigning cases to census tracts using the race/ethnicity population distribution within a postal code resulted in more correctly assigned cases than when using postal code centroids. The addition of age characteristics increased the match rates even further. Match rates were highly dependent on both the geographic distribution of race/ethnicity groups and population density. Conclusion Geo-imputation appears to offer some advantages and no serious drawbacks as compared with the alternative of assigning cases to census tracts based on postal code centroids. For a specific analysis, researchers will still need to consider the potential impact of geocoding quality on their results and evaluate

  14. Multi-Accuracy-Level Burning Plasma Simulations

    International Nuclear Information System (INIS)

    Artaud, J. F.; Basiuk, V.; Garcia, J.; Giruzzi, G.; Huynh, P.; Huysmans, G.; Imbeaux, F.; Johner, J.; Scheider, M.

    2007-01-01

    The design of a reactor grade tokamak is based on a hierarchy of tools. We present here three codes that are presently used for the simulations of burning plasmas. At the first level there is a 0-dimensional code that allows to choose a reasonable range of global parameters; in our case the HELIOS code was used for this task. For the second level we have developed a mixed 0-D / 1-D code called METIS that allows to study the main properties of a burning plasma, including profiles and all heat and current sources, but always under the constraint of energy and other empirical scaling laws. METIS is a fast code that permits to perform a large number of runs (a run takes about one minute) and design the main features of a scenario, or validate the results of the 0-D code on a full time evolution. At the top level, we used the full 1D1/2 suite of codes CRONOS that gives access to a detailed study of the plasma profiles evolution. CRONOS can use a variety of modules for source terms and transport coefficients computation with different level of complexity and accuracy: from simple estimators to highly sophisticated physics calculations. Thus it is possible to vary the accuracy of burning plasma simulations, as a trade-off with computation time. A wide range of scenario studies can thus be made with CRONOS and then validated with post-processing tools like MHD stability analysis. We will present in this paper results of this multi-level analysis applied to the ITER hybrid scenario. This specific example will illustrate the importance of having several tools for the study of burning plasma scenarios, especially in a domain that present devices cannot access experimentally. (Author)

  15. Matters of Accuracy and Conventionality: Prior Accuracy Guides Children's Evaluations of Others' Actions

    Science.gov (United States)

    Scofield, Jason; Gilpin, Ansley Tullos; Pierucci, Jillian; Morgan, Reed

    2013-01-01

    Studies show that children trust previously reliable sources over previously unreliable ones (e.g., Koenig, Clement, & Harris, 2004). However, it is unclear from these studies whether children rely on accuracy or conventionality to determine the reliability and, ultimately, the trustworthiness of a particular source. In the current study, 3- and…

  16. Enhancing spoken connected-digit recognition accuracy by error ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    nition systems have gained acceptable accuracy levels, the accuracy of recognition of current connected ... bar code and ISBN1 library code to name a few. ..... Kopec G, Bush M 1985 Network-based connected-digit recognition. IEEE Trans.

  17. Tomographic diagnosis and relevant aspects of otosclerosis

    International Nuclear Information System (INIS)

    Gaiotti, Juliana Oggioni; Gomes, Natalia Delage; Costa, Ana Maria Doffemond; Villela, Caroline Laurita Batista Couto; Moreira, Wanderval; Diniz, Renata Lopes Furletti Caldeira

    2013-01-01

    A literature review and pictorial essay were developed to discuss the importance of knowing the main findings and locations of otosclerosis at multidetector computed tomography (MDCT). The authors performed a retrospective review of cases of otosclerosis diagnosed in their institution by means of high resolution multidetector computed tomography. Otosclerosis corresponds to otic capsule dysplasia characterized by metabolic derangement of its endochondral layer. Such condition constitutes a relevant cause of sensorineural hearing loss, affecting about 7% to 10% of the general population. The diagnosis is usually clinical, but imaging methods play a significant role in the anatomical detailing, differential diagnosis, surgical planning and evaluation of postoperative complications. Among such methods, the relevance of MDCT is highlighted. Radiologists should be familiar with the MDCT findings of otosclerosis, as well as with the temporal bone anatomy to assist in the appropriate clinical management of this disease. (author)

  18. Tomographic diagnosis and relevant aspects of otosclerosis

    Energy Technology Data Exchange (ETDEWEB)

    Gaiotti, Juliana Oggioni; Gomes, Natalia Delage; Costa, Ana Maria Doffemond; Villela, Caroline Laurita Batista Couto; Moreira, Wanderval; Diniz, Renata Lopes Furletti Caldeira, E-mail: jugaiotti@gmail.com [Hospital Mater Dei-Mater Imagem, Belo Horizonte, MG (Brazil)

    2013-09-15

    A literature review and pictorial essay were developed to discuss the importance of knowing the main findings and locations of otosclerosis at multidetector computed tomography (MDCT). The authors performed a retrospective review of cases of otosclerosis diagnosed in their institution by means of high resolution multidetector computed tomography. Otosclerosis corresponds to otic capsule dysplasia characterized by metabolic derangement of its endochondral layer. Such condition constitutes a relevant cause of sensorineural hearing loss, affecting about 7% to 10% of the general population. The diagnosis is usually clinical, but imaging methods play a significant role in the anatomical detailing, differential diagnosis, surgical planning and evaluation of postoperative complications. Among such methods, the relevance of MDCT is highlighted. Radiologists should be familiar with the MDCT findings of otosclerosis, as well as with the temporal bone anatomy to assist in the appropriate clinical management of this disease. (author)

  19. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    International Nuclear Information System (INIS)

    Debono, Josephine C; Poulos, Ann E

    2015-01-01

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice

  20. EOG feature relevance determination for microsleep detection

    OpenAIRE

    Golz Martin; Wollner Sebastian; Sommer David; Schnieder Sebastian

    2017-01-01

    Automatic relevance determination (ARD) was applied to two-channel EOG recordings for microsleep event (MSE) recognition. 10 s immediately before MSE and also before counterexamples of fatigued, but attentive driving were analysed. Two type of signal features were extracted: the maximum cross correlation (MaxCC) and logarithmic power spectral densities (PSD) averaged in spectral bands of 0.5 Hz width ranging between 0 and 8 Hz. Generalised learn-ing vector quantisation (GRLVQ) was used as ARD...

  1. Is QCD relevant to nuclear physics

    International Nuclear Information System (INIS)

    Thomas, A.W.

    1985-01-01

    A review is given of recent work on baryon structure in a number of QCD-motivated models. After establishing a prima facie case that the quark model should be relevant in a consistent description of the nucleus over a wide range of momentum transfer, the author looks for experimental confirmation. The discussion includes the search for exotic states, for a six quark component of the deuteron, and an up to date report on the interpretation of the EMC effect. (Auth.)

  2. Trade Off Relevance Dan Reliability: Isu Ifrs

    OpenAIRE

    Mahmudah, Hadi

    2013-01-01

    Financial reports containing qualitative characteristic that are useful for usernya.For a long time believed to be the existence of trade off between characteristic of qualitative relevance and reliability. Trade off due to the fact that the use of the method of measurement historical cost and fair value. Trade off occur because of the interests of for the purpose of the preparation of reports on finance. Accountability for the purpose of the measurement of the cost of historical still reliab...

  3. Hydrogen interaction with fusion-relevant materials

    International Nuclear Information System (INIS)

    Caorlin, M.

    1990-01-01

    This paper is an outline of the work carried out at JRC Ispra in the Tritium-materials Interaction Laboratory, on the interaction of gaseous hydrogen with several materials of interest in the field of fusion technology. Experimental work is reported and a concise review of relevant theoretical and numerical supporting activity is given as well. A period of about seven years is covered since 1982. Current work and possible future extensions are also briefly mentioned. 11 figs., 18 refs

  4. Nanostructure symmetry: Relevance for physics and computing

    International Nuclear Information System (INIS)

    Dupertuis, Marc-André; Oberli, D. Y.; Karlsson, K. F.; Dalessi, S.; Gallinet, B.; Svendsen, G.

    2014-01-01

    We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented

  5. Nanostructure symmetry: Relevance for physics and computing

    Energy Technology Data Exchange (ETDEWEB)

    Dupertuis, Marc-André; Oberli, D. Y. [Laboratory for Physics of Nanostructure, EPF Lausanne (Switzerland); Karlsson, K. F. [Department of Physics, Chemistry, and Biology (IFM), Linköping University (Sweden); Dalessi, S. [Computational Biology Group, Department of Medical Genetics, University of Lausanne (Switzerland); Gallinet, B. [Nanophotonics and Metrology Laboratory, EPF Lausanne (Switzerland); Svendsen, G. [Dept. of Electronics and Telecom., Norwegian University of Science and Technology, Trondheim (Norway)

    2014-03-31

    We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented.

  6. Monochloramine Cometabolism by Nitrifying Biofilm Relevant ...

    Science.gov (United States)

    Recently, biological monochloramine removal (i.e., cometabolism) by a pure culture ammonia–oxidizing bacteria, Nitrosomonas europaea, and a nitrifying mixed–culture have been shown to increase monochloramine demand. Although important, these previous suspended culture batch kinetic experiments were not representative of drinking water distribution systems where bacteria grow predominantly as biofilm attached to pipe walls or sediments and physiological differences may exist between suspension and biofilm growth. Therefore, the current research was an important next step in extending the previous results to investigate monochloramine cometabolism by biofilm grown in annular reactors under drinking water relevant conditions. Estimated monochloramine cometabolism kinetics were similar to those of ammonia metabolism, and monochloramine cometabolism was a significant loss mechanism (25–40% of the observed monochloramine loss). These results demonstrated that monochloramine cometabolism occurred in drinking water relevant nitrifying biofilm; thus, cometabolism may be a significant contribution to monochloramine loss during nitrification episodes in distribution systems. Investigate whether or not nitrifying biofilm can biologically transform monochloramine under drinking water relevant conditions.

  7. Selection of relevant dietary indicators for health.

    Science.gov (United States)

    Steingrímsdóttir, L; Ovesen, L; Moreiras, O; Jacob, S

    2002-05-01

    To define a set of dietary components that are relevant determinants for health in Europe. The selected components are intended to serve as nutrition indicators for health in the European Health Monitoring Programme and, as such, must be limited in number, relevant to health in Europe and practical for all involved countries with respect to data gathering and comparability of data. Major nutrition factors were determined by reviewing relevant epidemiological and clinical literature in nutrition and health as well as referring to reports from international expert groups, including the report from the project Nutrition and Diet for Healthy Lifestyles in Europe. The selection of factors was also based on the relative ease and cost involved for participating countries to obtain comparable and valid data. The selected factors include foods or food groups as well as individual nutrients. Biomarkers are suggested for selected nutrients that pose the greatest difficulty in obtaining valid and comparable data from dietary studies. The following list of diet indicators for health monitoring in Europe was agreed upon by the EFCOSUM group in 2001, in order of priority: vegetables, fruit, bread, fish, saturated fatty acids as percentage of energy (%E), total fat as %E, and ethanol in grams per day. Biomarkers were suggested for the following nutrients: folate, vitamin D, iron, iodine and sodium. Energy has to be assessed in order to calculate %E from total fat and saturated fatty acids.

  8. The value relevance of environmental emissions

    Directory of Open Access Journals (Sweden)

    Melinda Lydia Nelwan

    2016-07-01

    Full Text Available This study examines whether environmental performance has value relevance by investigating the relations between environmental emissions and stock prices for the U.S. public companies. The previous studies argued that the conjectured relations between accounting performance measures and environmental performance do not have a strong theoretical basis, and the modeling of relations between market per-formance measures and environmental performance do not adequately consider the relevance of accounting performance to market value. Therefore, this study examines whether publicly reported environmental emissions provide incremental information to accounting earnings in pricing companies stocks. It is done among the complete set of industries covered by Toxics Release Inventory (TRI reporting for the period 2007 to 2010. Using Ohlson model but modified to include different types of emis-sions, it is found that ground emissions (underground injection and land emissions are value relevant but other emission types (air and water and transferred-out emis-sions appear to not provide incremental information in the valuation model. The result in this study raise concerns that different types of emissions are assessed differently by the market, confirming that studies should not aggregate such measures.

  9. Valerian: No Evidence for Clinically Relevant Interactions

    Directory of Open Access Journals (Sweden)

    Olaf Kelber

    2014-01-01

    Full Text Available In recent popular publications as well as in widely used information websites directed to cancer patients, valerian is claimed to have a potential of adverse interactions with anticancer drugs. This questions its use as a safe replacement for, for example, benzodiazepines. A review on the interaction potential of preparations from valerian root (Valeriana officinalis L. root was therefore conducted. A data base search and search in a clinical drug interaction data base were conducted. Thereafter, a systematic assessment of publications was performed. Seven in vitro studies on six CYP 450 isoenzymes, on p-glycoprotein, and on two UGT isoenzymes were identified. However, the methodological assessment of these studies did not support their suitability for the prediction of clinically relevant interactions. In addition, clinical studies on various valerian preparations did not reveal any relevant interaction potential concerning CYP 1A2, 2D6, 2E1, and 3A4. Available animal and human pharmacodynamic studies did not verify any interaction potential. The interaction potential of valerian preparations therefore seems to be low and thereby without clinical relevance. We conclude that there is no specific evidence questioning their safety, also in cancer patients.

  10. Functional dyspepsia: Are psychosocial factors of relevance?

    Institute of Scientific and Technical Information of China (English)

    Sandra Barry; Timothy G Dinan

    2006-01-01

    The pathogenesis of Functional Dyspepsia (FD) remains unclear, appears diverse and is thus inadequately understood. Akin to other functional gastrointestinal disorders, research has demonstrated an association between this common diagnosis and psychosocial factors and psychiatric morbidity. Conceptualising the relevance of these factors within the syndrome of FD requires application of the biopsychosocial model of disease.Using this paradigm, dysregulation of the reciprocal communication between the brain and the gut is central to symptom generation, interpretation and exacerbation.Appreciation and understanding of the neurobiological correlates of various psychological states is also relevant.The view that psychosocial factors exert their influence in FD predominantly through motivation of health care seeking also persists. This appears too one-dimensional an assertion in light of the evidence available supporting a more intrinsic aetiological link. Evolving understanding of pathogenic mechanisms and the heterogeneous nature of the syndrome will facilitate effective management.Co-morbid psychiatric illness warrants treatment with conventional therapies. Acknowledging the relevance of psychosocial variables in FD, the degree of which is subject to variation, has implications for assessment and management. Available evidence suggests psychological therapies may benefit FD patients particularly those with chronic symptoms. The rationale for use of psychotropic medications in FD is apparent but the evidence base to support the use of antidepressant pharmacotherapy is to date limited.

  11. ACCURACY AND RELIABILITY AS CRITERIA OF INFORMATIVENESS IN THE NEWS STORY

    Directory of Open Access Journals (Sweden)

    Melnikova Ekaterina Aleksandrovna

    2014-12-01

    Full Text Available The article clarifies the meaning of the terms accuracy and reliability of the news story, offers a researcher's approach to obtaining objective data that helps to verify linguistic means of accuracy and reliability presence in the informative structure of the text. The accuracy of the news story is defined as a high relevance degree of event reflection through language representation of its constituents; the reliability is viewed as news story originality that is proved by introducing citations and sources of information considered being trustworthy into the text content. Having based the research on an event nominative density identification method, the author composed nominative charts of 115 news story texts, collected at web-sites of BBC and CNN media corporations; distinguished qualitative and quantitative markers of accuracy and reliability in the news story text; confirmed that the accuracy of the news story is achieved with terminological clearness in nominating event constituents in the text, thematic bind between words, presence of onyms that help deeply identify characteristics of the referent event. The reliability of the text is discovered in eyewitness accounts, quotations, and references to the sources being considered as trustworthy. Accurate revision of associations between accuracy and reliability and informing strategies in digital news nets allowed the author to set two variants of information delivery, that differ in their communicative and pragmatic functions: developing (that informs about major and minor details of an event and truncated (which gives some details thus raising the interest to the event and urging a reader to open a full story.

  12. Prognostic accuracy of antenatal neonatology consultation.

    Science.gov (United States)

    Kukora, S; Gollehon, N; Weiner, G; Laventhal, N

    2017-01-01

    Neonatologists provide antenatal counseling to support shared decision-making for complicated pregnancies. Poor or ambiguous prognostication can lead to inappropriate treatment and parental distress. We sought to evaluate the accuracy of antenatal prognosticaltion. A retrospective cohort was assembled from a prospectively populated database of all outpatient neonatology consultations. On the basis of the written consultation, fetuses were characterized by diagnosis groups (multiple anomalies or genetic disorders, single major anomaly and obstetric complications), assigned to five prognostic categories (I=survivable, IIA=uncertain but likely survivable, II=uncertain, IIB=uncertain but likely non-survivable, III non-survivable) and two final outcome categories (fetal demise/in-hospital neonatal death or survival to hospital discharge). When possible, status at last follow-up was recorded for those discharged from the hospital. Prognostic accuracy was assessed using unweighted, multi-level likelihood ratios (LRs). The final cohort included 143 fetuses/infants distributed nearly evenly among the three diagnosis groups. Over half (64%) were assigned an uncertain prognosis, but most of these could be divided into 'likely survivable' or 'likely non-survivable' subgroups. Overall survival for the entire cohort was 62% (89/143). All but one of the fetuses assigned a non-survivable prognosis suffered fetal demise or died before hospital discharge. The neonatologist's antenatal prognosis accurately predicted the probability of survival by prognosis group (LR I=4.56, LR IIA=10.53, LR II=4.71, LR IIB=0.099, LR III=0.040). The LRs clearly differentiated between fetuses with high and low probability of survival. Eleven fetuses (7.7%) had misalignment between the predicted prognosis and outcome. Five died before discharge despite being given category I or IIA prognoses, whereas six infants with category IIB or III prognoses survived to discharge, though some of these were

  13. Treatment accuracy of fractionated stereotactic radiotherapy

    International Nuclear Information System (INIS)

    Kumar, Shaleen; Burke, Kevin; Nalder, Colin; Jarrett, Paula; Mubata, Cephas; A'Hern, Roger; Humphreys, Mandy; Bidmead, Margaret; Brada, Michael

    2005-01-01

    Background and purpose: To assess the geometric accuracy of the delivery of fractionated stereotactic radiotherapy (FSRT) for brain tumours using the Gill-Thomas-Cosman (GTC) relocatable frame. Accuracy of treatment delivery was measured via portal images acquired with an amorphous silicon based electronic portal imager (EPI). Results were used to assess the existing verification process and to review the current margins used for the expansion of clinical target volume (CTV) to planning target volume (PTV). Patients and methods: Patients were immobilized in a GTC frame. Target volume definition was performed on localization CT and MRI scans and a CTV to PTV margin of 5 mm (based on initial experience) was introduced in 3D. A Brown-Roberts-Wells (BRW) fiducial system was used for stereotactic coordinate definition. The existing verification process consisted of an intercomparison of the coordinates of the isocentres and anatomy between the localization and verification CT scans. Treatment was delivered with 6 MV photons using four fixed non-coplanar conformal fields using a multi-leaf collimator. Portal imaging verification consisted of the acquisition of orthogonal images centred through the treatment isocentre. Digitally reconstructed radiographs (DRRs) created from the CT localization scans were used as reference images. Semi-automated matching software was used to quantify set up deviations (displacements and rotations) between reference and portal images. Results: One hundred and twenty six anterior and 123 lateral portal images were available for analysis for set up deviations. For displacements, the total errors in the cranial/caudal direction were shown to have the largest SD's of 1.2 mm, while systematic and random errors reached SD's of 1.0 and 0.7 mm, respectively, in the cranial/caudal direction. The corresponding data for rotational errors (the largest deviation was found in the sagittal plane) was 0.7 deg. SD (total error), 0.5 deg. (systematic) and 0

  14. [Clinical research IV. Relevancy of the statistical test chosen].

    Science.gov (United States)

    Talavera, Juan O; Rivas-Ruiz, Rodolfo

    2011-01-01

    When we look at the difference between two therapies or the association of a risk factor or prognostic indicator with its outcome, we need to evaluate the accuracy of the result. This assessment is based on a judgment that uses information about the study design and statistical management of the information. This paper specifically mentions the relevance of the statistical test selected. Statistical tests are chosen mainly from two characteristics: the objective of the study and type of variables. The objective can be divided into three test groups: a) those in which you want to show differences between groups or inside a group before and after a maneuver, b) those that seek to show the relationship (correlation) between variables, and c) those that aim to predict an outcome. The types of variables are divided in two: quantitative (continuous and discontinuous) and qualitative (ordinal and dichotomous). For example, if we seek to demonstrate differences in age (quantitative variable) among patients with systemic lupus erythematosus (SLE) with and without neurological disease (two groups), the appropriate test is the "Student t test for independent samples." But if the comparison is about the frequency of females (binomial variable), then the appropriate statistical test is the χ(2).

  15. The accuracy of dynamic attitude propagation

    Science.gov (United States)

    Harvie, E.; Chu, D.; Woodard, M.

    1990-01-01

    Propagating attitude by integrating Euler's equation for rigid body motion has long been suggested for the Earth Radiation Budget Satellite (ERBS) but until now has not been implemented. Because of limited Sun visibility, propagation is necessary for yaw determination. With the deterioration of the gyros, dynamic propagation has become more attractive. Angular rates are derived from integrating Euler's equation with a stepsize of 1 second, using torques computed from telemetered control system data. The environmental torque model was quite basic. It included gravity gradient and unshadowed aerodynamic torques. Knowledge of control torques is critical to the accuracy of dynamic modeling. Due to their coarseness and sparsity, control actuator telemetry were smoothed before integration. The dynamic model was incorporated into existing ERBS attitude determination software. Modeled rates were then used for attitude propagation in the standard ERBS fine-attitude algorithm. In spite of the simplicity of the approach, the dynamically propagated attitude matched the attitude propagated with good gyros well for roll and yaw but diverged up to 3 degrees for pitch because of the very low resolution in pitch momentum wheel telemetry. When control anomalies significantly perturb the nominal attitude, the effect of telemetry granularity is reduced and the dynamically propagated attitudes are accurate on all three axes.

  16. Accuracy of crystal structure error estimates

    International Nuclear Information System (INIS)

    Taylor, R.; Kennard, O.

    1986-01-01

    A statistical analysis of 100 crystal structures retrieved from the Cambridge Structural Database is reported. Each structure has been determined independently by two different research groups. Comparison of the independent results leads to the following conclusions: (a) The e.s.d.'s of non-hydrogen-atom positional parameters are almost invariably too small. Typically, they are underestimated by a factor of 1.4-1.45. (b) The extent to which e.s.d.'s are underestimated varies significantly from structure to structure and from atom to atom within a structure. (c) Errors in the positional parameters of atoms belonging to the same chemical residue tend to be positively correlated. (d) The e.s.d.'s of heavy-atom positions are less reliable than those of light-atom positions. (e) Experimental errors in atomic positional parameters are normally, or approximately normally, distributed. (f) The e.s.d.'s of cell parameters are grossly underestimated, by an average factor of about 5 for cell lengths and 2.5 for cell angles. There is marginal evidence that the accuracy of atomic-coordinate e.s.d.'s also depends on diffractometer geometry, refinement procedure, whether or not the structure has a centre of symmetry, and the degree of precision attained in the structure determination. (orig.)

  17. Accuracy in activation analysis: count rate effects

    International Nuclear Information System (INIS)

    Lindstrom, R.M.; Fleming, R.F.

    1980-01-01

    The accuracy inherent in activation analysis is ultimately limited by the uncertainty of counting statistics. When careful attention is paid to detail, several workers have shown that all systematic errors can be reduced to an insignificant fraction of the total uncertainty, even when the statistical limit is well below one percent. A matter of particular importance is the reduction of errors due to high counting rate. The loss of counts due to random coincidence (pulse pileup) in the amplifier and to digitization time in the ADC may be treated as a series combination of extending and non-extending dead times, respectively. The two effects are experimentally distinct. Live timer circuits in commercial multi-channel analyzers compensate properly for ADC dead time for long-lived sources, but not for pileup. Several satisfactory solutions are available, including pileup rejection and dead time correction circuits, loss-free ADCs, and computed corrections in a calibrated system. These methods are sufficiently reliable and well understood that a decaying source can be measured routinely with acceptably small errors at a dead time as high as 20 percent

  18. Accuracy of Spindle Units with Hydrostatic Bearings

    Directory of Open Access Journals (Sweden)

    Fedorynenko Dmytro

    2016-06-01

    Full Text Available The work is devoted to the research of precision regularities in a spindle unit by the trajectory of the spindle installed on hydrostatic bearings. The mathematical model of trajectories spindle with lumped parameters that allows to define the position of the spindle with regard the simultaneous influence of design parameters, geometrical deviations ofform, temperature deformation bearing surfaces, the random nature of operational parameters and technical loads of hydrostatic bearings has been developed. Based on the results of numerical modeling the influence of shape errors of bearing surface of hydrostatic bearing on the statistical characteristics of the radius vector trajectories of the spindle by varying the values rotational speed of the spindle and oil pressure in front hydrostatic bearing has been developed. The obtained statistical regularities of precision spindle unit have been confirmed experimentally. It has been shown that an effective way to increase the precision of spindle units is to regulate the size of the gap in hydrostatic spindle bearings. The new design of an adjustable hydrostatic bearing, which can improve the accuracy of regulation size gap has been proposed.

  19. Needle placement accuracy during stereotactic localization mammography

    International Nuclear Information System (INIS)

    Green, D.H.

    2009-01-01

    Aim: To derive a mathematical model to describe the relationship between lesion position in the breast and measurements derived from the stereoradiographs to enable more accurate sampling of a lesion during stereotactic mammographic needle placement. Materials and methods: The affect that registration errors have on the accuracy of needle placement when identifying the lesion on the stereoradiographs was investigated using the mathematical model. Results: The focus-to-film distance of the x-ray tube and the horizontal distance of the lesion from the centre of rotation have little effect on error. Registration errors for lesions lying at a greater perpendicular distance in the breast from the centre of rotation produce smaller localization errors when compared with lesions sited closer. Lesion registration errors during marking of the stereoradiographs are exacerbated by decreasing the angle of x-ray tube swing. Conclusions: When problems are encountered in making an accurate registration of the lesion on the stereoradiographs, consider the following error reducing strategies: (1) employ an approach that places the lesion the maximum distance away from the film cassette; (2) avoid reducing the angle of tube swing; and (3) consider sampling superficial and deep to, as well as at, the location indicated. The possibility of erroneous tissue sampling should be borne in mind when reviewing the pathology report.

  20. Overlay accuracy with respect to device scaling

    Science.gov (United States)

    Leray, Philippe; Laidler, David; Cheng, Shaunee

    2012-03-01

    Overlay metrology performance is usually reported as repeatability, matching between tools or optics aberrations distorting the measurement (Tool induced shift or TIS). Over the last few years, improvement of these metrics by the tool suppliers has been impressive. But, what about accuracy? Using different target types, we have already reported small differences in the mean value as well as fingerprint [1]. These differences make the correctables questionable. Which target is correct and therefore which translation, scaling etc. values should be fed back to the scanner? In this paper we investigate the sources of these differences, using several approaches. First, we measure the response of different targets to offsets programmed in a test vehicle. Second, we check the response of the same overlay targets to overlay errors programmed into the scanner. We compare overlay target designs; what is the contribution of the size of the features that make up the target? We use different overlay measurement techniques; is DBO (Diffraction Based Overlay) more accurate than IBO (Image Based Overlay)? We measure overlay on several stacks; what is the stack contribution to inaccuracy? In conclusion, we offer an explanation for the observed differences and propose a solution to reduce them.

  1. The impact of positive, negative and topical relevance feedback

    NARCIS (Netherlands)

    Kaptein, Rianne; Kamps, Jaap; Hiemstra, Djoerd

    2008-01-01

    This document contains a description of experiments for the 2008 Relevance Feedback track. We experiment with different amounts of feedback, including negative relevance feedback. Feedback is implemented using massive weighted query expansion. Parsimonious query expansion using only relevant

  2. Joint modeling of genetically correlated diseases and functional annotations increases accuracy of polygenic risk prediction.

    Directory of Open Access Journals (Sweden)

    Yiming Hu

    2017-06-01

    Full Text Available Accurate prediction of disease risk based on genetic factors is an important goal in human genetics research and precision medicine. Advanced prediction models will lead to more effective disease prevention and treatment strategies. Despite the identification of thousands of disease-associated genetic variants through genome-wide association studies (GWAS in the past decade, accuracy of genetic risk prediction remains moderate for most diseases, which is largely due to the challenges in both identifying all the functionally relevant variants and accurately estimating their effect sizes. In this work, we introduce PleioPred, a principled framework that leverages pleiotropy and functional annotations in genetic risk prediction for complex diseases. PleioPred uses GWAS summary statistics as its input, and jointly models multiple genetically correlated diseases and a variety of external information including linkage disequilibrium and diverse functional annotations to increase the accuracy of risk prediction. Through comprehensive simulations and real data analyses on Crohn's disease, celiac disease and type-II diabetes, we demonstrate that our approach can substantially increase the accuracy of polygenic risk prediction and risk population stratification, i.e. PleioPred can significantly better separate type-II diabetes patients with early and late onset ages, illustrating its potential clinical application. Furthermore, we show that the increment in prediction accuracy is significantly correlated with the genetic correlation between the predicted and jointly modeled diseases.

  3. The Quality and Accuracy of Mobile Apps to Prevent Driving After Drinking Alcohol.

    Science.gov (United States)

    Wilson, Hollie; Stoyanov, Stoyan R; Gandabhai, Shailen; Baldwin, Alexander

    2016-08-08

    Driving after the consumption of alcohol represents a significant problem globally. Individual prevention countermeasures such as personalized mobile app aimed at preventing such behavior are widespread, but there is little research on their accuracy and evidence base. There has been no known assessment investigating the quality of such apps. This study aimed to determine the quality and accuracy of apps for drink driving prevention by conducting a review and evaluation of relevant mobile apps. A systematic app search was conducted following PRISMA guidelines. App quality was assessed using the Mobile App Rating Scale (MARS). Apps providing blood alcohol calculators (hereafter "calculators") were reviewed against current alcohol advice for accuracy. A total of 58 apps (30 iOS and 28 Android) met inclusion criteria and were included in the final analysis. Drink driving prevention apps had significantly lower engagement and overall quality scores than alcohol management apps. Most calculators provided conservative blood alcohol content (BAC) time until sober calculations. None of the apps had been evaluated to determine their efficacy in changing either drinking or driving behaviors. This novel study demonstrates that most drink driving prevention apps are not engaging and lack accuracy. They could be improved by increasing engagement features, such as gamification. Further research should examine the context and motivations for using apps to prevent driving after drinking in at-risk populations. Development of drink driving prevention apps should incorporate evidence-based information and guidance, lacking in current apps.

  4. PCA based feature reduction to improve the accuracy of decision tree c4.5 classification

    Science.gov (United States)

    Nasution, M. Z. F.; Sitompul, O. S.; Ramli, M.

    2018-03-01

    Splitting attribute is a major process in Decision Tree C4.5 classification. However, this process does not give a significant impact on the establishment of the decision tree in terms of removing irrelevant features. It is a major problem in decision tree classification process called over-fitting resulting from noisy data and irrelevant features. In turns, over-fitting creates misclassification and data imbalance. Many algorithms have been proposed to overcome misclassification and overfitting on classifications Decision Tree C4.5. Feature reduction is one of important issues in classification model which is intended to remove irrelevant data in order to improve accuracy. The feature reduction framework is used to simplify high dimensional data to low dimensional data with non-correlated attributes. In this research, we proposed a framework for selecting relevant and non-correlated feature subsets. We consider principal component analysis (PCA) for feature reduction to perform non-correlated feature selection and Decision Tree C4.5 algorithm for the classification. From the experiments conducted using available data sets from UCI Cervical cancer data set repository with 858 instances and 36 attributes, we evaluated the performance of our framework based on accuracy, specificity and precision. Experimental results show that our proposed framework is robust to enhance classification accuracy with 90.70% accuracy rates.

  5. Propagation of measurement accuracy to biomass soft-sensor estimation and control quality.

    Science.gov (United States)

    Steinwandter, Valentin; Zahel, Thomas; Sagmeister, Patrick; Herwig, Christoph

    2017-01-01

    In biopharmaceutical process development and manufacturing, the online measurement of biomass and derived specific turnover rates is a central task to physiologically monitor and control the process. However, hard-type sensors such as dielectric spectroscopy, broth fluorescence, or permittivity measurement harbor various disadvantages. Therefore, soft-sensors, which use measurements of the off-gas stream and substrate feed to reconcile turnover rates and provide an online estimate of the biomass formation, are smart alternatives. For the reconciliation procedure, mass and energy balances are used together with accuracy estimations of measured conversion rates, which were so far arbitrarily chosen and static over the entire process. In this contribution, we present a novel strategy within the soft-sensor framework (named adaptive soft-sensor) to propagate uncertainties from measurements to conversion rates and demonstrate the benefits: For industrially relevant conditions, hereby the error of the resulting estimated biomass formation rate and specific substrate consumption rate could be decreased by 43 and 64 %, respectively, compared to traditional soft-sensor approaches. Moreover, we present a generic workflow to determine the required raw signal accuracy to obtain predefined accuracies of soft-sensor estimations. Thereby, appropriate measurement devices and maintenance intervals can be selected. Furthermore, using this workflow, we demonstrate that the estimation accuracy of the soft-sensor can be additionally and substantially increased.

  6. Evaluation of the generality and accuracy of a new mesh morphing procedure for the human femur.

    Science.gov (United States)

    Grassi, Lorenzo; Hraiech, Najah; Schileo, Enrico; Ansaloni, Mauro; Rochette, Michel; Viceconti, Marco

    2011-01-01

    Various papers described mesh morphing techniques for computational biomechanics, but none of them provided a quantitative assessment of generality, robustness, automation, and accuracy in predicting strains. This study aims to quantitatively evaluate the performance of a novel mesh-morphing algorithm. A mesh-morphing algorithm based on radial-basis functions and on manual selection of corresponding landmarks on template and target was developed. The periosteal geometries of 100 femurs were derived from a computed tomography scan database and used to test the algorithm generality in producing finite element (FE) morphed meshes. A published benchmark, consisting of eight femurs for which in vitro strain measurements and standard FE model strain prediction accuracy were available, was used to assess the accuracy of morphed FE models in predicting strains. Relevant parameters were identified to test the algorithm robustness to operative conditions. Time and effort needed were evaluated to define the algorithm degree of automation. Morphing was successful for 95% of the specimens, with mesh quality indicators comparable to those of standard FE meshes. Accuracy of the morphed meshes in predicting strains was good (R(2)>0.9, RMSE%0.05) and partially to the number of landmark used. Producing a morphed mesh starting from the triangularized geometry of the specimen requires on average 10 min. The proposed method is general, robust, automated, and accurate enough to be used in bone FE modelling from diagnostic data, and prospectively in applications such as statistical shape modelling. Copyright © 2010 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Accuracy required and achievable in radiotherapy dosimetry: have modern technology and techniques changed our views?

    Science.gov (United States)

    Thwaites, David

    2013-06-01

    In this review of the accuracy required and achievable in radiotherapy dosimetry, older approaches and evidence-based estimates for 3DCRT have been reprised, summarising and drawing together the author's earlier evaluations where still relevant. Available evidence for IMRT uncertainties has been reviewed, selecting information from tolerances, QA, verification measurements, in vivo dosimetry and dose delivery audits, to consider whether achievable uncertainties increase or decrease for current advanced treatments and practice. Overall there is some evidence that they tend to increase, but that similar levels should be achievable. Thus it is concluded that those earlier estimates of achievable dosimetric accuracy are still applicable, despite the changes and advances in technology and techniques. The one exception is where there is significant lung involvement, where it is likely that uncertainties have now improved due to widespread use of more accurate heterogeneity models. Geometric uncertainties have improved with the wide availability of IGRT.

  8. Identifying Relevant Studies in Software Engineering

    DEFF Research Database (Denmark)

    Zhang, He; Ali Babar, Muhammad; Tell, Paolo

    2011-01-01

    Context: Systematic literature review (SLR) has become an important research methodology in software engineering since the introduction of evidence-based software engineering (EBSE) in 2004. One critical step in applying this methodology is to design and execute appropriate and effective search....... Objective: The main objective of the research reported in this paper is to improve the search step of undertaking SLRs in software engineering (SE) by devising and evaluating systematic and practical approaches to identifying relevant studies in SE. Method: We have systematically selected and analytically...

  9. Happiness: origins, forms, and technical relevance.

    Science.gov (United States)

    Akhtar, Salman

    2010-09-01

    By critically reviewing Freud's views on happiness, and also those of Helene Deutsch, Bertram Lewin, Melanie Klein, and Heinz Kohut, the author evolves a complex and multilayered perspective on the phenomenon. He categorizes happiness into four related and occasionally overlapping varieties: pleasure-based happiness (elation), assertion-based happiness (joy), merger-based happiness (ecstasy), and fulfillment-based happiness (contentment). After entering some caveats and drawing from his clinical experience, the author then demonstrates the relevance of these ideas to the conduct of psychotherapy and psychoanalysis.

  10. Relevance of physics to the pharmacy major.

    Science.gov (United States)

    McCall, Richard P

    2007-08-15

    To offer a physics course that is relevant to pharmacy students, yet still contains many of the fundamental principles of physics. The course was modified over a period of several years to include activities and examples that were related to other courses in the curriculum. Course evaluations were given to assess student attitudes about the importance of physics in the pharmacy curriculum. Students' attitudes have changed over time to appreciate the role that physics plays in their studies. Students gained confidence in their ability to learn in other courses.

  11. Application of Multilabel Learning Using the Relevant Feature for Each Label in Chronic Gastritis Syndrome Diagnosis

    Science.gov (United States)

    Liu, Guo-Ping; Yan, Jian-Jun; Wang, Yi-Qin; Fu, Jing-Jing; Xu, Zhao-Xia; Guo, Rui; Qian, Peng

    2012-01-01

    Background. In Traditional Chinese Medicine (TCM), most of the algorithms are used to solve problems of syndrome diagnosis that only focus on one syndrome, that is, single label learning. However, in clinical practice, patients may simultaneously have more than one syndrome, which has its own symptoms (signs). Methods. We employed a multilabel learning using the relevant feature for each label (REAL) algorithm to construct a syndrome diagnostic model for chronic gastritis (CG) in TCM. REAL combines feature selection methods to select the significant symptoms (signs) of CG. The method was tested on 919 patients using the standard scale. Results. The highest prediction accuracy was achieved when 20 features were selected. The features selected with the information gain were more consistent with the TCM theory. The lowest average accuracy was 54% using multi-label neural networks (BP-MLL), whereas the highest was 82% using REAL for constructing the diagnostic model. For coverage, hamming loss, and ranking loss, the values obtained using the REAL algorithm were the lowest at 0.160, 0.142, and 0.177, respectively. Conclusion. REAL extracts the relevant symptoms (signs) for each syndrome and improves its recognition accuracy. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice. PMID:22719781

  12. Application of Multilabel Learning Using the Relevant Feature for Each Label in Chronic Gastritis Syndrome Diagnosis

    Directory of Open Access Journals (Sweden)

    Guo-Ping Liu

    2012-01-01

    Full Text Available Background. In Traditional Chinese Medicine (TCM, most of the algorithms are used to solve problems of syndrome diagnosis that only focus on one syndrome, that is, single label learning. However, in clinical practice, patients may simultaneously have more than one syndrome, which has its own symptoms (signs. Methods. We employed a multilabel learning using the relevant feature for each label (REAL algorithm to construct a syndrome diagnostic model for chronic gastritis (CG in TCM. REAL combines feature selection methods to select the significant symptoms (signs of CG. The method was tested on 919 patients using the standard scale. Results. The highest prediction accuracy was achieved when 20 features were selected. The features selected with the information gain were more consistent with the TCM theory. The lowest average accuracy was 54% using multi-label neural networks (BP-MLL, whereas the highest was 82% using REAL for constructing the diagnostic model. For coverage, hamming loss, and ranking loss, the values obtained using the REAL algorithm were the lowest at 0.160, 0.142, and 0.177, respectively. Conclusion. REAL extracts the relevant symptoms (signs for each syndrome and improves its recognition accuracy. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice.

  13. Analysis of accuracy in photogrammetric roughness measurements

    Science.gov (United States)

    Olkowicz, Marcin; Dąbrowski, Marcin; Pluymakers, Anne

    2017-04-01

    Regarding permeability, one of the most important features of shale gas reservoirs is the effective aperture of cracks opened during hydraulic fracturing, both propped and unpropped. In a propped fracture, the aperture is controlled mostly by proppant size and its embedment, and fracture surface roughness only has a minor influence. In contrast, in an unpropped fracture aperture is controlled by the fracture roughness and the wall displacement. To measure fracture surface roughness, we have used the photogrammetric method since it is time- and cost-efficient. To estimate the accuracy of this method we compare the photogrammetric measurements with reference measurements taken with a White Light Interferometer (WLI). Our photogrammetric setup is based on high resolution 50 Mpx camera combined with a focus stacking technique. The first step for photogrammetric measurements is to determine the optimal camera positions and lighting. We compare multiple scans of one sample, taken with different settings of lighting and camera positions, with the reference WLI measurement. The second step is to perform measurements of all studied fractures with the parameters that produced the best results in the first step. To compare photogrammetric and WLI measurements we regrid both data sets onto a regular 10 μm grid and determined the best fit, followed by a calculation of the difference between the measurements. The first results of the comparison show that for 90 % of measured points the absolute vertical distance between WLI and photogrammetry is less than 10 μm, while the mean absolute vertical distance is 5 μm. This proves that our setup can be used for fracture roughness measurements in shales.

  14. Complete-arch accuracy of intraoral scanners.

    Science.gov (United States)

    Treesh, Joshua C; Liacouras, Peter C; Taft, Robert M; Brooks, Daniel I; Raiciulescu, Sorana; Ellert, Daniel O; Grant, Gerald T; Ye, Ling

    2018-04-30

    Intraoral scanners have shown varied results in complete-arch applications. The purpose of this in vitro study was to evaluate the complete-arch accuracy of 4 intraoral scanners based on trueness and precision measurements compared with a known reference (trueness) and with each other (precision). Four intraoral scanners were evaluated: CEREC Bluecam, CEREC Omnicam, TRIOS Color, and Carestream CS 3500. A complete-arch reference cast was created and printed using a 3-dimensional dental cast printer with photopolymer resin. The reference cast was digitized using a laboratory-based white light 3-dimensional scanner. The printed reference cast was scanned 10 times with each intraoral scanner. The digital standard tessellation language (STL) files from each scanner were then registered to the reference file and compared with differences in trueness and precision using a 3-dimensional modeling software. Additionally, scanning time was recorded for each scan performed. The Wilcoxon signed rank, Kruskal-Wallis, and Dunn tests were used to detect differences for trueness, precision, and scanning time (α=.05). Carestream CS 3500 had the lowest overall trueness and precision compared with Bluecam and TRIOS Color. The fourth scanner, Omnicam, had intermediate trueness and precision. All of the scanners tended to underestimate the size of the reference file, with exception of the Carestream CS 3500, which was more variable. Based on visual inspection of the color rendering of signed differences, the greatest amount of error tended to be in the posterior aspects of the arch, with local errors exceeding 100 μm for all scans. The single capture scanner Carestream CS 3500 had the overall longest scan times and was significantly slower than the continuous capture scanners TRIOS Color and Omnicam. Significant differences in both trueness and precision were found among the scanners. Scan times of the continuous capture scanners were faster than the single capture scanners

  15. Magnetoencephalographic accuracy profiles for the detection of auditory pathway sources.

    Science.gov (United States)

    Bauer, Martin; Trahms, Lutz; Sander, Tilmann

    2015-04-01

    The detection limits for cortical and brain stem sources associated with the auditory pathway are examined in order to analyse brain responses at the limits of the audible frequency range. The results obtained from this study are also relevant to other issues of auditory brain research. A complementary approach consisting of recordings of magnetoencephalographic (MEG) data and simulations of magnetic field distributions is presented in this work. A biomagnetic phantom consisting of a spherical volume filled with a saline solution and four current dipoles is built. The magnetic fields outside of the phantom generated by the current dipoles are then measured for a range of applied electric dipole moments with a planar multichannel SQUID magnetometer device and a helmet MEG gradiometer device. The inclusion of a magnetometer system is expected to be more sensitive to brain stem sources compared with a gradiometer system. The same electrical and geometrical configuration is simulated in a forward calculation. From both the measured and the simulated data, the dipole positions are estimated using an inverse calculation. Results are obtained for the reconstruction accuracy as a function of applied electric dipole moment and depth of the current dipole. We found that both systems can localize cortical and subcortical sources at physiological dipole strength even for brain stem sources. Further, we found that a planar magnetometer system is more suitable if the position of the brain source can be restricted in a limited region of the brain. If this is not the case, a helmet-shaped sensor system offers more accurate source estimation.

  16. Analysis of Ion Composition Estimation Accuracy for Incoherent Scatter Radars

    Science.gov (United States)

    Martínez Ledesma, M.; Diaz, M. A.

    2017-12-01

    The Incoherent Scatter Radar (ISR) is one of the most powerful sounding methods developed to estimate the Ionosphere. This radar system determines the plasma parameters by sending powerful electromagnetic pulses to the Ionosphere and analyzing the received backscatter. This analysis provides information about parameters such as electron and ion temperatures, electron densities, ion composition, and ion drift velocities. Nevertheless in some cases the ISR analysis has ambiguities in the determination of the plasma characteristics. It is of particular relevance the ion composition and temperature ambiguity obtained between the F1 and the lower F2 layers. In this case very similar signals are obtained with different mixtures of molecular ions (NO2+ and O2+) and atomic oxygen ions (O+), and consequently it is not possible to completely discriminate between them. The most common solution to solve this problem is the use of empirical or theoretical models of the ionosphere in the fitting of ambiguous data. More recent works take use of parameters estimated from the Plasma Line band of the radar to reduce the number of parameters to determine. In this work we propose to determine the error estimation of the ion composition ambiguity when using Plasma Line electron density measurements. The sensibility of the ion composition estimation has been also calculated depending on the accuracy of the ionospheric model, showing that the correct estimation is highly dependent on the capacity of the model to approximate the real values. Monte Carlo simulations of data fitting at different signal to noise (SNR) ratios have been done to obtain valid and invalid estimation probability curves. This analysis provides a method to determine the probability of erroneous estimation for different signal fluctuations. Also it can be used as an empirical method to compare the efficiency of the different algorithms and methods on when solving the ion composition ambiguity.

  17. Other relevant papers in physical oceanography

    International Nuclear Information System (INIS)

    Nyffeler, F.

    1989-01-01

    During the past few years, significant progress has occurred in the field of physical oceanography partly as a consequence of developing cooperation and international participation in well-coordinated ocean research programmes. Although these programs were not designed specifically to address CRESP problems, many have proved to be directly relevant to CRESP objectives. For example, MODE, POLYMODE, and Tourbillon were intensive site-specific experiments that included studies of dispersion processes throughout the water column. NOAMP and GME were also site specific, involved the entire water column, and even stressed near-bottom and suspended-sediment processes. Others, (e.g., WOCE) are larger in scope and include extensive observations of the general circulation of entire ocean basins. As a whole, they contribute immensely to improving the data base for exchange and transport processes and thereby for the verification and validation of both regional-scale and general-circulation ocean models. That, in turn, is directly relevant to radiological assessments. Selected papers deriving from experiments such as these are discussed and referenced below

  18. [Terbinafine : Relevant drug interactions and their management].

    Science.gov (United States)

    Dürrbeck, A; Nenoff, P

    2016-09-01

    The allylamine terbinafine is the probably most frequently prescribed systemic antifungal agent in Germany for the treatment of dermatomycoses and onychomycoses. According to the German drug law, terbinafine is approved for patients who are 18 years and older; however, this antifungal agent is increasingly used off-label for treatment of onychomycoses and tinea capitis in children. Terbinafine is associated with only a few interactions with other drugs, which is why terbinafine can generally be used without problems in older and multimorbid patients. Nevertheless, some potential interactions of terbinafine with certain drug substances are known, including substances of the group of antidepressants/antipsychotics and some cardiovascular drugs. Decisive for the relevance of interactions is-along with the therapeutic index of the substrate and the possible alternative degradation pathways-the genetically determined type of metabolism. When combining terbinafine with tricyclic antidepressants or selective serotonin reuptake inhibitors and serotonin/noradrenalin reuptake inhibitors, the clinical response and potential side effects must be monitored. Problematic is the use of terbinafine with simultaneous treatment with tamoxifen. The administration of potent CYP2D6 inhibitors leads to a diminished efficacy of tamoxifen because one of its most important active metabolites-endoxifen-is not sufficiently available. Therefore, combination of tamoxifen and terbinafine should be avoided. In conclusion, the number of substances which are able to cause clinically relevant interactions in case of simultaneously administration with terbinafine is clear and should be manageable in the dermatological office with adequate monitoring.

  19. Relevance of randomised controlled trials in oncology.

    Science.gov (United States)

    Tannock, Ian F; Amir, Eitan; Booth, Christopher M; Niraula, Saroj; Ocana, Alberto; Seruga, Bostjan; Templeton, Arnoud J; Vera-Badillo, Francisco

    2016-12-01

    Well-designed randomised controlled trials (RCTs) can prevent bias in the comparison of treatments and provide a sound basis for changes in clinical practice. However, the design and reporting of many RCTs can render their results of little relevance to clinical practice. In this Personal View, we discuss the limitations of RCT data and suggest some ways to improve the clinical relevance of RCTs in the everyday management of patients with cancer. RCTs should ask questions of clinical rather than commercial interest, avoid non-validated surrogate endpoints in registration trials, and have entry criteria that allow inclusion of all patients who are fit to receive treatment. Furthermore, RCTs should be reported with complete accounting of frequency and management of toxicities, and with strict guidelines to ensure freedom from bias. Premature reporting of results should be avoided. The bar for clinical benefit should be raised for drug registration, which should require publication and review of mature data from RCTs, post-marketing health outcome studies, and value-based pricing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Quantitative accuracy assessment of thermalhydraulic code predictions with SARBM

    International Nuclear Information System (INIS)

    Prosek, A.

    2001-01-01

    In recent years, the nuclear reactor industry has focused significant attention on nuclear reactor systems code accuracy and uncertainty issues. A few methods suitable to quantify code accuracy of thermalhydraulic code calculations were proposed and applied in the past. In this study a Stochastic Approximation Ratio Based Method (SARBM) was adapted and proposed for accuracy quantification. The objective of the study was to qualify the SARBM. The study compare the accuracy obtained by SARBM with the results obtained by widely used Fast Fourier Transform Based Method (FFTBM). The methods were applied to RELAP5/MOD3.2 code calculations of various BETHSY experiments. The obtained results showed that the SARBM was able to satisfactorily predict the accuracy of the calculated trends when visually comparing plots and comparing the results with the qualified FFTBM. The analysis also showed that the new figure-of-merit called accuracy factor (AF) is more convenient than stochastic approximation ratio for combining single variable accuracy's into total accuracy. The accuracy results obtained for the selected tests suggest that the acceptability factors for the SAR method were reasonably defined. The results also indicate that AF is a useful quantitative measure of accuracy.(author)

  1. The mathematical model accuracy estimation of the oil storage tank foundation soil moistening

    Science.gov (United States)

    Gildebrandt, M. I.; Ivanov, R. N.; Gruzin, AV; Antropova, L. B.; Kononov, S. A.

    2018-04-01

    The oil storage tanks foundations preparation technologies improvement is the relevant objective which achievement will make possible to reduce the material costs and spent time for the foundation preparing while providing the required operational reliability. The laboratory research revealed the nature of sandy soil layer watering with a given amount of water. The obtained data made possible developing the sandy soil layer moistening mathematical model. The performed estimation of the oil storage tank foundation soil moistening mathematical model accuracy showed the experimental and theoretical results acceptable convergence.

  2. Systematic Review of the Diagnostic Accuracy and Therapeutic Effectiveness of Sacroiliac Joint Interventions.

    Science.gov (United States)

    Simopoulos, Thomas T; Manchikanti, Laxmaiah; Gupta, Sanjeeva; Aydin, Steve M; Kim, Chong Hwan; Solanki, Daneshvari; Nampiaparampil, Devi E; Singh, Vijay; Staats, Peter S; Hirsch, Joshua A

    2015-01-01

    The sacroiliac joint is well known as a cause of low back and lower extremity pain. Prevalence estimates are 10% to 25% in patients with persistent axial low back pain without disc herniation, discogenic pain, or radiculitis based on multiple diagnostic studies and systematic reviews. However, at present there are no definitive management options for treating sacroiliac joint pain. To evaluate the diagnostic accuracy and therapeutic effectiveness of sacroiliac joint interventions. A systematic review of the diagnostic accuracy and therapeutic effectiveness of sacroiliac joint interventions. The available literature on diagnostic and therapeutic sacroiliac joint interventions was reviewed. The quality assessment criteria utilized were the Quality Appraisal of Reliability Studies (QAREL) checklist for diagnostic accuracy studies, Cochrane review criteria to assess sources of risk of bias, and Interventional Pain Management Techniques-Quality Appraisal of Reliability and Risk of Bias Assessment (IPM-QRB) criteria for randomized therapeutic trials and Interventional Pain Management Techniques-Quality Appraisal of Reliability and Risk of Bias Assessment for Nonrandomized Studies (IPM-QRBNR) for observational therapeutic assessments. The level of evidence was based on a best evidence synthesis with modified grading of qualitative evidence from Level I to Level V. Data sources included relevant literature published from 1966 through March 2015 that were identified through searches of PubMed and EMBASE, manual searches of the bibliographies of known primary and review articles, and all other sources. For the diagnostic accuracy assessment, and for the therapeutic modalities, the primary outcome measure of pain relief and improvement in functional status were utilized. A total of 11 diagnostic accuracy studies and 14 therapeutic studies were included. The evidence for diagnostic accuracy is Level II for dual diagnostic blocks with at least 70% pain relief as the criterion

  3. GPS and Electronic Fence Data Fusion for Positioning within Railway Worksite Scenarios

    DEFF Research Database (Denmark)

    Figueiras, Joao; Grønbæk, Lars Jesper; Ceccarelli, Andrea

    2012-01-01

    Context-dependent decisions in safety-critical applications require careful consideration of accuracy and timeliness of the underlying context information. Relevant examples include location-dependent actions in mobile distributed systems. This paper considers localization functions for personali......Context-dependent decisions in safety-critical applications require careful consideration of accuracy and timeliness of the underlying context information. Relevant examples include location-dependent actions in mobile distributed systems. This paper considers localization functions...... with information from the electronic fences is developed and analyzed. Different accuracy metrics are proposed and the benefit obtained from the fusion with electronic fences is quantitatively analyzed in the scenarios of a single mobile entity: By having fence information, the correct zone estimation can increase...... by 30%, while false alarms can be reduced one order of magnitude in the tested scenario....

  4. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  5. Improving treatment planning accuracy through multimodality imaging

    International Nuclear Information System (INIS)

    Sailer, Scott L.; Rosenman, Julian G.; Soltys, Mitchel; Cullip, Tim J.; Chen, Jun

    1996-01-01

    the patient's initial fields and boost, respectively. Case illustrations are shown. Conclusions: We have successfully integrated multimodality imaging into our treatment-planning system, and its routine use is increasing. Multimodality imaging holds out the promise of improving treatment planning accuracy and, thus, takes maximum advantage of three dimensional treatment planning systems.

  6. Package Design Affects Accuracy Recognition for Medications.

    Science.gov (United States)

    Endestad, Tor; Wortinger, Laura A; Madsen, Steinar; Hortemo, Sigurd

    2016-12-01

    Our aim was to test if highlighting and placement of substance name on medication package have the potential to reduce patient errors. An unintentional overdose of medication is a large health issue that might be linked to medication package design. In two experiments, placement, background color, and the active ingredient of generic medication packages were manipulated according to best human factors guidelines to reduce causes of labeling-related patient errors. In two experiments, we compared the original packaging with packages where we varied placement of the name, dose, and background of the active ingredient. Age-relevant differences and the effect of color on medication recognition error were tested. In Experiment 1, 59 volunteers (30 elderly and 29 young students), participated. In Experiment 2, 25 volunteers participated. The most common error was the inability to identify that two different packages contained the same active ingredient (young, 41%, and elderly, 68%). This kind of error decreased with the redesigned packages (young, 8%, and elderly, 16%). Confusion errors related to color design were reduced by two thirds in the redesigned packages compared with original generic medications. Prominent placement of substance name and dose with a band of high-contrast color support recognition of the active substance in medications. A simple modification including highlighting and placing the name of the active ingredient in the upper right-hand corner of the package helps users realize that two different packages can contain the same active substance, thus reducing the risk of inadvertent medication overdose. © 2016, Human Factors and Ergonomics Society.

  7. Identifying noncoding risk variants using disease-relevant gene regulatory networks.

    Science.gov (United States)

    Gao, Long; Uzun, Yasin; Gao, Peng; He, Bing; Ma, Xiaoke; Wang, Jiahui; Han, Shizhong; Tan, Kai

    2018-02-16

    Identifying noncoding risk variants remains a challenging task. Because noncoding variants exert their effects in the context of a gene regulatory network (GRN), we hypothesize that explicit use of disease-relevant GRNs can significantly improve the inference accuracy of noncoding risk variants. We describe Annotation of Regulatory Variants using Integrated Networks (ARVIN), a general computational framework for predicting causal noncoding variants. It employs a set of novel regulatory network-based features, combined with sequence-based features to infer noncoding risk variants. Using known causal variants in gene promoters and enhancers in a number of diseases, we show ARVIN outperforms state-of-the-art methods that use sequence-based features alone. Additional experimental validation using reporter assay further demonstrates the accuracy of ARVIN. Application of ARVIN to seven autoimmune diseases provides a holistic view of the gene subnetwork perturbed by the combinatorial action of the entire set of risk noncoding mutations.

  8. Application of Multilayer Perceptron with Automatic Relevance Determination on Weed Mapping Using UAV Multispectral Imagery.

    Science.gov (United States)

    Tamouridou, Afroditi A; Alexandridis, Thomas K; Pantazi, Xanthoula E; Lagopodi, Anastasia L; Kashefi, Javid; Kasampalis, Dimitris; Kontouris, Georgios; Moshou, Dimitrios

    2017-10-11

    Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery.

  9. Maximum relevance, minimum redundancy band selection based on neighborhood rough set for hyperspectral data classification

    International Nuclear Information System (INIS)

    Liu, Yao; Chen, Yuehua; Tan, Kezhu; Xie, Hong; Wang, Liguo; Xie, Wu; Yan, Xiaozhen; Xu, Zhen

    2016-01-01

    Band selection is considered to be an important processing step in handling hyperspectral data. In this work, we selected informative bands according to the maximal relevance minimal redundancy (MRMR) criterion based on neighborhood mutual information. Two measures MRMR difference and MRMR quotient were defined and a forward greedy search for band selection was constructed. The performance of the proposed algorithm, along with a comparison with other methods (neighborhood dependency measure based algorithm, genetic algorithm and uninformative variable elimination algorithm), was studied using the classification accuracy of extreme learning machine (ELM) and random forests (RF) classifiers on soybeans’ hyperspectral datasets. The results show that the proposed MRMR algorithm leads to promising improvement in band selection and classification accuracy. (paper)

  10. Constitutional relevance of atomic energy law

    International Nuclear Information System (INIS)

    Lettow, S.

    1980-01-01

    In a decision publicized on December 20, 1979 the German Federal Constitutional Court rejected a claim of unconstitutionality in connection with the licensing procedure of the Muelheim-Kaerlich Nuclear Power Station currently under construction. This constitutes confirmation, by the 1st Department of the Court, of a decision in 1978 by the 2nd Department about the Kalkar fast breeder power plant, in which the peaceful utilization of nuclear energy had been found to be constitutional. However, the new decision by the Federal Constitutional Court particularly emphasizes the constitutional relevance of the rules of procedure under the Atomic Energy Act and their function with respect to the protection of civil rights. (orig.) [de

  11. Mirror neurons and their clinical relevance.

    Science.gov (United States)

    Rizzolatti, Giacomo; Fabbri-Destro, Maddalena; Cattaneo, Luigi

    2009-01-01

    One of the most exciting events in neurosciences over the past few years has been the discovery of a mechanism that unifies action perception and action execution. The essence of this 'mirror' mechanism is as follows: whenever individuals observe an action being done by someone else, a set of neurons that code for that action is activated in the observers' motor system. Since the observers are aware of the outcome of their motor acts, they also understand what the other individual is doing without the need for intermediate cognitive mediation. In this Review, after discussing the most pertinent data concerning the mirror mechanism, we examine the clinical relevance of this mechanism. We first discuss the relationship between mirror mechanism impairment and some core symptoms of autism. We then outline the theoretical principles of neurorehabilitation strategies based on the mirror mechanism. We conclude by examining the relationship between the mirror mechanism and some features of the environmental dependency syndromes.

  12. The Integrin Receptor in Biologically Relevant Bilayers

    DEFF Research Database (Denmark)

    Kalli, Antreas C.; Róg, Tomasz; Vattulainen, Ilpo

    2017-01-01

    /talin complex was inserted in biologically relevant bilayers that resemble the cell plasma membrane containing zwitterionic and charged phospholipids, cholesterol and sphingolipids to study the dynamics of the integrin receptor and its effect on bilayer structure and dynamics. The results of this study...... demonstrate the dynamic nature of the integrin receptor and suggest that the presence of the integrin receptor alters the lipid organization between the two leaflets of the bilayer. In particular, our results suggest elevated density of cholesterol and of phosphatidylserine lipids around the integrin....../talin complex and a slowing down of lipids in an annulus of ~30 Å around the protein due to interactions between the lipids and the integrin/talin F2–F3 complex. This may in part regulate the interactions of integrins with other related proteins or integrin clustering thus facilitating signal transduction...

  13. PREDICTING RELEVANT EMPTY SPOTS IN SOCIAL INTERACTION

    Institute of Scientific and Technical Information of China (English)

    Yoshiharu MAENO; Yukio OHSAWA

    2008-01-01

    An empty spot refers to an empty hard-to-fill space which can be found in the records of the social interaction, and is the clue to the persons in the underlying social network who do not appear in the records. This contribution addresses a problem to predict relevant empty spots in social interaction. Homogeneous and inhomogeneous networks are studied as a model underlying the social interaction. A heuristic predictor function method is presented as a new method to address the problem. Simulation experiment is demonstrated over a homogeneous network. A test data set in the form of market baskets is generated from the simulated communication. Precision to predict the empty spots is calculated to demonstrate the performance of the presented method.

  14. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2010-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focussed. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations.The set of numerical coefficients defining this linear combination is then what one refers.......The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  15. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2014-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focused. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations. The set of numerical coefficients defining this linear combination is then what one refers....... The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  16. Towards increased policy relevance in energy modeling

    Energy Technology Data Exchange (ETDEWEB)

    Worrell, Ernst; Ramesohl, Stephan; Boyd, Gale

    2003-07-29

    Historically, most energy models were reasonably equipped to assess the impact of a subsidy or change in taxation, but are often insufficient to assess the impact of more innovative policy instruments. We evaluate the models used to assess future energy use, focusing on industrial energy use. We explore approaches to engineering-economic analysis that could help improve the realism and policy relevance of engineering-economic modeling frameworks. We also explore solutions to strengthen the policy usefulness of engineering-economic analysis that can be built from a framework of multi-disciplinary cooperation. We focus on the so-called ''engineering-economic'' (or ''bottom-up'') models, as they include the amount of detail that is commonly needed to model policy scenarios. We identify research priorities for the modeling framework, technology representation in models, policy evaluation and modeling of decision-making behavior.

  17. Climate-relevant monitorings in Germany

    International Nuclear Information System (INIS)

    Metternich, P.

    1993-01-01

    This catalogue contains so-called meta-data; i.e. information on data. For each measuring programme or set of data, users find the address (postal address, telephone, fax-number) of the respective contact person at the beginning of the entry. The catalogue has three parts: Part A is a compilation of monitoring programmes using conventional methods adopted on the ground. Part B contains research programmes or sets of data from the field of remote sensing. In part C, data sets from time series of climate-relevant parameters are described. Section A was additionally structured according so the compartments of the climate system: Atmosphere, hydrosphere, cryosphere, biosphere. (orig./KW) [de

  18. Bacteriophage lambda: early pioneer and still relevant

    Science.gov (United States)

    Casjens, Sherwood R.; Hendrix, Roger W.

    2015-01-01

    Molecular genetic research on bacteriophage lambda carried out during its golden age from the mid 1950's to mid 1980's was critically important in the attainment of our current understanding of the sophisticated and complex mechanisms by which the expression of genes is controlled, of DNA virus assembly and of the molecular nature of lysogeny. The development of molecular cloning techniques, ironically instigated largely by phage lambda researchers, allowed many phage workers to switch their efforts to other biological systems. Nonetheless, since that time the ongoing study of lambda and its relatives have continued to give important new insights. In this review we give some relevant early history and describe recent developments in understanding the molecular biology of lambda's life cycle. PMID:25742714

  19. Reach and Relevance of Prison Research

    Directory of Open Access Journals (Sweden)

    Hilde Tubex

    2015-04-01

    Full Text Available In this contribution I reflect on the changes in the penal landscape and how they impact on prison research. I do this from my experiences as a prison researcher in a variety of roles, in both Europe and Australia. The growing dominance of managerialism has impacted on both corrective services and universities, in ways that have changed the relationship between current prison practices and academically oriented research. Therefore, academics have to question how their contemporary prison research can bridge the emerging gap: how they can not only produce research that adheres to the roots of criminology and provides a base for a rational penal policy, but also how they can develop strategies to get recognition of and funding for this broader contextual work which, although it might not produce results that are immediately identifiable, can be of relevance in indirect ways and in the longer term.

  20. Relevance of protection quantities in medical exposures

    International Nuclear Information System (INIS)

    Pradhan, A.S.

    2008-01-01

    International Commission on Radiological Protection (ICRP) continues to classify the exposures to radiation in three categories; namely 1- occupational exposure, 2- public exposure, and 3- medical exposure. Protection quantities are primarily meant for the regulatory purpose in radiological protection for controlling and limiting stochastic risks in occupational and public exposures. These are based on two basic assumptions of 1- linear no-threshold dose-effect relationship (LNT) at low doses and 2- long-term additivity of low doses. Medical exposure are predominantly delivered to individuals (patients) undergoing diagnostic examinations, interventional procedures and radiation therapy but also include individual caring for or comforting patients incurring exposure and the volunteers of biomedical medical research programmes. Radiation protection is as relevant to occupational and public exposure as to medical exposures except that the dose limits set for the formers are not applicable to medical exposure but reference levels and dose constrains are recommended for diagnostic and interventional medical procedures. In medical institutions, both the occupational and medical exposure takes place. Since the doses in diagnostic examinations are low, it has been observed that not only the protection quantities are often used in such cases but these are extended to estimate the number of cancer deaths due to such practices. One of the striking features of the new ICRP recommendations has been to elaborate the concepts of the dosimetric quantities. The limitation of protection quantities ((Effective dose, E=Σ RT D TR .W T .W R and Equivalent Dose H T =Σ RT D TR .W R ) have been brought out and this has raised a great concern and initiated debates on the use of these quantities in medical exposures. Consequently, ICRP has set a task group to provide more details and the recommendations. It has, therefore, became important to draw the attention of medical physics community

  1. Ecological principles relevant to nuclear war

    International Nuclear Information System (INIS)

    Hutchinson, T.C.; Cropper, W.P. Jr.; Grover, H.D.

    1985-01-01

    The ecological principles outlined are very basic ones; the authors anticipate a readership trained in a broad range of disciplines, including those unfamiliar with the academic discipline of ecology. The authors include substantial discussion on ecophysiology (i.e., the responses of organisms to their environment) because this is relevant to the new understanding of the potential climatic consequences of nuclear war. In particular, the physiological sensitivity of organisms to reduced levels of light and temperature are a key part of the analysis of the potential ecological effects and agricultural effects of nuclear war. Much of the ecological analysis has been organized around major biological units called biomes. The authors describe the biome concept and discuss some of the environmental-climatic factors that are believed to control biome distribution. Emphasis is given to plants because of their controlling influence on ecosystem functions through their role as primary producers. Future reports are needed to address more fully the potential effects on animals. Much more research needs to be done on both plant and animal responses to the types of perturbations possible for the aftermath of a nuclear war. Another important element for analysis of the potential ecological consequences of nuclear war concerns recovery processes. As the post-nuclear war environmental extremes ameliorate, ecological communities in devastated regions would begin to reorganize. It is not possible to predict the course of such a succession precisely, but some principles concerning post-perturbation replacement (such as seed banks and germination), relevant successional patterns, and organism strategies are discussed

  2. Thermochemical data for environmentally-relevant elements

    International Nuclear Information System (INIS)

    Markich, S.J.; Brown, P.L.

    1999-01-01

    This study provides an extensive stability constant (log K) database suitable for calculating the speciation of selected environmentally-relevant elements (H, Na, K, Ca, Mg, Fe, Mn, U, Al Pb, Zn, Cu and cd) in an aqueous system, where a model fulvic acid (comprising aspartic, citric, malonic, salicylic and tricarballylic acids) is used to simulate metal binding by dissolved organic material Stability constants for inorganic metal complexes and minerals were selected primarily from critical literature complications and/or reviews. In contrast, few critically evaluated data were available for metal complexes with aspartic, citric, malonic, salicylic and tricarballylic acids. Consequently, data from original research articles were carefully evaluated and compiled as part of the study, following defined selection criteria. to meet the objective of compiling a comprehensive and reliable database of stability constants, all relevant equilibria and species, ranging from simple binary metal complexes to more complex ternary and even quaternary, metal complexes were included where possible in addition to the selection of stability constants from empirical sources, estimates of stability constants were performed when this could be done reliably, based on the unified theory of metal ion complexation and/or linear tree energy relationships The stability constants are given as common logarithms (logo) in the form required by the HARPHRQ geochemical code and refer to the standard state, i.e 298.15 K (25 deg C), 10 5 Pa (1 atm) and, for all species, infinite dilution (ionic strength = 0 mol L -1 ). In addition to the compilation of stability constant data, an overview is given of geochemical speciation modelling in aqueous systems and available conceptual models of metal binding by humic substances. (authors)

  3. Integration of genomic information into sport horse breeding programs for optimization of accuracy of selection.

    Science.gov (United States)

    Haberland, A M; König von Borstel, U; Simianer, H; König, S

    2012-09-01

    Reliable selection criteria are required for young riding horses to increase genetic gain by increasing accuracy of selection and decreasing generation intervals. In this study, selection strategies incorporating genomic breeding values (GEBVs) were evaluated. Relevant stages of selection in sport horse breeding programs were analyzed by applying selection index theory. Results in terms of accuracies of indices (r(TI) ) and relative selection response indicated that information on single nucleotide polymorphism (SNP) genotypes considerably increases the accuracy of breeding values estimated for young horses without own or progeny performance. In a first scenario, the correlation between the breeding value estimated from the SNP genotype and the true breeding value (= accuracy of GEBV) was fixed to a relatively low value of r(mg) = 0.5. For a low heritability trait (h(2) = 0.15), and an index for a young horse based only on information from both parents, additional genomic information doubles r(TI) from 0.27 to 0.54. Including the conventional information source 'own performance' into the before mentioned index, additional SNP information increases r(TI) by 40%. Thus, particularly with regard to traits of low heritability, genomic information can provide a tool for well-founded selection decisions early in life. In a further approach, different sources of breeding values (e.g. GEBV and estimated breeding values (EBVs) from different countries) were combined into an overall index when altering accuracies of EBVs and correlations between traits. In summary, we showed that genomic selection strategies have the potential to contribute to a substantial reduction in generation intervals in horse breeding programs.

  4. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  5. LEARNER AUTONOMY ON ESSAY WRITING ACCURACY

    Directory of Open Access Journals (Sweden)

    Mohammad Hafidz

    2018-01-01

    Full Text Available Abstract: Learner autonomy on writing is independently teaching and learning that keeps students’ control to explore their knowledge and experiences in written language, find out and evaluate their errors based on the conceptual courses to make accurately simple essay. The aim was to know the effectiveness of the learner autonomy on writing accuracy. This quantitative research conducted in one group pretest and posttest design. The number of samples was 21 students in Bangkalan. The instrument were tests to gain students’ writing score before and after treatment. Researcher statistically analyzed the data using SPSS 23 version by running a Paired Samples Test. The result shown the means of pretes score was 66,83 and posttest score was 74,57, Paired Samples Correlations was 0,614 (strong correlation. Significance was 0,005, it means that  a (0,05 is higher than r value (0,005 with high variance of mean value (14,091. As a result, the hypothesis (H1­ was received that learner autonomy contributed effectively to learners’ in organizing own ideas (Ene, 2006 such as making a topic  map becomes some explanable sub-topics, writing down main and supporting idea, clustering some objects, editing next and learners  absolutely accumulate some selected vocabularies inappropriate topics  (Chengping W, 2008. Keyword: Learner Autonomy, Learning process, outcomes Absrak: Pembelajaran otonomi  adalah pembelajaran mandiri yang mengontrol mahasiswa untuk menyampaikan gagasan dan pengalamannya, mencatat dan mengevaluasi kesalahan yang terjadi dalam penulisan esai sederhana berdasarkan pembelajaran yang tersetruktur. Tujuan adalah untuk mengetahui efektifitas pembelajaran otonomi terhadap akurasi tulisan secara statistik. Penelitian ini dilakukan dengan menggunakan desain tes awal dan akhir. Jumlah sampel terdiri dari 21 siswa di Bangkalan. Isntrumen yang digunakan adalah tes untuk mengetahui hasil nilai mahasiswa sebelum dan sesudah melakukan

  6. Positioning accuracy of the neurotron 1000

    International Nuclear Information System (INIS)

    Cox, Richard S.; Murphy, Martin J.

    1995-01-01

    Purpose: The Neurotron 1000 is a novel treatment machine under development for frameless stereotaxic radiosurgery that consists of a compact X-band accelerator mounted on a robotic arm. The therapy beam is guided to the lesion by an imaging system, which includes two diagnostic x-ray cameras that view the patient during treatment. Patient position and motion are measured by the imaging system and appropriate corrections are communicated in real time to the robotic arm for beam targeting and motion tracking. The three tests reported here measured the pointing accuracy of the therapy beam and the present capability of the imaging guidance system. Materials and Methods: 1) The positioning and pointing test measured the ability of the robotic arm to direct the beam through a test isocenter from arbitrary arm positions. The test isocenter was marked by a small light-sensitive crystal and the beam axis was simulated by a laser. The robot was directed to move the linac to a variety of positions, aiming the laser at the crystal detector from each position. The distance of the beam axis from the crystal was measured for each robot position. 2) The imaging-guidance system was tested by moving phantoms about in the field of view of the camera by precisely known displacements and comparing the guidance system's measurement of the phantom position with the its actual position. 3) The system's overall pointing and tracking capability was measured by an end-to-end test using a dosimetric phantom containing radiochromic film. The phantom was imaged by CT to locate the center of the cubical film package. A set of isocentric robotic paths was calculated to produce a spherical dose distribution at the center of the film package. The phantom with the film was then irradiated by the therapy beam as the robot executed the path, with the imaging system providing the beam targeting directions. Results: 1) The positioning and pointing test was performed for paths consisting of spirals and

  7. Self-organizing ontology of biochemically relevant small molecules.

    Science.gov (United States)

    Chepelev, Leonid L; Hastings, Janna; Ennis, Marcus; Steinbeck, Christoph; Dumontier, Michel

    2012-01-06

    The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. We conclude that the proposed methodology can ease the burden of chemical data annotators and

  8. Self-organizing ontology of biochemically relevant small molecules

    Science.gov (United States)

    2012-01-01

    Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of

  9. Self-organizing ontology of biochemically relevant small molecules

    Directory of Open Access Journals (Sweden)

    Chepelev Leonid L

    2012-01-01

    Full Text Available Abstract Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology

  10. Functionally relevant microsatellites in sugarcane unigenes

    Directory of Open Access Journals (Sweden)

    Singh Nagendra K

    2010-11-01

    Full Text Available Abstract Background Unigene sequences constitute a rich source of functionally relevant microsatellites. The present study was undertaken to mine the microsatellites in the available unigene sequences of sugarcane for understanding their constitution in the expressed genic component of its complex polyploid/aneuploid genome, assessing their functional significance in silico, determining the extent of allelic diversity at the microsatellite loci and for evaluating their utility in large-scale genotyping applications in sugarcane. Results The average frequency of perfect microsatellite was 1/10.9 kb, while it was 1/44.3 kb for the long and hypervariable class I repeats. GC-rich trinucleotides coding for alanine and the GA-rich dinucleotides were the most abundant microsatellite classes. Out of 15,594 unigenes mined in the study, 767 contained microsatellite repeats and for 672 of these putative functions were determined in silico. The microsatellite repeats were found in the functional domains of proteins encoded by 364 unigenes. Its significance was assessed by establishing the structure-function relationship for the beta-amylase and protein kinase encoding unigenes having repeats in the catalytic domains. A total of 726 allelic variants (7.42 alleles per locus with different repeat lengths were captured precisely for a set of 47 fluorescent dye labeled primers in 36 sugarcane genotypes and five cereal species using the automated fragment analysis system, which suggested the utility of designed primers for rapid, large-scale and high-throughput genotyping applications in sugarcane. Pair-wise similarity ranging from 0.33 to 0.84 with an average of 0.40 revealed a broad genetic base of the Indian varieties in respect of functionally relevant regions of the large and complex sugarcane genome. Conclusion Microsatellite repeats were present in 4.92% of sugarcane unigenes, for most (87.6% of which functions were determined in silico. High level of

  11. Environmental biodosimetry: a biologically relevant tool for ecological risk assessment and biomonitoring

    Energy Technology Data Exchange (ETDEWEB)

    Ulsh, B. E-mail: ulshb@mcmaster.ca; Hinton, T.G.; Congdon, J.D.; Dugan, L.C.; Whicker, F.W.; Bedford, J.S

    2003-07-01

    Biodosimetry, the estimation of received doses by determining the frequency of radiation-induced chromosome aberrations, is widely applied in humans acutely exposed as a result of accidents or for clinical purposes, but biodosimetric techniques have not been utilized in organisms chronically exposed to radionuclides in contaminated environments. The application of biodosimetry to environmental exposure scenarios could greatly improve the accuracy, and reduce the uncertainties, of ecological risk assessments and biomonitoring studies, because no assumptions are required regarding external exposure rates and the movement of organisms into and out of contaminated areas. Furthermore, unlike residue analyses of environmental media environmental biodosimetry provides a genetically relevant biomarker of cumulative lifetime exposure. Symmetrical chromosome translocations can impact reproductive success, and could therefore prove to be ecologically relevant as well. We describe our experience in studying aberrations in the yellow-bellied slider turtle as an example of environmental biodosimetry.

  12. Attitude importance and the accumulation of attitude-relevant knowledge in memory.

    Science.gov (United States)

    Holbrook, Allyson L; Berent, Matthew K; Krosnick, Jon A; Visser, Penny S; Boninger, David S

    2005-05-01

    People who attach personal importance to an attitude are especially knowledgeable about the attitude object. This article tests an explanation for this relation: that importance causes the accumulation of knowledge by inspiring selective exposure to and selective elaboration of relevant information. Nine studies showed that (a) after watching televised debates between presidential candidates, viewers were better able to remember the statements made on policy issues on which they had more personally important attitudes; (b) importance motivated selective exposure and selective elaboration: Greater personal importance was associated with better memory for relevant information encountered under controlled laboratory conditions, and manipulations eliminating opportunities for selective exposure and selective elaboration eliminated the importance-memory accuracy relation; and (c) people do not use perceptions of their knowledge volume to infer how important an attitude is to them, but importance does cause knowledge accumulation.

  13. Relevance of nonlinear effects of uncertainties in the input data on the calculational results

    International Nuclear Information System (INIS)

    Carvalho da Silva, F.; D'Angelo, A.; Gandini, A.; Rado, V.

    1982-01-01

    The second order sensitivity analysis relevant to neutron activations at the end of Fe and Na blocks shows that the discrepancy between the values obtained from the direct calculation and those which take into account the inaccuracy of the input data (average values) can be significant in cases of interest. It has been observed that, for a threshold detector response after a penetration larger than 50 cm in Fe blocks and 100 cm in Na blocks, the magnitude of this discrepancy (from 50% up to 100% of standard deviation) leads to the necessity of improving the existing accuracy of the inelastic cross-sections of Fe and Na. Moreover, the above discrepancy has been evaluated in terms of project parameters relevant to a power fast fission reactor, in particular, the Fe-displacement rate in the Fe/Na shield region and the Na-activation rate in the heat exchanger. (author)

  14. Early onset depression: the relevance of anxiety.

    Science.gov (United States)

    Parker, G; Wilhelm, K; Asghari, A

    1997-01-01

    The aim of this study was to determine risk factors that may differentiate early onset from late onset depression. A non-clinical cohort that had been assessed from 1978 to 1993 at 5 yearly intervals and that had a high prevalence rate of lifetime depression took part in the study. We established an appropriate age cut-off to distinguish early onset (i.e. before 26 years) of major and of minor depression, and examined the relevance of a number of possible determinants of early onset depression assessed over the life of the study. Despite several dimensional measures of depression, self-esteem and personality being considered, they generally failed (when assessed early in the study) to discriminate subsequent early onset depression, with the exception of low masculinity scores being a weak predictor of major and/or minor depression. Early onset depression was strongly predicted, however, by a lifetime episode of a major anxiety disorder, with generalised anxiety being a somewhat stronger and more consistent predictor than panic disorder, agoraphobia and minor anxiety disorders (ie social phobia, simple phobia). The possibility that anxiety may act as a key predispositional factor to early onset depression and to a greater number of depressive episodes is important in that clinical assessment and treatment of any existing anxiety disorder may be a more efficient and useful strategy than focussing primarily on the depressive disorder.

  15. Top studies relevant to primary care practice.

    Science.gov (United States)

    Perry, Danielle; Kolber, Michael R; Korownyk, Christina; Lindblad, Adrienne J; Ramji, Jamil; Ton, Joey; Allan, G Michael

    2018-04-01

    To summarize 10 high-quality studies from 2017 that have strong relevance to primary care practice. Study selection involved routine literature surveillance by a group of primary care health professionals. This included screening abstracts of important journals and Evidence Alerts, as well as searching the American College of Physicians Journal Club. Topics of the 2017 articles include whether treating subclinical hypothyroidism improves outcomes or symptoms; whether evolocumab reduces cardiovascular disease as well as low-density lipoprotein levels; whether lifestyle interventions reduce medication use in patients with diabetes; whether vitamin D prevents cardiovascular disease, cancer, or upper respiratory tract infections; whether canagliflozin reduces clinical events in patients with diabetes; how corticosteroid injections affect knee osteoarthritis; whether drained abscesses benefit from antibiotic treatment; whether patients with diabetes benefit from bariatric surgery; whether exenatide reduces clinical events in patients with diabetes; and whether tympanostomy tubes affect outcomes in recurrent acute otitis media or chronic otitis media. We provide brief summaries, context where needed, and final recommendations for 10 studies with potential effects on primary care. We also briefly review 5 "runner-up" studies. Research from 2017 produced several high-quality studies in diabetes management. These have demonstrated benefit for alternative therapies and offered evidence not previously available. This year's selection of studies also provided information on a variety of conditions and therapies that are, or might become, more common in primary care settings. Copyright© the College of Family Physicians of Canada.

  16. Clinically Relevant Anticancer Polymer Paclitaxel Therapeutics

    Directory of Open Access Journals (Sweden)

    Danbo Yang

    2010-12-01

    Full Text Available The concept of utilizing polymers in drug delivery has been extensively explored for improving the therapeutic index of small molecule drugs. In general, polymers can be used as polymer-drug conjugates or polymeric micelles. Each unique application mandates its own chemistry and controlled release of active drugs. Each polymer exhibits its own intrinsic issues providing the advantage of flexibility. However, none have as yet been approved by the U.S. Food and Drug Administration. General aspects of polymer and nano-particle therapeutics have been reviewed. Here we focus this review on specific clinically relevant anticancer polymer paclitaxel therapeutics. We emphasize their chemistry and formulation, in vitro activity on some human cancer cell lines, plasma pharmacokinetics and tumor accumulation, in vivo efficacy, and clinical outcomes. Furthermore, we include a short review of our recent developments of a novel poly(L-g-glutamylglutamine-paclitaxel nano-conjugate (PGG-PTX. PGG-PTX has its own unique property of forming nano-particles. It has also been shown to possess a favorable profile of pharmacokinetics and to exhibit efficacious potency. This review might shed light on designing new and better polymer paclitaxel therapeutics for potential anticancer applications in the clinic.

  17. Ligand Exchange Kinetics of Environmentally Relevant Metals

    Energy Technology Data Exchange (ETDEWEB)

    Panasci, Adele Frances [Univ. of California, Davis, CA (United States)

    2014-07-15

    The interactions of ground water with minerals and contaminants are of broad interest for geochemists but are not well understood. Experiments on the molecular scale can determine reaction parameters (i.e. rates of ligand exchange, activation entropy, activation entropy, and activation volume) that can be used in computations to gain insight into reactions that occur in natural groundwaters. Experiments to determine the rate of isotopic ligand exchange for three environmentally relevant metals, rhodium (Rh), iron (Fe), and neptunium (Np), are described. Many environmental transformations of metals (e.g. reduction) in soil occur at trivalent centers, Fe(III) in particular. Contaminant ions absorb to mineral surfaces via ligand exchange, and the reversal of this reaction can be dangerous, releasing contaminants into the environment. Ferric iron is difficult to study spectroscopically because most of its complexes are paramagnetic and are generally reactive toward ligand exchange; therefore, Rh(III), which is diamagnetic and less reactive, was used to study substitution reactions that are analogous to those that occur on mineral oxide surfaces. Studies on both Np(V) and Np(VI) are important in their own right, as 237Np is a radioactive transuranic element with a half-life of 2 million years.

  18. Relevance of extracellular DNA in rhizosphere

    Science.gov (United States)

    Pietramellara, Giacomo; Ascher, Judith; Baraniya, Divyashri; Arfaioli, Paola; Ceccherini, Maria Teresa; Hawes, Martha

    2013-04-01

    One of the most promising areas for future development is the manipulation of the rhizosphere to produce sustainable and efficient agriculture production systems. Using Omics approaches, to define the distinctive features of eDNA systems and structures, will facilitate progress in rhizo-enforcement and biocontrol studies. The relevance of these studies results clear when we consider the plethora of ecological functions in which eDNA is involved. This fraction can be actively extruded by living cells or discharged during cellular lysis and may exert a key role in the stability and variability of the soil bacterial genome, resulting also a source of nitrogen and phosphorus for plants due to the root's capacity to directly uptake short DNA fragments. The adhesive properties of the DNA molecule confer to eDNA the capacity to inhibit or kill pathogenic bacteria by cation limitation induction, and to facilitate formation of biofilm and extracellular traps (ETs), that may protect microorganisms inhabiting biofilm and plant roots against pathogens and allelopathic substances. The ETs are actively extruded by root border cells when they are dispersed in the rhizosphere, conferring to plants the capacity to extend an endogenous pathogen defence system outside the organism. Moreover, eDNA could be involved in rhizoremediation in heavy metal polluted soil acting as a bioflotation reagent.

  19. Relevance of tidal heating on large TNOs

    Science.gov (United States)

    Saxena, Prabal; Renaud, Joe P.; Henning, Wade G.; Jutzi, Martin; Hurford, Terry

    2018-03-01

    We examine the relevance of tidal heating for large Trans-Neptunian Objects, with a focus on its potential to melt and maintain layers of subsurface liquid water. Depending on their past orbital evolution, tidal heating may be an important part of the heat budget for a number of discovered and hypothetical TNO systems and may enable formation of, and increased access to, subsurface liquid water. Tidal heating induced by the process of despinning is found to be particularly able to compete with heating due to radionuclide decay in a number of different scenarios. In cases where radiogenic heating alone may establish subsurface conditions for liquid water, we focus on the extent by which tidal activity lifts the depth of such conditions closer to the surface. While it is common for strong tidal heating and long lived tides to be mutually exclusive, we find this is not always the case, and highlight when these two traits occur together. We find cases where TNO systems experience tidal heating that is a significant proportion of, or greater than radiogenic heating for periods ranging from100‧s of millions to a billion years. For subsurface oceans that contain a small antifreeze component, tidal heating due to very high initial spin states may enable liquid water to be preserved right up to the present day. Of particular interest is the Eris-Dysnomia system, which in those cases may exhibit extant cryovolcanism.

  20. Extracellular vesicles: fundamentals and clinical relevance

    Directory of Open Access Journals (Sweden)

    Wael Nassar

    2015-01-01

    Full Text Available All types of cells of eukaryotic organisms produce and release small nanovesicles into their extracellular environment. Early studies have described these vesicles as ′garbage bags′ only to remove obsolete cellular molecules. Valadi and colleagues, in 2007, were the first to discover the capability of circulating extracellular vesicles (EVs to horizontally transfer functioning gene information between cells. These extracellular vesicles express components responsible for angiogenesis promotion, stromal remodeling, chemoresistance, genetic exchange, and signaling pathway activation through growth factor/receptor transfer. EVs represent an important mode of intercellular communication by serving as vehicles for transfer between cells of membrane and cytosolic proteins, lipids, signaling proteins, and RNAs. They contribute to physiology and pathology, and they have a myriad of potential clinical applications in health and disease. Moreover, vesicles can pass the blood-brain barrier and may perhaps even be considered as naturally occurring liposomes. These cell-derived EVs not only represent a central mediator of the disease microenvironment, but their presence in the peripheral circulation may serve as a surrogate for disease biopsies, enabling real-time diagnosis and disease monitoring. In this review, we′ll be addressing the characteristics of different types of extracellular EVs, as well as their clinical relevance and potential as diagnostic markers, and also define therapeutic options.

  1. Human error theory: relevance to nurse management.

    Science.gov (United States)

    Armitage, Gerry

    2009-03-01

    Describe, discuss and critically appraise human error theory and consider its relevance for nurse managers. Healthcare errors are a persistent threat to patient safety. Effective risk management and clinical governance depends on understanding the nature of error. This paper draws upon a wide literature from published works, largely from the field of cognitive psychology and human factors. Although the content of this paper is pertinent to any healthcare professional; it is written primarily for nurse managers. Error is inevitable. Causation is often attributed to individuals, yet causation in complex environments such as healthcare is predominantly multi-factorial. Individual performance is affected by the tendency to develop prepacked solutions and attention deficits, which can in turn be related to local conditions and systems or latent failures. Blame is often inappropriate. Defences should be constructed in the light of these considerations and to promote error wisdom and organizational resilience. Managing and learning from error is seen as a priority in the British National Health Service (NHS), this can be better achieved with an understanding of the roots, nature and consequences of error. Such an understanding can provide a helpful framework for a range of risk management activities.

  2. Clinically Relevant Anticancer Polymer Paclitaxel Therapeutics

    International Nuclear Information System (INIS)

    Yang, Danbo; Yu, Lei; Van, Sang

    2010-01-01

    The concept of utilizing polymers in drug delivery has been extensively explored for improving the therapeutic index of small molecule drugs. In general, polymers can be used as polymer-drug conjugates or polymeric micelles. Each unique application mandates its own chemistry and controlled release of active drugs. Each polymer exhibits its own intrinsic issues providing the advantage of flexibility. However, none have as yet been approved by the U.S. Food and Drug Administration. General aspects of polymer and nano-particle therapeutics have been reviewed. Here we focus this review on specific clinically relevant anticancer polymer paclitaxel therapeutics. We emphasize their chemistry and formulation, in vitro activity on some human cancer cell lines, plasma pharmacokinetics and tumor accumulation, in vivo efficacy, and clinical outcomes. Furthermore, we include a short review of our recent developments of a novel poly(l-γ-glutamylglutamine)-paclitaxel nano-conjugate (PGG-PTX). PGG-PTX has its own unique property of forming nano-particles. It has also been shown to possess a favorable profile of pharmacokinetics and to exhibit efficacious potency. This review might shed light on designing new and better polymer paclitaxel therapeutics for potential anticancer applications in the clinic

  3. Accreditation - Its relevance for laboratories measuring radionuclides

    Energy Technology Data Exchange (ETDEWEB)

    Palsson, S E [Icelandic Radiation Protection Inst. (Iceland)

    2001-11-01

    Accreditation is an internationally recognised way for laboratories to demonstrate their competence. Obtaining and maintaining accreditation is, however, a costly and time-consuming procedure. The benefits of accreditation also depend on the role of the laboratory. Accreditation may be of limited relevance for a research laboratory, but essential for a laboratory associated with a national authority and e.g. issuing certificates. This report describes work done within the NKSBOK-1.1 sub-project on introducing accreditation to Nordic laboratories measuring radionuclides. Initially the focus was on the new standard ISO/IEC 17025, which was just in a draft form at the time, but which provides now a new framework for accreditation of laboratories. Later the focus was widened to include a general introduction to accreditation and providing through seminars a forum for exchanging views on the experience laboratories have had in this field. Copies of overheads from the last such seminar are included in the appendix to this report. (au)

  4. Perspective: Organizational professionalism: relevant competencies and behaviors.

    Science.gov (United States)

    Egener, Barry; McDonald, Walter; Rosof, Bernard; Gullen, David

    2012-05-01

    The professionalism behaviors of physicians have been extensively discussed and defined; however, the professionalism behaviors of health care organizations have not been systemically categorized or described. Defining organizational professionalism is important because the behaviors of a health care organization may substantially impact the behaviors of physicians and others within the organization as well as other institutions and the larger community. In this article, the authors discuss the following competencies of organizational professionalism, derived from ethical values: service, respect, fairness, integrity, accountability, mindfulness, and self-motivation. How nonprofit health care organizations can translate these competencies into behaviors is described. For example, incorporating metrics of population health into assessments of corporate success may increase collaboration among regional health care organizations while also benefiting the community. The unique responsibilities of leadership to model these competencies, promote them in the community, and develop relevant organizational strategies are clarified. These obligations elevate the importance of the executive leadership's capacity for self-reflection and the governing boards' responsibility for mapping operational activities to organizational mission. Lastly, the authors consider how medical organizations are currently addressing professionalism challenges. In an environment made turbulent by regulatory change and financial constraints, achieving proficiency in professionalism competencies can assist nonprofit health care organizations to promote population health and the well-being of their workforces.

  5. Media and mental illness: Relevance to India

    Directory of Open Access Journals (Sweden)

    S K Padhy

    2014-01-01

    Full Text Available Media has a complex interrelationship with mental illnesses. This narrative review takes a look at the various ways in which media and mental illnesses interact. Relevant scientific literature and electronic databases were searched, including Pubmed and GoogleScholar, to identify studies, viewpoints and recommendations using keywords related to media and mental illnesses. This review discusses both the positive and the negative portrayals of mental illnesses through the media. The portrayal of mental health professionals and psychiatric treatment is also discussed. The theories explaining the relationship of how media influences the attitudes and behavior are discussed. Media has also been suggested to be a risk factor for the genesis or exacerbation of mental illnesses like eating disorders and substance use disorders. The potential use of media to understand the psychopathology and plight of those with psychiatric disorders is referred to. The manner in which media can be used as a tool for change to reduce the stigma surrounding mental illnesses is explored.

  6. EXTRACELLULAR VESICLES: CLASSIFICATION, FUNCTIONS AND CLINICAL RELEVANCE

    Directory of Open Access Journals (Sweden)

    A. V. Oberemko

    2014-12-01

    Full Text Available This review presents a generalized definition of vesicles as bilayer extracellular organelles of all celular forms of life: not only eu-, but also prokaryotic. The structure and composition of extracellular vesicles, history of research, nomenclature, their impact on life processes in health and disease are discussed. Moreover, vesicles may be useful as clinical instruments for biomarkers, and they are promising as biotechnological drug. However, many questions in this area are still unresolved and need to be addressed in the future. The most interesting from the point of view of practical health care represents a direction to study the effect of exosomes and microvesicles in the development and progression of a particular disease, the possibility of adjusting the pathological process by means of extracellular vesicles of a particular type, acting as an active ingredient. Relevant is the further elucidation of the role and importance of exosomes to the surrounding cells, tissues and organs at the molecular level, the prospects for the use of non-cellular vesicles as biomarkers of disease.

  7. Clinically Relevant Anticancer Polymer Paclitaxel Therapeutics

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Danbo [Biomedical Engineering and Technology Institute, Institutes for Advanced Interdisciplinary Research, East China Normal University, 3663 North Zhongshan Road, Shanghai, 200062 (China); Yu, Lei, E-mail: yu-lei@gg.nitto.co.jp [Biomedical Engineering and Technology Institute, Institutes for Advanced Interdisciplinary Research, East China Normal University, 3663 North Zhongshan Road, Shanghai, 200062 (China); Biomedical Group, Nitto Denko Technical Corporation, 501 Via Del Monte, Oceanside, CA 92058 (United States); Van, Sang [Biomedical Group, Nitto Denko Technical Corporation, 501 Via Del Monte, Oceanside, CA 92058 (United States)

    2010-12-23

    The concept of utilizing polymers in drug delivery has been extensively explored for improving the therapeutic index of small molecule drugs. In general, polymers can be used as polymer-drug conjugates or polymeric micelles. Each unique application mandates its own chemistry and controlled release of active drugs. Each polymer exhibits its own intrinsic issues providing the advantage of flexibility. However, none have as yet been approved by the U.S. Food and Drug Administration. General aspects of polymer and nano-particle therapeutics have been reviewed. Here we focus this review on specific clinically relevant anticancer polymer paclitaxel therapeutics. We emphasize their chemistry and formulation, in vitro activity on some human cancer cell lines, plasma pharmacokinetics and tumor accumulation, in vivo efficacy, and clinical outcomes. Furthermore, we include a short review of our recent developments of a novel poly(l-γ-glutamylglutamine)-paclitaxel nano-conjugate (PGG-PTX). PGG-PTX has its own unique property of forming nano-particles. It has also been shown to possess a favorable profile of pharmacokinetics and to exhibit efficacious potency. This review might shed light on designing new and better polymer paclitaxel therapeutics for potential anticancer applications in the clinic.

  8. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    Science.gov (United States)

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  9. Precision and accuracy of mechanistic-empirical pavement design

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-09-01

    Full Text Available are discussed in general. The effects of variability and error on the design accuracy and design risk are lastly illustrated at the hand of a simple mechanistic-empirical design problem, showing that the engineering models alone determine the accuracy...

  10. An Investigation to Improve Classifier Accuracy for Myo Collected Data

    Science.gov (United States)

    2017-02-01

    Bad Samples Effect on Classification Accuracy 7 5.1 Naïve Bayes (NB) Classifier Accuracy 7 5.2 Logistic Model Tree (LMT) 10 5.3 K-Nearest Neighbor...gesture, pitch feature, user 06. All samples exhibit reversed movement...20 Fig. A-2 Come gesture, pitch feature, user 14. All samples exhibit reversed movement

  11. Coorientational Accuracy and Differentiation in the Management of Conflict.

    Science.gov (United States)

    Papa, Michael J.; Pood, Elliott A.

    1988-01-01

    Investigates the relationship between coorientational accuracy and differentiation time and two dimensions of conflict (interaction satisfaction and assertiveness of influence strategies). Suggests that entering a conflict with high coorientational accuracy leads to less differentiation and fewer assertive strategies during the confrontation and…

  12. Assessment Of Accuracies Of Remote-Sensing Maps

    Science.gov (United States)

    Card, Don H.; Strong, Laurence L.

    1992-01-01

    Report describes study of accuracies of classifications of picture elements in map derived by digital processing of Landsat-multispectral-scanner imagery of coastal plain of Arctic National Wildlife Refuge. Accuracies of portions of map analyzed with help of statistical sampling procedure called "stratified plurality sampling", in which all picture elements in given cluster classified in stratum to which plurality of them belong.

  13. English Verb Accuracy of Bilingual Cantonese-English Preschoolers

    Science.gov (United States)

    Rezzonico, Stefano; Goldberg, Ahuva; Milburn, Trelani; Belletti, Adriana; Girolametto, Luigi

    2017-01-01

    Purpose: Knowledge of verb development in typically developing bilingual preschoolers may inform clinicians about verb accuracy rates during the 1st 2 years of English instruction. This study aimed to investigate tensed verb accuracy in 2 assessment contexts in 4- and 5-year-old Cantonese-English bilingual preschoolers. Method: The sample included…

  14. Accuracy assessment of airborne laser scanning strips using planar features

    NARCIS (Netherlands)

    Soudarissanane, S.S.; Van der Sande, C.J.; Khoshelham, K.

    2010-01-01

    Airborne Laser Scanning (ALS) is widely used in many applications for its high measurement accuracy, fast acquisition capability, and large spatial coverage. Accuracy assessment of the ALS data usually relies on comparing corresponding tie elements, often points or lines, in the overlapping strips.

  15. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  16. 12 CFR 740.2 - Accuracy of advertising.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Accuracy of advertising. 740.2 Section 740.2... ADVERTISING AND NOTICE OF INSURED STATUS § 740.2 Accuracy of advertising. No insured credit union may use any advertising (which includes print, electronic, or broadcast media, displays and signs, stationery, and other...

  17. Does a Structured Data Collection Form Improve The Accuracy of ...

    African Journals Online (AJOL)

    and multiple etiologies for similar presentation. Standardized forms may harmonize the initial assessment, improve accuracy of diagnosis and enhance outcomes. Objectives: To determine the extent to which use of a structured data collection form (SDCF) affected the diagnostic accuracy of AAP. Methodology: A before and ...

  18. Quantifying the Accuracy of a Diagnostic Test or Marker

    NARCIS (Netherlands)

    Linnet, Kristian; Bossuyt, Patrick M. M.; Moons, Karel G. M.; Reitsma, Johannes B. R.

    2012-01-01

    BACKGROUND: In recent years, increasing focus has been directed to the methodology for evaluating (new) tests or biomarkers. A key step in the evaluation of a diagnostic test is the investigation into its accuracy. CONTENT: We reviewed the literature on how to assess the accuracy of diagnostic

  19. Concept Mapping Improves Metacomprehension Accuracy among 7th Graders

    Science.gov (United States)

    Redford, Joshua S.; Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2012-01-01

    Two experiments explored concept map construction as a useful intervention to improve metacomprehension accuracy among 7th grade students. In the first experiment, metacomprehension was marginally better for a concept mapping group than for a rereading group. In the second experiment, metacomprehension accuracy was significantly greater for a…

  20. Diagnostic accuracy of postmortem imaging vs autopsy—A systematic review

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, Anders, E-mail: anders.eriksson@rmv.se [Section of Forensic Medicine, Dept of Community Medicine and Rehabilitation, Umeå University, PO Box 7016, SE-907 12 Umeå (Sweden); Gustafsson, Torfinn [Section of Forensic Medicine, Dept of Community Medicine and Rehabilitation, Umeå University, PO Box 7016, SE-907 12 Umeå (Sweden); Höistad, Malin; Hultcrantz, Monica [Swedish Agency for Health Technology Assessment and Assessment of Social Services, PO Box 3657, SE-103 59 Stockholm (Sweden); Department of Learning, Informatics, Management and Ethics, Karolinska Institutet, SE-171 77 Stockholm (Sweden); Jacobson, Stella; Mejare, Ingegerd [Swedish Agency for Health Technology Assessment and Assessment of Social Services, PO Box 3657, SE-103 59 Stockholm (Sweden); Persson, Anders [Department of Medical and Health Sciences, Center for Medical Image Science and Visualization (CMIV), Linköping University, SE-581 85, Linköping Sweden (Sweden)

    2017-04-15

    Highlights: • The search generated 340 possibly relevant publications, of which 49 were assessed as having high risk of bias and 22 as moderate risk. • Due to considerable heterogeneity of included studies it was impossible to estimate the diagnostic accuracy of the various findings. • Future studies need larger materials and improved planning and methodological quality, preferentially from multi-center studies. - Abstract: Background Postmortem imaging has been used for more than a century as a complement to medico-legal autopsies. The technique has also emerged as a possible alternative to compensate for the continuous decline in the number of clinical autopsies. To evaluate the diagnostic accuracy of postmortem imaging for various types of findings, we performed this systematic literature review. Data sources The literature search was performed in the databases PubMed, Embase and Cochrane Library through January 7, 2015. Relevant publications were assessed for risk of bias using the QUADAS tool and were classified as low, moderate or high risk of bias according to pre-defined criteria. Autopsy and/or histopathology were used as reference standard. Findings The search generated 2600 abstracts, of which 340 were assessed as possibly relevant and read in full-text. After further evaluation 71 studies were finally included, of which 49 were assessed as having high risk of bias and 22 as moderate risk of bias. Due to considerable heterogeneity – in populations, techniques, analyses and reporting – of included studies it was impossible to combine data to get a summary estimate of the diagnostic accuracy of the various findings. Individual studies indicate, however, that imaging techniques might be useful for determining organ weights, and that the techniques seem superior to autopsy for detecting gas Conclusions and Implications In general, based on the current scientific literature, it was not possible to determine the diagnostic accuracy of postmortem

  1. Diagnostic accuracy of postmortem imaging vs autopsy—A systematic review

    International Nuclear Information System (INIS)

    Eriksson, Anders; Gustafsson, Torfinn; Höistad, Malin; Hultcrantz, Monica; Jacobson, Stella; Mejare, Ingegerd; Persson, Anders

    2017-01-01

    Highlights: • The search generated 340 possibly relevant publications, of which 49 were assessed as having high risk of bias and 22 as moderate risk. • Due to considerable heterogeneity of included studies it was impossible to estimate the diagnostic accuracy of the various findings. • Future studies need larger materials and improved planning and methodological quality, preferentially from multi-center studies. - Abstract: Background Postmortem imaging has been used for more than a century as a complement to medico-legal autopsies. The technique has also emerged as a possible alternative to compensate for the continuous decline in the number of clinical autopsies. To evaluate the diagnostic accuracy of postmortem imaging for various types of findings, we performed this systematic literature review. Data sources The literature search was performed in the databases PubMed, Embase and Cochrane Library through January 7, 2015. Relevant publications were assessed for risk of bias using the QUADAS tool and were classified as low, moderate or high risk of bias according to pre-defined criteria. Autopsy and/or histopathology were used as reference standard. Findings The search generated 2600 abstracts, of which 340 were assessed as possibly relevant and read in full-text. After further evaluation 71 studies were finally included, of which 49 were assessed as having high risk of bias and 22 as moderate risk of bias. Due to considerable heterogeneity – in populations, techniques, analyses and reporting – of included studies it was impossible to combine data to get a summary estimate of the diagnostic accuracy of the various findings. Individual studies indicate, however, that imaging techniques might be useful for determining organ weights, and that the techniques seem superior to autopsy for detecting gas Conclusions and Implications In general, based on the current scientific literature, it was not possible to determine the diagnostic accuracy of postmortem

  2. High accuracy autonomous navigation using the global positioning system (GPS)

    Science.gov (United States)

    Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul

    1997-01-01

    The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.

  3. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  4. Accuracy optimization with wavelength tunability in overlay imaging technology

    Science.gov (United States)

    Lee, Honggoo; Kang, Yoonshik; Han, Sangjoon; Shim, Kyuchan; Hong, Minhyung; Kim, Seungyoung; Lee, Jieun; Lee, Dongyoung; Oh, Eungryong; Choi, Ahlin; Kim, Youngsik; Marciano, Tal; Klein, Dana; Hajaj, Eitan M.; Aharon, Sharon; Ben-Dov, Guy; Lilach, Saltoun; Serero, Dan; Golotsvan, Anna

    2018-03-01

    As semiconductor manufacturing technology progresses and the dimensions of integrated circuit elements shrink, overlay budget is accordingly being reduced. Overlay budget closely approaches the scale of measurement inaccuracies due to both optical imperfections of the measurement system and the interaction of light with geometrical asymmetries of the measured targets. Measurement inaccuracies can no longer be ignored due to their significant effect on the resulting device yield. In this paper we investigate a new approach for imaging based overlay (IBO) measurements by optimizing accuracy rather than contrast precision, including its effect over the total target performance, using wavelength tunable overlay imaging metrology. We present new accuracy metrics based on theoretical development and present their quality in identifying the measurement accuracy when compared to CD-SEM overlay measurements. The paper presents the theoretical considerations and simulation work, as well as measurement data, for which tunability combined with the new accuracy metrics is shown to improve accuracy performance.

  5. Accuracy of clinical coding for procedures in oral and maxillofacial surgery.

    Science.gov (United States)

    Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I

    2016-10-01

    Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. TECHNIQUE OF CONSTRUCTION AND ANALYSIS OF GLONASS FIELDS OF ACCURACY IN THE GIVEN ZONE OF AIRSPACE

    Directory of Open Access Journals (Sweden)

    O. N. Skrypnik

    2015-01-01

    Full Text Available Based on the usage of LabView’s developed program of orbital motion modeling and the choice of satellite’s working constellation, the methodology of building-up the fields of potential accuracy GLONASS in the given airspace has been proposed. The methods are based on the estimation of horizontal (HDOP and vertical (VDOP geometric factors’ values in points chosen with given latitude and longitude discontinuity in the airspace which is being studied. By relevant error handling the areas where the values of HDOP and VDOP lay within given range and their cartographic matching are selected. Expressions for geometric factors calculation are listed. By comparing the data of real experiments with semireal-istic simulation which have been conducted with the aeronautical receiver CH-4312 and the simulator CH-3803M, the validity of math model and the results’ accuracy have been evaluated. Investigations of geometric factors’ change in the initial and finishing points of flight route and also during the flight Irkutsk-Moscow have been conducted. As an example the fields of accuracy GLONASS in horizontal and vertical surfaces for the airspace between Irkutsk and Moscow have been built for such points in time that match the aircraft’s take off in Irkutsk and its landing in Moscow.

  7. Evaluation of accuracy of intra operative imprint cytology for detection of breast lesions

    International Nuclear Information System (INIS)

    Mahmood, Z.; Shahbaz, A.; Qureshi, A.; Aziz, N.; Niazi, S.; Qureshi, S.; Bukhari, M.H.

    2010-01-01

    Objective: To determine the accuracy of imprint cytology as an intraoperative diagnostic procedure for breast lesions with histopathological correlation. Materials and Methods: This was a descriptive study on 40 cases of breast lesions comprising of inflammatory, benign and malignant lesions including their margins etc. It was conducted at King Edward Medical University, Lahore in collaboration with all Surgical Departments of Mayo Hospital. Relevant clinical data was recorded in a proforma. Both touch and scrape imprints were prepared from all the lesions and stained with May-Grunwaled Giemsa and Haematoxylin and Eosin stains. The imprints were subsequently compared with histopathology sections. Results: When we used atypical cases as negative both touch and scrape imprints gave sensitivity, specificity, positive predictive value, negative predictive value and accuracy at 100%. However when we used cases with atypia as positive, sensitivity and negative predictive value were 100% with both touch and scrape imprints. Specificity, positive predictive value and accuracy were 71%, 86%, 85.5% respectively with touch imprints and 78%, 89%, 89% respectively with scrape imprints. No diagnostic difference was noted between the results of both stains. All the imprints were well correlated with histopathological diagnosis. Conclusion: Imprint cytology is an accurate and simple intraoperative method for diagnosing breast lesions. It can provide the surgeons with information regarding immediate clinical and surgical interventions. (author)

  8. Accuracy of endoscopic ultrasonography for diagnosing ulcerative early gastric cancers

    Science.gov (United States)

    Park, Jin-Seok; Kim, Hyungkil; Bang, Byongwook; Kwon, Kyesook; Shin, Youngwoon

    2016-01-01

    Abstract Although endoscopic ultrasonography (EUS) is the first-choice imaging modality for predicting the invasion depth of early gastric cancer (EGC), the prediction accuracy of EUS is significantly decreased when EGC is combined with ulceration. The aim of present study was to compare the accuracy of EUS and conventional endoscopy (CE) for determining the depth of EGC. In addition, the various clinic-pathologic factors affecting the diagnostic accuracy of EUS, with a particular focus on endoscopic ulcer shapes, were evaluated. We retrospectively reviewed data from 236 consecutive patients with ulcerative EGC. All patients underwent EUS for estimating tumor invasion depth, followed by either curative surgery or endoscopic treatment. The diagnostic accuracy of EUS and CE was evaluated by comparing the final histologic result of resected specimen. The correlation between accuracy of EUS and characteristics of EGC (tumor size, histology, location in stomach, tumor invasion depth, and endoscopic ulcer shapes) was analyzed. Endoscopic ulcer shapes were classified into 3 groups: definite ulcer, superficial ulcer, and ill-defined ulcer. The overall accuracy of EUS and CE for predicting the invasion depth in ulcerative EGC was 68.6% and 55.5%, respectively. Of the 236 patients, 36 patients were classified as definite ulcers, 98 were superficial ulcers, and 102 were ill-defined ulcers, In univariate analysis, EUS accuracy was associated with invasion depth (P = 0.023), tumor size (P = 0.034), and endoscopic ulcer shapes (P = 0.001). In multivariate analysis, there is a significant association between superficial ulcer in CE and EUS accuracy (odds ratio: 2.977; 95% confidence interval: 1.255–7.064; P = 0.013). The accuracy of EUS for determining tumor invasion depth in ulcerative EGC was superior to that of CE. In addition, ulcer shape was an important factor that affected EUS accuracy. PMID:27472672

  9. Development of response inhibition in the context of relevant versus irrelevant emotions

    Directory of Open Access Journals (Sweden)

    Margot A Schel

    2013-07-01

    Full Text Available The present study examined the influence of relevant and irrelevant emotions on response inhibition from childhood to early adulthood. Ninety-four participants between 6 and 25 years of age performed two go/nogo tasks with emotional faces (neutral, happy, and fearful as stimuli. In one go/nogo task emotion formed a relevant dimension of the task and in the other go/nogo task emotion was irrelevant and participants had to respond to the color of the faces instead. A special feature of the latter task, in which emotion was irrelevant, was the inclusion of free choice trials, in which participants could freely decide between acting and inhibiting. Results showed a linear increase in response inhibition performance with increasing age both in relevant and irrelevant affective contexts. Relevant emotions had a pronounced influence on performance across age, whereas irrelevant emotions did not. Overall, participants made more false alarms on trials with fearful faces than happy faces, and happy faces were associated with better performance on go trials (higher percentage correct and faster RTs than fearful faces. The latter effect was stronger for young children in terms of accuracy. Finally, during the free choice trials participants did not base their decisions on affective context, confirming that irrelevant emotions do not have a strong impact on inhibition. Together, these findings suggest that across development relevant affective context has a larger influence on response inhibition than irrelevant affective context. When emotions are relevant, a context of positive emotions is associated with better performance compared to a context with negative emotions, especially in young children.

  10. Diagnostic accuracy of postmortem imaging vs autopsy-A systematic review.

    Science.gov (United States)

    Eriksson, Anders; Gustafsson, Torfinn; Höistad, Malin; Hultcrantz, Monica; Jacobson, Stella; Mejare, Ingegerd; Persson, Anders

    2017-04-01

    Background Postmortem imaging has been used for more than a century as a complement to medico-legal autopsies. The technique has also emerged as a possible alternative to compensate for the continuous decline in the number of clinical autopsies. To evaluate the diagnostic accuracy of postmortem imaging for various types of findings, we performed this systematic literature review. Data sources The literature search was performed in the databases PubMed, Embase and Cochrane Library through January 7, 2015. Relevant publications were assessed for risk of bias using the QUADAS tool and were classified as low, moderate or high risk of bias according to pre-defined criteria. Autopsy and/or histopathology were used as reference standard. Findings The search generated 2600 abstracts, of which 340 were assessed as possibly relevant and read in full-text. After further evaluation 71 studies were finally included, of which 49 were assessed as having high risk of bias and 22 as moderate risk of bias. Due to considerable heterogeneity - in populations, techniques, analyses and reporting - of included studies it was impossible to combine data to get a summary estimate of the diagnostic accuracy of the various findings. Individual studies indicate, however, that imaging techniques might be useful for determining organ weights, and that the techniques seem superior to autopsy for detecting gas Conclusions and Implications In general, based on the current scientific literature, it was not possible to determine the diagnostic accuracy of postmortem imaging and its usefulness in conjunction with, or as an alternative to autopsy. To correctly determine the usefulness of postmortem imaging, future studies need improved planning, improved methodological quality and larger materials, preferentially obtained from multi-center studies. Copyright © 2016. Published by Elsevier B.V.

  11. Structural Model Error and Decision Relevancy

    Science.gov (United States)

    Goldsby, M.; Lusk, G.

    2017-12-01

    The extent to which climate models can underwrite specific climate policies has long been a contentious issue. Skeptics frequently deny that climate models are trustworthy in an attempt to undermine climate action, whereas policy makers often desire information that exceeds the capabilities of extant models. While not skeptics, a group of mathematicians and philosophers [Frigg et al. (2014)] recently argued that even tiny differences between the structure of a complex dynamical model and its target system can lead to dramatic predictive errors, possibly resulting in disastrous consequences when policy decisions are based upon those predictions. They call this result the Hawkmoth effect (HME), and seemingly use it to rebuke rightwing proposals to forgo mitigation in favor of adaptation. However, a vigorous debate has emerged between Frigg et al. on one side and another philosopher-mathematician pair [Winsberg and Goodwin (2016)] on the other. On one hand, Frigg et al. argue that their result shifts the burden to climate scientists to demonstrate that their models do not fall prey to the HME. On the other hand, Winsberg and Goodwin suggest that arguments like those asserted by Frigg et al. can be, if taken seriously, "dangerous": they fail to consider the variety of purposes for which models can be used, and thus too hastily undermine large swaths of climate science. They put the burden back on Frigg et al. to show their result has any effect on climate science. This paper seeks to attenuate this debate by establishing an irenic middle position; we find that there is more agreement between sides than it first seems. We distinguish a `decision standard' from a `burden of proof', which helps clarify the contributions to the debate from both sides. In making this distinction, we argue that scientists bear the burden of assessing the consequences of HME, but that the standard Frigg et al. adopt for decision relevancy is too strict.

  12. Intracranial Aneurysms of Neuro-Ophthalmologic Relevance.

    Science.gov (United States)

    Micieli, Jonathan A; Newman, Nancy J; Barrow, Daniel L; Biousse, Valérie

    2017-12-01

    Intracranial saccular aneurysms are acquired lesions that often present with neuro-ophthalmologic symptoms and signs. Recent advances in neurosurgical techniques, endovascular treatments, and neurocritical care have improved the optimal management of symptomatic unruptured aneurysms, but whether the chosen treatment has an impact on neuro-ophthalmologic outcomes remains debated. A review of the literature focused on neuro-ophthalmic manifestations and treatment of intracranial aneurysms with specific relevance to neuro-ophthalmologic outcomes was conducted using Ovid MEDLINE and EMBASE databases. Cavernous sinus aneurysms were not included in this review. Surgical clipping vs endovascular coiling for aneurysms causing third nerve palsies was compared in 13 retrospective studies representing 447 patients. Complete recovery was achieved in 78% of surgical patients compared with 44% of patients treated with endovascular coiling. However, the complication rate, hospital costs, and days spent in intensive care were reported as higher in surgically treated patients. Retrospective reviews of surgical clipping and endovascular coiling for all ocular motor nerve palsies (third, fourth, or sixth cranial nerves) revealed similar results of complete resolution in 76% and 49%, respectively. Improvement in visual deficits related to aneurysmal compression of the anterior visual pathways was also better among patients treated with clipping than with coiling. The time to treatment from onset of visual symptoms was a predictive factor of visual recovery in several studies. Few reports have specifically assessed the improvement of visual deficits after treatment with flow diverters. Decisions regarding the choice of therapy for intracranial aneurysms causing neuro-ophthalmologic signs ideally should be made at high-volume centers with access to both surgical and endovascular treatments. The status of the patient, location of the aneurysm, and experience of the treating physicians

  13. Patient-relevant treatment goals in psoriasis.

    Science.gov (United States)

    Blome, Christine; Gosau, Ramona; Radtke, Marc A; Reich, Kristian; Rustenbach, Stephan J; Spehr, Christina; Thaçi, Diamant; Augustin, Matthias

    2016-03-01

    Patient-oriented care requires therapeutic decisions to agree with the patients' treatment needs and goals. This study addressed the following questions: What is important to psoriasis patients starting systemic treatment? How stable are these preferences within the first year of treatment? Are treatment goals associated with age, gender, or treatment success? The importance of treatment goals was assessed for patients with moderate-to-severe psoriasis in the German Psoriasis Registry (PsoBest) at baseline (onset of a systemic treatment; n = 3066) and at a 1-year follow-up (n = 1444) using the Patient Benefit Index (PBI). Treatment success was measured with PBI global score and Psoriasis Area Severity Index (PASI). Patients with moderate-to-severe psoriasis pursued a wide range of different goals. The most general treatment goals were rated most relevant, including skin healing and quick skin improvement (94.8/94.5 % "quite" or "very" important), confidence in the therapy (93.0 %), control over the disease (92.3 %), and a clear diagnosis and therapy (89.6 %). Further important goals related to not being in fear of the disease getting worse (84.8 %), reduction in itching (83.9 %), burning (70.6 %), and pain (60.6 %) as well as attaining a normal everyday life (78.4 %) and low treatment burden (64.2-77.9 %). Goals were mostly not associated with sex and gender. Goal importance slightly increased with treatment success. In a substantial proportion of patients (30.3-54.7 %) goal importance changed within 1 year after onset of systemic treatment. We conclude that treatment goal importance should be assessed in clinical practice on a regular basis.

  14. Relevant optical properties for direct restorative materials.

    Science.gov (United States)

    Pecho, Oscar E; Ghinea, Razvan; do Amaral, Erika A Navarro; Cardona, Juan C; Della Bona, Alvaro; Pérez, María M

    2016-05-01

    To evaluate relevant optical properties of esthetic direct restorative materials focusing on whitened and translucent shades. Enamel (E), body (B), dentin (D), translucent (T) and whitened (Wh) shades for E (WhE) and B (WhB) from a restorative system (Filtek Supreme XTE, 3M ESPE) were evaluated. Samples (1 mm thick) were prepared. Spectral reflectance (R%) and color coordinates (L*, a*, b*, C* and h°) were measured against black and white backgrounds, using a spectroradiometer, in a viewing booth, with CIE D65 illuminant and d/0° geometry. Scattering (S) and absorption (K) coefficients and transmittance (T%) were calculated using Kubelka-Munk's equations. Translucency (TP) and opalescence (OP) parameters and whiteness index (W*) were obtained from differences of CIELAB color coordinates. R%, S, K and T% curves from all shades were compared using VAF (Variance Accounting For) coefficient with Cauchy-Schwarz inequality. Color coordinates and optical parameters were statistically analyzed using one-way ANOVA, Tukey's test with Bonferroni correction (α=0.0007). Spectral behavior of R% and S were different for T shades. In addition, T shades showed the lowest R%, S and K values, as well as the highest T%, TP an OP values. In most cases, WhB shades showed different color and optical properties (including TP and W*) than their corresponding B shades. WhE shades showed similar mean W* values and higher mean T% and TP values than E shades. When using whitened or translucent composites, the final color is influenced not only by the intraoral background but also by the color and optical properties of multilayers used in the esthetic restoration. Copyright © 2016 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  15. Development of an ITER relevant inspection robot

    Energy Technology Data Exchange (ETDEWEB)

    Gargiulo, L.; Bayetti, P.; Cordier, J.J.; Grisolia, C.; Hatchressian, J.C. [Association Euratom-CEA, Cadarache (France). Dept. de Recherche sur la Fusion Controlee; Friconneau, J.P.; Keller, D.; Perrot, Y. [CEA-LIST Robotics and Interactive Systems Unit, Fontenay aux Roses (France)

    2007-07-01

    Robotic operations are one of the major maintenance challenges for ITER and future fusion reactors. In particular, in vessel inspection operations without loss of conditioning could be very useful. Within this framework, the aim of the project called AIA (Articulated Inspection Arm) is to demonstrate the feasibility of a multi-purpose in-vessel Remote Handling inspection system using a long reach, limited payload carrier (up to 10 kg). It is composed of 5 segments with 11 degrees of freedom and a total range of 8 m. The project is currently developed by the CEA within the European workprogramme. Its first in situ tests are planned this summer on the Tore Supra tokamak at Cadarache (France). They will validate chosen concepts for operations under ITER relevant vacuum and temperature conditions. After qualification, the arm will constitute a promising tool for generic application. Several processes are already considered for ITER maintenance and will be demonstrated on the AIA robot carrier: - The first embedded process is the viewing system. It is currently being manufactured and will allow for close visual inspection of the complex Plasma Facing Components (limiters, neutralisers, RF antennae, diagnostic windows, etc.). - In situ localisation of leakage based on helium sniffer is also studied to improve maintenance operations. - Finally the laser ablation system for PFC detritiation, also developed in CEA laboratories, is being fitted to be implanted into the robot and put into operation in Tore Supra. This paper deals with the integration of the robot in the Tore Supra tokamak and the advances in the development of the listed processes. It also introduces the current test campaign aiming to qualify the robot performance and reliability under vacuum and temperature conditions. (orig.)

  16. Development of an ITER relevant inspection robot

    Energy Technology Data Exchange (ETDEWEB)

    Gargiulo, Laurent [Association Euratom-CEA, Departement de Recherche sur la Fusion Controlee, CE Cadarache 13108 (France)], E-mail: laurent.gargiulo@cea.fr; Bayetti, Pascal; Bruno, Vincent; Cordier, Jean-Jacques [Association Euratom-CEA, Departement de Recherche sur la Fusion Controlee, CE Cadarache 13108 (France); Friconneau, Jean-Pierre [CEA-LIST Robotics and Interactive Systems Unit, CE Fontenay Aux Roses (France); Grisolia, Christian; Hatchressian, Jean-Claude; Houry, Michael [Association Euratom-CEA, Departement de Recherche sur la Fusion Controlee, CE Cadarache 13108 (France); Keller, Delphine; Perrot, Yann [CEA-LIST Robotics and Interactive Systems Unit, CE Fontenay Aux Roses (France)

    2008-12-15

    Robotic operations are one of the major maintenance challenges for ITER and future fusion reactors. In particular, in-vessel inspection operations without loss of conditioning will be mandatory. In this context, an Articulated Inspection Arm (AIA) is currently developed by the CEA within the European work programme framework, which aims at demonstrating the feasibility of a multi-purpose in-vessel Remote Handling inspection system using a long reach, limited payload carrier (up to 10 kg). It is composed of 5 segments with 8 degrees of freedom and a total range of 8 m. The first in situ tests will take place by the end of 2007 on the Tore Supra Tokamak at Cadarache (France). They will validate concepts for operations under ITER relevant vacuum and temperature conditions. After qualification, the arm will constitute a promising tool for various applications. Several processes are already considered for ITER maintenance and will be demonstrated on the AIA robot carrier: - The first embedded process is the viewing system. It is already manufactured and will allow close visual inspection of the complex Plasma Facing Components (PFC) (limiters, neutralisers, RF antenna, diagnostic windows, etc.). - In situ localisation of water leakage based on a helium sniffing system is also being studied to improve and facilitate maintenance operations. - Finally a laser ablation system for PFC detritiation, developed in CEA laboratories, is being fitted to be implemented on the robot for future operation in Tore Supra. This paper deals with the integration of the robot into Tore Supra and the progress in the development of the processes listed above. It also describes the current test campaign aiming to qualify the robot performance and reliability under vacuum and temperature conditions.

  17. Development of an ITER relevant inspection robot

    International Nuclear Information System (INIS)

    Gargiulo, L.; Bayetti, P.; Cordier, J.J.; Grisolia, C.; Hatchressian, J.C.

    2007-01-01

    Robotic operations are one of the major maintenance challenges for ITER and future fusion reactors. In particular, in vessel inspection operations without loss of conditioning could be very useful. Within this framework, the aim of the project called AIA (Articulated Inspection Arm) is to demonstrate the feasibility of a multi-purpose in-vessel Remote Handling inspection system using a long reach, limited payload carrier (up to 10 kg). It is composed of 5 segments with 11 degrees of freedom and a total range of 8 m. The project is currently developed by the CEA within the European workprogramme. Its first in situ tests are planned this summer on the Tore Supra tokamak at Cadarache (France). They will validate chosen concepts for operations under ITER relevant vacuum and temperature conditions. After qualification, the arm will constitute a promising tool for generic application. Several processes are already considered for ITER maintenance and will be demonstrated on the AIA robot carrier: - The first embedded process is the viewing system. It is currently being manufactured and will allow for close visual inspection of the complex Plasma Facing Components (limiters, neutralisers, RF antennae, diagnostic windows, etc.). - In situ localisation of leakage based on helium sniffer is also studied to improve maintenance operations. - Finally the laser ablation system for PFC detritiation, also developed in CEA laboratories, is being fitted to be implanted into the robot and put into operation in Tore Supra. This paper deals with the integration of the robot in the Tore Supra tokamak and the advances in the development of the listed processes. It also introduces the current test campaign aiming to qualify the robot performance and reliability under vacuum and temperature conditions. (orig.)

  18. Development of an ITER relevant inspection robot

    International Nuclear Information System (INIS)

    Gargiulo, Laurent; Bayetti, Pascal; Bruno, Vincent; Cordier, Jean-Jacques; Friconneau, Jean-Pierre; Grisolia, Christian; Hatchressian, Jean-Claude; Houry, Michael; Keller, Delphine; Perrot, Yann

    2008-01-01

    Robotic operations are one of the major maintenance challenges for ITER and future fusion reactors. In particular, in-vessel inspection operations without loss of conditioning will be mandatory. In this context, an Articulated Inspection Arm (AIA) is currently developed by the CEA within the European work programme framework, which aims at demonstrating the feasibility of a multi-purpose in-vessel Remote Handling inspection system using a long reach, limited payload carrier (up to 10 kg). It is composed of 5 segments with 8 degrees of freedom and a total range of 8 m. The first in situ tests will take place by the end of 2007 on the Tore Supra Tokamak at Cadarache (France). They will validate concepts for operations under ITER relevant vacuum and temperature conditions. After qualification, the arm will constitute a promising tool for various applications. Several processes are already considered for ITER maintenance and will be demonstrated on the AIA robot carrier: - The first embedded process is the viewing system. It is already manufactured and will allow close visual inspection of the complex Plasma Facing Components (PFC) (limiters, neutralisers, RF antenna, diagnostic windows, etc.). - In situ localisation of water leakage based on a helium sniffing system is also being studied to improve and facilitate maintenance operations. - Finally a laser ablation system for PFC detritiation, developed in CEA laboratories, is being fitted to be implemented on the robot for future operation in Tore Supra. This paper deals with the integration of the robot into Tore Supra and the progress in the development of the processes listed above. It also describes the current test campaign aiming to qualify the robot performance and reliability under vacuum and temperature conditions

  19. Autonomia e relevância dos regimes The autonomy and relevance of regimes

    Directory of Open Access Journals (Sweden)

    Gustavo Seignemartin de Carvalho

    2005-12-01

    Full Text Available Teorias institucionalistas na disciplina de relações internacionais usualmente definem regimes como um conjunto de normas e regras formais ou informais que permitem a convergência de expectativas ou a padronização do comportamento de seus participantes em uma determinada área de interesses com o objetivo de resolver problemas de coordenação que tenderiam a resultados não pareto-eficientes. Como estas definições baseadas meramente na "eficiência" dos regimes não parecem suficientes para explicar sua efetividade, o presente artigo propõe uma definição diferente para regimes: a de arranjos políticos que permitem a redistribuição dos ganhos da cooperação pelos participantes em uma determinada área de interesses em um contexto de interdependência. Regimes possuiriam efetividade pela sua autonomia e relevância, ou seja, por possuírem existência objetiva autônoma da de seus participantes e por influenciarem seu comportamento e expectativas de maneiras que não podem ser reduzidas à ação individual de nenhum deles. O artigo inicia-se com uma breve discussão sobre as dificuldades terminológicas associadas ao estudo de regimes e a definição dos conceitos de autonomia e relevância. Em seguida, classifica os diversos autores participantes do debate em duas perspectivas distintas, uma que nega (não-autonomistas e outra que atribui (autonomistas aos regimes autonomia e relevância, e faz uma breve análise dos autores e tradições mais significativos para o debate, aprofundando-se nos autonomistas e nos argumentos que reforçam a hipótese aqui apresentada. Ao final, o artigo propõe uma decomposição analítica dos regimes nos quatro elementos principais que lhes propiciam autonomia e relevância: normatividade, atores, especificidade da área de interesses e interdependência complexa com o contexto.Regimes are defined by institutionalist theories in the discipline of International Relations as formal or informal sets

  20. Experiments with positive, negative and topical relevance feedback

    NARCIS (Netherlands)

    Kaptein, R.; Kamps, J.; Li, R.; Hiemstra, D.

    2008-01-01

    This document contains a description of experiments for the 2008 Relevance Feedback track. We experiment with different amounts of feedback, including negative relevance feedback. Feedback is implemented using massive weighted query expansion. Parsimonious query expansion using Dirichlet smoothing

  1. Simulation assessment center in the service of the company as a factor in the accuracy and validity of the information about the employee

    OpenAIRE

    Borodai V.A.

    2017-01-01

    The article reveals the relevance of evaluation method for personnel assessment center technologies. The efficiency of the method in terms of accuracy and validity of the assessment of employees. Identified positive factors and problematic use of assessment center technology service company/

  2. Linear accuracy and reliability of volume data sets acquired by two CBCT-devices and an MSCT using virtual models : A comparative in-vitro study

    NARCIS (Netherlands)

    Wikner, Johannes; Hanken, Henning; Eulenburg, Christine; Heiland, Max; Groebe, Alexander; Assaf, Alexandre Thomas; Riecke, Bjoern; Friedrich, Reinhard E.

    2016-01-01

    Objective. To discriminate clinically relevant aberrance, the accuracy of linear measurements in three-dimensional (3D) reconstructed datasets was investigated. Materials and methods. Three partly edentulous human skulls were examined. Landmarks were defined prior to acquisition. Two CBCT-scanners

  3. ACCURACY ASSESSMENT OF COASTAL TOPOGRAPHY DERIVED FROM UAV IMAGES

    Directory of Open Access Journals (Sweden)

    N. Long

    2016-06-01

    Full Text Available To monitor coastal environments, Unmanned Aerial Vehicle (UAV is a low-cost and easy to use solution to enable data acquisition with high temporal frequency and spatial resolution. Compared to Light Detection And Ranging (LiDAR or Terrestrial Laser Scanning (TLS, this solution produces Digital Surface Model (DSM with a similar accuracy. To evaluate the DSM accuracy on a coastal environment, a campaign was carried out with a flying wing (eBee combined with a digital camera. Using the Photoscan software and the photogrammetry process (Structure From Motion algorithm, a DSM and an orthomosaic were produced. Compared to GNSS surveys, the DSM accuracy is estimated. Two parameters are tested: the influence of the methodology (number and distribution of Ground Control Points, GCPs and the influence of spatial image resolution (4.6 cm vs 2 cm. The results show that this solution is able to reproduce the topography of a coastal area with a high vertical accuracy (< 10 cm. The georeferencing of the DSM require a homogeneous distribution and a large number of GCPs. The accuracy is correlated with the number of GCPs (use 19 GCPs instead of 10 allows to reduce the difference of 4 cm; the required accuracy should be dependant of the research problematic. Last, in this particular environment, the presence of very small water surfaces on the sand bank does not allow to improve the accuracy when the spatial resolution of images is decreased.

  4. Systematic Calibration for Ultra-High Accuracy Inertial Measurement Units

    Directory of Open Access Journals (Sweden)

    Qingzhong Cai

    2016-06-01

    Full Text Available An inertial navigation system (INS has been widely used in challenging GPS environments. With the rapid development of modern physics, an atomic gyroscope will come into use in the near future with a predicted accuracy of 5 × 10−6°/h or better. However, existing calibration methods and devices can not satisfy the accuracy requirements of future ultra-high accuracy inertial sensors. In this paper, an improved calibration model is established by introducing gyro g-sensitivity errors, accelerometer cross-coupling errors and lever arm errors. A systematic calibration method is proposed based on a 51-state Kalman filter and smoother. Simulation results show that the proposed calibration method can realize the estimation of all the parameters using a common dual-axis turntable. Laboratory and sailing tests prove that the position accuracy in a five-day inertial navigation can be improved about 8% by the proposed calibration method. The accuracy can be improved at least 20% when the position accuracy of the atomic gyro INS can reach a level of 0.1 nautical miles/5 d. Compared with the existing calibration methods, the proposed method, with more error sources and high order small error parameters calibrated for ultra-high accuracy inertial measurement units (IMUs using common turntables, has a great application potential in future atomic gyro INSs.

  5. On the accuracy potential of focused plenoptic camera range determination in long distance operation

    Science.gov (United States)

    Sardemann, Hannes; Maas, Hans-Gerd

    2016-04-01

    Plenoptic cameras have found increasing interest in optical 3D measurement techniques in recent years. While their basic principle is 100 years old, the development in digital photography, micro-lens fabrication technology and computer hardware has boosted the development and lead to several commercially available ready-to-use cameras. Beyond their popular option of a posteriori image focusing or total focus image generation, their basic ability of generating 3D information from single camera imagery depicts a very beneficial option for certain applications. The paper will first present some fundamentals on the design and history of plenoptic cameras and will describe depth determination from plenoptic camera image data. It will then present an analysis of the depth determination accuracy potential of plenoptic cameras. While most research on plenoptic camera accuracy so far has focused on close range applications, we will focus on mid and long ranges of up to 100 m. This range is especially relevant, if plenoptic cameras are discussed as potential mono-sensorial range imaging devices in (semi-)autonomous cars or in mobile robotics. The results show the expected deterioration of depth measurement accuracy with depth. At depths of 30-100 m, which may be considered typical in autonomous driving, depth errors in the order of 3% (with peaks up to 10-13 m) were obtained from processing small point clusters on an imaged target. Outliers much higher than these values were observed in single point analysis, stressing the necessity of spatial or spatio-temporal filtering of the plenoptic camera depth measurements. Despite these obviously large errors, a plenoptic camera may nevertheless be considered a valid option for the application fields of real-time robotics like autonomous driving or unmanned aerial and underwater vehicles, where the accuracy requirements decrease with distance.

  6. Do technical parameters affect the diagnostic accuracy of virtual bronchoscopy in patients with suspected airways stenosis?

    International Nuclear Information System (INIS)

    Jones, Catherine M.; Athanasiou, Thanos; Nair, Sujit; Aziz, Omer; Purkayastha, Sanjay; Konstantinos, Vlachos; Paraskeva, Paraskevas; Casula, Roberto; Glenville, Brian; Darzi, Ara

    2005-01-01

    Purpose: Virtual bronchoscopy has gained popularity over the past decade as an alternative investigation to conventional bronchoscopy in the diagnosis, grading and monitoring of airway disease. The effect of technical parameters on diagnostic outcome from virtual bronchoscopy has not been determined. This meta-analysis aims to estimate accuracy of virtual compared to conventional bronchoscopy in patients with suspected airway stenosis, and evaluate the influence of technical parameters. Materials and methods: A MEDLINE search was used to identify relevant published studies. The primary endpoint was the 'correct diagnosis' of stenotic lesions on virtual compared to conventional bronchoscopy. Secondary endpoints included the effects of the technical parameters (pitch, collimation, reconstruction interval, rendering method, and scanner type), and date of publication on the diagnostic accuracy of virtual bronchoscopy. Results: Thirteen studies containing 454 patients were identified. Meta-analysis showed good overall diagnostic performance with 85% calculated pooled sensitivity (95% CI 77-91%), 87% specificity (95% CI 81-92%) and area under the curve (AUC) of 0.947. Subgroups included collimation of 3 mm or more (AUC 0.948), pitch of 1 (AUC 0.955), surface rendering technique (AUC 0.935), and reconstruction interval of more than 1.25 mm (AUC 0.914). There was no significant difference in accuracy accounting for publication date, scanner type or any of the above variables. Weighted regression analysis confirmed none of these variables could significantly account for study heterogeneity. Conclusion: Virtual bronchoscopy performs well in the investigation of patients with suspected airway stenosis. Overall sensitivity and specificity and diagnostic odds ratio for diagnosis of airway stenosis were high. The effects of pitch, collimation, reconstruction interval, rendering technique, scanner type, and publication date on diagnostic accuracy were not significant

  7. Accuracy and Efficiency of a Coupled Neutronics and Thermal Hydraulics Model

    International Nuclear Information System (INIS)

    Pope, Michael A.; Mousseau, Vincent A.

    2009-01-01

    The accuracy requirements for modern nuclear reactor simulation are steadily increasing due to the cost and regulation of relevant experimental facilities. Because of the increase in the cost of experiments and the decrease in the cost of simulation, simulation will play a much larger role in the design and licensing of new nuclear reactors. Fortunately as the work load of simulation increases, there are better physics models, new numerical techniques, and more powerful computer hardware that will enable modern simulation codes to handle this larger workload. This manuscript will discuss a numerical method where the six equations of two-phase flow, the solid conduction equations, and the two equations that describe neutron diffusion and precursor concentration are solved together in a tightly coupled, nonlinear fashion for a simplified model of a nuclear reactor core. This approach has two important advantages. The first advantage is a higher level of accuracy. Because the equations are solved together in a single nonlinear system, the solution is more accurate than the traditional 'operator split' approach where the two-phase flow equations are solved first, the heat conduction is solved second and the neutron diffusion is solved third, limiting the temporal accuracy to 1st order because the nonlinear coupling between the physics is handled explicitly. The second advantage of the method described in this manuscript is that the time step control in the fully implicit system can be based on the timescale of the solution rather than a stability-based time step restriction like the material Courant. Results are presented from a simulated control rod movement and a rod ejection that address temporal accuracy for the fully coupled solution and demonstrate how the fastest timescale of the problem can change between the state variables of neutronics, conduction and two-phase flow during the course of a transient.

  8. Accuracy and Efficiency of a Coupled Neutronics and Thermal Hydraulics Model

    International Nuclear Information System (INIS)

    Vincent A. Mousseau; Michael A. Pope

    2007-01-01

    The accuracy requirements for modern nuclear reactor simulation are steadily increasing due to the cost and regulation of relevant experimental facilities. Because of the increase in the cost of experiments and the decrease in the cost of simulation, simulation will play a much larger role in the design and licensing of new nuclear reactors. Fortunately as the work load of simulation increases, there are better physics models, new numerical techniques, and more powerful computer hardware that will enable modern simulation codes to handle the larger workload. This manuscript will discuss a numerical method where the six equations of two-phase flow, the solid conduction equations, and the two equations that describe neutron diffusion and precursor concentration are solved together in a tightly coupled, nonlinear fashion for a simplified model of a nuclear reactor core. This approach has two important advantages. The first advantage is a higher level of accuracy. Because the equations are solved together in a single nonlinear system, the solution is more accurate than the traditional 'operator split' approach where the two-phase flow equations are solved first, the heat conduction is solved second and the neutron diffusion is solved third, limiting the temporal accuracy to 1st order because the nonlinear coupling between the physics is handled explicitly. The second advantage of the method described in this manuscript is that the time step control in the fully implicit system can be based on the timescale of the solution rather than a stability-based time step restriction like the material Courant. Results are presented from a simulated control rod movement and a rod ejection that address temporal accuracy for the fully coupled solution and demonstrate how the fastest timescale of the problem can change between the state variables of neutronics, conduction and two-phase flow during the course of a transient

  9. 49 CFR 556.9 - Public inspection of relevant information.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Public inspection of relevant information. 556.9... NONCOMPLIANCE § 556.9 Public inspection of relevant information. Information relevant to a petition under this... Administration, 400 Seventh Street, SW., Washington, DC 20590. Copies of available information may be obtained in...

  10. 46 CFR 560.5 - Receipt of relevant information.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 9 2010-10-01 2010-10-01 false Receipt of relevant information. 560.5 Section 560.5... FOREIGN PORTS § 560.5 Receipt of relevant information. (a) In making its decision on matters arising under... submissions should be supported by affidavits of fact and memorandum of law. Relevant information may include...

  11. Aspect-based Relevance Learning for Image Retrieval

    NARCIS (Netherlands)

    M.J. Huiskes (Mark)

    2005-01-01

    htmlabstractWe analyze the special structure of the relevance feedback learning problem, focusing particularly on the effects of image selection by partial relevance on the clustering behavior of feedback examples. We propose a scheme, aspect-based relevance learning, which guarantees that feedback

  12. Determination of material irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Cerles, J.M.; Mas, P.

    1978-01-01

    In this paper, the author reports some main methods to determine the nuclear parameters of material irradiation in testing reactor (nuclear power, burn-up, fluxes, fluences, ...). The different methods (theoretical or experimental) are reviewed: neutronics measurements and calculations, gamma scanning, thermal balance, ... The required accuracies are reviewed: they are of 3-5% on flux, fluences, nuclear power, burn-up, conversion factor, ... These required accuracies are compared with the real accuracies available which are at the present time of order of 5-20% on these parameters

  13. Surgical accuracy of three-dimensional virtual planning

    DEFF Research Database (Denmark)

    Stokbro, Kasper; Aagaard, Esben; Torkov, Peter

    2016-01-01

    This retrospective study evaluated the precision and positional accuracy of different orthognathic procedures following virtual surgical planning in 30 patients. To date, no studies of three-dimensional virtual surgical planning have evaluated the influence of segmentation on positional accuracy...... and transverse expansion. Furthermore, only a few have evaluated the precision and accuracy of genioplasty in placement of the chin segment. The virtual surgical plan was compared with the postsurgical outcome by using three linear and three rotational measurements. The influence of maxillary segmentation...

  14. Determination of fuel irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Mas, P.

    1977-01-01

    This paper reports on the present point of some main methods to determine the nuclear parameters of fuel irradiation in testing reactors (nuclear power, burn up, ...) The different methods (theoretical or experimental) are reviewed: neutron measurements and calculations, gamma scanning, heat balance, ... . The required accuracies are reviewed: they are of 3-5 % on flux, fluences, nuclear power, burn-up, conversion factor. These required accuracies are compared with the real accuracies available which are the present time of order of 5-20 % on these parameters

  15. Analysis of spatial distribution of land cover maps accuracy

    Science.gov (United States)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain

  16. Channelized relevance vector machine as a numerical observer for cardiac perfusion defect detection task

    Science.gov (United States)

    Kalayeh, Mahdi M.; Marin, Thibault; Pretorius, P. Hendrik; Wernick, Miles N.; Yang, Yongyi; Brankov, Jovan G.

    2011-03-01

    In this paper, we present a numerical observer for image quality assessment, aiming to predict human observer accuracy in a cardiac perfusion defect detection task for single-photon emission computed tomography (SPECT). In medical imaging, image quality should be assessed by evaluating the human observer accuracy for a specific diagnostic task. This approach is known as task-based assessment. Such evaluations are important for optimizing and testing imaging devices and algorithms. Unfortunately, human observer studies with expert readers are costly and time-demanding. To address this problem, numerical observers have been developed as a surrogate for human readers to predict human diagnostic performance. The channelized Hotelling observer (CHO) with internal noise model has been found to predict human performance well in some situations, but does not always generalize well to unseen data. We have argued in the past that finding a model to predict human observers could be viewed as a machine learning problem. Following this approach, in this paper we propose a channelized relevance vector machine (CRVM) to predict human diagnostic scores in a detection task. We have previously used channelized support vector machines (CSVM) to predict human scores and have shown that this approach offers better and more robust predictions than the classical CHO method. The comparison of the proposed CRVM with our previously introduced CSVM method suggests that CRVM can achieve similar generalization accuracy, while dramatically reducing model complexity and computation time.

  17. Evaluation of relevant information for optimal reflector modeling through data assimilation procedures

    International Nuclear Information System (INIS)

    Argaud, J.P.; Bouriquet, B.; Clerc, T.; Lucet-Sanchez, F.; Poncot, A.

    2015-01-01

    The goal of this study is to look after the amount of information that is mandatory to get a relevant parameters optimisation by data assimilation for physical models in neutronic diffusion calculations, and to determine what is the best information to reach the optimum of accuracy at the cheapest cost. To evaluate the quality of the optimisation, we study the covariance matrix that represents the accuracy of the optimised parameter. This matrix is a classical output of the data assimilation procedure, and it is the main information about accuracy and sensitivity of the parameter optimal determination. We present some results collected in the field of neutronic simulation for PWR type reactor. We seek to optimise the reflector parameters that characterise the neutronic reflector surrounding the whole reactive core. On the basis of the configuration studies, it has been shown that with data assimilation we can determine a global strategy to optimise the quality of the result with respect to the amount of information provided. The consequence of this is a cost reduction in terms of measurement and/or computing time with respect to the basic approach. Another result is that using multi-campaign data rather data from a unique campaign significantly improves the efficiency of parameters optimisation

  18. Stochastic Optimized Relevance Feedback Particle Swarm Optimization for Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    2014-01-01

    Full Text Available One of the major challenges for the CBIR is to bridge the gap between low level features and high level semantics according to the need of the user. To overcome this gap, relevance feedback (RF coupled with support vector machine (SVM has been applied successfully. However, when the feedback sample is small, the performance of the SVM based RF is often poor. To improve the performance of RF, this paper has proposed a new technique, namely, PSO-SVM-RF, which combines SVM based RF with particle swarm optimization (PSO. The aims of this proposed technique are to enhance the performance of SVM based RF and also to minimize the user interaction with the system by minimizing the RF number. The PSO-SVM-RF was tested on the coral photo gallery containing 10908 images. The results obtained from the experiments showed that the proposed PSO-SVM-RF achieved 100% accuracy in 8 feedback iterations for top 10 retrievals and 80% accuracy in 6 iterations for 100 top retrievals. This implies that with PSO-SVM-RF technique high accuracy rate is achieved at a small number of iterations.

  19. Accuracy of Replicating Static Torque and its Effect on Shooting Accuracy in Young Basketball Players

    Directory of Open Access Journals (Sweden)

    Struzik Artur

    2014-12-01

    Full Text Available Purpose. Accurate shooting in basketball is a prerequisite for success. Coordination ability, one of the abilities that determine the repeatability of accurate shooting, is based on kinesthetic differentiation. The aim of the study was to evaluate the strength component of kinesthetic differentiation ability and determine its relationship with shooting accuracy. Methods. Peak muscle torque of the elbow extensors under static conditions was measured in 12 young basketball players. Participants then reproduced the same movement at a perceived magnitude of 25%, 50%, and 75% of static peak torque, with error scores calculated as a measure of kinesthetic differentiation. The results were compared with players’ field goal percentages calculated during game play in a regional championship. Results. No statistically significant relationships were found between the level of kinesthetic differentiation ability and field goal percentage. Additionally, no upper limb asymmetry was found in the sample. Conclusions. The relatively high levels of elbow static peak torque suggest the importance of upper limb strength in contemporary basketball. The lack of a statistically significant difference between the right and left limbs decreases the risk of suffering injury. It is likely that choosing other suitable tests would demonstrate the relationships between field goal percentage and kinesthetic differentiation ability.

  20. Making academic research more relevant: A few suggestions

    Directory of Open Access Journals (Sweden)

    Abinash Panda

    2014-09-01

    Full Text Available Academic research in the domain of management scholarship, though steeped in scientific and methodological rigour, is generally found to be of little relevance to practice. The authors of this paper have revisited the rigour-relevance debate in light of recent developments and with special reference to the management research scenario in India. The central thesis of the argument is that the gulf between rigour and relevance needs to be bridged to make academic research more relevant to business organizations and practitioners. They have offered some suggestions to enhance the relevance of academic research to practice.