WorldWideScience

Sample records for methods computer readable

  1. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  2. Systems, computer-implemented methods, and tangible computer-readable storage media for wide-field interferometry

    Science.gov (United States)

    Lyon, Richard G. (Inventor); Leisawitz, David T. (Inventor); Rinehart, Stephen A. (Inventor); Memarsadeghi, Nargess (Inventor)

    2012-01-01

    Disclosed herein are systems, computer-implemented methods, and tangible computer-readable storage media for wide field imaging interferometry. The method includes for each point in a two dimensional detector array over a field of view of an image: gathering a first interferogram from a first detector and a second interferogram from a second detector, modulating a path-length for a signal from an image associated with the first interferogram in the first detector, overlaying first data from the modulated first detector and second data from the second detector, and tracking the modulating at every point in a two dimensional detector array comprising the first detector and the second detector over a field of view for the image. The method then generates a wide-field data cube based on the overlaid first data and second data for each point. The method can generate an image from the wide-field data cube.

  3. Methods to Measure Map Readability

    OpenAIRE

    Harrie, Lars

    2009-01-01

    Creation of maps in real-time web services introduces challenges concerning map readability. Therefore we must introduce analytical measures controlling the readability. The aim of this study is to develop and evaluate analytical readability measures with the help of user tests.

  4. Computer-readable ''Nuclear Data Sheets''

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1975-01-01

    The evaluated nuclear structure data contained in ''Nuclear Data Sheets'' are available in computer-readable form. Experimentally established properties of nuclear levels are included as well as radiations from nuclear reactions and radioactive decay. Portions of the data can be selected for distribution in several formats on magnetic tape or computer cards. A variety of different listing and drawing formats are also available. 4 figures

  5. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices

    Science.gov (United States)

    Gering, Kevin L

    2013-08-27

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

  6. Systems, methods and computer-readable media to model kinetic performance of rechargeable electrochemical devices

    Science.gov (United States)

    Gering, Kevin L.

    2013-01-01

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics. The computing system also analyzes the cell information of the electrochemical cell with a Butler-Volmer (BV) expression modified to determine exchange current density of the electrochemical cell by including kinetic performance information related to pulse-time dependence, electrode surface availability, or a combination thereof. A set of sigmoid-based expressions may be included with the modified-BV expression to determine kinetic performance as a function of pulse time. The determined exchange current density may be used with the modified-BV expression, with or without the sigmoid expressions, to analyze other characteristics of the electrochemical cell. Model parameters can be defined in terms of cell aging, making the overall kinetics model amenable to predictive estimates of cell kinetic performance along the aging timeline.

  7. Method, system and computer-readable media for measuring impedance of an energy storage device

    Science.gov (United States)

    Morrison, John L.; Morrison, William H.; Christophersen, Jon P.; Motloch, Chester G.

    2016-01-26

    Real-time battery impedance spectrum is acquired using a one-time record. Fast Summation Transformation (FST) is a parallel method of acquiring a real-time battery impedance spectrum using a one-time record that enables battery diagnostics. An excitation current to a battery is a sum of equal amplitude sine waves of frequencies that are octave harmonics spread over a range of interest. A sample frequency is also octave and harmonically related to all frequencies in the sum. A time profile of this sampled signal has a duration that is a few periods of the lowest frequency. A voltage response of the battery, average deleted, is an impedance of the battery in a time domain. Since the excitation frequencies are known and octave and harmonically related, a simple algorithm, FST, processes the time profile by rectifying relative to sine and cosine of each frequency. Another algorithm yields real and imaginary components for each frequency.

  8. Laser readable thermoluminescent radiation dosimeters and methods for producing thereof

    International Nuclear Information System (INIS)

    Braunlich, P.F.; Tetzlaff, W.

    1989-01-01

    Thin layer thermoluminescent radiation dosimeters for use in laser readable dosimetry systems, and methods of fabricating such thin layer dosimeters are disclosed. The thin layer thermoluminescent radiation dosimeters include a thin substrate made from glass or other inorganic materials capable of withstanding high temperatures and high heating rates. A thin layer of a thermoluminescent phosphor material is heat bonded to the substrate using an inorganic binder such as glass. The dosimeters can be mounted in frames and cases for ease in handling. Methods of the invention include mixing a suitable phosphor composition and binder, both being in particulate or granular form. The mixture is then deposited onto a substrate such as by using mask printing techniques. The dosimeters are thereafter heated to fuse and bond the binder and phosphor to the substrate. 34 figs

  9. Informed consent recall and comprehension in orthodontics: traditional vs improved readability and processability methods.

    Science.gov (United States)

    Kang, Edith Y; Fields, Henry W; Kiyak, Asuman; Beck, F Michael; Firestone, Allen R

    2009-10-01

    Low general and health literacy in the United States means informed consent documents are not well understood by most adults. Methods to improve recall and comprehension of informed consent have not been tested in orthodontics. The purposes of this study were to evaluate (1) recall and comprehension among patients and parents by using the American Association of Orthodontists' (AAO) informed consent form and new forms incorporating improved readability and processability; (2) the association between reading ability, anxiety, and sociodemographic variables and recall and comprehension; and (3) how various domains (treatment, risk, and responsibility) of information are affected by the forms. Three treatment groups (30 patient-parent pairs in each) received an orthodontic case presentation and either the AAO form, an improved readability form (MIC), or an improved readability and processability (pairing audio and visual cues) form (MIC + SS). Structured interviews were transcribed and coded to evaluate recall and comprehension. Significant relationships among patient-related variables and recall and comprehension explained little of the variance. The MIC + SS form significantly improved patient recall and parent recall and comprehension. Recall was better than comprehension, and parents performed better than patients. The MIC + SS form significantly improved patient treatment comprehension and risk recall and parent treatment recall and comprehension. Patients and parents both overestimated their understanding of the materials. Improving the readability of consent materials made little difference, but combining improved readability and processability benefited both patients' recall and parents' recall and comprehension compared with the AAO form.

  10. Computer-Based Readability Testing of Information Booklets for German Cancer Patients.

    Science.gov (United States)

    Keinki, Christian; Zowalla, Richard; Pobiruchin, Monika; Huebner, Jutta; Wiesner, Martin

    2018-04-12

    Understandable health information is essential for treatment adherence and improved health outcomes. For readability testing, several instruments analyze the complexity of sentence structures, e.g., Flesch-Reading Ease (FRE) or Vienna-Formula (WSTF). Moreover, the vocabulary is of high relevance for readers. The aim of this study is to investigate the agreement of sentence structure and vocabulary-based (SVM) instruments. A total of 52 freely available German patient information booklets on cancer were collected from the Internet. The mean understandability level L was computed for 51 booklets. The resulting values of FRE, WSTF, and SVM were assessed pairwise for agreement with Bland-Altman plots and two-sided, paired t tests. For the pairwise comparison, the mean L values are L FRE  = 6.81, L WSTF  = 7.39, L SVM  = 5.09. The sentence structure-based metrics gave significantly different scores (P < 0.001) for all assessed booklets, confirmed by the Bland-Altman analysis. The study findings suggest that vocabulary-based instruments cannot be interchanged with FRE/WSTF. However, both analytical aspects should be considered and checked by authors to linguistically refine texts with respect to the individual target group. Authors of health information can be supported by automated readability analysis. Health professionals can benefit by direct booklet comparisons allowing for time-effective selection of suitable booklets for patients.

  11. Lattice Boltzmann method fundamentals and engineering applications with computer codes

    CERN Document Server

    Mohamad, A A

    2014-01-01

    Introducing the Lattice Boltzmann Method in a readable manner, this book provides detailed examples with complete computer codes. It avoids the most complicated mathematics and physics without scarifying the basic fundamentals of the method.

  12. A Practical Method to Increase the Frequency Readability for Vibration Signals

    Directory of Open Access Journals (Sweden)

    Jean Loius Ntakpe

    2016-10-01

    Full Text Available Damage detection and nondestructive evaluation of mechanical and civil engineering structures are nowadays very important to assess the integrity and ensure the reliability of structures. Thus, frequency evaluation becomes a crucial issue, since this modal parameter is mainly used in structural integrity assessment. The herein presented study highligts the possibility of increasing the frequency readability by involving a simple and cost-effective method.

  13. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  14. Computational Text Analysis: A More Comprehensive Approach to Determine Readability of Reading Materials

    Science.gov (United States)

    Aziz, Anealka; Fook, Chan Yuen; Alsree, Zubaida

    2010-01-01

    Reading materials are considered having high readability if readers are interested to read the materials, understand the content of the materials and able to read the materials fluently. In contrast, reading materials with low readability discourage readers from reading the materials, create difficulties for readers to understand the content of…

  15. Constructing and validating readability models: the method of integrating multilevel linguistic features with machine learning.

    Science.gov (United States)

    Sung, Yao-Ting; Chen, Ju-Ling; Cha, Ji-Her; Tseng, Hou-Chiang; Chang, Tao-Hsing; Chang, Kuo-En

    2015-06-01

    Multilevel linguistic features have been proposed for discourse analysis, but there have been few applications of multilevel linguistic features to readability models and also few validations of such models. Most traditional readability formulae are based on generalized linear models (GLMs; e.g., discriminant analysis and multiple regression), but these models have to comply with certain statistical assumptions about data properties and include all of the data in formulae construction without pruning the outliers in advance. The use of such readability formulae tends to produce a low text classification accuracy, while using a support vector machine (SVM) in machine learning can enhance the classification outcome. The present study constructed readability models by integrating multilevel linguistic features with SVM, which is more appropriate for text classification. Taking the Chinese language as an example, this study developed 31 linguistic features as the predicting variables at the word, semantic, syntax, and cohesion levels, with grade levels of texts as the criterion variable. The study compared four types of readability models by integrating unilevel and multilevel linguistic features with GLMs and an SVM. The results indicate that adopting a multilevel approach in readability analysis provides a better representation of the complexities of both texts and the reading comprehension process.

  16. The Readability of Principles of Macroeconomics Textbooks

    Science.gov (United States)

    Tinkler, Sarah; Woods, James

    2013-01-01

    The authors evaluated principles of macroeconomics textbooks for readability using Coh-Metrix, a computational linguistics tool. Additionally, they conducted an experiment on Amazon's Mechanical Turk Web site in which participants ranked the readability of text samples. There was a wide range of scores on readability indexes both among…

  17. Mobile computing with special reference to readability task under the impact of vibration, colour combination and gender.

    Science.gov (United States)

    Mallick, Zulquernain; Siddiquee, Arshad Noor; Haleem, Abid

    2008-12-01

    The last 20 years have seen a tremendous growth in the field of computing with special reference to mobile computing. Ergonomic issues pertaining to this theme remains unexplored. With special reference to readability in mobile computing, an experimental research was conducted to study the gender effect on human performance under the impact of vibration in a human computer interaction environment. Fourteen subjects (7 males and 7 females) participated in the study. Three independent variables, namely gender, level of vibration and screen text/background colour, were selected for the experimental investigation while the dependent variable was the number of characters read per minute. The data collected were analyzed statistically through an experimental design for repeated measures. Results indicated that gender as an organismic variable, the level of vibration and screen text/background colour revealed statistically significant differences. However, the second order interaction was found to be statistically non-significant. These findings are discussed in light of the previous studies undertaken on the topic.

  18. How Readability and Topic Incidence Relate to Performance on Mathematics Story Problems in Computer-Based Curricula

    Science.gov (United States)

    Walkington, Candace; Clinton, Virginia; Ritter, Steven N.; Nathan, Mitchell J.

    2015-01-01

    Solving mathematics story problems requires text comprehension skills. However, previous studies have found few connections between traditional measures of text readability and performance on story problems. We hypothesized that recently developed measures of readability and topic incidence measured by text-mining tools may illuminate associations…

  19. Emotioncy: A Potential Measure of Readability

    Science.gov (United States)

    Pishghadamn, Reza; Abbasnejad, Hannaheh

    2016-01-01

    Given the deficiencies of readability formulae as reliable tools for measuring text readability in educational settings, this study aims to offer a new measure to improve the current methods of testing the readability levels of texts through the incorporation of the newly-developed concept of emotioncy. To this end, a group of 221 students were…

  20. Readability versus Leveling.

    Science.gov (United States)

    Fry, Edward

    2002-01-01

    Shows some similarities and differences between readability formulas and leveling procedures and reports some current large-scale uses of readability formulas. Presents a dictionary definition of readability and leveling, and a history and background of readability and leveling. Discusses what goes into determining readability and leveling scores.…

  1. Readability of the written study information in pediatric research in France.

    Directory of Open Access Journals (Sweden)

    Véronique Ménoni

    Full Text Available BACKGROUND: The aim was to evaluate the readability of research information leaflets (RIL for minors asked to participate in biomedical research studies and to assess the factors influencing this readability. METHODS AND FINDINGS: All the pediatric protocols from three French pediatric clinical research units were included (N = 104. Three criteria were used to evaluate readability: length of the text, Flesch's readability score and presence of illustrations. We compared the readability of RIL to texts specifically written for children (school textbooks, school exams or extracts from literary works. We assessed the effect of protocol characteristics on readability. The RIL had a median length of 608 words [350 words, 25(th percentile; 1005 words, 75(th percentile], corresponding to two pages. The readability of the RIL, with a median Flesch score of 40 [30; 47], was much poorer than that of pediatric reference texts, with a Flesch score of 67 [60; 73]. A small proportion of RIL (13/91; 14% were illustrated. The RIL were longer (p<0.001, more readable (p<0.001 and more likely to be illustrated (p<0.009 for industrial than for institutional sponsors. CONCLUSION: Researchers should routinely compute the reading ease of study information sheets and make greater efforts to improve the readability of written documents for potential participants.

  2. The Principles of Readability

    Science.gov (United States)

    DuBay, William H.

    2004-01-01

    The principles of readability are in every style manual. Readability formulas are in every writing aid. What is missing is the research and theory on which they stand. This short review of readability research spans 100 years. The first part covers the history of adult literacy studies in the U.S., establishing the stratified nature of the adult…

  3. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  4. Computer aided analysis of additional chromosome aberrations in Philadelphia chromosome positive acute lymphoblastic leukaemia using a simplified computer readable cytogenetic notation

    Directory of Open Access Journals (Sweden)

    Mohr Brigitte

    2003-01-01

    Full Text Available Abstract Background The analysis of complex cytogenetic databases of distinct leukaemia entities may help to detect rare recurring chromosome aberrations, minimal common regions of gains and losses, and also hot spots of genomic rearrangements. The patterns of the karyotype alterations may provide insights into the genetic pathways of disease progression. Results We developed a simplified computer readable cytogenetic notation (SCCN by which chromosome findings are normalised at a resolution of 400 bands. Lost or gained chromosomes or chromosome segments are specified in detail, and ranges of chromosome breakpoint assignments are recorded. Software modules were written to summarise the recorded chromosome changes with regard to the respective chromosome involvement. To assess the degree of karyotype alterations the ploidy levels and numbers of numerical and structural changes were recorded separately, and summarised in a complex karyotype aberration score (CKAS. The SCCN and CKAS were used to analyse the extend and the spectrum of additional chromosome aberrations in 94 patients with Philadelphia chromosome positive (Ph-positive acute lymphoblastic leukemia (ALL and secondary chromosome anomalies. Dosage changes of chromosomal material represented 92.1% of all additional events. Recurring regions of chromosome losses were identified. Structural rearrangements affecting (pericentromeric chromosome regions were recorded in 24.6% of the cases. Conclusions SCCN and CKAS provide unifying elements between karyotypes and computer processable data formats. They proved to be useful in the investigation of additional chromosome aberrations in Ph-positive ALL, and may represent a step towards full automation of the analysis of large and complex karyotype databases.

  5. Available Methods in Farsi-English Cross Language Information Retrieval Using Machine-readable, Bilingual Glossary

    Directory of Open Access Journals (Sweden)

    Hamid Alizadeh

    2009-12-01

    Full Text Available In this paper the impact scope of Natural Language Processing (NLP on translating search statements was determined by testing out research hypotheses. The NLP techniques employed for search statement processing included text parsing, linguistic forms identification, stopword removal, morphological analysis, and tokenization. Examination of the hypotheses indicated that using the method of translating the first equivalent term selected versus the method of selecting all equivalent terms, would contribute to increased efficiency of the review that while morphological analysis of the terms not translated by the glossary, would increase the retrieval precision cutoff, there would be no significant difference established by the lack of such analysis thereof that sentence translation as opposed to term by term translation, would increase the efficiency of Farsi-English proofreading. Other findings are also represented.

  6. A Machine Learning Approach to Measurement of Text Readability for EFL Learners Using Various Linguistic Features

    Science.gov (United States)

    Kotani, Katsunori; Yoshimi, Takehiko; Isahara, Hitoshi

    2011-01-01

    The present paper introduces and evaluates a readability measurement method designed for learners of EFL (English as a foreign language). The proposed readability measurement method (a regression model) estimates the text readability based on linguistic features, such as lexical, syntactic and discourse features. Text readability refers to the…

  7. Analyzing readability of medicines information material in Slovenia

    Science.gov (United States)

    Kasesnik, Karin; Kline, Mihael

    2011-01-01

    Objective: Readability has been claimed to be an important factor for understanding texts describing health symptoms and medications. Such texts may be a factor which indirectly affects the health of the population. Despite the expertise of physicians, the readability of information sources may be important for acquiring essential treatment information. The aim of this study was to measure the readability level of medicines promotion material in Slovenia. Methods: The Flesch readability formula was modified to comply with Slovene texts. On the basis of determining the Slovene readability algorithm, the readability ease related to the readability grade level of different Slovene texts was established. In order to estimate an adjustment of the texts to the recommended readability grade level of the targeted population, readability values of English texts were set. One sample t-test and standard deviations from the arithmetic mean values were used as statistical tests. Results: The results of the research showed low readability scores of the Slovene texts. Difficult readability values were seen in different types of examined texts: in patient information leaflets, in the summaries of product characteristics, in promotional materials, while describing over-the-counter medications and in the materials for creating disease awareness. Especially low readability values were found within the texts belonging to promotional materials intended for the physicians. None of researched items, not even for the general public, were close to primary school grade readability levels and therefore could not be described as easily readable. Conclusion: This study provides an understanding of the level of readability of selected Slovene medicines information material. It was concluded that health-related texts were not compliant with general public or with healthcare professional needs. PMID:23093886

  8. Validation Study of Waray Text Readability Instrument

    Science.gov (United States)

    Oyzon, Voltaire Q.; Corrales, Juven B.; Estardo, Wilfredo M., Jr.

    2015-01-01

    In 2012 the Leyte Normal University developed a computer software--modelled after the Spache Readability Formula (1953) made for English--made to help rank texts that can is used by teachers or research groups on selecting appropriate reading materials to support the DepEd's MTB-MLE program in Region VIII, in the Philippines. However,…

  9. Readability and Reading Ability.

    Science.gov (United States)

    Wright, Benjamin D.; Stenner, A. Jackson

    This document discusses the measurement of reading ability and the readability of books by application of the Lexile framework. It begins by stating the importance of uniform measures. It then discusses the history of reading ability testing, based on the assumption that no researcher has been able to measure more than one kind of reading ability.…

  10. Readability of Wikipedia

    NARCIS (Netherlands)

    Lucassen, T.; Dijkstra, Roald; Schraagen, Johannes Martinus Cornelis

    2012-01-01

    Wikipedia is becoming widely acknowledged as a reliable source of encyclopedic information. However, concerns have been expressed about its readability. Wikipedia articles might be written in a language too difficult to be understood by most of its visitors. In this study, we apply the Flesch

  11. Readability of Malaria Medicine Information Leaflets in Nigeria ...

    African Journals Online (AJOL)

    Purpose: To assess the readability of malaria medicines information leaflets available in Nigeria. Methods: Fourty five leaflets were assessed using the Simplified Measure of Gobbledygook (SMOG) readability test and by examining them for paper type, font size type, use of symbols and pictograms, and bilingual information ...

  12. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  13. Method for computed tomography

    International Nuclear Information System (INIS)

    Wagner, W.

    1980-01-01

    In transversal computer tomography apparatus, in which the positioning zone in which the patient can be positioned is larger than the scanning zone in which a body slice can be scanned, reconstruction errors are liable to occur. These errors are caused by incomplete irradiation of the body during examination. They become manifest not only as an incorrect image of the area not irradiated, but also have an adverse effect on the image of the other, completely irradiated areas. The invention enables reduction of these errors

  14. Computational methods working group

    International Nuclear Information System (INIS)

    Gabriel, T.A.

    1997-09-01

    During the Cold Moderator Workshop several working groups were established including one to discuss calculational methods. The charge for this working group was to identify problems in theory, data, program execution, etc., and to suggest solutions considering both deterministic and stochastic methods including acceleration procedures.

  15. The readability of pediatric patient education materials on the World Wide Web.

    Science.gov (United States)

    D'Alessandro, D M; Kingsley, P; Johnson-West, J

    2001-07-01

    Literacy is a national and international problem. Studies have shown the readability of adult and pediatric patient education materials to be too high for average adults. Materials should be written at the 8th-grade level or lower. To determine the general readability of pediatric patient education materials designed for adults on the World Wide Web (WWW). GeneralPediatrics.com (http://www.generalpediatrics.com) is a digital library serving the medical information needs of pediatric health care providers, patients, and families. Documents from 100 different authoritative Web sites designed for laypersons were evaluated using a built-in computer software readability formula (Flesch Reading Ease and Flesch-Kincaid reading levels) and hand calculation methods (Fry Formula and SMOG methods). Analysis of variance and paired t tests determined significance. Eighty-nine documents constituted the final sample; they covered a wide spectrum of pediatric topics. The overall Flesch Reading Ease score was 57.0. The overall mean Fry Formula was 12.0 (12th grade, 0 months of schooling) and SMOG was 12.2. The overall Flesch-Kincaid grade level was significantly lower (Peducation materials on the WWW are not written at an appropriate reading level for the average adult. We propose that a practical reading level and how it was determined be included on all patient education materials on the WWW for general guidance in material selection. We discuss suggestions for improved readability of patient education materials.

  16. Readability Approaches: Implications for Turkey

    Science.gov (United States)

    Ulusoy, Mustafa

    2006-01-01

    Finding the right fit between students' reading ability and textbooks is very important for comprehension. Readability studies aim to analyse texts to find the right fit between students and texts. In this literature review, readability studies are classified under quantitative, qualitative and combined quantitative-qualitative readability…

  17. Computational Methods in Medicine

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-01-01

    Full Text Available Artificial Intelligence requires Logic. But its Classical version shows too many insufficiencies. So, it is absolutely necessary to introduce more sophisticated tools, such as Fuzzy Logic, Modal Logic, Non-Monotonic Logic, and so on [2]. Among the things that AI needs to represent are Categories, Objects, Properties, Relations between objects, Situations, States, Time, Events, Causes and effects, Knowledge about knowledge, and so on. The problems in AI can be classified in two general types
    [3, 4], Search Problems and Representation Problem. There exist different ways to reach this objective. So, we have [3] Logics, Rules, Frames, Associative Nets, Scripts and so on, that are often interconnected. Also, it will be very useful, in dealing with problems of uncertainty and causality, to introduce Bayesian Networks and particularly, a principal tool as the Essential Graph. We attempt here to show the scope of application of such versatile methods, currently fundamental in Medicine.

  18. Numerical methods in matrix computations

    CERN Document Server

    Björck, Åke

    2015-01-01

    Matrix algorithms are at the core of scientific computing and are indispensable tools in most applications in engineering. This book offers a comprehensive and up-to-date treatment of modern methods in matrix computation. It uses a unified approach to direct and iterative methods for linear systems, least squares and eigenvalue problems. A thorough analysis of the stability, accuracy, and complexity of the treated methods is given. Numerical Methods in Matrix Computations is suitable for use in courses on scientific computing and applied technical areas at advanced undergraduate and graduate level. A large bibliography is provided, which includes both historical and review papers as well as recent research papers. This makes the book useful also as a reference and guide to further study and research work. Åke Björck is a professor emeritus at the Department of Mathematics, Linköping University. He is a Fellow of the Society of Industrial and Applied Mathematics.

  19. Numerical computer methods part D

    CERN Document Server

    Johnson, Michael L

    2004-01-01

    The aim of this volume is to brief researchers of the importance of data analysis in enzymology, and of the modern methods that have developed concomitantly with computer hardware. It is also to validate researchers' computer programs with real and synthetic data to ascertain that the results produced are what they expected. Selected Contents: Prediction of protein structure; modeling and studying proteins with molecular dynamics; statistical error in isothermal titration calorimetry; analysis of circular dichroism data; model comparison methods.

  20. Computational Methods in Plasma Physics

    CERN Document Server

    Jardin, Stephen

    2010-01-01

    Assuming no prior knowledge of plasma physics or numerical methods, Computational Methods in Plasma Physics covers the computational mathematics and techniques needed to simulate magnetically confined plasmas in modern magnetic fusion experiments and future magnetic fusion reactors. Largely self-contained, the text presents the basic concepts necessary for the numerical solution of partial differential equations. Along with discussing numerical stability and accuracy, the author explores many of the algorithms used today in enough depth so that readers can analyze their stability, efficiency,

  1. The Readability of an Unreadable Text.

    Science.gov (United States)

    Gordon, Robert M.

    1980-01-01

    The Dale-Chall Readability Formula and the Fry Readability Graph were used to analyze passages of Plato's "Parmenides," a notoriously difficult literary piece. The readability levels of the text ranged from fourth to eighth grade (Dale-Chall) and from sixth to tenth grade (Fry), indicating the limitations of the readability tests. (DF)

  2. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  3. Pylinguistics: an open source library for readability assessment of texts written in Portuguese

    Directory of Open Access Journals (Sweden)

    Castilhos, S.

    2016-12-01

    Full Text Available Readability assessment is an important task in automatic text simplification that aims identify the text complexity by computing a set of metrics. In this paper, we present the development and assessment of an open source library called Pylinguistics to readability assessment of texts written in Portuguese. Additionally, to illustrate the possibilities of our tool, this work also presents an empirical analysis of readability of Brazilian scientific news dissemination.

  4. Methods for computing color anaglyphs

    Science.gov (United States)

    McAllister, David F.; Zhou, Ya; Sullivan, Sophia

    2010-02-01

    A new computation technique is presented for calculating pixel colors in anaglyph images. The method depends upon knowing the RGB spectral distributions of the display device and the transmission functions of the filters in the viewing glasses. It requires the solution of a nonlinear least-squares program for each pixel in a stereo pair and is based on minimizing color distances in the CIEL*a*b* uniform color space. The method is compared with several techniques for computing anaglyphs including approximation in CIE space using the Euclidean and Uniform metrics, the Photoshop method and its variants, and a method proposed by Peter Wimmer. We also discuss the methods of desaturation and gamma correction for reducing retinal rivalry.

  5. Computational methods in drug discovery

    Directory of Open Access Journals (Sweden)

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  6. Combinatorial methods with computer applications

    CERN Document Server

    Gross, Jonathan L

    2007-01-01

    Combinatorial Methods with Computer Applications provides in-depth coverage of recurrences, generating functions, partitions, and permutations, along with some of the most interesting graph and network topics, design constructions, and finite geometries. Requiring only a foundation in discrete mathematics, it can serve as the textbook in a combinatorial methods course or in a combined graph theory and combinatorics course.After an introduction to combinatorics, the book explores six systematic approaches within a comprehensive framework: sequences, solving recurrences, evaluating summation exp

  7. Consent information leaflets – readable or unreadable?

    Science.gov (United States)

    Graham, Caroline; Reynard, John M; Turney, Benjamin W

    2016-01-01

    Objective The objective of this article is to assess the readability of leaflets about urological procedures provided by the British Association of Urological Surgeons (BAUS) to evaluate their suitability for providing information. Methods Information leaflets were assessed using three measures of readability: Flesch Reading Ease, Flesch-Kincaid and Simple Measure of Gobbledygook (SMOG) grade formulae. The scores were compared with national literacy statistics. Results Relatively good readability was demonstrated using the Flesch Reading Ease (53.4–60.1) and Flesch-Kincaid Grade Level (6.5–7.6) methods. However, the average SMOG index (14.0–15.0) for each category suggests that the majority of the leaflets are written above the reading level of an 18-year-old. Using national literacy statistics, at least 43% of the population will have significant difficultly understanding the majority of these leaflets. Conclusions The results suggest that comprehension of the leaflets provided by the BAUS is likely to be poor. These leaflets may be used as an adjunct to discussion but it is essential to ensure that all the information necessary to make an informed decision has been conveyed in a way that can be understood by the patient. PMID:27867520

  8. Computational methods for fluid dynamics

    CERN Document Server

    Ferziger, Joel H

    2002-01-01

    In its 3rd revised and extended edition the book offers an overview of the techniques used to solve problems in fluid mechanics on computers and describes in detail those most often used in practice. Included are advanced methods in computational fluid dynamics, like direct and large-eddy simulation of turbulence, multigrid methods, parallel computing, moving grids, structured, block-structured and unstructured boundary-fitted grids, free surface flows. The 3rd edition contains a new section dealing with grid quality and an extended description of discretization methods. The book shows common roots and basic principles for many different methods. The book also contains a great deal of practical advice for code developers and users, it is designed to be equally useful to beginners and experts. The issues of numerical accuracy, estimation and reduction of numerical errors are dealt with in detail, with many examples. A full-feature user-friendly demo-version of a commercial CFD software has been added, which ca...

  9. Readability of pediatric health materials for preventive dental care

    Directory of Open Access Journals (Sweden)

    Riedy Christine A

    2006-11-01

    Full Text Available Abstract Background This study examined the content and general readability of pediatric oral health education materials for parents of young children. Methods Twenty-seven pediatric oral health pamphlets or brochures from commercial, government, industry, and private nonprofit sources were analyzed for general readability ("usability" according to several parameters: readability, (Flesch-Kincaid grade level, Flesch Reading Ease, and SMOG grade level; thoroughness, (inclusion of topics important to young childrens' oral health; textual framework (frequency of complex phrases, use of pictures, diagrams, and bulleted text within materials; and terminology (frequency of difficult words and dental jargon. Results Readability of the written texts ranged from 2nd to 9th grade. The average Flesch-Kincaid grade level for government publications was equivalent to a grade 4 reading level (4.73, range, 2.4 – 6.6; F-K grade levels for commercial publications averaged 8.1 (range, 6.9 – 8.9; and industry published materials read at an average Flesch-Kincaid grade level of 7.4 (range, 4.7 – 9.3. SMOG readability analysis, based on a count of polysyllabic words, consistently rated materials 2 to 3 grade levels higher than did the Flesch-Kincaid analysis. Government sources were significantly lower compared to commercial and industry sources for Flesch-Kincaid grade level and SMOG readability analysis. Content analysis found materials from commercial and industry sources more complex than government-sponsored publications, whereas commercial sources were more thorough in coverage of pediatric oral health topics. Different materials frequently contained conflicting information. Conclusion Pediatric oral health care materials are readily available, yet their quality and readability vary widely. In general, government publications are more readable than their commercial and industry counterparts. The criteria for usability and results of the analyses

  10. A Software Application for Assessing Readability in the Japanese EFL Context

    Science.gov (United States)

    Ozasa, Toshiaki; Weir, George R. S.; Fukui, Masayasu

    2010-01-01

    We have been engaged in developing a readability index and its application software attuned for Japanese EFL learners. The index program, Ozasa-Fukui Year Level Program, Ver. 1.0, was used in developing the readability metric Ozasa-Fukui Year Level Index but tended to assume a high level of computer knowledge in its users. As a result, the…

  11. Consent information leaflets - readable or unreadable?

    Science.gov (United States)

    Graham, Caroline; Reynard, John M; Turney, Benjamin W

    2015-05-01

    The objective of this article is to assess the readability of leaflets about urological procedures provided by the British Association of Urological Surgeons (BAUS) to evaluate their suitability for providing information. Information leaflets were assessed using three measures of readability: Flesch Reading Ease, Flesch-Kincaid and Simple Measure of Gobbledygook (SMOG) grade formulae. The scores were compared with national literacy statistics. Relatively good readability was demonstrated using the Flesch Reading Ease (53.4-60.1) and Flesch-Kincaid Grade Level (6.5-7.6) methods. However, the average SMOG index (14.0-15.0) for each category suggests that the majority of the leaflets are written above the reading level of an 18-year-old. Using national literacy statistics, at least 43% of the population will have significant difficultly understanding the majority of these leaflets. The results suggest that comprehension of the leaflets provided by the BAUS is likely to be poor. These leaflets may be used as an adjunct to discussion but it is essential to ensure that all the information necessary to make an informed decision has been conveyed in a way that can be understood by the patient.

  12. Numerical computer methods part E

    CERN Document Server

    Johnson, Michael L

    2004-01-01

    The contributions in this volume emphasize analysis of experimental data and analytical biochemistry, with examples taken from biochemistry. They serve to inform biomedical researchers of the modern data analysis methods that have developed concomitantly with computer hardware. Selected Contents: A practical approach to interpretation of SVD results; modeling of oscillations in endocrine networks with feedback; quantifying asynchronous breathing; sample entropy; wavelet modeling and processing of nasal airflow traces.

  13. Computational methods for stellerator configurations

    International Nuclear Information System (INIS)

    Betancourt, O.

    1992-01-01

    This project had two main objectives. The first one was to continue to develop computational methods for the study of three dimensional magnetic confinement configurations. The second one was to collaborate and interact with researchers in the field who can use these techniques to study and design fusion experiments. The first objective has been achieved with the development of the spectral code BETAS and the formulation of a new variational approach for the study of magnetic island formation in a self consistent fashion. The code can compute the correct island width corresponding to the saturated island, a result shown by comparing the computed island with the results of unstable tearing modes in Tokamaks and with experimental results in the IMS Stellarator. In addition to studying three dimensional nonlinear effects in Tokamaks configurations, these self consistent computed island equilibria will be used to study transport effects due to magnetic island formation and to nonlinearly bifurcated equilibria. The second objective was achieved through direct collaboration with Steve Hirshman at Oak Ridge, D. Anderson and R. Talmage at Wisconsin as well as through participation in the Sherwood and APS meetings

  14. Readability in reading materials selection and coursebook design for college English in China

    OpenAIRE

    Lu, Zhongshe

    2002-01-01

    This thesis studies the application of readability in reading materials selection and coursebook design for college English in an EFL context in China. Its aim is to develop rationales which coursebook writers can utilise in selecting materials as texts and as a basis for designing tasks. This study, through a combination of quantitative and qualitative research methods, argues that readability is applicable in the EFL Chinese context, and readability plays a important role in determining...

  15. Computational methods for molecular imaging

    CERN Document Server

    Shi, Kuangyu; Li, Shuo

    2015-01-01

    This volume contains original submissions on the development and application of molecular imaging computing. The editors invited authors to submit high-quality contributions on a wide range of topics including, but not limited to: • Image Synthesis & Reconstruction of Emission Tomography (PET, SPECT) and other Molecular Imaging Modalities • Molecular Imaging Enhancement • Data Analysis of Clinical & Pre-clinical Molecular Imaging • Multi-Modal Image Processing (PET/CT, PET/MR, SPECT/CT, etc.) • Machine Learning and Data Mining in Molecular Imaging. Molecular imaging is an evolving clinical and research discipline enabling the visualization, characterization and quantification of biological processes taking place at the cellular and subcellular levels within intact living subjects. Computational methods play an important role in the development of molecular imaging, from image synthesis to data analysis and from clinical diagnosis to therapy individualization. This work will bring readers fro...

  16. Computer-aided head film analysis: the University of California San Francisco method.

    Science.gov (United States)

    Baumrind, S; Miller, D M

    1980-07-01

    Computer technology is already assuming an important role in the management of orthodontic practices. The next 10 years are likely to see expansion in computer usage into the areas of diagnosis, treatment planning, and treatment-record keeping. In the areas of diagnosis and treatment planning, one of the first problems to be attacked will be the automation of head film analysis. The problems of constructing computer-aided systems for this purpose are considered herein in the light of the authors' 10 years of experience in developing a similar system for research purposes. The need for building in methods for automatic detection and correction of gross errors is discussed and the authors' method for doing so is presented. The construction of a rudimentary machine-readable data base for research and clinical purposes is described.

  17. Assessing the Readability of Medical Documents: A Ranking Approach.

    Science.gov (United States)

    Zheng, Jiaping; Yu, Hong

    2018-03-23

    The use of electronic health record (EHR) systems with patient engagement capabilities, including viewing, downloading, and transmitting health information, has recently grown tremendously. However, using these resources to engage patients in managing their own health remains challenging due to the complex and technical nature of the EHR narratives. Our objective was to develop a machine learning-based system to assess readability levels of complex documents such as EHR notes. We collected difficulty ratings of EHR notes and Wikipedia articles using crowdsourcing from 90 readers. We built a supervised model to assess readability based on relative orders of text difficulty using both surface text features and word embeddings. We evaluated system performance using the Kendall coefficient of concordance against human ratings. Our system achieved significantly higher concordance (.734) with human annotators than did a baseline using the Flesch-Kincaid Grade Level, a widely adopted readability formula (.531). The improvement was also consistent across different disease topics. This method's concordance with an individual human user's ratings was also higher than the concordance between different human annotators (.658). We explored methods to automatically assess the readability levels of clinical narratives. Our ranking-based system using simple textual features and easy-to-learn word embeddings outperformed a widely used readability formula. Our ranking-based method can predict relative difficulties of medical documents. It is not constrained to a predefined set of readability levels, a common design in many machine learning-based systems. Furthermore, the feature set does not rely on complex processing of the documents. One potential application of our readability ranking is personalization, allowing patients to better accommodate their own background knowledge. ©Jiaping Zheng, Hong Yu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 23.03.2018.

  18. Readability of medicinal package leaflets: a systematic review.

    Science.gov (United States)

    Pires, Carla; Vigário, Marina; Cavaco, Afonso

    2015-01-01

    OBJECTIVE To review studies on the readability of package leaflets of medicinal products for human use. METHODS We conducted a systematic literature review between 2008 and 2013 using the keywords "Readability and Package Leaflet" and "Readability and Package Insert" in the academic search engine Biblioteca do Conhecimento Online, comprising different bibliographic resources/databases. The preferred reporting items for systematic reviews and meta-analyses criteria were applied to prepare the draft of the report. Quantitative and qualitative original studies were included. Opinion or review studies not written in English, Portuguese, Italian, French, or Spanish were excluded. RESULTS We identified 202 studies, of which 180 were excluded and 22 were enrolled [two enrolling healthcare professionals, 10 enrolling other type of participants (including patients), three focused on adverse reactions, and 7 descriptive studies]. The package leaflets presented various readability problems, such as complex and difficult to understand texts, small font size, or few illustrations. The main methods to assess the readability of the package leaflet were usability tests or legibility formulae. Limitations with these methods included reduced number of participants; lack of readability formulas specifically validated for specific languages (e.g., Portuguese); and absence of an assessment on patients literacy, health knowledge, cognitive skills, levels of satisfaction, and opinions. CONCLUSIONS Overall, the package leaflets presented various readability problems. In this review, some methodological limitations were identified, including the participation of a limited number of patients and healthcare professionals, the absence of prior assessments of participant literacy, humor or sense of satisfaction, or the predominance of studies not based on role-plays about the use of medicines. These limitations should be avoided in future studies and be considered when interpreting the results.

  19. Text Readability and Intuitive Simplification: A Comparison of Readability Formulas

    Science.gov (United States)

    Crossley, Scott A.; Allen, David B.; McNamara, Danielle S.

    2011-01-01

    Texts are routinely simplified for language learners with authors relying on a variety of approaches and materials to assist them in making the texts more comprehensible. Readability measures are one such tool that authors can use when evaluating text comprehensibility. This study compares the Coh-Metrix Second Language (L2) Reading Index, a…

  20. Computer methods in general relativity: algebraic computing

    CERN Document Server

    Araujo, M E; Skea, J E F; Koutras, A; Krasinski, A; Hobill, D; McLenaghan, R G; Christensen, S M

    1993-01-01

    Karlhede & MacCallum [1] gave a procedure for determining the Lie algebra of the isometry group of an arbitrary pseudo-Riemannian manifold, which they intended to im- plement using the symbolic manipulation package SHEEP but never did. We have recently finished making this procedure explicit by giving an algorithm suitable for implemen- tation on a computer [2]. Specifically, we have written an algorithm for determining the isometry group of a spacetime (in four dimensions), and partially implemented this algorithm using the symbolic manipulation package CLASSI, which is an extension of SHEEP.

  1. Use of a New Set of Linguistic Features to Improve Automatic Assessment of Text Readability

    Science.gov (United States)

    Yoshimi, Takehiko; Kotani, Katsunori; Isahara, Hitoshi

    2012-01-01

    The present paper proposes and evaluates a readability assessment method designed for Japanese learners of EFL (English as a foreign language). The proposed readability assessment method is constructed by a regression algorithm using a new set of linguistic features that were employed separately in previous studies. The results showed that the…

  2. How Readable Are Parenting Books?

    Science.gov (United States)

    Abram, Marie J.; Dowling, William D.

    1979-01-01

    The author's style of writing has implications for the ease with which the written material can be read. Using the Flesch Reading Ease Formula, the mean readability score, the standard deviation, and range are given for 50 parenting books. Discussion suggests how the list might be used by parent educators. (Author)

  3. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  4. Assessing the Accuracy and Readability of Online Health Information for Patients With Pancreatic Cancer.

    Science.gov (United States)

    Storino, Alessandra; Castillo-Angeles, Manuel; Watkins, Ammara A; Vargas, Christina; Mancias, Joseph D; Bullock, Andrea; Demirjian, Aram; Moser, A James; Kent, Tara S

    2016-09-01

    The degree to which patients are empowered by written educational materials depends on the text's readability level and the accuracy of the information provided. The association of a website's affiliation or focus on treatment modality with its readability and accuracy has yet to be thoroughly elucidated. To compare the readability and accuracy of patient-oriented online resources for pancreatic cancer by treatment modality and website affiliation. An online search of 50 websites discussing 5 pancreatic cancer treatment modalities (alternative therapy, chemotherapy, clinical trials, radiation therapy, and surgery) was conducted. The website's affiliation was identified. Readability was measured by 9 standardized tests, and accuracy was assessed by an expert panel. Nine standardized tests were used to compute the median readability level of each website. The median readability scores were compared among treatment modality and affiliation categories. Accuracy was determined by an expert panel consisting of 2 medical specialists and 2 surgical specialists. The 4 raters independently evaluated all websites belonging to the 5 treatment modalities (a score of 1 indicates that readability and accuracy based on the focus of the treatment modality and the website's affiliation. Websites discussing surgery (with a median readability level of 13.7 and an interquartile range [IQR] of 11.9-15.6) were easier to read than those discussing radiotherapy (median readability level, 15.2 [IQR, 13.0-17.0]) (P = .003) and clinical trials (median readability level, 15.2 [IQR, 12.8-17.0]) (P = .002). Websites of nonprofit organizations (median readability level, 12.9 [IQR, 11.2-15.0]) were easier to read than media (median readability level, 16.0 [IQR, 13.4-17.0]) (P readability level, 14.8 [IQR, 12.9-17.0]) (P readability level, 14.0 [IQR, 12.1-16.1]) were easier to read than media websites (P = .001). Among treatment modalities, alternative therapy websites exhibited the

  5. Readability of patient information and consent documents in rheumatological studies

    DEFF Research Database (Denmark)

    Hamnes, Bente; van Eijk-Hustings, Yvonne; Primdahl, Jette

    2016-01-01

    BACKGROUND: Before participation in medical research an informed consent must be obtained. This study investigates whether the readability of patient information and consent documents (PICDs) corresponds to the average educational level of participants in rheumatological studies in the Netherlands......, Denmark, and Norway. METHODS: 24 PICDs from studies were collected and readability was assessed independently using the Gunning's Fog Index (FOG) and Simple Measure of Gobbledygook (SMOG) grading. RESULTS: The mean score for the FOG and SMOG grades were 14.2 (9.0-19.0) and 14.2 (12-17) respectively....... The mean FOG and SMOG grades were 12.7 and 13.3 in the Dutch studies, 15.0 and 14.9 in the Danish studies, and 14.6 and 14.3 in the Norwegian studies, respectively. Out of the 2865 participants, more than 57 % had a lower educational level than the highest readability score calculated in the individual...

  6. Computational Methods and Function Theory

    CERN Document Server

    Saff, Edward; Salinas, Luis; Varga, Richard

    1990-01-01

    The volume is devoted to the interaction of modern scientific computation and classical function theory. Many problems in pure and more applied function theory can be tackled using modern computing facilities: numerically as well as in the sense of computer algebra. On the other hand, computer algorithms are often based on complex function theory, and dedicated research on their theoretical foundations can lead to great enhancements in performance. The contributions - original research articles, a survey and a collection of problems - cover a broad range of such problems.

  7. Computational methods for reversed-field equilibrium

    International Nuclear Information System (INIS)

    Boyd, J.K.; Auerbach, S.P.; Willmann, P.A.; Berk, H.L.; McNamara, B.

    1980-01-01

    Investigating the temporal evolution of reversed-field equilibrium caused by transport processes requires the solution of the Grad-Shafranov equation and computation of field-line-averaged quantities. The technique for field-line averaging and the computation of the Grad-Shafranov equation are presented. Application of Green's function to specify the Grad-Shafranov equation boundary condition is discussed. Hill's vortex formulas used to verify certain computations are detailed. Use of computer software to implement computational methods is described

  8. Internally readable identifying tag

    International Nuclear Information System (INIS)

    Jefferts, K.B.; Jefferts, E.R.

    1980-01-01

    A method of identifying non-metallic objects by means of X-ray equipment is described in detail. A small metal pin with a number of grooves cut in a pre-determined equi-spaced pattern is implanted into the non-metallic object and by decoding the groove patterns using X-ray equipment, the object is uniquely identified. A specific example of such an application is in studying the migratory habits of fish. The pin inserted into the snout of the fish is 0.010 inch in diameter, 0.040 inch in length with 8 possible positions for grooves if spaced 0.005 inch apart. With 6 of the groove positions available for data, the capacity is 2 6 or 64 combinations; clearly longer pins would increase the data capacity. This method of identification is a major advance over previous techniques which necessitated destruction of the fish in order to recover the identification tag. (UK)

  9. Readability Level of Spanish-Language Patient-Reported Outcome Measures in Audiology and Otolaryngology

    Science.gov (United States)

    Coco, Laura; Colina, Sonia; Atcherson, Samuel R.

    2017-01-01

    Purpose The purpose of this study was to examine the readability level of the Spanish versions of several audiology- and otolaryngology-related patient-reported outcome measures (PROMs) and include a readability analysis of 2 translation approaches when available—the published version and a “functionalist” version—using a team-based collaborative approach including community members. Method Readability levels were calculated using the Fry Graph adapted for Spanish, as well as the Fernandez-Huerta and the Spaulding formulae for several commonly used audiology- and otolaryngology-related PROMs. Results Readability calculations agreed with previous studies analyzing audiology-related PROMs in English and demonstrated many Spanish-language PROMs were beyond the 5th grade reading level suggested for health-related materials written for the average population. In addition, the functionalist versions of the PROMs yielded lower grade-level (improved) readability levels than the published versions. Conclusion Our results suggest many of the Spanish-language PROMs evaluated here are beyond the recommended readability levels and may be influenced by the approach to translation. Moreover, improved readability may be possible using a functionalist approach to translation. Future analysis of the suitability of outcome measures and the quality of their translations should move beyond readability and include an evaluation of the individual's comprehension of the written text. PMID:28892821

  10. Using readability, comprehensibility and lexical coverage to ...

    African Journals Online (AJOL)

    experience academic difficulty in technical subjects such as Accounting. Davison and ...... The readability of managerial accounting and financial management textbooks. .... Principles and Practice in Second Language Acquisition. Available.

  11. Assessing readability formula differences with written health information materials: application, results, and recommendations.

    Science.gov (United States)

    Wang, Lih-Wern; Miller, Michael J; Schmitt, Michael R; Wen, Frances K

    2013-01-01

    comprehension, use of more recent validation criteria for determining reading grade level estimates, and simplicity of use. To improve interpretation of readability results, reporting reading grade level estimates from any formula should be accompanied with information about word sample size, location of word sampling in the text, formatting, and method of calculation. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Novel methods in computational finance

    CERN Document Server

    Günther, Michael; Maten, E

    2017-01-01

    This book discusses the state-of-the-art and open problems in computational finance. It presents a collection of research outcomes and reviews of the work from the STRIKE project, an FP7 Marie Curie Initial Training Network (ITN) project in which academic partners trained early-stage researchers in close cooperation with a broader range of associated partners, including from the private sector. The aim of the project was to arrive at a deeper understanding of complex (mostly nonlinear) financial models and to develop effective and robust numerical schemes for solving linear and nonlinear problems arising from the mathematical theory of pricing financial derivatives and related financial products. This was accomplished by means of financial modelling, mathematical analysis and numerical simulations, optimal control techniques and validation of models. In recent years the computational complexity of mathematical models employed in financial mathematics has witnessed tremendous growth. Advanced numerical techni...

  13. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  14. Computational methods in drug discovery

    OpenAIRE

    Sumudu P. Leelananda; Steffen Lindert

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery project...

  15. Readability of Invasive Procedure Consent Forms.

    Science.gov (United States)

    Eltorai, Adam E M; Naqvi, Syed S; Ghanian, Soha; Eberson, Craig P; Weiss, Arnold-Peter C; Born, Christopher T; Daniels, Alan H

    2015-12-01

    Informed consent is a pillar of ethical medicine which requires patients to fully comprehend relevant issues including the risks, benefits, and alternatives of an intervention. Given the average reading skill of US adults is at the 8th grade level, the American Medical Association (AMA) and the National Institutes of Health (NIH) recommend patient information materials should not exceed a 6th grade reading level. We hypothesized that text provided in invasive procedure consent forms would exceed recommended readability guidelines for medical information. To test this hypothesis, we gathered procedure consent forms from all surgical inpatient hospitals in the state of Rhode Island. For each consent form, readability analysis was measured with the following measures: Flesch Reading Ease Formula, Flesch-Kincaid Grade Level, Fog Scale, SMOG Index, Coleman-Liau Index, Automated Readability Index, and Linsear Write Formula. These readability scores were used to calculate a composite Text Readability Consensus Grade Level. Invasive procedure consent forms were found to be written at an average of 15th grade level (i.e., third year of college), which is significantly higher than the average US adult reading level of 8th grade (p readability guidelines for patient materials of 6th grade (p readability levels which makes comprehension difficult or impossible for many patients. Efforts to improve the readability of procedural consent forms should improve patient understanding regarding their healthcare decisions. © 2015 Wiley Periodicals, Inc.

  16. Evaluating four readability formulas for Afrikaans.

    NARCIS (Netherlands)

    Jansen, C. J. M.; Richards, Rose; Van Zyl, Liezl

    2017-01-01

    For almost a hundred years now, readability formulas have been used to measure how difficult it is to comprehend a given text. To date, four readability formulas have been developed for Afrikaans. Two such formulas were published by Van Rooyen (1986), one formula by McDermid Heyns (2007) and one

  17. Hybrid Monte Carlo methods in computational finance

    NARCIS (Netherlands)

    Leitao Rodriguez, A.

    2017-01-01

    Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the

  18. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  19. Sense and readability: participant information sheets for research studies.

    Science.gov (United States)

    Ennis, Liam; Wykes, Til

    2016-02-01

    Informed consent in research is partly achieved through the use of information sheets. There is a perception however that these information sheets are long and complex. The recommended reading level for patient information is grade 6, or 11-12 years old. To investigate whether the readability of participant information sheets has changed over time, whether particular study characteristics are related to poorer readability and whether readability and other study characteristics are related to successful study recruitment. Method: We obtained 522 information sheets from the UK National Institute for Health Research Clinical Research Network: Mental Health portfolio database and study principal investigators. Readability was assessed with the Flesch reading index and the Grade level test. Information sheets increased in length over the study period. The mean grade level across all information sheets was 9.8, or 15-16 years old. A high level of patient involvement was associated with more recruitment success and studies involving pharmaceutical or device interventions were the least successful. The complexity of information sheets had little bearing on successful recruitment. Information sheets are far more complex than the recommended reading level of grade 6 for patient information. The disparity may be exacerbated by an increasing focus on legal content. Researchers would benefit from clear guidance from ethics committees on writing succinctly and accessibly and how to balance the competing legal issues with the ability of participants to understand what a study entails. © The Royal College of Psychiatrists 2016.

  20. Computational Methods for Biomolecular Electrostatics

    Science.gov (United States)

    Dong, Feng; Olsen, Brett; Baker, Nathan A.

    2008-01-01

    An understanding of intermolecular interactions is essential for insight into how cells develop, operate, communicate and control their activities. Such interactions include several components: contributions from linear, angular, and torsional forces in covalent bonds, van der Waals forces, as well as electrostatics. Among the various components of molecular interactions, electrostatics are of special importance because of their long range and their influence on polar or charged molecules, including water, aqueous ions, and amino or nucleic acids, which are some of the primary components of living systems. Electrostatics, therefore, play important roles in determining the structure, motion and function of a wide range of biological molecules. This chapter presents a brief overview of electrostatic interactions in cellular systems with a particular focus on how computational tools can be used to investigate these types of interactions. PMID:17964951

  1. Readability assessment of online ophthalmic patient information.

    Science.gov (United States)

    Edmunds, Matthew R; Barry, Robert J; Denniston, Alastair K

    2013-12-01

    Patients increasingly use the Internet to access information related to their disease, but poor health literacy is known to impact negatively on medical outcomes. Multiple agencies have recommended that patient-oriented literature be written at a fourth- to sixth-grade (9-12 years of age) reading level to assist understanding. The readability of online patient-oriented materials related to ophthalmic diagnoses is not yet known. To assess the readability of online literature specifically for a range of ophthalmic conditions. Body text of the top 10 patient-oriented websites for 16 different ophthalmic diagnoses, covering the full range of ophthalmic subspecialties, was analyzed for readability, source (United Kingdom vs non-United Kingdom, not for profit vs commercial), and appropriateness for sight-impaired readers. Four validated readability formulas were used: Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG), and Gunning Fog Index (GFOG). Data were compared with the Mann-Whitney test (for 2 groups) and Kruskal-Wallis test (for more than 2 groups) and correlation was assessed by the Spearman r. None of the 160 webpages had readability scores within published guidelines, with 83% assessed as being of "difficult" readability. Not-for-profit webpages were of significantly greater length than commercial webpages (P = .02) and UK-based webpages had slightly superior readability scores compared with those of non-UK webpages (P = .004 to P readability formula used). Of all webpages evaluated, only 34% included facility to adjust text size to assist visually impaired readers. To our knowledge, this is the first study to assess readability of patient-focused webpages specifically for a range of ophthalmic diagnoses. In keeping with previous studies in other medical conditions, we determined that readability scores were inferior to those recommended, irrespective of the measure used. Although readability is only one

  2. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  3. Computational methods for data evaluation and assimilation

    CERN Document Server

    Cacuci, Dan Gabriel

    2013-01-01

    Data evaluation and data combination require the use of a wide range of probability theory concepts and tools, from deductive statistics mainly concerning frequencies and sample tallies to inductive inference for assimilating non-frequency data and a priori knowledge. Computational Methods for Data Evaluation and Assimilation presents interdisciplinary methods for integrating experimental and computational information. This self-contained book shows how the methods can be applied in many scientific and engineering areas. After presenting the fundamentals underlying the evaluation of experiment

  4. Readability analysis of online resources related to lung cancer.

    Science.gov (United States)

    Weiss, Kathleen D; Vargas, Christina R; Ho, Olivia A; Chuang, Danielle J; Weiss, Jonathan; Lee, Bernard T

    2016-11-01

    Patients seeking health information commonly use the Internet as the first source for material. Studies show that well-informed patients have increased involvement, satisfaction, and healthcare outcomes. As one-third of Americans have only basic or below basic health literacy, the National Institutes of Health and American Medical Association recommend patient-directed health resources be written at a sixth-grade reading level. This study evaluates the readability of commonly accessed online resources on lung cancer. A search for "lung cancer" was performed using Google and Bing, and the top 10 websites were identified. Location services were disabled, and sponsored sites were excluded. Relevant articles (n = 109) with patient-directed content available directly from the main sites were downloaded. Readability was assessed using 10 established methods and analyzed with articles grouped by parent website. The average reading grade level across all sites was 11.2, with a range from 8.8 (New Fog Count) to 12.2 (Simple Measure of Gobbledygook). The average Flesch Reading Ease score was 52, corresponding with fairly difficult to read text. The readability varied when compared by individual website, ranging in grade level from 9.2 to 15.2. Only 10 articles (9%) were written below a sixth-grade level and these tended to discuss simpler topics. Patient-directed online information about lung cancer exceeds the recommended sixth-grade reading level. Readability varies between individual websites, allowing physicians to direct patients according to level of health literacy. Modifications to existing materials can significantly improve readability while maintaining content for patients with low health literacy. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Evaluation of the Readability of Dermatological Postoperative Patient Information Leaflets Across England.

    Science.gov (United States)

    Hunt, William T N; McGrath, Emily J

    2016-06-01

    Postoperative patient information leaflets (PILs) provide important guidance to patients after skin surgery. Readability is a method of evaluating information for text comprehension. The recommended level for PIL readability is US grade ≤6. To evaluate the readability of public English dermatological postoperative PILs. All dermatology departments in England were requested to provide their postoperative PILs. Patient information leaflets were evaluated using Readability Studio (Oleander Software, Vandalia, OH). Two preselected parameters were also noted: whether the PIL was doctor or nurse-written, and whether the PIL was Information Standard hallmarked. Eighty-five of one hundred thirty (65.4%) of PILs were evaluated. Only 29.4% of the PILs were grade level ≤6 with Flesch-Kincaid. The mean readability levels were 7.8 for Flesch-Kincaid, 67 for Flesch reading ease, 10.5 for Simple Measure of Gobbledygook (SMOG), 9.4 for Gunning-Fog, 8 for Fry, and 9.8 for FORCAST. No instruments demonstrated a significant difference between doctor (6) and nurse-written (7) PILs. Two instruments found that the 3 Information Standard hallmarked PILs had a higher (harder) readability than ordinary PILs (n = 82) (Gunning-Fog, p = .029*; SMOG p = .049*). Most English postoperative dermatological PILs' readability levels exceed recommendations (US grade ≤6). Departmental PILs should be reviewed to ensure that they are comprehensible to their patients.

  6. Electromagnetic field computation by network methods

    CERN Document Server

    Felsen, Leopold B; Russer, Peter

    2009-01-01

    This monograph proposes a systematic and rigorous treatment of electromagnetic field representations in complex structures. The book presents new strong models by combining important computational methods. This is the last book of the late Leopold Felsen.

  7. Methods in computed angiotomography of the brain

    International Nuclear Information System (INIS)

    Yamamoto, Yuji; Asari, Shoji; Sadamoto, Kazuhiko.

    1985-01-01

    Authors introduce the methods in computed angiotomography of the brain. Setting of the scan planes and levels and the minimum dose bolus (MinDB) injection of contrast medium are described in detail. These methods are easily and safely employed with the use of already propagated CT scanners. Computed angiotomography is expected for clinical applications in many institutions because of its diagnostic value in screening of cerebrovascular lesions and in demonstrating the relationship between pathological lesions and cerebral vessels. (author)

  8. Methods and experimental techniques in computer engineering

    CERN Document Server

    Schiaffonati, Viola

    2014-01-01

    Computing and science reveal a synergic relationship. On the one hand, it is widely evident that computing plays an important role in the scientific endeavor. On the other hand, the role of scientific method in computing is getting increasingly important, especially in providing ways to experimentally evaluate the properties of complex computing systems. This book critically presents these issues from a unitary conceptual and methodological perspective by addressing specific case studies at the intersection between computing and science. The book originates from, and collects the experience of, a course for PhD students in Information Engineering held at the Politecnico di Milano. Following the structure of the course, the book features contributions from some researchers who are working at the intersection between computing and science.

  9. Computational techniques of the simplex method

    CERN Document Server

    Maros, István

    2003-01-01

    Computational Techniques of the Simplex Method is a systematic treatment focused on the computational issues of the simplex method. It provides a comprehensive coverage of the most important and successful algorithmic and implementation techniques of the simplex method. It is a unique source of essential, never discussed details of algorithmic elements and their implementation. On the basis of the book the reader will be able to create a highly advanced implementation of the simplex method which, in turn, can be used directly or as a building block in other solution algorithms.

  10. Computational and mathematical methods in brain atlasing.

    Science.gov (United States)

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  11. A survey of machine readable data bases

    Science.gov (United States)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  12. Readability assessment of online tracheostomy care resources.

    Science.gov (United States)

    Kong, Keonho Albert; Hu, Amanda

    2015-02-01

    To assess the readability of online tracheostomy care resources. Cross-sectional study. Academic center. A Google search was performed for "tracheostomy care" in January 2014. The top 50 results were categorized into major versus minor websites and patient-oriented versus professional-oriented resources. These websites were evaluated with the following readability tools: Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG), and Gunning Frequency of Gobbledygook (GFOG). Readability scores for the websites were FRES 57.21 ± 16.71 (possible range = 0-100), FKGL 8.33 ± 2.84 (possible range = 3-12), SMOG 11.25 ± 2.49 (possible range = 3-19), and GFOG 11.43 ± 4.07 (possible range = 3-19). There was no significant difference in all 4 readability scores between major (n = 41) and minor (n = 9) websites. Professional-oriented websites (n = 19) had the following readability scores: FRES 40.77 ± 11.69, FKGL 10.93 ± 2.48, SMOG 13.29 ± 2.32, and GFOG 14.91 ± 3.98. Patient-oriented websites (n = 31) had the following readability scores: FRES 67.29 ± 9.91, FKGL 6.73 ± 1.61, SMOG 10.01 ± 1.64, and GFOG 9.30 ± 2.27. Professional-oriented websites had more difficult readability scores than patient-oriented websites for FRES (P readability between major and minor websites. Professional-oriented websites were more difficult to read than patient-oriented websites. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.

  13. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  14. Machine Readable Passports & The Visa Waiver Programme

    CERN Multimedia

    2003-01-01

    From 1 October 2003, all passengers intending to enter the USA on the Visa Waiver Programme (VWP) will be required to present a machine-readable passport (MRP). Passengers travelling to the USA with a non-machine readable passport will require a valid US entry visa. Applying for a US visa is a lengthy process, which can take several weeks or even months. Therefore it is strongly recommended that: • All Visa Waiver nationals who hold a non-machine readable passport should obtain a MRP before their next visit to the USA. • Children travelling on a parent's passport (be it machine readable or non-machine readable) cannot benefit from the Visa Waiver Programme and should obtain their own MRP prior to travelling to the USA or request a visa. What is a Machine Readable Passport (MRP)? A MRP has the holders' personal details, e.g. name, date of birth, nationality and their passport number contained in two lines of text at the base of the photo page. This text may be read by machine. These 2 lines ...

  15. Empirical evaluation methods in computer vision

    CERN Document Server

    Christensen, Henrik I

    2002-01-01

    This book provides comprehensive coverage of methods for the empirical evaluation of computer vision techniques. The practical use of computer vision requires empirical evaluation to ensure that the overall system has a guaranteed performance. The book contains articles that cover the design of experiments for evaluation, range image segmentation, the evaluation of face recognition and diffusion methods, image matching using correlation methods, and the performance of medical image processing algorithms. Sample Chapter(s). Foreword (228 KB). Chapter 1: Introduction (505 KB). Contents: Automate

  16. A computational method for sharp interface advection

    DEFF Research Database (Denmark)

    Roenby, Johan; Bredmose, Henrik; Jasak, Hrvoje

    2016-01-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volu...

  17. Computing discharge using the index velocity method

    Science.gov (United States)

    Levesque, Victor A.; Oberg, Kevin A.

    2012-01-01

    Application of the index velocity method for computing continuous records of discharge has become increasingly common, especially since the introduction of low-cost acoustic Doppler velocity meters (ADVMs) in 1997. Presently (2011), the index velocity method is being used to compute discharge records for approximately 470 gaging stations operated and maintained by the U.S. Geological Survey. The purpose of this report is to document and describe techniques for computing discharge records using the index velocity method. Computing discharge using the index velocity method differs from the traditional stage-discharge method by separating velocity and area into two ratings—the index velocity rating and the stage-area rating. The outputs from each of these ratings, mean channel velocity (V) and cross-sectional area (A), are then multiplied together to compute a discharge. For the index velocity method, V is a function of such parameters as streamwise velocity, stage, cross-stream velocity, and velocity head, and A is a function of stage and cross-section shape. The index velocity method can be used at locations where stage-discharge methods are used, but it is especially appropriate when more than one specific discharge can be measured for a specific stage. After the ADVM is selected, installed, and configured, the stage-area rating and the index velocity rating must be developed. A standard cross section is identified and surveyed in order to develop the stage-area rating. The standard cross section should be surveyed every year for the first 3 years of operation and thereafter at a lesser frequency, depending on the susceptibility of the cross section to change. Periodic measurements of discharge are used to calibrate and validate the index rating for the range of conditions experienced at the gaging station. Data from discharge measurements, ADVMs, and stage sensors are compiled for index-rating analysis. Index ratings are developed by means of regression

  18. Computational efficiency for the surface renewal method

    Science.gov (United States)

    Kelley, Jason; Higgins, Chad

    2018-04-01

    Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.

  19. Computational methods in molecular imaging technologies

    CERN Document Server

    Gunjan, Vinit Kumar; Venkatesh, C; Amarnath, M

    2017-01-01

    This book highlights the experimental investigations that have been carried out on magnetic resonance imaging and computed tomography (MRI & CT) images using state-of-the-art Computational Image processing techniques, and tabulates the statistical values wherever necessary. In a very simple and straightforward way, it explains how image processing methods are used to improve the quality of medical images and facilitate analysis. It offers a valuable resource for researchers, engineers, medical doctors and bioinformatics experts alike.

  20. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  1. Zonal methods and computational fluid dynamics

    International Nuclear Information System (INIS)

    Atta, E.H.

    1985-01-01

    Recent advances in developing numerical algorithms for solving fluid flow problems, and the continuing improvement in the speed and storage of large scale computers have made it feasible to compute the flow field about complex and realistic configurations. Current solution methods involve the use of a hierarchy of mathematical models ranging from the linearized potential equation to the Navier Stokes equations. Because of the increasing complexity of both the geometries and flowfields encountered in practical fluid flow simulation, there is a growing emphasis in computational fluid dynamics on the use of zonal methods. A zonal method is one that subdivides the total flow region into interconnected smaller regions or zones. The flow solutions in these zones are then patched together to establish the global flow field solution. Zonal methods are primarily used either to limit the complexity of the governing flow equations to a localized region or to alleviate the grid generation problems about geometrically complex and multicomponent configurations. This paper surveys the application of zonal methods for solving the flow field about two and three-dimensional configurations. Various factors affecting their accuracy and ease of implementation are also discussed. From the presented review it is concluded that zonal methods promise to be very effective for computing complex flowfields and configurations. Currently there are increasing efforts to improve their efficiency, versatility, and accuracy

  2. Domain decomposition methods and parallel computing

    International Nuclear Information System (INIS)

    Meurant, G.

    1991-01-01

    In this paper, we show how to efficiently solve large linear systems on parallel computers. These linear systems arise from discretization of scientific computing problems described by systems of partial differential equations. We show how to get a discrete finite dimensional system from the continuous problem and the chosen conjugate gradient iterative algorithm is briefly described. Then, the different kinds of parallel architectures are reviewed and their advantages and deficiencies are emphasized. We sketch the problems found in programming the conjugate gradient method on parallel computers. For this algorithm to be efficient on parallel machines, domain decomposition techniques are introduced. We give results of numerical experiments showing that these techniques allow a good rate of convergence for the conjugate gradient algorithm as well as computational speeds in excess of a billion of floating point operations per second. (author). 5 refs., 11 figs., 2 tabs., 1 inset

  3. Computational and instrumental methods in EPR

    CERN Document Server

    Bender, Christopher J

    2006-01-01

    Computational and Instrumental Methods in EPR Prof. Bender, Fordham University Prof. Lawrence J. Berliner, University of Denver Electron magnetic resonance has been greatly facilitated by the introduction of advances in instrumentation and better computational tools, such as the increasingly widespread use of the density matrix formalism. This volume is devoted to both instrumentation and computation aspects of EPR, while addressing applications such as spin relaxation time measurements, the measurement of hyperfine interaction parameters, and the recovery of Mn(II) spin Hamiltonian parameters via spectral simulation. Key features: Microwave Amplitude Modulation Technique to Measure Spin-Lattice (T1) and Spin-Spin (T2) Relaxation Times Improvement in the Measurement of Spin-Lattice Relaxation Time in Electron Paramagnetic Resonance Quantitative Measurement of Magnetic Hyperfine Parameters and the Physical Organic Chemistry of Supramolecular Systems New Methods of Simulation of Mn(II) EPR Spectra: Single Cryst...

  4. Proceedings of computational methods in materials science

    International Nuclear Information System (INIS)

    Mark, J.E. Glicksman, M.E.; Marsh, S.P.

    1992-01-01

    The Symposium on which this volume is based was conceived as a timely expression of some of the fast-paced developments occurring throughout materials science and engineering. It focuses particularly on those involving modern computational methods applied to model and predict the response of materials under a diverse range of physico-chemical conditions. The current easy access of many materials scientists in industry, government laboratories, and academe to high-performance computers has opened many new vistas for predicting the behavior of complex materials under realistic conditions. Some have even argued that modern computational methods in materials science and engineering are literally redefining the bounds of our knowledge from which we predict structure-property relationships, perhaps forever changing the historically descriptive character of the science and much of the engineering

  5. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  6. Readability Comparison of Pro- and Anti-Cancer Screening Online Messages in Japan

    Science.gov (United States)

    Okuhara, Tsuyoshi; Ishikawa, Hirono; Okada, Masahumi; Kato, Mio; Kiuchi, Takahiro

    2016-01-01

    Background: Cancer screening rates are lower in Japan than those in western countries. Health professionals publish pro-cancer screening messages on the internet to encourage audiences to undergo cancer screening. However, the information provided is often difficult to read for lay persons. Further, anti-cancer screening activists warn against cancer screening with messages on the Internet. We aimed to assess and compare the readability of pro- and anti-cancer screening online messages in Japan using a measure of readability. Methods: We conducted web searches at the beginning of September 2016 using two major Japanese search engines (Google.jp and Yahoo!.jp). The included websites were classified as “anti”, “pro”, or “neutral” depending on the claims, and “health professional” or “non-health professional” depending on the writers. Readability was determined using a validated measure of Japanese readability. Statistical analysis was conducted using two-way ANOVA. Results: In the total 159 websites analyzed, anti-cancer screening online messages were generally easier to read than pro-cancer screening online messages, Messages written by health professionals were more difficult to read than those written by non-health professionals. Claim × writer interaction was not significant. Conclusion: When health professionals prepare pro-cancer screening materials for publication online, we recommend they check for readability using readability assessment tools and improve text for easy comprehension when necessary. PMID:28125867

  7. Readability Assessment of Online Patient Education Material on Congestive Heart Failure

    Science.gov (United States)

    2017-01-01

    Background Online health information is being used more ubiquitously by the general population. However, this information typically favors only a small percentage of readers, which can result in suboptimal medical outcomes for patients. Objective The readability of online patient education materials regarding the topic of congestive heart failure was assessed through six readability assessment tools. Methods The search phrase “congestive heart failure” was employed into the search engine Google. Out of the first 100 websites, only 70 were included attending to compliance with selection and exclusion criteria. These were then assessed through six readability assessment tools. Results Only 5 out of 70 websites were within the limits of the recommended sixth-grade readability level. The mean readability scores were as follows: the Flesch-Kincaid Grade Level (9.79), Gunning-Fog Score (11.95), Coleman-Liau Index (15.17), Simple Measure of Gobbledygook (SMOG) index (11.39), and the Flesch Reading Ease (48.87). Conclusion Most of the analyzed websites were found to be above the sixth-grade readability level recommendations. Efforts need to be made to better tailor online patient education materials to the general population. PMID:28656111

  8. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  9. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  10. The asymptotic expansion method via symbolic computation

    OpenAIRE

    Navarro, Juan F.

    2012-01-01

    This paper describes an algorithm for implementing a perturbation method based on an asymptotic expansion of the solution to a second-order differential equation. We also introduce a new symbolic computation system which works with the so-called modified quasipolynomials, as well as an implementation of the algorithm on it.

  11. The Asymptotic Expansion Method via Symbolic Computation

    Directory of Open Access Journals (Sweden)

    Juan F. Navarro

    2012-01-01

    Full Text Available This paper describes an algorithm for implementing a perturbation method based on an asymptotic expansion of the solution to a second-order differential equation. We also introduce a new symbolic computation system which works with the so-called modified quasipolynomials, as well as an implementation of the algorithm on it.

  12. Computationally efficient methods for digital control

    NARCIS (Netherlands)

    Guerreiro Tome Antunes, D.J.; Hespanha, J.P.; Silvestre, C.J.; Kataria, N.; Brewer, F.

    2008-01-01

    The problem of designing a digital controller is considered with the novelty of explicitly taking into account the computation cost of the controller implementation. A class of controller emulation methods inspired by numerical analysis is proposed. Through various examples it is shown that these

  13. [Readability of surgical informed consent in Spain].

    Science.gov (United States)

    San Norberto, Enrique María; Gómez-Alonso, Daniel; Trigueros, José M; Quiroga, Jorge; Gualis, Javier; Vaquero, Carlos

    2014-03-01

    To assess the readability of informed consent documents (IC) of the different national surgical societies. During January 2012 we collected 504 IC protocols of different specialties. To calculate readability parameters the following criteria were assessed: number of words, syllables and phrases, syllables/word and word/phrase averages, Word correlation index, Flesch-Szigriszt index, Huerta Fernández index, Inflesz scale degree and the Gunning-Fog index. The mean Flesch-Szigriszt index was 50.65 ± 6,72, so readability is considered normal. There are significant differences between specialties such as Urology (43.00 ± 4.17) and Angiology and Vascular Surgery (63.00 ± 3.26, P<.001). No IC would be appropriate for adult readability according to the Fernández-Huerta index (total mean 55.77 ± 6.57); the IC of Angiology and Vascular Surgery were the closest ones (67.85 ± 3.20). Considering the Inflesz scale degree (total mean of 2.84 ± 3,23), IC can be described as «somewhat difficult». There are significant differences between the IC of Angiology and Vascular Surgery (3.23 ± 0.47) that could be qualified as normal, or Cardiovascular Surgery (2.79 ± 0.43) as «nearly normal readability»; and others such as Urology (1, 70 ± 0.46, P<.001) and Thoracic Surgery (1.90 ± 0.30, P<.001), with a readability between «very» and «somewhat» difficult. The Gunning-Fog indexes are far from the readability for a general audience (total mean of 26.29 ± 10,89). IC developed by scientific societies of different surgical specialties do not have an adequate readability for patients. We recommend the use of readability indexes during the writing of these consent forms. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  14. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  15. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  16. Computational Methods in Stochastic Dynamics Volume 2

    CERN Document Server

    Stefanou, George; Papadopoulos, Vissarion

    2013-01-01

    The considerable influence of inherent uncertainties on structural behavior has led the engineering community to recognize the importance of a stochastic approach to structural problems. Issues related to uncertainty quantification and its influence on the reliability of the computational models are continuously gaining in significance. In particular, the problems of dynamic response analysis and reliability assessment of structures with uncertain system and excitation parameters have been the subject of continuous research over the last two decades as a result of the increasing availability of powerful computing resources and technology.   This book is a follow up of a previous book with the same subject (ISBN 978-90-481-9986-0) and focuses on advanced computational methods and software tools which can highly assist in tackling complex problems in stochastic dynamic/seismic analysis and design of structures. The selected chapters are authored by some of the most active scholars in their respective areas and...

  17. Readability of Online Materials for Rhinoplasty.

    Science.gov (United States)

    Santos, Pauline Joy F; Daar, David A; Paydar, Keyianoosh Z; Wirth, Garrett A

    2018-01-01

    Rhinoplasty is a popular aesthetic and reconstructive surgical procedure. However, little is known about the content and readability of online materials for patient education. The recommended grade level for educational materials is 7th to 8th grade according to the National Institutes of Health (NIH). This study aims to assess the readability of online patient resources for rhinoplasty. The largest public search engine, Google, was queried using the term "rhinoplasty" on February 26, 2016. Location filters were disabled and sponsored results excluded to avoid any inadvertent search bias. The 10 most popular websites were identified and all relevant, patient-directed information within one click from the original site was downloaded and saved as plain text. Readability was analyzed using five established analyses (Readability-score.com, Added Bytes, Ltd., Sussex, UK). Analysis of ten websites demonstrates an average grade level of at least 12 th grade. No material was at the recommended 7 th to 8 th grade reading level (Flesch-Kincaid, 11.1; Gunning-Fog, 14.1; Coleman-Liau, 14.5; SMOG 10.4; Automated Readability, 10.7; Average Grade Level, 12.2). Overall Flesch-Kincaid Reading Ease Index was 43.5, which is rated as "difficult." Online materials available for rhinoplasty exceed NIH-recommended reading levels, which may prevent appropriate decision-making in patients considering these types of surgery. Outcomes of this study identify that Plastic Surgeons should be cognizant of available online patient materials and make efforts to develop and provide more appropriate materials. Readability results can also contribute to marketing strategy and attracting a more widespread interest in the procedure.

  18. Computational methods for industrial radiation measurement applications

    International Nuclear Information System (INIS)

    Gardner, R.P.; Guo, P.; Ao, Q.

    1996-01-01

    Computational methods have been used with considerable success to complement radiation measurements in solving a wide range of industrial problems. The almost exponential growth of computer capability and applications in the last few years leads to a open-quotes black boxclose quotes mentality for radiation measurement applications. If a black box is defined as any radiation measurement device that is capable of measuring the parameters of interest when a wide range of operating and sample conditions may occur, then the development of computational methods for industrial radiation measurement applications should now be focused on the black box approach and the deduction of properties of interest from the response with acceptable accuracy and reasonable efficiency. Nowadays, increasingly better understanding of radiation physical processes, more accurate and complete fundamental physical data, and more advanced modeling and software/hardware techniques have made it possible to make giant strides in that direction with new ideas implemented with computer software. The Center for Engineering Applications of Radioisotopes (CEAR) at North Carolina State University has been working on a variety of projects in the area of radiation analyzers and gauges for accomplishing this for quite some time, and they are discussed here with emphasis on current accomplishments

  19. BLUES function method in computational physics

    Science.gov (United States)

    Indekeu, Joseph O.; Müller-Nedebock, Kristian K.

    2018-04-01

    We introduce a computational method in physics that goes ‘beyond linear use of equation superposition’ (BLUES). A BLUES function is defined as a solution of a nonlinear differential equation (DE) with a delta source that is at the same time a Green’s function for a related linear DE. For an arbitrary source, the BLUES function can be used to construct an exact solution to the nonlinear DE with a different, but related source. Alternatively, the BLUES function can be used to construct an approximate piecewise analytical solution to the nonlinear DE with an arbitrary source. For this alternative use the related linear DE need not be known. The method is illustrated in a few examples using analytical calculations and numerical computations. Areas for further applications are suggested.

  20. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  1. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  2. Computational methods of electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.

    1983-01-01

    A review of computational methods simulating the non-plasma transport of electrons and their attendant cascades is presented. Remarks are mainly restricted to linearized formalisms at electron energies above 1 keV. The effectiveness of various metods is discussed including moments, point-kernel, invariant imbedding, discrete-ordinates, and Monte Carlo. Future research directions and the potential impact on various aspects of science and engineering are indicated

  3. Mathematical optics classical, quantum, and computational methods

    CERN Document Server

    Lakshminarayanan, Vasudevan

    2012-01-01

    Going beyond standard introductory texts, Mathematical Optics: Classical, Quantum, and Computational Methods brings together many new mathematical techniques from optical science and engineering research. Profusely illustrated, the book makes the material accessible to students and newcomers to the field. Divided into six parts, the text presents state-of-the-art mathematical methods and applications in classical optics, quantum optics, and image processing. Part I describes the use of phase space concepts to characterize optical beams and the application of dynamic programming in optical wave

  4. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  5. Delamination detection using methods of computational intelligence

    Science.gov (United States)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  6. A readability assessment of online stroke information.

    Science.gov (United States)

    Sharma, Nikhil; Tridimas, Andreas; Fitzsimmons, Paul R

    2014-07-01

    Patients and carers increasingly access the Internet as a source of health information. Poor health literacy is extremely common and frequently limits patient's comprehension of health care information literature. We aimed to assess the readability of online consumer-orientated stroke information using 2 validated readability measures. The 100 highest Google ranked consumer-oriented stroke Web pages were assessed for reading difficulty using the Flesch-Kincaid and Simple Measure of Gobbledygook (SMOG) formulae. None of the included Web pages complied with the current readability guidelines when readability was measured using the gold standard SMOG formula. Mean Flesch-Kincaid grade level was 10.4 (95% confidence interval [CI] 9.97-10.9) and mean SMOG grade 12.1 (95% CI 11.7-12.4). Over half of the Web pages were produced at graduate reading levels or above. Not-for-profit Web pages were significantly easier to read (P=.0006). The Flesch-Kincaid formula significantly underestimated reading difficulty, with a mean underestimation of 1.65 grades (95% CI 1.49-1.81), Preadability guidelines and to be comprehensible to the average patient. The Flesch-Kincaid formula significantly underestimates reading difficulty, and SMOG should be used as the measure of choice. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  7. Readability Revisited? The Implications of Text Complexity

    Science.gov (United States)

    Wray, David; Janan, Dahlia

    2013-01-01

    The concept of readability has had a variable history, moving from a position where it was considered as a very important topic for those responsible for producing texts and matching those texts to the abilities and needs of learners, to its current declining visibility in the education literature. Some important work has been coming from the USA…

  8. Readability of Early Intervention Program Literature

    Science.gov (United States)

    Pizur-Barnekow, Kris; Patrick, Timothy; Rhyner, Paula M.; Cashin, Susan; Rentmeester, Angela

    2011-01-01

    Accessibility of early intervention program literature was examined through readability analysis of documents given to families who have a child served by the Birth to 3 program. Nine agencies that serve families in Birth to 3 programs located in a county in the Midwest provided the (n = 94) documents. Documents were included in the analysis if…

  9. A Readability Analysis of Selected Introductory Economics.

    Science.gov (United States)

    Gallagher, Daniel J.; Thompson, G. Rodney

    1981-01-01

    To aid secondary school and college level economics teachers as they select textbooks for introductory economics courses, this article recounts how teachers can use the Flesch Reading Ease Test to measure readability. Data are presented on application of the Flesch Reading Ease Test to 15 introductory economics textbooks. (Author/DB)

  10. Measuring the Readability of Children's Trade Books.

    Science.gov (United States)

    Popp, Helen M.; Porter, Douglas

    In order to utilize interesting children's trade books in a systematic reading program, two readability formulas were devised based on a selection of children's trade books. Children's scores on selections from these books and judges' rankings were compared. The judges' decisions were considered to be highly credible and were used as the criterion…

  11. Efficient computation method of Jacobian matrix

    International Nuclear Information System (INIS)

    Sasaki, Shinobu

    1995-05-01

    As well known, the elements of the Jacobian matrix are complex trigonometric functions of the joint angles, resulting in a matrix of staggering complexity when we write it all out in one place. This article addresses that difficulties to this subject are overcome by using velocity representation. The main point is that its recursive algorithm and computer algebra technologies allow us to derive analytical formulation with no human intervention. Particularly, it is to be noted that as compared to previous results the elements are extremely simplified throughout the effective use of frame transformations. Furthermore, in case of a spherical wrist, it is shown that the present approach is computationally most efficient. Due to such advantages, the proposed method is useful in studying kinematically peculiar properties such as singularity problems. (author)

  12. Computational method for free surface hydrodynamics

    International Nuclear Information System (INIS)

    Hirt, C.W.; Nichols, B.D.

    1980-01-01

    There are numerous flow phenomena in pressure vessel and piping systems that involve the dynamics of free fluid surfaces. For example, fluid interfaces must be considered during the draining or filling of tanks, in the formation and collapse of vapor bubbles, and in seismically shaken vessels that are partially filled. To aid in the analysis of these types of flow phenomena, a new technique has been developed for the computation of complicated free-surface motions. This technique is based on the concept of a local average volume of fluid (VOF) and is embodied in a computer program for two-dimensional, transient fluid flow called SOLA-VOF. The basic approach used in the VOF technique is briefly described, and compared to other free-surface methods. Specific capabilities of the SOLA-VOF program are illustrated by generic examples of bubble growth and collapse, flows of immiscible fluid mixtures, and the confinement of spilled liquids

  13. Soft Computing Methods for Disulfide Connectivity Prediction.

    Science.gov (United States)

    Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S

    2015-01-01

    The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.

  14. Readability levels of health pamphlets distributed in hospitals and health centres in Athens, Greece.

    Science.gov (United States)

    Kondilis, B K; Akrivos, P D; Sardi, T A; Soteriades, E S; Falagas, M E

    2010-10-01

    Health literacy is important in the medical and social sciences due to its impact on behavioural and health outcomes. Nevertheless, little is known about it in Greece, including patients' level of understanding health brochures and pamphlets distributed in Greek hospitals and clinics. Observational study in the greater metropolitan area of Athens, Greece. Pamphlets and brochures written in the Greek language were collected from 17 hospitals and healthcare centres between the spring and autumn of 2006. Readability of pamphlets using the Flesch-Kincaid, Simple Measure of Gobbledygook (SMOG) and Fog methods was calculated based on a Greek readability software. Out of 70 pamphlets collected from 17 hospitals, 37 pamphlets met the criteria for the study. The average readability level of all scanned pamphlets was ninth to 10th grade, corresponding to a readability level of 'average'. A highly significant difference (PPamphlets from private hospitals were one grade more difficult than those from public hospitals. Approximately 43.7% of the Greek population aged ≥20 years would not be able to comprehend the available pamphlets, which were found to have an average readability level of ninth to 10th grade. Further research examining readability levels in the context of health literacy in Greece is warranted. This effort paves the way for additional research in the field of readability levels of health pamphlets in the Greek language, the sources of health information, and the level of understanding of key health messages by the population. Copyright © 2010 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  16. Readability Assessment of Online Patient Abdominoplasty Resources.

    Science.gov (United States)

    Phillips, Nicole A; Vargas, Christina R; Chuang, Danielle J; Lee, Bernard T

    2015-02-01

    Limited functional health literacy is recognized as an important contributor to health disparities in the United States. As internet access becomes more universal, there is increasing concern about whether patients with poor or marginal literacy can access understandable healthcare information. As such, the National Institutes of Health and American Medical Association recommend that patient information be written at a sixth grade level. This study identifies the most popular online resources for patient information about abdominoplasty and evaluates their readability in the context of average American literacy. The two largest internet search engines were queried for "tummy tuck surgery" to simulate a patient search in lay terms. The ten most popular sites common to both search engines were identified, and all relevant articles from the main sites were downloaded. Sponsored results were excluded. Readability analysis of the articles was performed using ten established tests. Online information about abdominoplasty from the ten most popular publically available websites had an overall average readability of 12th grade. Mean reading grade level scores among tests were: Coleman-Liau 11.9, Flesch-Kincaid 11.4, FORCAST 11.1, Fry 13, Gunning Fog 13.5, New Dale-Chall 11.8, New Fog Count 9.9, Raygor Estimate 12, and SMOG 13.4; Flesch Reading Ease index score was 46. Online patient resources about abdominoplasty are uniformly above the recommended target readability level and are likely too difficult for many patients to understand. A range of readability identified among websites could allow surgeons to guide patients to more appropriate resources for their literacy skills.

  17. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  18. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  19. A computational method for sharp interface advection

    Science.gov (United States)

    Bredmose, Henrik; Jasak, Hrvoje

    2016-01-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face–interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM® extension and is published as open source. PMID:28018619

  20. A computational method for sharp interface advection.

    Science.gov (United States)

    Roenby, Johan; Bredmose, Henrik; Jasak, Hrvoje

    2016-11-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face-interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM ® extension and is published as open source.

  1. Computational electromagnetic methods for transcranial magnetic stimulation

    Science.gov (United States)

    Gomez, Luis J.

    Transcranial magnetic stimulation (TMS) is a noninvasive technique used both as a research tool for cognitive neuroscience and as a FDA approved treatment for depression. During TMS, coils positioned near the scalp generate electric fields and activate targeted brain regions. In this thesis, several computational electromagnetics methods that improve the analysis, design, and uncertainty quantification of TMS systems were developed. Analysis: A new fast direct technique for solving the large and sparse linear system of equations (LSEs) arising from the finite difference (FD) discretization of Maxwell's quasi-static equations was developed. Following a factorization step, the solver permits computation of TMS fields inside realistic brain models in seconds, allowing for patient-specific real-time usage during TMS. The solver is an alternative to iterative methods for solving FD LSEs, often requiring run-times of minutes. A new integral equation (IE) method for analyzing TMS fields was developed. The human head is highly-heterogeneous and characterized by high-relative permittivities (107). IE techniques for analyzing electromagnetic interactions with such media suffer from high-contrast and low-frequency breakdowns. The novel high-permittivity and low-frequency stable internally combined volume-surface IE method developed. The method not only applies to the analysis of high-permittivity objects, but it is also the first IE tool that is stable when analyzing highly-inhomogeneous negative permittivity plasmas. Design: TMS applications call for electric fields to be sharply focused on regions that lie deep inside the brain. Unfortunately, fields generated by present-day Figure-8 coils stimulate relatively large regions near the brain surface. An optimization method for designing single feed TMS coil-arrays capable of producing more localized and deeper stimulation was developed. Results show that the coil-arrays stimulate 2.4 cm into the head while stimulating 3

  2. Computational predictive methods for fracture and fatigue

    Science.gov (United States)

    Cordes, J.; Chang, A. T.; Nelson, N.; Kim, Y.

    1994-09-01

    The damage-tolerant design philosophy as used by aircraft industries enables aircraft components and aircraft structures to operate safely with minor damage, small cracks, and flaws. Maintenance and inspection procedures insure that damages developed during service remain below design values. When damage is found, repairs or design modifications are implemented and flight is resumed. Design and redesign guidelines, such as military specifications MIL-A-83444, have successfully reduced the incidence of damage and cracks. However, fatigue cracks continue to appear in aircraft well before the design life has expired. The F16 airplane, for instance, developed small cracks in the engine mount, wing support, bulk heads, the fuselage upper skin, the fuel shelf joints, and along the upper wings. Some cracks were found after 600 hours of the 8000 hour design service life and design modifications were required. Tests on the F16 plane showed that the design loading conditions were close to the predicted loading conditions. Improvements to analytic methods for predicting fatigue crack growth adjacent to holes, when multiple damage sites are present, and in corrosive environments would result in more cost-effective designs, fewer repairs, and fewer redesigns. The overall objective of the research described in this paper is to develop, verify, and extend the computational efficiency of analysis procedures necessary for damage tolerant design. This paper describes an elastic/plastic fracture method and an associated fatigue analysis method for damage tolerant design. Both methods are unique in that material parameters such as fracture toughness, R-curve data, and fatigue constants are not required. The methods are implemented with a general-purpose finite element package. Several proof-of-concept examples are given. With further development, the methods could be extended for analysis of multi-site damage, creep-fatigue, and corrosion fatigue problems.

  3. Modules and methods for all photonic computing

    Science.gov (United States)

    Schultz, David R.; Ma, Chao Hung

    2001-01-01

    A method for all photonic computing, comprising the steps of: encoding a first optical/electro-optical element with a two dimensional mathematical function representing input data; illuminating the first optical/electro-optical element with a collimated beam of light; illuminating a second optical/electro-optical element with light from the first optical/electro-optical element, the second optical/electro-optical element having a characteristic response corresponding to an iterative algorithm useful for solving a partial differential equation; iteratively recirculating the signal through the second optical/electro-optical element with light from the second optical/electro-optical element for a predetermined number of iterations; and, after the predetermined number of iterations, optically and/or electro-optically collecting output data representing an iterative optical solution from the second optical/electro-optical element.

  4. Optical design teaching by computing graphic methods

    Science.gov (United States)

    Vazquez-Molini, D.; Muñoz-Luna, J.; Fernandez-Balbuena, A. A.; Garcia-Botella, A.; Belloni, P.; Alda, J.

    2012-10-01

    One of the key challenges in the teaching of Optics is that students need to know not only the math of the optical design, but also, and more important, to grasp and understand the optics in a three-dimensional space. Having a clear image of the problem to solve is the first step in order to begin to solve that problem. Therefore to achieve that the students not only must know the equation of refraction law but they have also to understand how the main parameters of this law are interacting among them. This should be a major goal in the teaching course. Optical graphic methods are a valuable tool in this way since they have the advantage of visual information and the accuracy of a computer calculation.

  5. The readability of American Academy of Pediatrics patient education brochures.

    Science.gov (United States)

    Freda, Margaret Comerford

    2005-01-01

    The purpose of this study was to evaluate the readability of American Academy of Pediatrics (AAP) patient education brochures. Seventy-four brochures were analyzed using two readability formulas. Mean readability for all 74 brochures was grade 7.94 using the Flesch-Kincaid formula, and grade 10.1 with SMOG formula (P = .001). Using the SMOG formula, no brochures were of acceptably low (education brochures have acceptably low levels of readability, but at least half are written at higher than acceptable readability levels for the general public. This study also demonstrated statistically significant variability between the two different readability formulas; had only the SMOG formula been used, all of the brochures would have had unacceptably high readability levels. Readability is an essential concept for patient education materials. Professional associations that develop and market patient education materials should test for readability and publish those readability levels on each piece of patient education so health care providers will know if the materials are appropriate for their patients.

  6. Computed tomography shielding methods: a literature review.

    Science.gov (United States)

    Curtis, Jessica Ryann

    2010-01-01

    To investigate available shielding methods in an effort to further awareness and understanding of existing preventive measures related to patient exposure in computed tomography (CT) scanning. Searches were conducted to locate literature discussing the effectiveness of commercially available shields. Literature containing information regarding breast, gonad, eye and thyroid shielding was identified. Because of rapidly advancing technology, the selection of articles was limited to those published within the past 5 years. The selected studies were examined using the following topics as guidelines: the effectiveness of the shield (percentage of dose reduction), the shield's effect on image quality, arguments for or against its use (including practicality) and overall recommendation for its use in clinical practice. Only a limited number of studies have been performed on the use of shields for the eyes, thyroid and gonads, but the evidence shows an overall benefit to their use. Breast shielding has been the most studied shielding method, with consistent agreement throughout the literature on its effectiveness at reducing radiation dose. The effect of shielding on image quality was not remarkable in a majority of studies. Although it is noted that more studies need to be conducted regarding the impact on image quality, the currently published literature stresses the importance of shielding in reducing dose. Commercially available shields for the breast, thyroid, eyes and gonads should be implemented in clinical practice. Further research is needed to ascertain the prevalence of shielding in the clinical setting.

  7. Computational methods in calculating superconducting current problems

    Science.gov (United States)

    Brown, David John, II

    Various computational problems in treating superconducting currents are examined. First, field inversion in spatial Fourier transform space is reviewed to obtain both one-dimensional transport currents flowing down a long thin tape, and a localized two-dimensional current. The problems associated with spatial high-frequency noise, created by finite resolution and experimental equipment, are presented, and resolved with a smooth Gaussian cutoff in spatial frequency space. Convergence of the Green's functions for the one-dimensional transport current densities is discussed, and particular attention is devoted to the negative effects of performing discrete Fourier transforms alone on fields asymptotically dropping like 1/r. Results of imaging simulated current densities are favorably compared to the original distributions after the resulting magnetic fields undergo the imaging procedure. The behavior of high-frequency spatial noise, and the behavior of the fields with a 1/r asymptote in the imaging procedure in our simulations is analyzed, and compared to the treatment of these phenomena in the published literature. Next, we examine calculation of Mathieu and spheroidal wave functions, solutions to the wave equation in elliptical cylindrical and oblate and prolate spheroidal coordinates, respectively. These functions are also solutions to Schrodinger's equations with certain potential wells, and are useful in solving time-varying superconducting problems. The Mathieu functions are Fourier expanded, and the spheroidal functions expanded in associated Legendre polynomials to convert the defining differential equations to recursion relations. The infinite number of linear recursion equations is converted to an infinite matrix, multiplied by a vector of expansion coefficients, thus becoming an eigenvalue problem. The eigenvalue problem is solved with root solvers, and the eigenvector problem is solved using a Jacobi-type iteration method, after preconditioning the

  8. Computational Studies of Protein Hydration Methods

    Science.gov (United States)

    Morozenko, Aleksandr

    It is widely appreciated that water plays a vital role in proteins' functions. The long-range proton transfer inside proteins is usually carried out by the Grotthuss mechanism and requires a chain of hydrogen bonds that is composed of internal water molecules and amino acid residues of the protein. In other cases, water molecules can facilitate the enzymes catalytic reactions by becoming a temporary proton donor/acceptor. Yet a reliable way of predicting water protein interior is still not available to the biophysics community. This thesis presents computational studies that have been performed to gain insights into the problems of fast and accurate prediction of potential water sites inside internal cavities of protein. Specifically, we focus on the task of attainment of correspondence between results obtained from computational experiments and experimental data available from X-ray structures. An overview of existing methods of predicting water molecules in the interior of a protein along with a discussion of the trustworthiness of these predictions is a second major subject of this thesis. A description of differences of water molecules in various media, particularly, gas, liquid and protein interior, and theoretical aspects of designing an adequate model of water for the protein environment are widely discussed in chapters 3 and 4. In chapter 5, we discuss recently developed methods of placement of water molecules into internal cavities of a protein. We propose a new methodology based on the principle of docking water molecules to a protein body which allows to achieve a higher degree of matching experimental data reported in protein crystal structures than other techniques available in the world of biophysical software. The new methodology is tested on a set of high-resolution crystal structures of oligopeptide-binding protein (OppA) containing a large number of resolved internal water molecules and applied to bovine heart cytochrome c oxidase in the fully

  9. Informed consent and the readability of the written consent form.

    Science.gov (United States)

    Sivanadarajah, N; El-Daly, I; Mamarelis, G; Sohail, M Z; Bates, P

    2017-11-01

    Introduction The aim of this study was to objectively ascertain the level of readability of standardised consent forms for orthopaedic procedures. Methods Standardised consent forms (both in summary and detailed formats) endorsed by the British Orthopaedic Association (BOA) were retrieved from orthoconsent.com and assessed for readability. This involved using an online tool to calculate the validated Flesch reading ease score (FRES). This was compared with the FRES for the National Health Service (NHS) Consent Form 1. Data were analysed and interpreted according to the FRES grading table. Results The FRES for Consent Form 1 was 55.6, relating to the literacy expected of an A level student. The mean FRES for the BOA summary consent forms (n=27) was 63.6 (95% confidence interval [CI]: 61.2-66.0) while for the detailed consent forms (n=32), it was 68.9 (95% CI: 67.7-70.0). All BOA detailed forms scored >60, correlating to the literacy expected of a 13-15-year-old. The detailed forms had a higher FRES than the summary forms (p<0.001). Conclusions This study demonstrates that the BOA endorsed standardised consent forms are much easier to read and understand than the NHS Consent Form 1, with the detailed BOA forms being the easiest to read. Despite this, owing to varying literacy levels, a significant proportion of patients may struggle to give informed consent based on the written information provided to them.

  10. Readability Levels of Health-Based Websites: From Content to Comprehension

    Science.gov (United States)

    Schutten, Mary; McFarland, Allison

    2009-01-01

    Three of the national health education standards include decision-making, accessing information and analyzing influences. WebQuests are a popular inquiry-oriented method used by secondary teachers to help students achieve these content standards. While WebQuests support higher level thinking skills, the readability level of the information on the…

  11. Reconstructing Readability: Recent Developments and Recommendations in the Analysis of Text Difficulty

    Science.gov (United States)

    Benjamin, Rebekah George

    2012-01-01

    Largely due to technological advances, methods for analyzing readability have increased significantly in recent years. While past researchers designed hundreds of formulas to estimate the difficulty of texts for readers, controversy has surrounded their use for decades, with criticism stemming largely from their application in creating new texts…

  12. Aligning Theme and Information Structure to Improve the Readability of Technical Writing

    Science.gov (United States)

    Moore, N. A. J.

    2006-01-01

    The readability of technical writing, and technical manuals in particular, especially for second language readers, can be noticeably improved by pairing Theme with Given and Rheme with New. This allows for faster processing of text and easier access to the "method of development" of the text. Typical Theme-Rheme patterns are described, and the…

  13. Beyond Readability: Investigating Coherence of Clinical Text for Consumers

    Science.gov (United States)

    Hetzel, Scott; Dalrymple, Prudence; Keselman, Alla

    2011-01-01

    Background A basic tenet of consumer health informatics is that understandable health resources empower the public. Text comprehension holds great promise for helping to characterize consumer problems in understanding health texts. The need for efficient ways to assess consumer-oriented health texts and the availability of computationally supported tools led us to explore the effect of various text characteristics on readers’ understanding of health texts, as well as to develop novel approaches to assessing these characteristics. Objective The goal of this study was to compare the impact of two different approaches to enhancing readability, and three interventions, on individuals’ comprehension of short, complex passages of health text. Methods Participants were 80 university staff, faculty, or students. Each participant was asked to “retell” the content of two health texts: one a clinical trial in the domain of diabetes mellitus, and the other typical Visit Notes. These texts were transformed for the intervention arms of the study. Two interventions provided terminology support via (1) standard dictionary or (2) contextualized vocabulary definitions. The third intervention provided coherence improvement. We assessed participants’ comprehension of the clinical texts through propositional analysis, an open-ended questionnaire, and analysis of the number of errors made. Results For the clinical trial text, the effect of text condition was not significant in any of the comparisons, suggesting no differences in recall, despite the varying levels of support (P = .84). For the Visit Note, however, the difference in the median total propositions recalled between the Coherent and the (Original + Dictionary) conditions was significant (P = .04). This suggests that participants in the Coherent condition recalled more of the original Visit Notes content than did participants in the Original and the Dictionary conditions combined. However, no difference was seen

  14. Computing and physical methods to calculate Pu

    International Nuclear Information System (INIS)

    Mohamed, Ashraf Elsayed Mohamed

    2013-01-01

    Main limitations due to the enhancement of the plutonium content are related to the coolant void effect as the spectrum becomes faster, the neutron flux in the thermal region tends towards zero and is concentrated in the region from 10 Ke to 1 MeV. Thus, all captures by 240 Pu and 242 Pu in the thermal and epithermal resonance disappear and the 240 Pu and 242 Pu contributions to the void effect became positive. The higher the Pu content and the poorer the Pu quality, the larger the void effect. The core control in nominal or transient conditions Pu enrichment leads to a decrease in (B eff.), the efficiency of soluble boron and control rods. Also, the Doppler effect tends to decrease when Pu replaces U, so, that in case of transients the core could diverge again if the control is not effective enough. As for the voiding effect, the plutonium degradation and the 240 Pu and 242 Pu accumulation after multiple recycling lead to spectrum hardening and to a decrease in control. One solution would be to use enriched boron in soluble boron and shutdown rods. In this paper, I discuss and show the advanced computing and physical methods to calculate Pu inside the nuclear reactors and glovebox and the different solutions to be used to overcome the difficulties that effect, on safety parameters and on reactor performance, and analysis the consequences of plutonium management on the whole fuel cycle like Raw materials savings, fraction of nuclear electric power involved in the Pu management. All through two types of scenario, one involving a low fraction of the nuclear park dedicated to plutonium management, the other involving a dilution of the plutonium in all the nuclear park. (author)

  15. Readability Levels of Dental Patient Education Brochures.

    Science.gov (United States)

    Boles, Catherine D; Liu, Ying; November-Rider, Debra

    2016-02-01

    The objective of this study was to evaluate dental patient education brochures produced since 2000 to determine if there is any change in the Flesch-Kincaid grade level readability. A convenience sample of 36 brochures was obtained for analysis of the readability of the patient education material on multiple dental topics. Readability was measured using the Flesch-Kincaid Grade Level through Microsoft Word. Pearson's correlation was used to describe the relationship among the factors of interest. Backward model selection of multiple linear regression model was used to investigate the relationship between Flesch-Kincaid Grade level and a set of predictors included in this study. A convenience sample (n=36) of dental education brochures produced from 2000 to 2014 showed a mean Flesch-Kincaid reading grade level of 9.15. Weak to moderate correlations existed between word count and grade level (r=0.40) and characters count and grade level (r=0.46); strong correlations were found between grade level and average words per sentence (r=0.70), average characters per word (r=0.85) and Flesch Reading Ease (r=-0.98). Only 1 brochure out of the sample met the recommended sixth grade reading level (Flesch-Kincaid Grade Level 5.7). Overall, the Flesch-Kincaid Grade Level of all brochures was significantly higher than the recommended sixth grade reading level (preadability of the brochures. However, the majority of the brochures analyzed are still testing above the recommended sixth grade reading level. Copyright © 2016 The American Dental Hygienists’ Association.

  16. Computational methods in sequence and structure prediction

    Science.gov (United States)

    Lang, Caiyi

    This dissertation is organized into two parts. In the first part, we will discuss three computational methods for cis-regulatory element recognition in three different gene regulatory networks as the following: (a) Using a comprehensive "Phylogenetic Footprinting Comparison" method, we will investigate the promoter sequence structures of three enzymes (PAL, CHS and DFR) that catalyze sequential steps in the pathway from phenylalanine to anthocyanins in plants. Our result shows there exists a putative cis-regulatory element "AC(C/G)TAC(C)" in the upstream of these enzyme genes. We propose this cis-regulatory element to be responsible for the genetic regulation of these three enzymes and this element, might also be the binding site for MYB class transcription factor PAP1. (b) We will investigate the role of the Arabidopsis gene glutamate receptor 1.1 (AtGLR1.1) in C and N metabolism by utilizing the microarray data we obtained from AtGLR1.1 deficient lines (antiAtGLR1.1). We focus our investigation on the putatively co-regulated transcript profile of 876 genes we have collected in antiAtGLR1.1 lines. By (a) scanning the occurrence of several groups of known abscisic acid (ABA) related cisregulatory elements in the upstream regions of 876 Arabidopsis genes; and (b) exhaustive scanning of all possible 6-10 bps motif occurrence in the upstream regions of the same set of genes, we are able to make a quantative estimation on the enrichment level of each of the cis-regulatory element candidates. We finally conclude that one specific cis-regulatory element group, called "ABRE" elements, are statistically highly enriched within the 876-gene group as compared to their occurrence within the genome. (c) We will introduce a new general purpose algorithm, called "fuzzy REDUCE1", which we have developed recently for automated cis-regulatory element identification. In the second part, we will discuss our newly devised protein design framework. With this framework we have developed

  17. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  18. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  19. Advanced Computational Methods in Bio-Mechanics.

    Science.gov (United States)

    Al Qahtani, Waleed M S; El-Anwar, Mohamed I

    2018-04-15

    A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.

  20. Augmented reality with image registration, vision correction and sunlight readability via liquid crystal devices

    OpenAIRE

    Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin

    2017-01-01

    Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-...

  1. Readability of patient information pamphlets in urogynecology.

    Science.gov (United States)

    Reagan, Krista M L; O'Sullivan, David M; Harvey, David P; Lasala, Christine A

    2015-01-01

    The purpose of this study was to determine the reading level of frequently used patient information pamphlets and documents in the field of urogynecology. Urogynecology pamphlets were identified from a variety of sources. Readability was determined using 4 different accepted formulas: the Flesch-Kincaid Grade Level, the simple measure of gobbledygook Index, the Coleman-Liau Index, and the Gunning Fog index. The scores were calculated using an online calculator (http://www.readability-score.com). Descriptive statistics were used for analysis. The average of the 4 scores was calculated for each pamphlet. Subsequently, Z-scores were used to standardize the averages between the reading scales. Of the 40 documents reviewed, only a single pamphlet met the National Institutes of Health-recommended reading level. This document was developed by the American Urological Association and was specifically designated as a "Low-Literacy Brochure." The remainder of the patient education pamphlets, from both industry-sponsored and academic-sponsored sources, consistently rated above the recommended reading level for maximum comprehension. The majority of patient education pamphlets, from both industry-sponsored and academic-sponsored sources, are above the reading level recommended by the National Institutes of Health for maximum patient comprehension. Future work should be done to improve the educational resources available to patients by simplifying the verbiage in these documents.

  2. Readability and Understandability of Online Vocal Cord Paralysis Materials.

    Science.gov (United States)

    Balakrishnan, Vini; Chandy, Zachariah; Hseih, Amy; Bui, Thanh-Lan; Verma, Sunil P

    2016-03-01

    Patients use several online resources to learn about vocal cord paralysis (VCP). The objective of this study was to assess the readability and understandability of online VCP patient education materials (PEMs), with readability assessments and the Patient Education Materials Evaluation Tool (PEMAT), respectively. The relationship between readability and understandability was then analyzed. Descriptive and correlational design. Online PEMs were identified by performing a Google search with the term "vocal cord paralysis." After scientific webpages, news articles, and information for medical professionals were excluded, 29 articles from the first 50 search results were considered. Readability analysis was performed with 6 formulas. Four individuals with different educational backgrounds conducted understandability analysis with the PEMAT. Fleiss's Kappa interrater reliability analysis determined consistency among raters. Correlation between readability and understandability was determined with Pearson's correlation test. The reading level of the reviewed articles ranged from grades 9 to 17. Understandability ranged from 29% to 82%. Correlation analysis demonstrated a strong negative correlation between materials' readability and understandability (r = -0.462, P Online PEMs pertaining to VCP are written above the recommended reading levels. Overall, materials written at lower grade levels are more understandable. However, articles of identical grade levels had varying levels of understandability. The PEMAT may provide a more critical evaluation of the quality of a PEM when compared with readability formulas. Both readability and understandability should be used to evaluate PEMs. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  3. Readability Levels of the 1975 Third Grade Macmillan Basal Readers.

    Science.gov (United States)

    McKinney, Ernestine Williams

    1983-01-01

    Analysis of third-grade books in the New Macmillan Reading Program reveals that the books exceeded the publisher's designation of readability and did not progress in difficulty from easy to more difficult. Findings suggest the need for more complete and reliable information from publishers concerning textbook readability. (FL)

  4. Measuring the Readability of Elementary Algebra Using the Cloze Technique.

    Science.gov (United States)

    Kulm, Gerald

    The relationship to readability of ten variables characterizing structural properties of mathematical prose was investigated in elementary algebra textbooks. Readability was measured by algebra student's responses to two forms of cloze tests. Linear and currilinear correlations were calculated between each structural variable and the cloze test.…

  5. Readability of Brochures Produced by State of Florida.

    Science.gov (United States)

    Christ, William G.; Pharr, Paula

    1980-01-01

    A study of the readability of governmental pamphlets produced by the State of Florida, based on the use of the Flesch Reading Ease Formula and the Dale-Chall Formula, suggests that if a seventh or eighth grade readability level is considered an appropriate standard for public information brochures, the brochures tested may be too complex…

  6. readability of comprehension passages in junior high school (jhs)

    African Journals Online (AJOL)

    CHARLES

    ... to enhance readability. Key Words: readability formulas, comprehension passages, Junior High School, .... Index has a manual version but in this study the electronic version was used. The ..... probably the majority of the people heard the news by word of mouth. A critical look ..... The Journal of Tourism Studies 9.2: 49-60.

  7. Textbooks in Management, Marketing and Finance: An Analysis of Readability.

    Science.gov (United States)

    Gallagher, Daniel J.; Thompson, G. Rodney

    1982-01-01

    Examines the readability of texts in basic junior level college courses in the fields of management, marketing, and finance. The readability model is described, along with its application and results. Specific texts and how they fared are listed in accompanying tables. (CT)

  8. Readability as a Factor in Magazine Ad Copy Recall.

    Science.gov (United States)

    Wesson, David A.

    1989-01-01

    Examines the relationship between advertising copy readability and advertising effectiveness. Finds that recall is improved when the copy style is either fairly easy or fairly hard to read. Suggests the value of considering copy readability as a potential contributor, though a minor one, to the success of magazine advertising. (RS)

  9. New or improved computational methods and advanced reactor design

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Takeda, Toshikazu; Ushio, Tadashi

    1997-01-01

    Nuclear computational method has been studied continuously up to date, as a fundamental technology supporting the nuclear development. At present, research on computational method according to new theory and the calculating method thought to be difficult to practise are also continued actively to find new development due to splendid improvement of features of computer. In Japan, many light water type reactors are now in operations, new computational methods are induced for nuclear design, and a lot of efforts are concentrated for intending to more improvement of economics and safety. In this paper, some new research results on the nuclear computational methods and their application to nuclear design of the reactor were described for introducing recent trend of the nuclear design of the reactor. 1) Advancement of the computational method, 2) Reactor core design and management of the light water reactor, and 3) Nuclear design of the fast reactor. (G.K.)

  10. A computer method for spectral classification

    International Nuclear Information System (INIS)

    Appenzeller, I.; Zekl, H.

    1978-01-01

    The authors describe the start of an attempt to improve the accuracy of spectroscopic parallaxes by evaluating spectroscopic temperature and luminosity criteria such as those of the MK classification spectrograms which were analyzed automatically by means of a suitable computer program. (Auth.)

  11. Computational structural biology: methods and applications

    National Research Council Canada - National Science Library

    Schwede, Torsten; Peitsch, Manuel Claude

    2008-01-01

    ... sequencing reinforced the observation that structural information is needed to understand the detailed function and mechanism of biological molecules such as enzyme reactions and molecular recognition events. Furthermore, structures are obviously key to the design of molecules with new or improved functions. In this context, computational structural biology...

  12. Development of SMOG-Cro readability formula for healthcare communication and patient education.

    Science.gov (United States)

    Brangan, Sanja

    2015-03-01

    Effective communication shows a positive impact on patient satisfaction, compliance and medical outcomes, at the same time reducing the healthcare costs. Written information for patients needs to correspond to health literacy levels of the intended audiences. Readability formulas correlate well with the reading and comprehension tests but are considered an easier and quicker method to estimate a text difficulty. SMOG readability formula designed for English language needs to be modified if used for texts in other languages. The aim of this study was to develop a readability formula based on SMOG, that could be used to estimate text difficulty of written materials for patients in Croatian language. Contras- tive analysis of English and Croatian language covering a corpus of almost 100,000 running words showed clear linguis- tic differences in the number of polysyllabic words. The new formula, named SMOG-Cro, is presented as an equation: SMOG-Cro = 2 + √4+ syllables, with the score showing the number of years of education a person needs to be able to understand a piece of writing. The presented methodology could help in the development of readability formulas for other languages. We hope the results of this study are soon put into practice for more effective healthcare communication and patient education, and for development of a health literacy assessment tool in Croatian language.

  13. Evaluation of Quality and Readability of Health Information Websites Identified through India's Major Search Engines.

    Science.gov (United States)

    Raj, S; Sharma, V L; Singh, A J; Goel, S

    2016-01-01

    Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words "Health" and "Information" were used on search engines "Google" and "Yahoo." Out of 50 websites (25 from each search engines), after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), and SMOG. Results. Forty percent of websites (n = 13) were sponsored by government. Health On the Net Code of Conduct (HONcode) certification was present on 50% (n = 16) of websites. The mean LIDA score (74.31) was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  14. Soft computing methods for geoidal height transformation

    Science.gov (United States)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  15. Soft Computing Methods in Design of Superalloys

    Science.gov (United States)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  16. Statistical methods and computing for big data

    Science.gov (United States)

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  17. Statistical methods and computing for big data.

    Science.gov (United States)

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  18. Tensor network method for reversible classical computation

    Science.gov (United States)

    Yang, Zhi-Cheng; Kourtis, Stefanos; Chamon, Claudio; Mucciolo, Eduardo R.; Ruckenstein, Andrei E.

    2018-03-01

    We develop a tensor network technique that can solve universal reversible classical computational problems, formulated as vertex models on a square lattice [Nat. Commun. 8, 15303 (2017), 10.1038/ncomms15303]. By encoding the truth table of each vertex constraint in a tensor, the total number of solutions compatible with partial inputs and outputs at the boundary can be represented as the full contraction of a tensor network. We introduce an iterative compression-decimation (ICD) scheme that performs this contraction efficiently. The ICD algorithm first propagates local constraints to longer ranges via repeated contraction-decomposition sweeps over all lattice bonds, thus achieving compression on a given length scale. It then decimates the lattice via coarse-graining tensor contractions. Repeated iterations of these two steps gradually collapse the tensor network and ultimately yield the exact tensor trace for large systems, without the need for manual control of tensor dimensions. Our protocol allows us to obtain the exact number of solutions for computations where a naive enumeration would take astronomically long times.

  19. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  20. Computational methods for two-phase flow and particle transport

    CERN Document Server

    Lee, Wen Ho

    2013-01-01

    This book describes mathematical formulations and computational methods for solving two-phase flow problems with a computer code that calculates thermal hydraulic problems related to light water and fast breeder reactors. The physical model also handles the particle and gas flow problems that arise from coal gasification and fluidized beds. The second part of this book deals with the computational methods for particle transport.

  1. Reference depth for geostrophic computation - A new method

    Digital Repository Service at National Institute of Oceanography (India)

    Varkey, M.J.; Sastry, J.S.

    Various methods are available for the determination of reference depth for geostrophic computation. A new method based on the vertical profiles of mean and variance of the differences of mean specific volume anomaly (delta x 10) for different layers...

  2. An Augmented Fast Marching Method for Computing Skeletons and Centerlines

    NARCIS (Netherlands)

    Telea, Alexandru; Wijk, Jarke J. van

    2002-01-01

    We present a simple and robust method for computing skeletons for arbitrary planar objects and centerlines for 3D objects. We augment the Fast Marching Method (FMM) widely used in level set applications by computing the paramterized boundary location every pixel came from during the boundary

  3. Classical versus Computer Algebra Methods in Elementary Geometry

    Science.gov (United States)

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  4. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  5. Computational Methods for Conformational Sampling of Biomolecules

    DEFF Research Database (Denmark)

    Bottaro, Sandro

    mathematical approach to a classic geometrical problem in protein simulations, and demonstrated its superiority compared to existing approaches. Secondly, we have constructed a more accurate implicit model of the aqueous environment, which is of fundamental importance in protein chemistry. This model......Proteins play a fundamental role in virtually every process within living organisms. For example, some proteins act as enzymes, catalyzing a wide range of reactions necessary for life, others mediate the cell interaction with the surrounding environment and still others have regulatory functions...... is computationally much faster than models where water molecules are represented explicitly. Finally, in collaboration with the group of structural bioinformatics at the Department of Biology (KU), we have applied these techniques in the context of modeling of protein structure and flexibility from low...

  6. Computational Method for Atomistic-Continuum Homogenization

    National Research Council Canada - National Science Library

    Chung, Peter

    2002-01-01

    The homogenization method is used as a framework for developing a multiscale system of equations involving atoms at zero temperature at the small scale and continuum mechanics at the very large scale...

  7. The readability of psychosocial wellness patient resources: improving surgical outcomes.

    Science.gov (United States)

    Kugar, Meredith A; Cohen, Adam C; Wooden, William; Tholpady, Sunil S; Chu, Michael W

    2017-10-01

    Patient education is increasingly accessed with online resources and is essential for patient satisfaction and clinical outcomes. The average American adult reads at a seventh grade level, and the National Institute of Health (NIH) and the American Medical Association (AMA) recommend that information be written at a sixth-grade reading level. Health literacy plays an important role in the disease course and outcomes of all patients, including those with depression and likely other psychiatric disorders, although this is an area in need of further study. The purpose of this study was to collect and analyze written, online mental health resources on the Veterans Health Administration (VA) website, and other websites, using readability assessment instruments. An internet search was performed to identify written patient education information regarding mental health from the VA (the VA Mental Health Website) and top-rated psychiatric hospitals. Seven mental health topics were included in the analysis: generalized anxiety disorder, bipolar, major depressive disorder, posttraumatic stress disorder, schizophrenia, substance abuse, and suicide. Readability analyses were performed using the Gunning Fog Index, the Flesch-Kincaid Grade Level, the Coleman-Liau Index, the SMOG Readability Formula, and the Automated Readability Index. These scores were then combined into a Readability Consensus score. A two-tailed t-test was used to compare the mean values, and statistical significance was set at P readability consensus than six of the top psychiatric hospitals (P readability consensus for mental health information on all websites analyzed was 9.52. Online resources for mental health disorders are more complex than recommended by the NIH and AMA. Efforts to improve readability of mental health and psychosocial wellness resources could benefit patient understanding and outcomes, especially in patients with lower literacy. Surgical outcomes are correlated with patient mental

  8. Instrument design optimization with computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Michael H. [Old Dominion Univ., Norfolk, VA (United States)

    2017-08-01

    Using Finite Element Analysis to approximate the solution of differential equations, two different instruments in experimental Hall C at the Thomas Jefferson National Accelerator Facility are analyzed. The time dependence of density uctuations from the liquid hydrogen (LH2) target used in the Qweak experiment (2011-2012) are studied with Computational Fluid Dynamics (CFD) and the simulation results compared to data from the experiment. The 2.5 kW liquid hydrogen target was the highest power LH2 target in the world and the first to be designed with CFD at Jefferson Lab. The first complete magnetic field simulation of the Super High Momentum Spectrometer (SHMS) is presented with a focus on primary electron beam deflection downstream of the target. The SHMS consists of a superconducting horizontal bending magnet (HB) and three superconducting quadrupole magnets. The HB allows particles scattered at an angle of 5:5 deg to the beam line to be steered into the quadrupole magnets which make up the optics of the spectrometer. Without mitigation, remnant fields from the SHMS may steer the unscattered beam outside of the acceptable envelope on the beam dump and limit beam operations at small scattering angles. A solution is proposed using optimal placement of a minimal amount of shielding iron around the beam line.

  9. Computer methods in physics 250 problems with guided solutions

    CERN Document Server

    Landau, Rubin H

    2018-01-01

    Our future scientists and professionals must be conversant in computational techniques. In order to facilitate integration of computer methods into existing physics courses, this textbook offers a large number of worked examples and problems with fully guided solutions in Python as well as other languages (Mathematica, Java, C, Fortran, and Maple). It’s also intended as a self-study guide for learning how to use computer methods in physics. The authors include an introductory chapter on numerical tools and indication of computational and physics difficulty level for each problem.

  10. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  11. Three-dimensional protein structure prediction: Methods and computational strategies.

    Science.gov (United States)

    Dorn, Márcio; E Silva, Mariel Barbachan; Buriol, Luciana S; Lamb, Luis C

    2014-10-12

    A long standing problem in structural bioinformatics is to determine the three-dimensional (3-D) structure of a protein when only a sequence of amino acid residues is given. Many computational methodologies and algorithms have been proposed as a solution to the 3-D Protein Structure Prediction (3-D-PSP) problem. These methods can be divided in four main classes: (a) first principle methods without database information; (b) first principle methods with database information; (c) fold recognition and threading methods; and (d) comparative modeling methods and sequence alignment strategies. Deterministic computational techniques, optimization techniques, data mining and machine learning approaches are typically used in the construction of computational solutions for the PSP problem. Our main goal with this work is to review the methods and computational strategies that are currently used in 3-D protein prediction. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Comparison of Five Computational Methods for Computing Q Factors in Photonic Crystal Membrane Cavities

    DEFF Research Database (Denmark)

    Novitsky, Andrey; de Lasson, Jakob Rosenkrantz; Frandsen, Lars Hagedorn

    2017-01-01

    Five state-of-the-art computational methods are benchmarked by computing quality factors and resonance wavelengths in photonic crystal membrane L5 and L9 line defect cavities. The convergence of the methods with respect to resolution, degrees of freedom and number of modes is investigated. Specia...

  13. How well are health information websites displayed on mobile phones? Implications for the readability of health information.

    Science.gov (United States)

    Cheng, Christina; Dunn, Matthew

    2017-03-01

    Issue addressed More than 87% of Australians own a mobile phone with Internet access and 82% of phone owners use their smartphones to search for health information, indicating that mobile phones may be a powerful tool for building health literacy. Yet, online health information has been found to be above the reading ability of the general population. As reading on a smaller screen may further complicate the readability of information, this study aimed to examine how health information is displayed on mobile phones and its implications for readability. Methods Using a cross-sectional design with convenience sampling, a sample of 270 mobile webpages with information on 12 common health conditions was generated for analysis, they were categorised based on design and position of information display. Results The results showed that 71.48% of webpages were mobile-friendly but only 15.93% were mobile-friendly webpages designed in a way to optimise readability, with a paging format and queried information displayed for immediate viewing. Conclusion With inadequate evidence and lack of consensus on how webpage design can best promote reading and comprehension, it is difficult to draw a conclusion on the effect of current mobile health information presentation on readability. So what? Building mobile-responsive websites should be a priority for health information providers and policy-makers. Research efforts are urgently required to identify how best to enhance readability of mobile health information and fully capture the capabilities of mobile phones as a useful device to increase health literacy.

  14. Computational Methods for Physicists Compendium for Students

    CERN Document Server

    Sirca, Simon

    2012-01-01

    This book helps advanced undergraduate, graduate and postdoctoral students in their daily work by offering them a compendium of numerical methods. The choice of methods pays  significant attention to error estimates, stability and convergence issues as well as to the ways to optimize program execution speeds. Many examples are given throughout the chapters, and each chapter is followed by at least a handful of more comprehensive problems which may be dealt with, for example, on a weekly basis in a one- or two-semester course. In these end-of-chapter problems the physics background is pronounced, and the main text preceding them is intended as an introduction or as a later reference. Less stress is given to the explanation of individual algorithms. It is tried to induce in the reader an own independent thinking and a certain amount of scepticism and scrutiny instead of blindly following readily available commercial tools.

  15. Measurement method of cardiac computed tomography (CT)

    International Nuclear Information System (INIS)

    Watanabe, Shigeru; Yamamoto, Hironori; Yumura, Yasuo; Yoshida, Hideo; Morooka, Nobuhiro

    1980-01-01

    The CT was carried out in 126 cases consisting of 31 normals, 17 cases of mitral stenosis (MS), 8 cases of mitral regurgitation (MR), 11 cases of aortic stenosis (AS), 9 cases of aortic regurgitation (AR), 20 cases of myocardial infarction (MI), 8 cases of atrial septal defect (ASD) and 22 hypertensives. The 20-second scans were performed every 1.5 cm from the 2nd intercostal space to the 5th or 6th intercostal space. The computed tomograms obtained were classified into 8 levels by cross-sectional anatomy; levels of (1) the aortic arch, (2) just beneath the aortic arch, (3) the pulmonary artery bifurcation, (4) the right atrial appendage or the upper right atrium, (5) the aortic root, (6) the upper left ventricle, (7) the mid left ventricle, and (8) the lower left ventricle. The diameter (anteroposterior and transverse) and cross-sectional area were measured about ascending aorta (Ao), descending aorta (AoD), superior vena cava (SVC), inferoir vena cava (IVC), pulmonary artery branch (PA), main pulmonary artery (mPA), left atrium (LA), right atrium (RA), and right ventricular outflow tract (RVOT) on each level where they were clearly distinguished. However, it was difficult to separate cardiac wall from cardiac cavity because there was little difference of X-ray attenuation coefficient between the myocardium and blood. Therefore, on mid ventricular level, diameter and area about total cardiac shadow were measured, and then cardiac ratios to the thorax were respectively calculated. The normal range of their values was shown in table, and abnormal characteristics in cardiac disease were exhibited in comparison with normal values. In MS, diameter and area in LA were significantly larger than normal. In MS and ASD, all the right cardiac system were larger than normal, especially, RA and SVC in MS, PA and RVOT in ASD. The diameter and area of the aortic root was larger in the order of AR, AS and HT than normal. (author)

  16. Three numerical methods for the computation of the electrostatic energy

    International Nuclear Information System (INIS)

    Poenaru, D.N.; Galeriu, D.

    1975-01-01

    The FORTRAN programs for computation of the electrostatic energy of a body with axial symmetry by Lawrence, Hill-Wheeler and Beringer methods are presented in detail. The accuracy, time of computation and the required memory of these methods are tested at various deformations for two simple parametrisations: two overlapping identical spheres and a spheroid. On this basis the field of application of each method is recomended

  17. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  18. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  19. Developing a multimodal biometric authentication system using soft computing methods.

    Science.gov (United States)

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  20. Readability Formulas as Applied to College Economics Textbooks.

    Science.gov (United States)

    McConnell, Campbell R.

    1982-01-01

    Determines from empirical information on the application of four readability formulas to a group of widely used college economics textbooks that there is no consistency in the absolute reading levels or the rank orderings of these books. (AEA)

  1. Computational Simulations and the Scientific Method

    Science.gov (United States)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  2. Computer systems and methods for visualizing data

    Science.gov (United States)

    Stolte, Chris; Hanrahan, Patrick

    2013-01-29

    A method for forming a visual plot using a hierarchical structure of a dataset. The dataset comprises a measure and a dimension. The dimension consists of a plurality of levels. The plurality of levels form a dimension hierarchy. The visual plot is constructed based on a specification. A first level from the plurality of levels is represented by a first component of the visual plot. A second level from the plurality of levels is represented by a second component of the visual plot. The dataset is queried to retrieve data in accordance with the specification. The data includes all or a portion of the dimension and all or a portion of the measure. The visual plot is populated with the retrieved data in accordance with the specification.

  3. Exploration on Automated Software Requirement Document Readability Approaches

    OpenAIRE

    Chen, Mingda; He, Yao

    2017-01-01

    Context. The requirements analysis phase, as the very beginning of software development process, has been identified as a quite important phase in the software development lifecycle. Software Requirement Specification (SRS) is the output of requirements analysis phase, whose quality factors play an important role in the evaluation work. Readability is a quite important SRS quality factor, but there are few available automated approaches for readability measurement, because of the tight depend...

  4. Control rod computer code IAMCOS: general theory and numerical methods

    International Nuclear Information System (INIS)

    West, G.

    1982-11-01

    IAMCOS is a computer code for the description of mechanical and thermal behavior of cylindrical control rods for fast breeders. This code version was applied, tested and modified from 1979 to 1981. In this report are described the basic model (02 version), theoretical definitions and computation methods [fr

  5. Computation of saddle-type slow manifolds using iterative methods

    DEFF Research Database (Denmark)

    Kristiansen, Kristian Uldall

    2015-01-01

    with respect to , appropriate estimates are directly attainable using the method of this paper. The method is applied to several examples, including a model for a pair of neurons coupled by reciprocal inhibition with two slow and two fast variables, and the computation of homoclinic connections in the Fitz......This paper presents an alternative approach for the computation of trajectory segments on slow manifolds of saddle type. This approach is based on iterative methods rather than collocation-type methods. Compared to collocation methods, which require mesh refinements to ensure uniform convergence...

  6. Readability of websites containing information on dental implants.

    Science.gov (United States)

    Jayaratne, Yasas S N; Anderson, Nina K; Zwahlen, Roger A

    2014-12-01

    It is recommended that health-related materials for patients be written at sixth grade level or below. Many websites oriented toward patient education about dental implants are available, but the readability of these sites has not been evaluated. To assess readability of patient-oriented online information on dental implants. Websites containing patient-oriented information on dental implants were retrieved using the Google search engine. Individual and mean readability/grade levels were calculated using standardized formulas. Readability of each website was classified as easy (≤ 6th-grade level) or difficult (≥ 10th grade level). Thirty nine websites with patient-oriented information on dental implant were found. The average readability grade level of these websites was 11.65 ± 1.36. No website scored at/below the recommended 6th grade level. Thirty four of 39 websites (87.18%) were difficult to read. The number of characters, words, and sentences on these sites varied widely. All patient-oriented websites on dental implants scored above the recommended grade level, and majority of these sites were "difficult" in their readability. There is a dire need to create patient information websites on implants, which the majority can read. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Readability of patient information and consent documents in rheumatological studies.

    Science.gov (United States)

    Hamnes, Bente; van Eijk-Hustings, Yvonne; Primdahl, Jette

    2016-07-16

    Before participation in medical research an informed consent must be obtained. This study investigates whether the readability of patient information and consent documents (PICDs) corresponds to the average educational level of participants in rheumatological studies in the Netherlands, Denmark, and Norway. 24 PICDs from studies were collected and readability was assessed independently using the Gunning's Fog Index (FOG) and Simple Measure of Gobbledygook (SMOG) grading. The mean score for the FOG and SMOG grades were 14.2 (9.0-19.0) and 14.2 (12-17) respectively. The mean FOG and SMOG grades were 12.7 and 13.3 in the Dutch studies, 15.0 and 14.9 in the Danish studies, and 14.6 and 14.3 in the Norwegian studies, respectively. Out of the 2865 participants, more than 57 % had a lower educational level than the highest readability score calculated in the individual study. As the readability level of the PICDs did not match the participants' educational level, consent may not have been valid, as the participants may have had a limited understanding of what they agreed to participate in. There should be more focus on the readability of PICDs. National guidelines for how to write clear and unambiguous PICDs in simple and easily understandable language could increase the focus on the readability of PICD.

  8. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  9. Platform-independent method for computer aided schematic drawings

    Science.gov (United States)

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  10. Simulating elastic light scattering using high performance computing methods

    NARCIS (Netherlands)

    Hoekstra, A.G.; Sloot, P.M.A.; Verbraeck, A.; Kerckhoffs, E.J.H.

    1993-01-01

    The Coupled Dipole method, as originally formulated byPurcell and Pennypacker, is a very powerful method tosimulate the Elastic Light Scattering from arbitraryparticles. This method, which is a particle simulationmodel for Computational Electromagnetics, has one majordrawback: if the size of the

  11. Computational and experimental methods for enclosed natural convection

    International Nuclear Information System (INIS)

    Larson, D.W.; Gartling, D.K.; Schimmel, W.P. Jr.

    1977-10-01

    Two computational procedures and one optical experimental procedure for studying enclosed natural convection are described. The finite-difference and finite-element numerical methods are developed and several sample problems are solved. Results obtained from the two computational approaches are compared. A temperature-visualization scheme using laser holographic interferometry is described, and results from this experimental procedure are compared with results from both numerical methods

  12. Computer Anti-forensics Methods and their Impact on Computer Forensic Investigation

    OpenAIRE

    Pajek, Przemyslaw; Pimenidis, Elias

    2009-01-01

    Electronic crime is very difficult to investigate and prosecute, mainly\\ud due to the fact that investigators have to build their cases based on artefacts left\\ud on computer systems. Nowadays, computer criminals are aware of computer forensics\\ud methods and techniques and try to use countermeasure techniques to efficiently\\ud impede the investigation processes. In many cases investigation with\\ud such countermeasure techniques in place appears to be too expensive, or too\\ud time consuming t...

  13. Fibonacci’s Computation Methods vs Modern Algorithms

    Directory of Open Access Journals (Sweden)

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  14. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  15. Determining Readability: How to Select and Apply Easy-to-Use Readability Formulas to Assess the Difficulty of Adult Literacy Materials

    Science.gov (United States)

    Burke, Victoria; Greenberg, Daphne

    2010-01-01

    There are many readability tools that instructors can use to help adult learners select reading materials. We describe and compare different types of readability tools: formulas calculated by hand, tools found on the Web, tools embedded in a word processing program, and readability tools found in a commercial software program. Practitioners do not…

  16. The role of readability in effective health communication: an experiment using a Japanese health information text on chronic suppurative otitis media.

    Science.gov (United States)

    Sakai, Yukiko

    2013-09-01

    This study identifies the most significant readability factors and examines ways of improving and evaluating Japanese health information text in terms of ease of reading and understanding. Six different Japanese texts were prepared based on an original short text written by a medical doctor for a hospital web site intended for laypersons regarding chronic suppurative otitis media. Four were revised for single readability factor (syntax, vocabulary, or text structure) and two were modified in all three factors. Using a web-based survey, 270 high school students read one of the seven texts, including the original, completed two kinds of comprehension tests, and answered questions on their impressions of the text's readability. Significantly higher comprehension test scores were shown in the true or false test for a mixed text that presented important information first for better text structure. They were also found in the cloze test for a text using common vocabulary and a cohesive mixed text. Vocabulary could be a critical single readability factor when presumably combined with better text structure. Using multiple evaluation methods can help assess comprehensive readability. The findings on improvement and evaluation methods of readability can be applied to support effective health communication. © 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information and Libraries Journal.

  17. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  18. Readability of Educational Materials to Support Parent Sexual Communication With Their Children and Adolescents.

    Science.gov (United States)

    Ballonoff Suleiman, Ahna; Lin, Jessica S; Constantine, Norman A

    2016-05-01

    Sexual communication is a principal means of transmitting sexual values, expectations, and knowledge from parents to their children and adolescents. Many parents seek information and guidance to support talking with their children about sex and sexuality. Parent education materials can deliver this guidance but must use appropriate readability levels to facilitate comprehension and motivation. This study appraised the readability of educational materials to support parent sexual communication with their children. Fifty brochures, pamphlets, and booklets were analyzed using the Flesch-Kincaid, Gunning Fog, and Simple Measure of Gobbledygook (SMOG) index methods. Mean readability grade-level scores were 8.3 (range = 4.5-12.8), 9.7 (range = 5.5-14.9), and 10.1 (range = 6.7-13.9), respectively. Informed by National Institutes of Health-recommended 6th to 7th grade levels and American Medical Association-recommended 5th to 6th grade levels, percentages falling at or below the 7.0 grade level were calculated as 38%, 12%, and 2% and those falling at or below the 6.0 grade level were calculated as 12%, 2%, and 0% based on the Flesch-Kincaid, Gunning Fog, and SMOG methods, respectively. These analyses indicate that the majority of educational materials available online to support parents' communication with their children about sex and sexuality do not meet the needs of many or most parents. Efforts to improve the accessibility of these materials are warranted.

  19. Assessing Online Patient Education Readability for Spine Surgery Procedures.

    Science.gov (United States)

    Long, William W; Modi, Krishna D; Haws, Brittany E; Khechen, Benjamin; Massel, Dustin H; Mayo, Benjamin C; Singh, Kern

    2018-03-01

    Increased patient reliance on Internet-based health information has amplified the need for comprehensible online patient education articles. As suggested by the American Medical Association and National Institute of Health, spine fusion articles should be written for a 4th-6th-grade reading level to increase patient comprehension, which may improve postoperative outcomes. The purpose of this study is to determine the readability of online health care education information relating to anterior cervical discectomy and fusion (ACDF) and lumbar fusion procedures. Online health-education resource qualitative analysis. Three search engines were utilized to access patient education articles for common cervical and lumbar spine procedures. Relevant articles were analyzed for readability using Readability Studio Professional Edition software (Oleander Software Ltd). Articles were stratified by organization type as follows: General Medical Websites (GMW), Healthcare Network/Academic Institutions (HNAI), and Private Practices (PP). Thirteen common readability tests were performed with the mean readability of each compared between subgroups using analysis of variance. ACDF and lumbar fusion articles were determined to have a mean readability of 10.7±1.5 and 11.3±1.6, respectively. GMW, HNAI, and PP subgroups had a mean readability of 10.9±2.9, 10.7±2.8, and 10.7±2.5 for ACDF and 10.9±3.0, 10.8±2.9, and 11.6±2.7 for lumbar fusion articles. Of 310 total articles, only 6 (3 ACDF and 3 lumbar fusion) were written for comprehension below a 7th-grade reading level. Current online literature from medical websites containing information regarding ACDF and lumbar fusion procedures are written at a grade level higher than the suggested guidelines. Therefore, current patient education articles should be revised to accommodate the average reading level in the United States and may result in improved patient comprehension and postoperative outcomes.

  20. Further Issues in Determining the Readability of Self-Report Items: Comment on McHugh and Behar (2009)

    Science.gov (United States)

    Schinka, John A.

    2012-01-01

    Objective: Issues regarding the readability of self-report assessment instruments, methods for establishing the reading ability level of respondents, and guidelines for development of scales designed for marginal readers have been inconsistently addressed in the literature. A recent study by McHugh and Behar (2009) provided new findings relevant…

  1. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  2. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  3. A Krylov Subspace Method for Unstructured Mesh SN Transport Computation

    International Nuclear Information System (INIS)

    Yoo, Han Jong; Cho, Nam Zin; Kim, Jong Woon; Hong, Ser Gi; Lee, Young Ouk

    2010-01-01

    Hong, et al., have developed a computer code MUST (Multi-group Unstructured geometry S N Transport) for the neutral particle transport calculations in three-dimensional unstructured geometry. In this code, the discrete ordinates transport equation is solved by using the discontinuous finite element method (DFEM) or the subcell balance methods with linear discontinuous expansion. In this paper, the conventional source iteration in the MUST code is replaced by the Krylov subspace method to reduce computing time and the numerical test results are given

  4. Computational methods for high-energy source shielding

    International Nuclear Information System (INIS)

    Armstrong, T.W.; Cloth, P.; Filges, D.

    1983-01-01

    The computational methods for high-energy radiation transport related to shielding of the SNQ-spallation source are outlined. The basic approach is to couple radiation-transport computer codes which use Monte Carlo methods and discrete ordinates methods. A code system is suggested that incorporates state-of-the-art radiation-transport techniques. The stepwise verification of that system is briefly summarized. The complexity of the resulting code system suggests a more straightforward code specially tailored for thick shield calculations. A short guide line to future development of such a Monte Carlo code is given

  5. Monte Carlo methods of PageRank computation

    NARCIS (Netherlands)

    Litvak, Nelli

    2004-01-01

    We describe and analyze an on-line Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink

  6. Geometric optical transfer function and tis computation method

    International Nuclear Information System (INIS)

    Wang Qi

    1992-01-01

    Geometric Optical Transfer Function formula is derived after expound some content to be easily ignored, and the computation method is given with Bessel function of order zero and numerical integration and Spline interpolation. The method is of advantage to ensure accuracy and to save calculation

  7. Efficient Numerical Methods for Stochastic Differential Equations in Computational Finance

    KAUST Repository

    Happola, Juho

    2017-09-19

    Stochastic Differential Equations (SDE) offer a rich framework to model the probabilistic evolution of the state of a system. Numerical approximation methods are typically needed in evaluating relevant Quantities of Interest arising from such models. In this dissertation, we present novel effective methods for evaluating Quantities of Interest relevant to computational finance when the state of the system is described by an SDE.

  8. Fully consistent CFD methods for incompressible flow computations

    DEFF Research Database (Denmark)

    Kolmogorov, Dmitry; Shen, Wen Zhong; Sørensen, Niels N.

    2014-01-01

    Nowadays collocated grid based CFD methods are one of the most e_cient tools for computations of the ows past wind turbines. To ensure the robustness of the methods they require special attention to the well-known problem of pressure-velocity coupling. Many commercial codes to ensure the pressure...

  9. Efficient Numerical Methods for Stochastic Differential Equations in Computational Finance

    KAUST Repository

    Happola, Juho

    2017-01-01

    Stochastic Differential Equations (SDE) offer a rich framework to model the probabilistic evolution of the state of a system. Numerical approximation methods are typically needed in evaluating relevant Quantities of Interest arising from such models. In this dissertation, we present novel effective methods for evaluating Quantities of Interest relevant to computational finance when the state of the system is described by an SDE.

  10. Computational methods for structural load and resistance modeling

    Science.gov (United States)

    Thacker, B. H.; Millwater, H. R.; Harren, S. V.

    1991-01-01

    An automated capability for computing structural reliability considering uncertainties in both load and resistance variables is presented. The computations are carried out using an automated Advanced Mean Value iteration algorithm (AMV +) with performance functions involving load and resistance variables obtained by both explicit and implicit methods. A complete description of the procedures used is given as well as several illustrative examples, verified by Monte Carlo Analysis. In particular, the computational methods described in the paper are shown to be quite accurate and efficient for a material nonlinear structure considering material damage as a function of several primitive random variables. The results show clearly the effectiveness of the algorithms for computing the reliability of large-scale structural systems with a maximum number of resolutions.

  11. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  12. Quality and readability of websites for patient information on tonsillectomy and sleep apnea.

    Science.gov (United States)

    Chi, Ethan; Jabbour, Noel; Aaronson, Nicole Leigh

    2017-07-01

    Tonsillectomy is a common treatment for obstructive sleep apnea (OSA). The Internet allows patients direct access to medical information. Since information on the Internet is largely unregulated, quality and readability are variable. This study evaluates the quality and readability of the most likely visited websites presenting information on sleep apnea and tonsillectomy. The three most popular search engines (Google, Bing, Yahoo) were queried with the phrase "sleep apnea AND tonsillectomy." The DISCERN instrument was used to assess quality of information. Readability was evaluated using the Flesch-Kincaid Reading Grade Level (FKGL) and Flesch Reading Ease Score (FRES). Out of the maximum of 80, the average DISCERN quality score for the websites was 55.1 (SD- 12.3, Median- 60.5). The mean score for FRES was 42.3 (SD- 15.9, Median- 45.5), which falls in the range defined as difficult. No website was above the optimal score of 65. The mean score for the FKGL was US grade-level of 10.7 (SD- 1.6, Median- 11.6). Only 4(27%) websites were in the optimal range of 6-8. There was very weak correlation between FRES and DISCERN (r = 0.07) and FKGL and DISCERN (r = 0.21). Tonsillectomy is one of the most common surgeries in the US. However, the internet information readily available to patients varies in quality. Additionally, much of the information is above the recommended grade level for comprehension by the public. By being aware of what information patients are reading online, physicians can better explain treatments and address misunderstandings. Physicians may consider using similar methods to test the readability for their own resources for patient education. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  14. Third Molars on the Internet: A Guide for Assessing Information Quality and Readability

    Science.gov (United States)

    Brennan, David; Sambrook, Paul; Armfield, Jason

    2015-01-01

    Background Directing patients suffering from third molars (TMs) problems to high-quality online information is not only medically important, but also could enable better engagement in shared decision making. Objectives This study aimed to develop a scale that measures the scientific information quality (SIQ) for online information concerning wisdom tooth problems and to conduct a quality evaluation for online TMs resources. In addition, the study evaluated whether a specific piece of readability software (Readability Studio Professional 2012) might be reliable in measuring information comprehension, and explored predictors for the SIQ Scale. Methods A cross-sectional sample of websites was retrieved using certain keywords and phrases such as “impacted wisdom tooth problems” using 3 popular search engines. The retrieved websites (n=150) were filtered. The retained 50 websites were evaluated to assess their characteristics, usability, accessibility, trust, readability, SIQ, and their credibility using DISCERN and Health on the Net Code (HoNCode). Results Websites’ mean scale scores varied significantly across website affiliation groups such as governmental, commercial, and treatment provider bodies. The SIQ Scale had a good internal consistency (alpha=.85) and was significantly correlated with DISCERN (r=.82, Preadability grade (10.3, SD 1.9) was above the recommended level, and was significantly correlated with the Scientific Information Comprehension Scale (r=.45. PReadability Studio software estimates were associated with scientific information comprehensiveness measures. PMID:26443470

  15. Polyadenylated Sequencing Primers Enable Complete Readability of PCR Amplicons Analyzed by Dideoxynucleotide Sequencing

    Directory of Open Access Journals (Sweden)

    Martin Beránek

    2012-01-01

    Full Text Available Dideoxynucleotide DNA sequencing is one of the principal procedures in molecular biology. Loss of an initial part of nucleotides behind the 3' end of the sequencing primer limits the readability of sequenced amplicons. We present a method which extends the readability by using sequencing primers modified by polyadenylated tails attached to their 5' ends. Performing a polymerase chain reaction, we amplified eight amplicons of six human genes (AMELX, APOE, HFE, MBL2, SERPINA1 and TGFB1 ranging from 106 bp to 680 bp. Polyadenylation of the sequencing primers minimized the loss of bases in all amplicons. Complete sequences of shorter products (AMELX 106 bp, SERPINA1 121 bp, HFE 208 bp, APOE 244 bp, MBL2 317 bp were obtained. In addition, in the case of TGFB1 products (366 bp, 432 bp, and 680 bp, respectively, the lengths of sequencing readings were significantly longer if adenylated primers were used. Thus, single strand dideoxynucleotide sequencing with adenylated primers enables complete or near complete readability of short PCR amplicons.

  16. Class of reconstructed discontinuous Galerkin methods in computational fluid dynamics

    International Nuclear Information System (INIS)

    Luo, Hong; Xia, Yidong; Nourgaliev, Robert

    2011-01-01

    A class of reconstructed discontinuous Galerkin (DG) methods is presented to solve compressible flow problems on arbitrary grids. The idea is to combine the efficiency of the reconstruction methods in finite volume methods and the accuracy of the DG methods to obtain a better numerical algorithm in computational fluid dynamics. The beauty of the resulting reconstructed discontinuous Galerkin (RDG) methods is that they provide a unified formulation for both finite volume and DG methods, and contain both classical finite volume and standard DG methods as two special cases of the RDG methods, and thus allow for a direct efficiency comparison. Both Green-Gauss and least-squares reconstruction methods and a least-squares recovery method are presented to obtain a quadratic polynomial representation of the underlying linear discontinuous Galerkin solution on each cell via a so-called in-cell reconstruction process. The devised in-cell reconstruction is aimed to augment the accuracy of the discontinuous Galerkin method by increasing the order of the underlying polynomial solution. These three reconstructed discontinuous Galerkin methods are used to compute a variety of compressible flow problems on arbitrary meshes to assess their accuracy. The numerical experiments demonstrate that all three reconstructed discontinuous Galerkin methods can significantly improve the accuracy of the underlying second-order DG method, although the least-squares reconstructed DG method provides the best performance in terms of both accuracy, efficiency, and robustness. (author)

  17. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  18. Multigrid methods for the computation of propagators in gauge fields

    International Nuclear Information System (INIS)

    Kalkreuter, T.

    1992-11-01

    In the present work generalizations of multigrid methods for propagators in gauge fields are investigated. We discuss proper averaging operations for bosons and for staggered fermions. An efficient algorithm for computing C numerically is presented. The averaging kernels C can be used not only in deterministic multigrid computations, but also in multigrid Monte Carlo simulations, and for the definition of block spins and blocked gauge fields in Monte Carlo renormalization group studies of gauge theories. Actual numerical computations of kernels and propagators are performed in compact four-dimensional SU(2) gauge fields. (orig./HSI)

  19. Water demand forecasting: review of soft computing methods.

    Science.gov (United States)

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  20. Improving readability through extractive summarization for learners with reading difficulties

    Directory of Open Access Journals (Sweden)

    K. Nandhini

    2013-11-01

    Full Text Available In this paper, we describe the design and evaluation of extractive summarization approach to assist the learners with reading difficulties. As existing summarization approaches inherently assign more weights to the important sentences, our approach predicts the summary sentences that are important as well as readable to the target audience with good accuracy. We used supervised machine learning technique for summary extraction of science and social subjects in the educational text. Various independent features from the existing literature for predicting important sentences and proposed learner dependent features for predicting readable sentences are extracted from texts and are used for automatic classification. We performed both extrinsic and intrinsic evaluation on this approach and the intrinsic evaluation is carried out using F-measure and readability analysis. The extrinsic evaluation comprises of learner feedback using likert scale and the effect of assistive summary on improving readability for learners’ with reading difficulty using ANOVA. The results show significant improvement in readability for the target audience using assistive summary.

  1. Assessing readability of patient education materials: current role in orthopaedics.

    Science.gov (United States)

    Badarudeen, Sameer; Sabharwal, Sanjeev

    2010-10-01

    Health literacy is the single best predictor of an individual's health status. It is important to customize health-related education material to the individual patient's level of reading skills. Readability of a given text is the objective measurement of the reading skills one should possess to understand the written material. In this article, some of the commonly used readability assessment tools are discussed and guidelines to improve the comprehension of patient education handouts are provided. Where are we now? Several healthcare organizations have recommended the readability of patient education materials be no higher than sixth- to eighth-grade level. However, most of the patient education materials currently available on major orthopaedic Web sites are written at a reading level that may be too advanced for comprehension by a substantial proportion of the population. WHERE DO WE NEED TO GO?: There are several readily available and validated tools for assessing the readability of written materials. While use of audiovisual aids such as video clips, line drawings, models, and charts can enhance the comprehension of a health-related topic, standard readability tools cannot construe such enhancements. HOW DO WE GET THERE?: Given the variability in the capacity to comprehend health-related materials among individuals seeking orthopaedic care, stratifying the contents of patient education materials at different levels of complexity will likely improve health literacy and enhance patient-centered communication.

  2. Readability of Orthopedic Trauma Patient Education Materials on the Internet.

    Science.gov (United States)

    Mohan, Rohith; Yi, Paul H; Morshed, Saam

    In this study, we used the Flesch-Kincaid Readability Scale to determine the readability levels of orthopedic trauma patient education materials on the American Academy of Orthopaedic Surgeons (AAOS) website and to examine how subspecialty coauthorship affects readability level. Included articles from the AAOS online patient education library and the AAOS OrthoPortal website were categorized as trauma or broken bones and injuries on the AAOS online library or were screened by study authors for relevance to orthopedic trauma. Subsequently, the Flesch-Kincaid scale was used to determine each article's readability level, which was reported as a grade level. Subspecialty coauthorship was noted for each article. A total of 115 articles from the AAOS website were included in the study and reviewed. Mean reading level was grade 9.1 for all articles reviewed. Nineteen articles (16.5%) were found to be at or below the eighth-grade level, and only 1 article was at or below the sixth-grade level. In addition, there was no statistically significant difference between articles coauthored by the various orthopedic subspecialties and those authored exclusively by AAOS. Orthopedic trauma readability materials on the AAOS website appear to be written at a reading comprehension level too high for the average patient to understand.

  3. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  4. A stochastic method for computing hadronic matrix elements

    Energy Technology Data Exchange (ETDEWEB)

    Alexandrou, Constantia [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; The Cyprus Institute, Nicosia (Cyprus). Computational-based Science and Technology Research Center; Dinter, Simon; Drach, Vincent [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Jansen, Karl [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hadjiyiannakou, Kyriakos [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Renner, Dru B. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Collaboration: European Twisted Mass Collaboration

    2013-02-15

    We present a stochastic method for the calculation of baryon three-point functions that is more versatile compared to the typically used sequential method. We analyze the scaling of the error of the stochastically evaluated three-point function with the lattice volume and find a favorable signal-to-noise ratio suggesting that our stochastic method can be used efficiently at large volumes to compute hadronic matrix elements.

  5. The Direct Lighting Computation in Global Illumination Methods

    Science.gov (United States)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  6. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  7. Integrating computational methods to retrofit enzymes to synthetic pathways.

    Science.gov (United States)

    Brunk, Elizabeth; Neri, Marilisa; Tavernelli, Ivano; Hatzimanikatis, Vassily; Rothlisberger, Ursula

    2012-02-01

    Microbial production of desired compounds provides an efficient framework for the development of renewable energy resources. To be competitive to traditional chemistry, one requirement is to utilize the full capacity of the microorganism to produce target compounds with high yields and turnover rates. We use integrated computational methods to generate and quantify the performance of novel biosynthetic routes that contain highly optimized catalysts. Engineering a novel reaction pathway entails addressing feasibility on multiple levels, which involves handling the complexity of large-scale biochemical networks while respecting the critical chemical phenomena at the atomistic scale. To pursue this multi-layer challenge, our strategy merges knowledge-based metabolic engineering methods with computational chemistry methods. By bridging multiple disciplines, we provide an integral computational framework that could accelerate the discovery and implementation of novel biosynthetic production routes. Using this approach, we have identified and optimized a novel biosynthetic route for the production of 3HP from pyruvate. Copyright © 2011 Wiley Periodicals, Inc.

  8. The Experiment Method for Manufacturing Grid Development on Single Computer

    Institute of Scientific and Technical Information of China (English)

    XIAO Youan; ZHOU Zude

    2006-01-01

    In this paper, an experiment method for the Manufacturing Grid application system development in the single personal computer environment is proposed. The characteristic of the proposed method is constructing a full prototype Manufacturing Grid application system which is hosted on a single personal computer with the virtual machine technology. Firstly, it builds all the Manufacturing Grid physical resource nodes on an abstraction layer of a single personal computer with the virtual machine technology. Secondly, all the virtual Manufacturing Grid resource nodes will be connected with virtual network and the application software will be deployed on each Manufacturing Grid nodes. Then, we can obtain a prototype Manufacturing Grid application system which is working in the single personal computer, and can carry on the experiment on this foundation. Compared with the known experiment methods for the Manufacturing Grid application system development, the proposed method has the advantages of the known methods, such as cost inexpensively, operation simple, and can get the confidence experiment result easily. The Manufacturing Grid application system constructed with the proposed method has the high scalability, stability and reliability. It is can be migrated to the real application environment rapidly.

  9. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  10. Hamiltonian lattice field theory: Computer calculations using variational methods

    International Nuclear Information System (INIS)

    Zako, R.L.

    1991-01-01

    I develop a variational method for systematic numerical computation of physical quantities -- bound state energies and scattering amplitudes -- in quantum field theory. An infinite-volume, continuum theory is approximated by a theory on a finite spatial lattice, which is amenable to numerical computation. I present an algorithm for computing approximate energy eigenvalues and eigenstates in the lattice theory and for bounding the resulting errors. I also show how to select basis states and choose variational parameters in order to minimize errors. The algorithm is based on the Rayleigh-Ritz principle and Kato's generalizations of Temple's formula. The algorithm could be adapted to systems such as atoms and molecules. I show how to compute Green's functions from energy eigenvalues and eigenstates in the lattice theory, and relate these to physical (renormalized) coupling constants, bound state energies and Green's functions. Thus one can compute approximate physical quantities in a lattice theory that approximates a quantum field theory with specified physical coupling constants. I discuss the errors in both approximations. In principle, the errors can be made arbitrarily small by increasing the size of the lattice, decreasing the lattice spacing and computing sufficiently long. Unfortunately, I do not understand the infinite-volume and continuum limits well enough to quantify errors due to the lattice approximation. Thus the method is currently incomplete. I apply the method to real scalar field theories using a Fock basis of free particle states. All needed quantities can be calculated efficiently with this basis. The generalization to more complicated theories is straightforward. I describe a computer implementation of the method and present numerical results for simple quantum mechanical systems

  11. Hamiltonian lattice field theory: Computer calculations using variational methods

    International Nuclear Information System (INIS)

    Zako, R.L.

    1991-01-01

    A variational method is developed for systematic numerical computation of physical quantities-bound state energies and scattering amplitudes-in quantum field theory. An infinite-volume, continuum theory is approximated by a theory on a finite spatial lattice, which is amenable to numerical computation. An algorithm is presented for computing approximate energy eigenvalues and eigenstates in the lattice theory and for bounding the resulting errors. It is shown how to select basis states and choose variational parameters in order to minimize errors. The algorithm is based on the Rayleigh-Ritz principle and Kato's generalizations of Temple's formula. The algorithm could be adapted to systems such as atoms and molecules. It is shown how to compute Green's functions from energy eigenvalues and eigenstates in the lattice theory, and relate these to physical (renormalized) coupling constants, bound state energies and Green's functions. Thus one can compute approximate physical quantities in a lattice theory that approximates a quantum field theory with specified physical coupling constants. The author discusses the errors in both approximations. In principle, the errors can be made arbitrarily small by increasing the size of the lattice, decreasing the lattice spacing and computing sufficiently long. Unfortunately, the author does not understand the infinite-volume and continuum limits well enough to quantify errors due to the lattice approximation. Thus the method is currently incomplete. The method is applied to real scalar field theories using a Fock basis of free particle states. All needed quantities can be calculated efficiently with this basis. The generalization to more complicated theories is straightforward. The author describes a computer implementation of the method and present numerical results for simple quantum mechanical systems

  12. Application of statistical method for FBR plant transient computation

    International Nuclear Information System (INIS)

    Kikuchi, Norihiro; Mochizuki, Hiroyasu

    2014-01-01

    Highlights: • A statistical method with a large trial number up to 10,000 is applied to the plant system analysis. • A turbine trip test conducted at the “Monju” reactor is selected as a plant transient. • A reduction method of trial numbers is discussed. • The result with reduced trial number can express the base regions of the computed distribution. -- Abstract: It is obvious that design tolerances, errors included in operation, and statistical errors in empirical correlations effect on the transient behavior. The purpose of the present study is to apply above mentioned statistical errors to a plant system computation in order to evaluate the statistical distribution contained in the transient evolution. A selected computation case is the turbine trip test conducted at 40% electric power of the prototype fast reactor “Monju”. All of the heat transport systems of “Monju” are modeled with the NETFLOW++ system code which has been validated using the plant transient tests of the experimental fast reactor Joyo, and “Monju”. The effects of parameters on upper plenum temperature are confirmed by sensitivity analyses, and dominant parameters are chosen. The statistical errors are applied to each computation deck by using a pseudorandom number and the Monte-Carlo method. The dSFMT (Double precision SIMD-oriented Fast Mersenne Twister) that is developed version of Mersenne Twister (MT), is adopted as the pseudorandom number generator. In the present study, uniform random numbers are generated by dSFMT, and these random numbers are transformed to the normal distribution by the Box–Muller method. Ten thousands of different computations are performed at once. In every computation case, the steady calculation is performed for 12,000 s, and transient calculation is performed for 4000 s. In the purpose of the present statistical computation, it is important that the base regions of distribution functions should be calculated precisely. A large number of

  13. Electron beam treatment planning: A review of dose computation methods

    International Nuclear Information System (INIS)

    Mohan, R.; Riley, R.; Laughlin, J.S.

    1983-01-01

    Various methods of dose computations are reviewed. The equivalent path length methods used to account for body curvature and internal structure are not adequate because they ignore the lateral diffusion of electrons. The Monte Carlo method for the broad field three-dimensional situation in treatment planning is impractical because of the enormous computer time required. The pencil beam technique may represent a suitable compromise. The behavior of a pencil beam may be described by the multiple scattering theory or, alternatively, generated using the Monte Carlo method. Although nearly two orders of magnitude slower than the equivalent path length technique, the pencil beam method improves accuracy sufficiently to justify its use. It applies very well when accounting for the effect of surface irregularities; the formulation for handling inhomogeneous internal structure is yet to be developed

  14. Readability of Online Sources Regarding Meniscal Tears.

    Science.gov (United States)

    Hodax, Jonathan D; Baird, Grayson L; McBride, Trevor; Owens, Brett D

    2017-09-01

    Meniscal injuries are extremely common, with an incidence of 8.3 per 1,000 person/years in young, active individuals. Patients often turn to the internet to glean information about their injuries, and even to guide decision making about treatment. Much research has been done demonstrating that a reading level of eighth grade or lower is appropriate for accurately communicating written information to patients, yet medical practitioners often fail to meet this requirement. To better examine the information patients receive about meniscal injuries, we set out to evaluate the reading level and content of three commonly used search terms on the three search engines with the largest market share. The authors examined the keywords "meniscus tear," "meniscus tear treatment," and "knee pain meniscus" on the three highest market share search engines. The top 10 results from each search were included, and redundancies identified. Unique Web sites were evaluated for source, word count, reading level, and content including advertisements, diagrams, photographs, nonoperative and operative options, and accurate medical information. A total of 23 unique Web sites were identified in our search, including 13 public education sources, 6 academic institutions, and 4 private physicians/groups. Average grade levels of articles ranged from 9.4 to 14.2 (mean, 11.14; standard deviation [SD] 1.46), and Flesch-Kincaid reading ease scores ranged from 23.9 to 68.7 (mean, 55.31; SD, 10.11). Pages from public sources required the highest level of readability (11.6, 95% confidence interval [CI]: 9.8-13.2), which was significantly higher than private (11.0, 95% CI: 9.3, 12.7]) and academic (10.9, 95% CI: 8.9-12.9), p  = 0.007 and p  = 0.002, respectively. Further efforts to make appropriate health information available to patients are needed. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. The readability and suitability of sexual health promotion leaflets.

    Science.gov (United States)

    Corcoran, Nova; Ahmad, Fatuma

    2016-02-01

    To investigate the readability and suitability of sexual health promotion leaflets. Application of SMOG, FRY and SAM tests to assess the readability and suitability of a selection of sexual health leaflets. SMOG and FRY scores illustrate an average reading level of grade 9. SAM scores indicate that 59% of leaflets are superior in design and 41% are average in design. Leaflets generally perform well in the categories of content, literacy demand, typography and layout. They perform poorly in use of graphics, learning stimulation/motivation and cultural appropriateness. Sexual health leaflets have a reading level that is too high. Leaflets perform well on the suitability scores indicating they are reasonably suitable. There are a number of areas where sexual health leaflets could improve their design. Numerous practical techniques are suggested for improving the readability and suitability of sexual health leaflets. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Initial Readability Assessment of Clinical Trial Eligibility Criteria

    Science.gov (United States)

    Kang, Tian; Elhadad, Noémie; Weng, Chunhua

    2015-01-01

    Various search engines are available to clinical trial seekers. However, it remains unknown how comprehensible clinical trial eligibility criteria used for recruitment are to a lay audience. This study initially investigated this problem. Readability of eligibility criteria was assessed according to (i) shallow and lexical characteristics through the use of an established, generic readability metric; (ii) syntactic characteristics through natural language processing techniques; and (iii) health terminological characteristics through an automated comparison to technical and lay health texts. We further stratified clinical trials according to various study characteristics (e.g., source country or study type) to understand potential factors influencing readability. Mainly caused by frequent use of technical jargons, a college reading level was found to be necessary to understand eligibility criteria text, a level much higher than the average literacy level of the general American population. The use of technical jargons should be minimized to simplify eligibility criteria text. PMID:26958204

  17. Readability Assessment of Patient Information about Lymphedema and Its Treatment.

    Science.gov (United States)

    Seth, Akhil K; Vargas, Christina R; Chuang, Danielle J; Lee, Bernard T

    2016-02-01

    Patient use of online resources for health information is increasing, and access to appropriately written information has been associated with improved patient satisfaction and overall outcomes. The American Medical Association and the National Institutes of Health recommend that patient materials be written at a sixth-grade reading level. In this study, the authors simulated a patient search of online educational content for lymphedema and evaluated readability. An online search for the term "lymphedema" was performed, and the first 12 hits were identified. User and location filters were disabled and sponsored results were excluded. Patient information from each site was downloaded and formatted into plain text. Readability was assessed using established tests: Coleman-Liau, Flesch-Kincaid, Flesch Reading Ease Index, FORCAST Readability Formula, Fry Graph, Gunning Fog Index, New Dale-Chall Formula, New Fog Count, Raygor Readability Estimate, and Simple Measure of Gobbledygook Readability Formula. There were 152 patient articles downloaded; the overall mean reading level was 12.6. Individual website reading levels ranged from 9.4 (cancer.org) to 16.7 (wikipedia.org). There were 36 articles dedicated to conservative treatments for lymphedema; surgical treatment was mentioned in nine articles across four sites. The average reading level for conservative management was 12.7, compared with 15.6 for surgery (p readability, and surgeons should direct patients to sites appropriate for their level. There is limited information about surgical treatment available on the most popular sites; this information is significantly harder to read than sections on conservative measures.

  18. Readability assessment of internet-based consumer health information.

    Science.gov (United States)

    Walsh, Tiffany M; Volsko, Teresa A

    2008-10-01

    A substantial amount of consumer health-related information is available on the Internet. Studies suggest that consumer comprehension may be compromised if content exceeds a 7th-grade reading level, which is the average American reading level identified by the United States Department of Health and Human Services (USDHHS). To determine the readability of Internet-based consumer health information offered by organizations that represent the top 5 medical-related causes of death in America. We hypothesized that the average readability (reading grade level) of Internet-based consumer health information on heart disease, cancer, stroke, chronic obstructive pulmonary disease, and diabetes would exceed the USDHHS recommended reading level. From the Web sites of the American Heart Association, American Cancer Society, American Lung Association, American Diabetes Association, and American Stroke Association we randomly gathered 100 consumer-health-information articles. We assessed each article with 3 readability-assessment tools: SMOG (Simple Measure of Gobbledygook), Gunning FOG (Frequency of Gobbledygook), and Flesch-Kincaid Grade Level. We also categorized the articles per the USDHHS readability categories: easy to read (below 6th-grade level), average difficulty (7th to 9th grade level), and difficult (above 9th-grade level). Most of the articles exceeded the 7th-grade reading level and were in the USDHHS "difficult" category. The mean +/- SD readability score ranges were: SMOG 11.80 +/- 2.44 to 14.40 +/- 1.47, Flesch-Kincaid 9.85 +/- 2.25 to 11.55 +/- 0.76, and Gunning FOG 13.10 +/- 3.42 to 16.05 +/- 2.31. The articles from the American Lung Association had the lowest reading-level scores with each of the readability-assessment tools. Our findings support that Web-based medical information intended for consumer use is written above USDHHS recommended reading levels. Compliance with these recommendations may increase the likelihood of consumer comprehension.

  19. Computational methods for three-dimensional microscopy reconstruction

    CERN Document Server

    Frank, Joachim

    2014-01-01

    Approaches to the recovery of three-dimensional information on a biological object, which are often formulated or implemented initially in an intuitive way, are concisely described here based on physical models of the object and the image-formation process. Both three-dimensional electron microscopy and X-ray tomography can be captured in the same mathematical framework, leading to closely-related computational approaches, but the methodologies differ in detail and hence pose different challenges. The editors of this volume, Gabor T. Herman and Joachim Frank, are experts in the respective methodologies and present research at the forefront of biological imaging and structural biology.   Computational Methods for Three-Dimensional Microscopy Reconstruction will serve as a useful resource for scholars interested in the development of computational methods for structural biology and cell biology, particularly in the area of 3D imaging and modeling.

  20. The readability of scientific texts is decreasing over time

    Science.gov (United States)

    2017-01-01

    Clarity and accuracy of reporting are fundamental to the scientific process. Readability formulas can estimate how difficult a text is to read. Here, in a corpus consisting of 709,577 abstracts published between 1881 and 2015 from 123 scientific journals, we show that the readability of science is steadily decreasing. Our analyses show that this trend is indicative of a growing use of general scientific jargon. These results are concerning for scientists and for the wider public, as they impact both the reproducibility and accessibility of research findings. PMID:28873054

  1. A readability comparison of anti- versus pro-influenza vaccination online messages in Japan

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Okuhara

    2017-06-01

    When health professionals prepare pro-influenza vaccination materials for publication online, we recommend they check for readability using readability assessment tools and improve the text for easy reading if necessary.

  2. Computations of finite temperature QCD with the pseudofermion method

    International Nuclear Information System (INIS)

    Fucito, F.; Solomon, S.

    1985-01-01

    The authors discuss the phase diagram of finite temperature QCD as it is obtained including the effects of dynamical quarks by the pseudofermion method. They compare their results with the results obtained by other groups and comment on the actual state of the art for these kind of computations

  3. Multiscale methods in computational fluid and solid mechanics

    NARCIS (Netherlands)

    Borst, de R.; Hulshoff, S.J.; Lenz, S.; Munts, E.A.; Brummelen, van E.H.; Wall, W.; Wesseling, P.; Onate, E.; Periaux, J.

    2006-01-01

    First, an attempt is made towards gaining a more systematic understanding of recent progress in multiscale modelling in computational solid and fluid mechanics. Sub- sequently, the discussion is focused on variational multiscale methods for the compressible and incompressible Navier-Stokes

  4. Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)

    International Nuclear Information System (INIS)

    Oka, Yoshiaki; Okuda, Hiroshi

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)

  5. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  6. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  7. Method and system for environmentally adaptive fault tolerant computing

    Science.gov (United States)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  8. Numerical evaluation of methods for computing tomographic projections

    International Nuclear Information System (INIS)

    Zhuang, W.; Gopal, S.S.; Hebert, T.J.

    1994-01-01

    Methods for computing forward/back projections of 2-D images can be viewed as numerical integration techniques. The accuracy of any ray-driven projection method can be improved by increasing the number of ray-paths that are traced per projection bin. The accuracy of pixel-driven projection methods can be increased by dividing each pixel into a number of smaller sub-pixels and projecting each sub-pixel. The authors compared four competing methods of computing forward/back projections: bilinear interpolation, ray-tracing, pixel-driven projection based upon sub-pixels, and pixel-driven projection based upon circular, rather than square, pixels. This latter method is equivalent to a fast, bi-nonlinear interpolation. These methods and the choice of the number of ray-paths per projection bin or the number of sub-pixels per pixel present a trade-off between computational speed and accuracy. To solve the problem of assessing backprojection accuracy, the analytical inverse Fourier transform of the ramp filtered forward projection of the Shepp and Logan head phantom is derived

  9. Evaluation of Quality and Readability of Health Information Websites Identified through India’s Major Search Engines

    Directory of Open Access Journals (Sweden)

    S. Raj

    2016-01-01

    Full Text Available Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words “Health” and “Information” were used on search engines “Google” and “Yahoo.” Out of 50 websites (25 from each search engines, after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES, Flesch-Kincaid Grade Level (FKGL, and SMOG. Results. Forty percent of websites (n=13 were sponsored by government. Health On the Net Code of Conduct (HONcode certification was present on 50% (n=16 of websites. The mean LIDA score (74.31 was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  10. [Systematic Readability Analysis of Medical Texts on Websites of German University Clinics for General and Abdominal Surgery].

    Science.gov (United States)

    Esfahani, B Janghorban; Faron, A; Roth, K S; Grimminger, P P; Luers, J C

    2016-12-01

    Background: Besides the function as one of the main contact points, websites of hospitals serve as medical information portals. As medical information texts should be understood by any patients independent of the literacy skills and educational level, online texts should have an appropriate structure to ease understandability. Materials and Methods: Patient information texts on websites of clinics for general surgery at German university hospitals (n = 36) were systematically analysed. For 9 different surgical topics representative medical information texts were extracted from each website. Using common readability tools and 5 different readability indices the texts were analysed concerning their readability and structure. The analysis was furthermore stratified in relation to geographical regions in Germany. Results: For the definite analysis the texts of 196 internet websites could be used. On average the texts consisted of 25 sentences and 368 words. The reading analysis tools congruously showed that all texts showed a rather low readability demanding a high literacy level from the readers. Conclusion: Patient information texts on German university hospital websites are difficult to understand for most patients. To fulfill the ambition of informing the general population in an adequate way about medical issues, a revision of most medical texts on websites of German surgical hospitals is recommended. Georg Thieme Verlag KG Stuttgart · New York.

  11. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  12. Readability, content, and quality of online patient education materials on preeclampsia.

    Science.gov (United States)

    Lange, Elizabeth M S; Shah, Anuj M; Braithwaite, Brian A; You, Whitney B; Wong, Cynthia A; Grobman, William A; Toledo, Paloma

    2015-01-01

    The objective of this study was to evaluate the readability, content, and quality of patient education materials addressing preeclampsia. Websites of U.S. obstetrics and gynecology residency programs were searched for patient education materials. Readability, content, and quality were assessed. A one-sample t-test was used to evaluate mean readability level compared with the recommended 6th grade reading level. Mean readability levels were higher using all indices (p education materials should be improved.

  13. Computational biology in the cloud: methods and new insights from computing at scale.

    Science.gov (United States)

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  14. Computational Methods for Modeling Aptamers and Designing Riboswitches

    Directory of Open Access Journals (Sweden)

    Sha Gong

    2017-11-01

    Full Text Available Riboswitches, which are located within certain noncoding RNA region perform functions as genetic “switches”, regulating when and where genes are expressed in response to certain ligands. Understanding the numerous functions of riboswitches requires computation models to predict structures and structural changes of the aptamer domains. Although aptamers often form a complex structure, computational approaches, such as RNAComposer and Rosetta, have already been applied to model the tertiary (three-dimensional (3D structure for several aptamers. As structural changes in aptamers must be achieved within the certain time window for effective regulation, kinetics is another key point for understanding aptamer function in riboswitch-mediated gene regulation. The coarse-grained self-organized polymer (SOP model using Langevin dynamics simulation has been successfully developed to investigate folding kinetics of aptamers, while their co-transcriptional folding kinetics can be modeled by the helix-based computational method and BarMap approach. Based on the known aptamers, the web server Riboswitch Calculator and other theoretical methods provide a new tool to design synthetic riboswitches. This review will represent an overview of these computational methods for modeling structure and kinetics of riboswitch aptamers and for designing riboswitches.

  15. Using Readability Tests to Improve the Accuracy of Evaluation Documents Intended for Low-Literate Participants

    Science.gov (United States)

    Kouame, Julien B.

    2010-01-01

    Background: Readability tests are indicators that measure how easy a document can be read and understood. Simple, but very often ignored, readability statistics cannot only provide information about the level of difficulty of the readability of particular documents but also can increase an evaluator's credibility. Purpose: The purpose of this…

  16. A Study of Readability of Texts in Bangla through Machine Learning Approaches

    Science.gov (United States)

    Sinha, Manjira; Basu, Anupam

    2016-01-01

    In this work, we have investigated text readability in Bangla language. Text readability is an indicator of the suitability of a given document with respect to a target reader group. Therefore, text readability has huge impact on educational content preparation. The advances in the field of natural language processing have enabled the automatic…

  17. 6 CFR 37.19 - Machine readable technology on the driver's license or identification card.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Machine readable technology on the driver's..., Verification, and Card Issuance Requirements § 37.19 Machine readable technology on the driver's license or identification card. For the machine readable portion of the REAL ID driver's license or identification card...

  18. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  19. A Computationally Efficient Method for Polyphonic Pitch Estimation

    Directory of Open Access Journals (Sweden)

    Ruohua Zhou

    2009-01-01

    Full Text Available This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  20. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  1. Variational-moment method for computing magnetohydrodynamic equilibria

    International Nuclear Information System (INIS)

    Lao, L.L.

    1983-08-01

    A fast yet accurate method to compute magnetohydrodynamic equilibria is provided by the variational-moment method, which is similar to the classical Rayleigh-Ritz-Galerkin approximation. The equilibrium solution sought is decomposed into a spectral representation. The partial differential equations describing the equilibrium are then recast into their equivalent variational form and systematically reduced to an optimum finite set of coupled ordinary differential equations. An appropriate spectral decomposition can make the series representing the solution coverge rapidly and hence substantially reduces the amount of computational time involved. The moment method was developed first to compute fixed-boundary inverse equilibria in axisymmetric toroidal geometry, and was demonstrated to be both efficient and accurate. The method since has been generalized to calculate free-boundary axisymmetric equilibria, to include toroidal plasma rotation and pressure anisotropy, and to treat three-dimensional toroidal geometry. In all these formulations, the flux surfaces are assumed to be smooth and nested so that the solutions can be decomposed in Fourier series in inverse coordinates. These recent developments and the advantages and limitations of the moment method are reviewed. The use of alternate coordinates for decomposition is discussed

  2. Computer-aided methods of determining thyristor thermal transients

    International Nuclear Information System (INIS)

    Lu, E.; Bronner, G.

    1988-08-01

    An accurate tracing of the thyristor thermal response is investigated. This paper offers several alternatives for thermal modeling and analysis by using an electrical circuit analog: topological method, convolution integral method, etc. These methods are adaptable to numerical solutions and well suited to the use of the digital computer. The thermal analysis of thyristors was performed for the 1000 MVA converter system at the Princeton Plasma Physics Laboratory. Transient thermal impedance curves for individual thyristors in a given cooling arrangement were known from measurements and from manufacturer's data. The analysis pertains to almost any loading case, and the results are obtained in a numerical or a graphical format. 6 refs., 9 figs

  3. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  4. Fast calculation method for computer-generated cylindrical holograms.

    Science.gov (United States)

    Yamaguchi, Takeshi; Fujii, Tomohiko; Yoshikawa, Hiroshi

    2008-07-01

    Since a general flat hologram has a limited viewable area, we usually cannot see the other side of a reconstructed object. There are some holograms that can solve this problem. A cylindrical hologram is well known to be viewable in 360 deg. Most cylindrical holograms are optical holograms, but there are few reports of computer-generated cylindrical holograms. The lack of computer-generated cylindrical holograms is because the spatial resolution of output devices is not great enough; therefore, we have to make a large hologram or use a small object to fulfill the sampling theorem. In addition, in calculating the large fringe, the calculation amount increases in proportion to the hologram size. Therefore, we propose what we believe to be a new calculation method for fast calculation. Then, we print these fringes with our prototype fringe printer. As a result, we obtain a good reconstructed image from a computer-generated cylindrical hologram.

  5. Computational methods in metabolic engineering for strain design.

    Science.gov (United States)

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Method of Computer-aided Instruction in Situation Control Systems

    Directory of Open Access Journals (Sweden)

    Anatoliy O. Kargin

    2013-01-01

    Full Text Available The article considers the problem of computer-aided instruction in context-chain motivated situation control system of the complex technical system behavior. The conceptual and formal models of situation control with practical instruction are considered. Acquisition of new behavior knowledge is presented as structural changes in system memory in the form of situational agent set. Model and method of computer-aided instruction represent formalization, based on the nondistinct theories by physiologists and cognitive psychologists.The formal instruction model describes situation and reaction formation and dependence on different parameters, effecting education, such as the reinforcement value, time between the stimulus, action and the reinforcement. The change of the contextual link between situational elements when using is formalized.The examples and results of computer instruction experiments of the robot device “LEGO MINDSTORMS NXT”, equipped with ultrasonic distance, touch, light sensors.

  7. A new fault detection method for computer networks

    International Nuclear Information System (INIS)

    Lu, Lu; Xu, Zhengguo; Wang, Wenhai; Sun, Youxian

    2013-01-01

    Over the past few years, fault detection for computer networks has attracted extensive attentions for its importance in network management. Most existing fault detection methods are based on active probing techniques which can detect the occurrence of faults fast and precisely. But these methods suffer from the limitation of traffic overhead, especially in large scale networks. To relieve traffic overhead induced by active probing based methods, a new fault detection method, whose key is to divide the detection process into multiple stages, is proposed in this paper. During each stage, only a small region of the network is detected by using a small set of probes. Meanwhile, it also ensures that the entire network can be covered after multiple detection stages. This method can guarantee that the traffic used by probes during each detection stage is small sufficiently so that the network can operate without severe disturbance from probes. Several simulation results verify the effectiveness of the proposed method

  8. Contents and readability of currently used surgical/ procedure ...

    African Journals Online (AJOL)

    Conclusion: The content of majority of the informed consent forms used in Nigerian tertiary health institutions are poor and their readability scores are not better than those used in developed parts of the world. Health Institutions in Nigeria should revise their informed consent forms to improve their contents and do a usability ...

  9. Reading apps for children: Readability from the design perspective

    Science.gov (United States)

    Mohammed, Wasan Abdulwahab; Husni, Husniza

    2017-10-01

    Electronic reading for young children opens new avenues especially with the advance of modern reading devices. The readability of mobile learning applications has received extensive attention by the designers and developers. The reason for such concern is due to its importance in determining usability related issues especially in the design context for children. In many cases, children find it difficult to interact with mobile reading apps. This is because apps for reading and for entertainment require different features. As such, this study sets out three objectives: 1) to evaluate five reading apps for young children from design perspectives; 2) to examine the readability for current existing mobile apps for reading and 3) to propose and evaluate mobile apps UI guideline for readability. Readability indices, observation and interview were conducted on 6 - 8 years old students. The obtained result showed that certain reading apps provide better reading experience for children than others. Some of the reasons are mostly related to the design characteristics embedded within the app. In addition, the use of animation was found to stimulate children reading experience as it is believed to offer the interactivity elements to gain their interest and willingness to read. These findings are believed to provide the recommendations and insights for designers of reading apps for children.

  10. Readability of Healthcare Literature for Hepatitis B and C.

    Science.gov (United States)

    Meillier, Andrew; Patel, Shyam; Al-Osaimi, Abdullah M S

    2015-12-01

    Patients increasingly use the Internet for educational material concerning health and diseases. This information can be utilized to teach the population of hepatitis B and C if properly written at the necessary grade level of the intended patient population. We explored the readability of online resources concerning hepatitis B and C. Google searches were performed for "Hepatitis B" and "Hepatitis C." The Internet resources that were intended for patient education were used with specific exclusions. Articles were taken from 19 and 23 different websites focusing on the symptoms, diagnosis, and treatment of hepatitis B and C, respectively. The articles were analyzed using Readability Studio Professional Edition (Oleander Solutions, Vandalia, OH) using 10 different readability scales. The results were compared and averaged to identify the anticipated academic grade level required to understand the information. The average readability scores of the 10 scales had ranges of 9.7-16.4 for hepatitis B and 9.2-16.4 for hepatitis C. The average academic reading grade level for hepatitis B was 12.6 ± 2.1 and for hepatitis C was 12.7 ± 2.1. There was no significant discrepancy between the hepatitis B and C Internet resource averaged grade levels. The resources accessed by patients are higher than the previously determined necessary grade level for patients to properly understand the intended information. The American Medical Association recommends material should be simplified to grade levels below the sixth grade level to benefit the ideal proportion of the patient population.

  11. Readability of Internet Information on Hearing: Systematic Literature Review.

    Science.gov (United States)

    Laplante-Lévesque, Ariane; Thorén, Elisabet Sundewall

    2015-09-01

    This systematic literature review asks the following question: “ What is the readability of Internet information on hearing that people with hearing impairment and their significant others can access in the context of their hearing care?” Searches were completed in three databases: CINAHL, PubMed, and Scopus. Seventy-eight records were identified and systematically screened for eligibility: 8 records were included that contained data on the readability of Internet information on hearing that people with hear ing impairment and their significant others can access in the context of their hearing care. Records reported mean readability levels from 9 to over 14. In other words, people with hearing impairment and their significant others need 9 to 14 years of education to read and understand Internet information on hearing that they access in the context of their hearing care. The poor readability of Internet information on hearing has been well documented; it is time to focus on valid and sustainable initiatives that address this problem.

  12. Tools for Assessing Readability of Statistics Teaching Materials

    Science.gov (United States)

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  13. Reassessing the Accuracy and Use of Readability Formulae

    Science.gov (United States)

    Janan, Dahlia; Wray, David

    2014-01-01

    Purpose: The purpose of the study is to review readability formulae and offer a critique, based on a comparison of the grading of a variety of texts given by six well-known formulae. Methodology: A total of 64 texts in English were selected either by or for native English speaking children aged between six and 11 years. Each text was assessed…

  14. Readability of Informed Consent Documents at University Counseling Centers

    Science.gov (United States)

    Lustgarten, Samuel D.; Elchert, Daniel M.; Cederberg, Charles; Garrison, Yunkyoung L.; Ho, Y. C. S.

    2017-01-01

    The extent to which clients understand the nature and anticipated course of therapy is referred to as informed consent. Counseling psychologists often provide informed consent documents to enhance the education of services and for liability purposes. Professionals in numerous health care settings have evaluated the readability of their informed…

  15. Readability of Malaria Medicine Information Leaflets in Nigeria

    African Journals Online (AJOL)

    Erah

    2010-12-18

    Dec 18, 2010 ... for malaria medicines information leaflets available in Nigeria was 13.69 ± 1.70. This value is equivalent ... Health promotion and behaviour change communication ... their rational use at the community level. Readability is the ...

  16. The Readability of Malaysian English Children Books: A Multilevel Analysis

    Directory of Open Access Journals (Sweden)

    Adlina Ismail

    2016-11-01

    Full Text Available These days, there are more English books for children published by local publishers in Malaysia. It is a positive development because the books will be more accessible to the children. However, the books have never been studied and evaluated in depth yet. One important factor in assessing reading materials is readability. Readability determines whether a text is easy or difficult to understand and a balanced mix of both can promote learning and language development. Various researchers mentioned a multilevel framework of discourse that any language assessment on a text should take into account. The levels that were proposed were word, syntax, textbase, situation model and genre and rhetorical structures. Traditional readability measures such as Flesh Reading Ease Formula, Gunning Readability Index, Fog Count, and Fry Grade Level are not able to address the multilevel because they are based on shallow variables. In contrast, Coh-metrix TERA provided five indices that are correlated to grade level and aligned to the multilevel framework. This study analyzed ten Malaysian English chapter books for children using this Coh-metrix TERA. The result revealed that the Malaysian English children books were easy in shallow level but there was a possible difficulty in textbase and situation model level because of the lack of cohesion. In conclusion, more attention should be given on deeper level of text rather than just word and syntax level.

  17. An Analysis of the Readability of Financial Accounting Textbooks.

    Science.gov (United States)

    Smith, Gerald; And Others

    1981-01-01

    The Flesch formula was used to calculate the readability of 15 financial accounting textbooks. The 15 textbooks represented introductory, intermediate, and advanced levels and also were classified by five different publishers. Two-way analysis of variance and Tukey's post hoc analysis revealed some significant differences. (Author/CT)

  18. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  19. Computing homography with RANSAC algorithm: a novel method of registration

    Science.gov (United States)

    Li, Xiaowei; Liu, Yue; Wang, Yongtian; Yan, Dayuan

    2005-02-01

    An AR (Augmented Reality) system can integrate computer-generated objects with the image sequences of real world scenes in either an off-line or a real-time way. Registration, or camera pose estimation, is one of the key techniques to determine its performance. The registration methods can be classified as model-based and move-matching. The former approach can accomplish relatively accurate registration results, but it requires the precise model of the scene, which is hard to be obtained. The latter approach carries out registration by computing the ego-motion of the camera. Because it does not require the prior-knowledge of the scene, its registration results sometimes turn out to be less accurate. When the model defined is as simple as a plane, a mixed method is introduced to take advantages of the virtues of the two methods mentioned above. Although unexpected objects often occlude this plane in an AR system, one can still try to detect corresponding points with a contract-expand method, while this will import erroneous correspondences. Computing homography with RANSAC algorithm is used to overcome such shortcomings. Using the robustly estimated homography resulted from RANSAC, the camera projective matrix can be recovered and thus registration is accomplished even when the markers are lost in the scene.

  20. Pair Programming as a Modern Method of Teaching Computer Science

    Directory of Open Access Journals (Sweden)

    Irena Nančovska Šerbec

    2008-10-01

    Full Text Available At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM Computing Curricula. The professional knowledge is therefore associated and combined with the teaching knowledge and skills. In the paper we present how to achieve competences related to programming by using different didactical models (semiotic ladder, cognitive objectives taxonomy, problem solving and modern teaching method “pair programming”. Pair programming differs from standard methods (individual work, seminars, projects etc.. It belongs to the extreme programming as a discipline of software development and is known to have positive effects on teaching first programming language. We have experimentally observed pair programming in the introductory programming course. The paper presents and analyzes the results of using this method: the aspects of satisfaction during programming and the level of gained knowledge. The results are in general positive and demonstrate the promising usage of this teaching method.

  1. Applications of meshless methods for damage computations with finite strains

    International Nuclear Information System (INIS)

    Pan Xiaofei; Yuan Huang

    2009-01-01

    Material defects such as cavities have great effects on the damage process in ductile materials. Computations based on finite element methods (FEMs) often suffer from instability due to material failure as well as large distortions. To improve computational efficiency and robustness the element-free Galerkin (EFG) method is applied in the micro-mechanical constitute damage model proposed by Gurson and modified by Tvergaard and Needleman (the GTN damage model). The EFG algorithm is implemented in the general purpose finite element code ABAQUS via the user interface UEL. With the help of the EFG method, damage processes in uniaxial tension specimens and notched specimens are analyzed and verified with experimental data. Computational results reveal that the damage which takes place in the interior of specimens will extend to the exterior and cause fracture of specimens; the damage is a fast procedure relative to the whole tensing process. The EFG method provides more stable and robust numerical solution in comparing with the FEM analysis

  2. Improved computation method in residual life estimation of structural components

    Directory of Open Access Journals (Sweden)

    Maksimović Stevan M.

    2013-01-01

    Full Text Available This work considers the numerical computation methods and procedures for the fatigue crack growth predicting of cracked notched structural components. Computation method is based on fatigue life prediction using the strain energy density approach. Based on the strain energy density (SED theory, a fatigue crack growth model is developed to predict the lifetime of fatigue crack growth for single or mixed mode cracks. The model is based on an equation expressed in terms of low cycle fatigue parameters. Attention is focused on crack growth analysis of structural components under variable amplitude loads. Crack growth is largely influenced by the effect of the plastic zone at the front of the crack. To obtain efficient computation model plasticity-induced crack closure phenomenon is considered during fatigue crack growth. The use of the strain energy density method is efficient for fatigue crack growth prediction under cyclic loading in damaged structural components. Strain energy density method is easy for engineering applications since it does not require any additional determination of fatigue parameters (those would need to be separately determined for fatigue crack propagation phase, and low cyclic fatigue parameters are used instead. Accurate determination of fatigue crack closure has been a complex task for years. The influence of this phenomenon can be considered by means of experimental and numerical methods. Both of these models are considered. Finite element analysis (FEA has been shown to be a powerful and useful tool1,6 to analyze crack growth and crack closure effects. Computation results are compared with available experimental results. [Projekat Ministarstva nauke Republike Srbije, br. OI 174001

  3. Readability Trends of Online Information by the American Academy of Otolaryngology-Head and Neck Surgery Foundation.

    Science.gov (United States)

    Wong, Kevin; Levi, Jessica R

    2017-01-01

    Objective Previous studies have shown that patient education materials published by the American Academy of Otolaryngology-Head and Neck Surgery Foundation may be too difficult for the average reader to understand. The purpose of this study was to determine if current educational materials show improvements in readability. Study Design Cross-sectional analysis. Setting The Patient Health Information section of the American Academy of Otolaryngology-Head and Neck Surgery Foundation website. Subjects and Methods All patient education articles were extracted in plain text. Webpage navigation, references, author information, appointment information, acknowledgments, and disclaimers were removed. Follow-up editing was also performed to remove paragraph breaks, colons, semicolons, numbers, percentages, and bullets. Readability grade was calculated with the Flesch-Kincaid Grade Level, Flesch Reading Ease, Gunning-Fog Index, Coleman-Liau Index, Automated Readability Index, and Simple Measure of Gobbledygook. Intra- and interobserver reliability were assessed. Results A total of 126 articles from 7 topics were analyzed. Readability levels across all 6 tools showed that the difficulty of patient education materials exceeded the abilities of an average American. As compared with previous studies, current educational materials by the American Academy of Otolaryngology-Head and Neck Surgery Foundation have shown a decrease in difficulty. Intra- and interobserver reliability were both excellent, with intraclass coefficients of 0.99 and 0.96, respectively. Conclusion Improvements in readability is an encouraging finding and one that is consistent with recent trends toward improved health literacy. Nevertheless, online patient educational material is still too difficult for the average reader. Revisions may be necessary for current materials to benefit a larger readership.

  4. Readability of Online Patient Education Materials Related to IR.

    Science.gov (United States)

    McEnteggart, Gregory E; Naeem, Muhammad; Skierkowski, Dorothy; Baird, Grayson L; Ahn, Sun H; Soares, Gregory

    2015-08-01

    To assess the readability of online patient education materials (OPEM) related to common diseases treated by and procedures performed by interventional radiology (IR). The following websites were chosen based on their average Google search return for each IR OPEM content area examined in this study: Society of Interventional Radiology (SIR), Cardiovascular and Interventional Radiological Society of Europe (CIRSE), National Library of Medicine, RadiologyInfo, Mayo Clinic, WebMD, and Wikipedia. IR OPEM content area was assessed for the following: peripheral arterial disease, central venous catheter, varicocele, uterine artery embolization, vertebroplasty, transjugular intrahepatic portosystemic shunt, and deep vein thrombosis. The following algorithms were used to estimate and compare readability levels: Flesch-Kincaid Grade Formula, Flesch Reading Ease Score, Gunning Frequency of Gobbledygook, Simple Measure of Gobbledygook, and Coleman-Liau Index. Data were analyzed using general mixed modeling. On average, online sources that required beyond high school grade-level readability were Wikipedia (15.0), SIR (14.2), and RadiologyInfo (12.4); sources that required high school grade-level readability were CIRSE (11.3), Mayo Clinic (11.0), WebMD (10.6), and National Library of Medicine (9.0). On average, OPEM on uterine artery embolization, vertebroplasty, varicocele, and peripheral arterial disease required the highest level of readability (12.5, 12.3, 12.3, and 12.2, respectively). The IR OPEM assessed in this study were written above the recommended sixth-grade reading level and the health literacy level of the average American adult. Many patients in the general public may not have the ability to read and understand health information in IR OPEM. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  5. The Readability of Online Resources for Mastopexy Surgery.

    Science.gov (United States)

    Vargas, Christina R; Chuang, Danielle J; Lee, Bernard T

    2016-01-01

    As more patients use Internet resources for health information, there is increasing interest in evaluating the readability of available online materials. The National Institutes of Health and American Medical Association recommend that patient educational content be written at a sixth-grade reading level. This study evaluates the most popular online resources for information about mastopexy relative to average adult literacy in the United States. The 12 most popular sites returned by the largest Internet search engine were identified using the search term "breast lift surgery." Relevant articles from the main sites were downloaded and formatted into text documents. Pictures, captions, links, and references were excluded. The readability of these 100 articles was analyzed overall and subsequently by site using 10 established readability tests. Subgroup analysis was performed for articles discussing the benefits of surgery and those focusing on risks. The overall average readability of online patient information was 13.3 (range, 11.1-15). There was a range of average readability scores overall across the 12 sites from 8.9 to 16.1, suggesting that some may be more appropriate than others for patient demographics with different health literacy levels. Subgroup analysis revealed that articles discussing the risks of mastopexy were significantly harder to read (mean, 14.1) than articles about benefits (11.6). Patient-directed articles from the most popular online resources for mastopexy information are uniformly above the recommended reading level and likely too difficult to be understood by a large number of patients in the United States.

  6. Evaluating the Readability of Radio Frequency Identification for Construction Materials

    Directory of Open Access Journals (Sweden)

    Younghan Jung

    2017-01-01

    Full Text Available Radio Frequency Identification (RFID, which was originally introduced to improve material handling and speed production as part of supply chain management, has become a globally accepted technology that is now applied on many construction sites to facilitate real-time information visibility and traceability. This paper describes a senior undergraduate project for a Construction Management (CM program that was specifically designed to give the students a greater insight into technical research in the CM area. The students were asked to determine whether it would be possible to utilize an RFID system capable of tracking tagged equipment, personnel and materials across an entire construction site. This project required them to set up an experimental program, execute a series of experiments, analyze the results and summarize them in a report. The readability test was performed using an active Ultra-High frequency (UHF, 433.92 MHz RFID system with various construction materials, including metal, concrete, wood, plastic, and aluminum. The readability distance distances are measured for each of the six scenarios. The distance at which a tag was readable with no obstructions was found to be an average of 133.9m based on three measurements, with a standard deviation of 3.9m. This result confirms the manufacturer’s claimed distance of 137.2m. The RFID tag embedded under 50.8mm of concrete was readable for an average distance of only 12.2m, the shortest readable distance of any of the scenarios tested. At the end of the semester, faculty advisors held an open discussion session to gather feedback and elicit the students’ reflections on their research experiences, revealing that the students’ overall impressions of their undergraduate research had positively affected their postgraduate education plans.

  7. Readability and quality of wikipedia pages on neurosurgical topics.

    Science.gov (United States)

    Modiri, Omeed; Guha, Daipayan; Alotaibi, Naif M; Ibrahim, George M; Lipsman, Nir; Fallah, Aria

    2018-03-01

    Wikipedia is the largest online encyclopedia with over 40 million articles, and generating 500 million visits per month. The aim of this study is to assess the readability and quality of Wikipedia pages on neurosurgical related topics. We selected the neurosurgical related Wikipedia pages based on the series of online patient information articles that are published by the American Association of Neurological Surgeons (AANS). We assessed readability of Wikipedia pages using five different readability scales (Flesch Reading Ease, Flesch Kincaid Grade Level, Gunning Fog Index, SMOG) Grade level, and Coleman-Liau Index). We used the Center for Disease Control (CDC) Clear Communication Index as well as the DISCERN Instrument to evaluate the quality of each Wikipedia article. We identified a total of fifty-five Wikipedia articles that corresponded with patient information articles published by the AANS. This constitutes 77.46% of the AANS topics. The mean Flesch Kincaid reading ease score for all of the Wikipedia articles we analyzed is 31.10, which indicates that a college-level education is necessary to understand them. In comparison to the readability analysis for the AANS articles, the Wikipedia articles were more difficult to read across every scale. None of the Wikipedia articles meet the CDC criterion for clear communications. Our analyses demonstrated that Wikipedia articles related to neurosurgical topics are associated with higher grade levels for reading and also below the expected levels of clear communications for patients. Collaborative efforts from the neurosurgical community are needed to enhance the readability and quality of Wikipedia pages related to neurosurgery. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  9. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  10. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  11. Experiences using DAKOTA stochastic expansion methods in computational simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan; Ruthruff, Joseph R.

    2012-01-01

    Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.

  12. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  13. Advanced soft computing diagnosis method for tumour grading.

    Science.gov (United States)

    Papageorgiou, E I; Spyridonos, P P; Stylios, C D; Ravazoula, P; Groumpos, P P; Nikiforidis, G N

    2006-01-01

    To develop an advanced diagnostic method for urinary bladder tumour grading. A novel soft computing modelling methodology based on the augmentation of fuzzy cognitive maps (FCMs) with the unsupervised active Hebbian learning (AHL) algorithm is applied. One hundred and twenty-eight cases of urinary bladder cancer were retrieved from the archives of the Department of Histopathology, University Hospital of Patras, Greece. All tumours had been characterized according to the classical World Health Organization (WHO) grading system. To design the FCM model for tumour grading, three experts histopathologists defined the main histopathological features (concepts) and their impact on grade characterization. The resulted FCM model consisted of nine concepts. Eight concepts represented the main histopathological features for tumour grading. The ninth concept represented the tumour grade. To increase the classification ability of the FCM model, the AHL algorithm was applied to adjust the weights of the FCM. The proposed FCM grading model achieved a classification accuracy of 72.5%, 74.42% and 95.55% for tumours of grades I, II and III, respectively. An advanced computerized method to support tumour grade diagnosis decision was proposed and developed. The novelty of the method is based on employing the soft computing method of FCMs to represent specialized knowledge on histopathology and on augmenting FCMs ability using an unsupervised learning algorithm, the AHL. The proposed method performs with reasonably high accuracy compared to other existing methods and at the same time meets the physicians' requirements for transparency and explicability.

  14. Methods for monitoring multiple gene expression

    Energy Technology Data Exchange (ETDEWEB)

    Berka, Randy [Davis, CA; Bachkirova, Elena [Davis, CA; Rey, Michael [Davis, CA

    2012-05-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  15. Methods for monitoring multiple gene expression

    Energy Technology Data Exchange (ETDEWEB)

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2013-10-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  16. Splitting method for computing coupled hydrodynamic and structural response

    International Nuclear Information System (INIS)

    Ash, J.E.

    1977-01-01

    A numerical method is developed for application to unsteady fluid dynamics problems, in particular to the mechanics following a sudden release of high energy. Solution of the initial compressible flow phase provides input to a power-series method for the incompressible fluid motions. The system is split into spatial and time domains leading to the convergent computation of a sequence of elliptic equations. Two sample problems are solved, the first involving an underwater explosion and the second the response of a nuclear reactor containment shell structure to a hypothetical core accident. The solutions are correlated with experimental data

  17. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  18. Computational methods for planning and evaluating geothermal energy projects

    International Nuclear Information System (INIS)

    Goumas, M.G.; Lygerou, V.A.; Papayannakis, L.E.

    1999-01-01

    In planning, designing and evaluating a geothermal energy project, a number of technical, economic, social and environmental parameters should be considered. The use of computational methods provides a rigorous analysis improving the decision-making process. This article demonstrates the application of decision-making methods developed in operational research for the optimum exploitation of geothermal resources. Two characteristic problems are considered: (1) the economic evaluation of a geothermal energy project under uncertain conditions using a stochastic analysis approach and (2) the evaluation of alternative exploitation schemes for optimum development of a low enthalpy geothermal field using a multicriteria decision-making procedure. (Author)

  19. Comparative Costs of Converting Shelf List Records to Machine Readable Form

    Directory of Open Access Journals (Sweden)

    Richard E. Chapin

    1968-03-01

    Full Text Available A study at Michigan State University Library compared the costs of three different methods of conversion: keypunching, paper-tape typewriting, and optical scanning by a service bureau. The record converted included call number, copy number, first 39 letters of the author's name, first 43 letters of the title, and date of publication. Source documents were all of the shelf list cards at the Library. The end products were a master book tape of the library collections and a machine readable book card for each volume to be used in an automated circulation system.

  20. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  1. Comparison of different methods for shielding design in computed tomography

    International Nuclear Information System (INIS)

    Ciraj-Bjelac, O.; Arandjic, D.; Kosutic, D.

    2011-01-01

    The purpose of this work is to compare different methods for shielding calculation in computed tomography (CT). The BIR-IPEM (British Inst. of Radiology and Inst. of Physics in Engineering in Medicine) and NCRP (National Council on Radiation Protection) method were used for shielding thickness calculation. Scattered dose levels and calculated barrier thickness were also compared with those obtained by scatter dose measurements in the vicinity of a dedicated CT unit. Minimal requirement for protective barriers based on BIR-IPEM method ranged between 1.1 and 1.4 mm of lead demonstrating underestimation of up to 20 % and overestimation of up to 30 % when compared with thicknesses based on measured dose levels. For NCRP method, calculated thicknesses were 33 % higher (27-42 %). BIR-IPEM methodology-based results were comparable with values based on scattered dose measurements, while results obtained using NCRP methodology demonstrated an overestimation of the minimal required barrier thickness. (authors)

  2. Improving readability of informed consents for research at an academic medical institution.

    Science.gov (United States)

    Hadden, Kristie B; Prince, Latrina Y; Moore, Tina D; James, Laura P; Holland, Jennifer R; Trudeau, Christopher R

    2017-12-01

    The final rule for the protection of human subjects requires that informed consent be "in language understandable to the subject" and mandates that "the informed consent must be organized in such a way that facilitates comprehension." This study assessed the readability of Institutional Review Board-approved informed consent forms at our institution, implemented an intervention to improve the readability of consent forms, and measured the first year impact of the intervention. Readability assessment was conducted on a sample of 217 Institutional Review Board-approved informed consents from 2013 to 2015. A plain language informed consent template was developed and implemented and readability was assessed again after 1 year. The mean readability of the baseline sample was 10th grade. The mean readability of the post-intervention sample (n=82) was seventh grade. Providing investigators with a plain language informed consent template and training can promote improved readability of informed consents for research.

  3. Readability of Online Health Information: A Meta-Narrative Systematic Review.

    Science.gov (United States)

    Daraz, Lubna; Morrow, Allison S; Ponce, Oscar J; Farah, Wigdan; Katabi, Abdulrahman; Majzoub, Abdul; Seisa, Mohamed O; Benkhadra, Raed; Alsawas, Mouaz; Larry, Prokop; Murad, M Hassan

    2018-01-01

    Online health information should meet the reading level for the general public (set at sixth-grade level). Readability is a key requirement for information to be helpful and improve quality of care. The authors conducted a systematic review to evaluate the readability of online health information in the United States and Canada. Out of 3743 references, the authors included 157 cross-sectional studies evaluating 7891 websites using 13 readability scales. The mean readability grade level across websites ranged from grade 10 to 15 based on the different scales. Stratification by specialty, health condition, and type of organization producing information revealed the same findings. In conclusion, online health information in the United States and Canada has a readability level that is inappropriate for general public use. Poor readability can lead to misinformation and may have a detrimental effect on health. Efforts are needed to improve readability and the content of online health information.

  4. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  5. Mathematical modellings and computational methods for structural analysis of LMFBR's

    International Nuclear Information System (INIS)

    Liu, W.K.; Lam, D.

    1983-01-01

    In this paper, two aspects of nuclear reactor problems are discussed, modelling techniques and computational methods for large scale linear and nonlinear analyses of LMFBRs. For nonlinear fluid-structure interaction problem with large deformation, arbitrary Lagrangian-Eulerian description is applicable. For certain linear fluid-structure interaction problem, the structural response spectrum can be found via 'added mass' approach. In a sense, the fluid inertia is accounted by a mass matrix added to the structural mass. The fluid/structural modes of certain fluid-structure problem can be uncoupled to get the reduced added mass. The advantage of this approach is that it can account for the many repeated structures of nuclear reactor. In regard to nonlinear dynamic problem, the coupled nonlinear fluid-structure equations usually have to be solved by direct time integration. The computation can be very expensive and time consuming for nonlinear problems. Thus, it is desirable to optimize the accuracy and computation effort by using implicit-explicit mixed time integration method. (orig.)

  6. Augmented reality with image registration, vision correction and sunlight readability via liquid crystal devices.

    Science.gov (United States)

    Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin

    2017-03-27

    Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.

  7. A numerical method to compute interior transmission eigenvalues

    International Nuclear Information System (INIS)

    Kleefeld, Andreas

    2013-01-01

    In this paper the numerical calculation of eigenvalues of the interior transmission problem arising in acoustic scattering for constant contrast in three dimensions is considered. From the computational point of view existing methods are very expensive, and are only able to show the existence of such transmission eigenvalues. Furthermore, they have trouble finding them if two or more eigenvalues are situated closely together. We present a new method based on complex-valued contour integrals and the boundary integral equation method which is able to calculate highly accurate transmission eigenvalues. So far, this is the first paper providing such accurate values for various surfaces different from a sphere in three dimensions. Additionally, the computational cost is even lower than those of existing methods. Furthermore, the algorithm is capable of finding complex-valued eigenvalues for which no numerical results have been reported yet. Until now, the proof of existence of such eigenvalues is still open. Finally, highly accurate eigenvalues of the interior Dirichlet problem are provided and might serve as test cases to check newly derived Faber–Krahn type inequalities for larger transmission eigenvalues that are not yet available. (paper)

  8. The principles of computer hardware

    CERN Document Server

    Clements, Alan

    2000-01-01

    Principles of Computer Hardware, now in its third edition, provides a first course in computer architecture or computer organization for undergraduates. The book covers the core topics of such a course, including Boolean algebra and logic design; number bases and binary arithmetic; the CPU; assembly language; memory systems; and input/output methods and devices. It then goes on to cover the related topics of computer peripherals such as printers; the hardware aspects of the operating system; and data communications, and hence provides a broader overview of the subject. Its readable, tutorial-based approach makes it an accessible introduction to the subject. The book has extensive in-depth coverage of two microprocessors, one of which (the 68000) is widely used in education. All chapters in the new edition have been updated. Major updates include: powerful software simulations of digital systems to accompany the chapters on digital design; a tutorial-based introduction to assembly language, including many exam...

  9. Laboratory Sequence in Computational Methods for Introductory Chemistry

    Science.gov (United States)

    Cody, Jason A.; Wiser, Dawn C.

    2003-07-01

    A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.

  10. An analytical method for computing atomic contact areas in biomolecules.

    Science.gov (United States)

    Mach, Paul; Koehl, Patrice

    2013-01-15

    We propose a new analytical method for detecting and computing contacts between atoms in biomolecules. It is based on the alpha shape theory and proceeds in three steps. First, we compute the weighted Delaunay triangulation of the union of spheres representing the molecule. In the second step, the Delaunay complex is filtered to derive the dual complex. Finally, contacts between spheres are collected. In this approach, two atoms i and j are defined to be in contact if their centers are connected by an edge in the dual complex. The contact areas between atom i and its neighbors are computed based on the caps formed by these neighbors on the surface of i; the total area of all these caps is partitioned according to their spherical Laguerre Voronoi diagram on the surface of i. This method is analytical and its implementation in a new program BallContact is fast and robust. We have used BallContact to study contacts in a database of 1551 high resolution protein structures. We show that with this new definition of atomic contacts, we generate realistic representations of the environments of atoms and residues within a protein. In particular, we establish the importance of nonpolar contact areas that complement the information represented by the accessible surface areas. This new method bears similarity to the tessellation methods used to quantify atomic volumes and contacts, with the advantage that it does not require the presence of explicit solvent molecules if the surface of the protein is to be considered. © 2012 Wiley Periodicals, Inc. Copyright © 2012 Wiley Periodicals, Inc.

  11. System and method for determining stability of a neural system

    Science.gov (United States)

    Curtis, Steven A. (Inventor)

    2011-01-01

    Disclosed are methods, systems, and computer-readable media for determining stability of a neural system. The method includes tracking a function world line of an N element neural system within at least one behavioral space, determining whether the tracking function world line is approaching a psychological stability surface, and implementing a quantitative solution that corrects instability if the tracked function world line is approaching the psychological stability surface.

  12. An Accurate liver segmentation method using parallel computing algorithm

    International Nuclear Information System (INIS)

    Elbasher, Eiman Mohammed Khalied

    2014-12-01

    Computed Tomography (CT or CAT scan) is a noninvasive diagnostic imaging procedure that uses a combination of X-rays and computer technology to produce horizontal, or axial, images (often called slices) of the body. A CT scan shows detailed images of any part of the body, including the bones muscles, fat and organs CT scans are more detailed than standard x-rays. CT scans may be done with or without "contrast Contrast refers to a substance taken by mouth and/ or injected into an intravenous (IV) line that causes the particular organ or tissue under study to be seen more clearly. CT scan of the liver and biliary tract are used in the diagnosis of many diseases in the abdomen structures, particularly when another type of examination, such as X-rays, physical examination, and ultra sound is not conclusive. Unfortunately, the presence of noise and artifact in the edges and fine details in the CT images limit the contrast resolution and make diagnostic procedure more difficult. This experimental study was conducted at the College of Medical Radiological Science, Sudan University of Science and Technology and Fidel Specialist Hospital. The sample of study was included 50 patients. The main objective of this research was to study an accurate liver segmentation method using a parallel computing algorithm, and to segment liver and adjacent organs using image processing technique. The main technique of segmentation used in this study was watershed transform. The scope of image processing and analysis applied to medical application is to improve the quality of the acquired image and extract quantitative information from medical image data in an efficient and accurate way. The results of this technique agreed wit the results of Jarritt et al, (2010), Kratchwil et al, (2010), Jover et al, (2011), Yomamoto et al, (1996), Cai et al (1999), Saudha and Jayashree (2010) who used different segmentation filtering based on the methods of enhancing the computed tomography images. Anther

  13. A discrete ordinate response matrix method for massively parallel computers

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1991-01-01

    A discrete ordinate response matrix method is formulated for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices which result from the diamond-differenced equations are utilized in a factored form which minimizes memory requirements and significantly reduces the required number of algorithm utilizes massive parallelism by assigning each spatial node to a processor. The algorithm is accelerated effectively by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red/black iterations. The method has been implemented on a 16k Connection Machine-2, and S 8 and S 16 solutions have been obtained for fixed-source benchmark problems in X--Y geometry

  14. Review methods for image segmentation from computed tomography images

    International Nuclear Information System (INIS)

    Mamat, Nurwahidah; Rahman, Wan Eny Zarina Wan Abdul; Soh, Shaharuddin Cik; Mahmud, Rozi

    2014-01-01

    Image segmentation is a challenging process in order to get the accuracy of segmentation, automation and robustness especially in medical images. There exist many segmentation methods that can be implemented to medical images but not all methods are suitable. For the medical purposes, the aims of image segmentation are to study the anatomical structure, identify the region of interest, measure tissue volume to measure growth of tumor and help in treatment planning prior to radiation therapy. In this paper, we present a review method for segmentation purposes using Computed Tomography (CT) images. CT images has their own characteristics that affect the ability to visualize anatomic structures and pathologic features such as blurring of the image and visual noise. The details about the methods, the goodness and the problem incurred in the methods will be defined and explained. It is necessary to know the suitable segmentation method in order to get accurate segmentation. This paper can be a guide to researcher to choose the suitable segmentation method especially in segmenting the images from CT scan

  15. GCP compliance and readability of informed consent forms from an emerging hub for clinical trials

    Directory of Open Access Journals (Sweden)

    Satish Chandrasekhar Nair

    2015-01-01

    Full Text Available Background: The rapid expansion of trials in emerging regions has raised valid concerns about research subject protection, particularly related to informed consent. The purpose of this study is to assess informed consent form (ICF compliance with Good Clinical Practice (GCP guidelines and the readability easeof the ICFs in Abu Dhabi, a potential destination for clinical trials in the UAE. Materials and Methods: A multicenter retrospective cross-sectional analysis of 140 ICFs from industry sponsored and non-sponsored studies was conducted by comparing against a local standard ICF. Flesch-Kincaid Reading Scale was used to assess the readability ease of the forms. Results: Non-sponsored studies had signifi cantly lower overall GCP compliance of 55.8% when compared to 79.5% for industry sponsored studies. Only 33% of sponsored and 16% of non-sponsored studies included basic information on the participants′ rights and responsibilities. Flesch-Kincaid Reading ease score for the informed consent forms from industry sponsored studies was signifi cantly higher 48.9 ± 4.8 as compared to 38.5 ± 8.0 for non-sponsored studies, though both were more complex than recommended. Reading Grade Level score was also higher than expected, but scores for the ICFs from the industry sponsored studies were 9.7 ± 0.7, signifi cantly lower as compared to 12.2 ± 1.3 for non-sponsored studies. Conclusion: In spite of the undisputed benefits of conducting research in emerging markets readability, comprehension issues and the lack of basic essential information call for improvements in the ICFs to protect the rights of future research subjects enrolled in clinical trials in the UAE.

  16. Readability and Content Assessment of Informed Consent Forms for Medical Procedures in Croatia

    Science.gov (United States)

    Vučemilo, Luka; Borovečki, Ana

    2015-01-01

    Background High quality of informed consent form is essential for adequate information transfer between physicians and patients. Current status of medical procedure consent forms in clinical practice in Croatia specifically in terms of the readability and the content is unknown. The aim of this study was to assess the readability and the content of informed consent forms for diagnostic and therapeutic procedures used with patients in Croatia. Methods 52 informed consent forms from six Croatian hospitals on the secondary and tertiary health-care level were tested for reading difficulty using Simple Measure of Gobbledygook (SMOG) formula adjusted for Croatian language and for qualitative analysis of the content. Results The averaged SMOG grade of analyzed informed consent forms was 13.25 (SD 1.59, range 10–19). Content analysis revealed that informed consent forms included description of risks in 96% of the cases, benefits in 81%, description of procedures in 78%, alternatives in 52%, risks and benefits of alternatives in 17% and risks and benefits of not receiving treatment or undergoing procedures in 13%. Conclusions Readability of evaluated informed consent forms is not appropriate for the general population in Croatia. The content of the forms failed to include in high proportion of the cases description of alternatives, risks and benefits of alternatives, as well as risks and benefits of not receiving treatments or undergoing procedures. Data obtained from this research could help in development and improvement of informed consent forms in Croatia especially now when Croatian hospitals are undergoing the process of accreditation. PMID:26376183

  17. A computer method for simulating the decay of radon daughters

    International Nuclear Information System (INIS)

    Hartley, B.M.

    1988-01-01

    The analytical equations representing the decay of a series of radioactive atoms through a number of daughter products are well known. These equations are for an idealized case in which the expectation value of the number of atoms which decay in a certain time can be represented by a smooth curve. The real curve of the total number of disintegrations from a radioactive species consists of a series of Heaviside step functions, with the steps occurring at the time of the disintegration. The disintegration of radioactive atoms is said to be random but this random behaviour is such that a single species forms an ensemble of which the times of disintegration give a geometric distribution. Numbers which have a geometric distribution can be generated by computer and can be used to simulate the decay of one or more radioactive species. A computer method is described for simulating such decay of radioactive atoms and this method is applied specifically to the decay of the short half life daughters of radon 222 and the emission of alpha particles from polonium 218 and polonium 214. Repeating the simulation of the decay a number of times provides a method for investigating the statistical uncertainty inherent in methods for measurement of exposure to radon daughters. This statistical uncertainty is difficult to investigate analytically since the time of decay of an atom of polonium 218 is not independent of the time of decay of subsequent polonium 214. The method is currently being used to investigate the statistical uncertainties of a number of commonly used methods for the counting of alpha particles from radon daughters and the calculations of exposure

  18. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  19. A new computational method for reactive power market clearing

    International Nuclear Information System (INIS)

    Zhang, T.; Elkasrawy, A.; Venkatesh, B.

    2009-01-01

    After deregulation of electricity markets, ancillary services such as reactive power supply are priced separately. However, unlike real power supply, procedures for costing and pricing reactive power supply are still evolving and spot markets for reactive power do not exist as of now. Further, traditional formulations proposed for clearing reactive power markets use a non-linear mixed integer programming formulation that are difficult to solve. This paper proposes a new reactive power supply market clearing scheme. Novelty of this formulation lies in the pricing scheme that rewards transformers for tap shifting while participating in this market. The proposed model is a non-linear mixed integer challenge. A significant portion of the manuscript is devoted towards the development of a new successive mixed integer linear programming (MILP) technique to solve this formulation. The successive MILP method is computationally robust and fast. The IEEE 6-bus and 300-bus systems are used to test the proposed method. These tests serve to demonstrate computational speed and rigor of the proposed method. (author)

  20. Empirical method for simulation of water tables by digital computers

    International Nuclear Information System (INIS)

    Carnahan, C.L.; Fenske, P.R.

    1975-09-01

    An empirical method is described for computing a matrix of water-table elevations from a matrix of topographic elevations and a set of observed water-elevation control points which may be distributed randomly over the area of interest. The method is applicable to regions, such as the Great Basin, where the water table can be assumed to conform to a subdued image of overlying topography. A first approximation to the water table is computed by smoothing a matrix of topographic elevations and adjusting each node of the smoothed matrix according to a linear regression between observed water elevations and smoothed topographic elevations. Each observed control point is assumed to exert a radially decreasing influence on the first approximation surface. The first approximation is then adjusted further to conform to observed water-table elevations near control points. Outside the domain of control, the first approximation is assumed to represent the most probable configuration of the water table. The method has been applied to the Nevada Test Site and the Hot Creek Valley areas in Nevada

  1. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    Science.gov (United States)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  2. Computer codes and methods for simulating accelerator driven systems

    International Nuclear Information System (INIS)

    Sartori, E.; Byung Chan Na

    2003-01-01

    A large set of computer codes and associated data libraries have been developed by nuclear research and industry over the past half century. A large number of them are in the public domain and can be obtained under agreed conditions from different Information Centres. The areas covered comprise: basic nuclear data and models, reactor spectra and cell calculations, static and dynamic reactor analysis, criticality, radiation shielding, dosimetry and material damage, fuel behaviour, safety and hazard analysis, heat conduction and fluid flow in reactor systems, spent fuel and waste management (handling, transportation, and storage), economics of fuel cycles, impact on the environment of nuclear activities etc. These codes and models have been developed mostly for critical systems used for research or power generation and other technological applications. Many of them have not been designed for accelerator driven systems (ADS), but with competent use, they can be used for studying such systems or can form the basis for adapting existing methods to the specific needs of ADS's. The present paper describes the types of methods, codes and associated data available and their role in the applications. It provides Web addresses for facilitating searches for such tools. Some indications are given on the effect of non appropriate or 'blind' use of existing tools to ADS. Reference is made to available experimental data that can be used for validating the methods use. Finally, some international activities linked to the different computational aspects are described briefly. (author)

  3. Methodics of computing the results of monitoring the exploratory gallery

    Directory of Open Access Journals (Sweden)

    Krúpa Víazoslav

    2000-09-01

    Full Text Available At building site of motorway tunnel Višòové-Dubná skala , the priority is given to driving of exploration galley that secures in detail: geologic, engineering geology, hydrogeology and geotechnics research. This research is based on gathering information for a supposed use of the full profile driving machine that would drive the motorway tunnel. From a part of the exploration gallery which is driven by the TBM method, a fulfilling information is gathered about the parameters of the driving process , those are gathered by a computer monitoring system. The system is mounted on a driving machine. This monitoring system is based on the industrial computer PC 104. It records 4 basic values of the driving process: the electromotor performance of the driving machine Voest-Alpine ATB 35HA, the speed of driving advance, the rotation speed of the disintegrating head TBM and the total head pressure. The pressure force is evaluated from the pressure in the hydraulic cylinders of the machine. Out of these values, the strength of rock mass, the angle of inner friction, etc. are mathematically calculated. These values characterize rock mass properties as their changes. To define the effectivity of the driving process, the value of specific energy and the working ability of driving head is used. The article defines the methodics of computing the gathered monitoring information, that is prepared for the driving machine Voest – Alpine ATB 35H at the Institute of Geotechnics SAS. It describes the input forms (protocols of the developed method created by an EXCEL program and shows selected samples of the graphical elaboration of the first monitoring results obtained from exploratory gallery driving process in the Višòové – Dubná skala motorway tunnel.

  4. Description of a method for computing fluid-structure interaction

    International Nuclear Information System (INIS)

    Gantenbein, F.

    1982-02-01

    A general formulation allowing computation of structure vibrations in a dense fluid is described. It is based on fluid modelisation by fluid finite elements. For each fluid node are associated two variables: the pressure p and a variable π defined as p=d 2 π/dt 2 . Coupling between structure and fluid is introduced by surface elements. This method is easy to introduce in a general finite element code. Validation was obtained by analytical calculus and tests. It is widely used for vibrational and seismic studies of pipes and internals of nuclear reactors some applications are presented [fr

  5. Computer Aided Flowsheet Design using Group Contribution Methods

    DEFF Research Database (Denmark)

    Bommareddy, Susilpa; Eden, Mario R.; Gani, Rafiqul

    2011-01-01

    In this paper, a systematic group contribution based framework is presented for synthesis of process flowsheets from a given set of input and output specifications. Analogous to the group contribution methods developed for molecular design, the framework employs process groups to represent...... information of each flowsheet to minimize the computational load and information storage. The design variables for the selected flowsheet(s) are identified through a reverse simulation approach and are used as initial estimates for rigorous simulation to verify the feasibility and performance of the design....

  6. COMPUTER-IMPLEMENTED METHOD OF PERFORMING A SEARCH USING SIGNATURES

    DEFF Research Database (Denmark)

    2017-01-01

    A computer-implemented method of processing a query vector and a data vector), comprising: generating a set of masks and a first set of multiple signatures and a second set of multiple signatures by applying the set of masks to the query vector and the data vector, respectively, and generating...... candidate pairs, of a first signature and a second signature, by identifying matches of a first signature and a second signature. The set of masks comprises a configuration of the elements that is a Hadamard code; a permutation of a Hadamard code; or a code that deviates from a Hadamard code...

  7. Method and apparatus for managing transactions with connected computers

    Science.gov (United States)

    Goldsmith, Steven Y.; Phillips, Laurence R.; Spires, Shannon V.

    2003-01-01

    The present invention provides a method and apparatus that make use of existing computer and communication resources and that reduce the errors and delays common to complex transactions such as international shipping. The present invention comprises an agent-based collaborative work environment that assists geographically distributed commercial and government users in the management of complex transactions such as the transshipment of goods across the U.S.-Mexico border. Software agents can mediate the creation, validation and secure sharing of shipment information and regulatory documentation over the Internet, using the World-Wide Web to interface with human users.

  8. Numerical methods and computers used in elastohydrodynamic lubrication

    Science.gov (United States)

    Hamrock, B. J.; Tripp, J. H.

    1982-01-01

    Some of the methods of obtaining approximate numerical solutions to boundary value problems that arise in elastohydrodynamic lubrication are reviewed. The highlights of four general approaches (direct, inverse, quasi-inverse, and Newton-Raphson) are sketched. Advantages and disadvantages of these approaches are presented along with a flow chart showing some of the details of each. The basic question of numerical stability of the elastohydrodynamic lubrication solutions, especially in the pressure spike region, is considered. Computers used to solve this important class of lubrication problems are briefly described, with emphasis on supercomputers.

  9. Combined machine-readable and visually authenticated optical devices

    Science.gov (United States)

    Souparis, Hugues

    1996-03-01

    Optical variable devices are now widely used on documents or values. The most recent optical visual features with high definition, animation, brightness, special color tune, provide excellent first and second levels of authentication. Human eye is the only instrument required to check the authenticity. This is a major advantage of OVDs in many circumstances, such as currency exchange, ID street control . . . But, under other circumstances, such as automatic payments with banknotes, volume ID controls at boarders, ID controls in shops . . . an automatic authentication will be necessary or more reliable. When both a visual and automated authentication are required, the combination, on the same security component, of a variable image and a machine readable optical element is a very secure and cost effective solution for the protection of documents. Several techniques are now available an can be selected depending upon the respective roles of the machine readability and visual control.

  10. A hybrid method for the parallel computation of Green's functions

    International Nuclear Information System (INIS)

    Petersen, Dan Erik; Li Song; Stokbro, Kurt; Sorensen, Hans Henrik B.; Hansen, Per Christian; Skelboe, Stig; Darve, Eric

    2009-01-01

    Quantum transport models for nanodevices using the non-equilibrium Green's function method require the repeated calculation of the block tridiagonal part of the Green's and lesser Green's function matrices. This problem is related to the calculation of the inverse of a sparse matrix. Because of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only require computing a small number of entries of the inverse matrix. Then, we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size.

  11. Multigrid Methods for the Computation of Propagators in Gauge Fields

    Science.gov (United States)

    Kalkreuter, Thomas

    Multigrid methods were invented for the solution of discretized partial differential equations in order to overcome the slowness of traditional algorithms by updates on various length scales. In the present work generalizations of multigrid methods for propagators in gauge fields are investigated. Gauge fields are incorporated in algorithms in a covariant way. The kernel C of the restriction operator which averages from one grid to the next coarser grid is defined by projection on the ground-state of a local Hamiltonian. The idea behind this definition is that the appropriate notion of smoothness depends on the dynamics. The ground-state projection choice of C can be used in arbitrary dimension and for arbitrary gauge group. We discuss proper averaging operations for bosons and for staggered fermions. The kernels C can also be used in multigrid Monte Carlo simulations, and for the definition of block spins and blocked gauge fields in Monte Carlo renormalization group studies. Actual numerical computations are performed in four-dimensional SU(2) gauge fields. We prove that our proposals for block spins are “good”, using renormalization group arguments. A central result is that the multigrid method works in arbitrarily disordered gauge fields, in principle. It is proved that computations of propagators in gauge fields without critical slowing down are possible when one uses an ideal interpolation kernel. Unfortunately, the idealized algorithm is not practical, but it was important to answer questions of principle. Practical methods are able to outperform the conjugate gradient algorithm in case of bosons. The case of staggered fermions is harder. Multigrid methods give considerable speed-ups compared to conventional relaxation algorithms, but on lattices up to 184 conjugate gradient is superior.

  12. The quality and readability of internet information regarding clavicle fractures.

    Science.gov (United States)

    Zhang, Dafang; Schumacher, Charles; Harris, Mitchel Byron

    2016-03-01

    The internet has become a major source of health information for patients. However, there has been little scrutiny of health information available on the internet to the public. Our objectives were to evaluate the quality and readability of information available on the internet regarding clavicle fractures and whether they changed with academic affiliation of the website or with complexity of the search term. Through a prospective evaluation of 3 search engines using 3 different search terms of varying complexity ("broken collarbone," "collarbone fracture," and "clavicle fracture"), we evaluated 91 website hits for quality and readability. Websites were specifically analyzed by search term and by website type. Information quality was evaluated on a four-point scale, and information readability was assessed using the Flesch-Kincaid score for reading grade level. The average quality score for our website hits was low, and the average reading grade level was far above the recommended level. Academic websites offered significantly higher quality information, whereas commercial websites offered significantly lower quality information. The use of more complex search terms yielded information of higher reading grade level but not higher quality. Current internet information regarding clavicle fractures is of low quality and low readability. Higher quality information utilizing more accessible language on clavicle fractures is needed on the internet. It is important to be aware of the information accessible to patients prior to their presentation to our clinics. Patients should be advised to visit websites with academic affiliations and to avoid commercial websites. Copyright © 2015 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  13. Assessing the readability of ClinicalTrials.gov.

    Science.gov (United States)

    Wu, Danny T Y; Hanauer, David A; Mei, Qiaozhu; Clark, Patricia M; An, Lawrence C; Proulx, Joshua; Zeng, Qing T; Vydiswaran, V G Vinod; Collins-Thompson, Kevyn; Zheng, Kai

    2016-03-01

    ClinicalTrials.gov serves critical functions of disseminating trial information to the public and helping the trials recruit participants. This study assessed the readability of trial descriptions at ClinicalTrials.gov using multiple quantitative measures. The analysis included all 165,988 trials registered at ClinicalTrials.gov as of April 30, 2014. To obtain benchmarks, the authors also analyzed 2 other medical corpora: (1) all 955 Health Topics articles from MedlinePlus and (2) a random sample of 100,000 clinician notes retrieved from an electronic health records system intended for conveying internal communication among medical professionals. The authors characterized each of the corpora using 4 surface metrics, and then applied 5 different scoring algorithms to assess their readability. The authors hypothesized that clinician notes would be most difficult to read, followed by trial descriptions and MedlinePlus Health Topics articles. Trial descriptions have the longest average sentence length (26.1 words) across all corpora; 65% of their words used are not covered by a basic medical English dictionary. In comparison, average sentence length of MedlinePlus Health Topics articles is 61% shorter, vocabulary size is 95% smaller, and dictionary coverage is 46% higher. All 5 scoring algorithms consistently rated CliniclTrials.gov trial descriptions the most difficult corpus to read, even harder than clinician notes. On average, it requires 18 years of education to properly understand these trial descriptions according to the results generated by the readability assessment algorithms. Trial descriptions at CliniclTrials.gov are extremely difficult to read. Significant work is warranted to improve their readability in order to achieve CliniclTrials.gov's goal of facilitating information dissemination and subject recruitment. Published by Oxford University Press on behalf of the American Medical Informatics Association 2015. This work is written by US Government

  14. Online Patient Resources for Liposuction: A Comparative Analysis of Readability.

    Science.gov (United States)

    Vargas, Christina R; Ricci, Joseph A; Chuang, Danielle J; Lee, Bernard T

    2016-03-01

    As patients strive to become informed about health care, inadequate functional health literacy is a significant barrier. Nearly half of American adults have poor or marginal health literacy skills and the National Institutes of Health and American Medical Association have recommended that patient information should be written at a sixth grade level. The aim of this study is to identify the most commonly used online patient information about liposuction and to evaluate its readability relative to average American literacy. An internet search of "liposuction" was performed and the 10 most popular websites identified. User and location data were disabled and sponsored results excluded. All relevant, patient-directed articles were downloaded and formatted into plain text. Articles were then analyzed using 10 established readability tests. A comparison group was constructed to identify the most popular online consumer information about tattooing. Mean readability scores and specific article characteristics were compared. A total of 80 articles were collected from websites about liposuction. Readability analysis revealed an overall 13.6 grade reading level (range, 10-16 grade); all articles exceeded the target sixth grade level. Consumer websites about tattooing were significantly easier to read, with a mean 7.8 grade level. These sites contained significantly fewer characters per word and words per sentence, as well as a smaller proportion of complex, long, and unfamiliar words. Online patient resources about liposuction are potentially too difficult for a large number of Americans to understand. Liposuction websites are significantly harder to read than consumer websites about tattooing. Aesthetic surgeons are advised to discuss with patients resources they use and guide patients to appropriate information for their skill level.

  15. Reliability, Readability and Quality of Online Information about Femoracetabular Impingement

    Directory of Open Access Journals (Sweden)

    Fatih Küçükdurmaz

    2015-07-01

    Conclusion: According to our results, the websites intended to attract patients searching for information regarding femoroacetabular impingement are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. This indicates that while these resources are easily accessed by patients, there is potential for them to be a source of misinformation.

  16. Fluid history computation methods for reactor safeguards problems using MNODE computer program

    International Nuclear Information System (INIS)

    Huang, Y.S.; Savery, C.W.

    1976-10-01

    A method for predicting the pressure-temperature histories of air, water liquid, and vapor flowing in a zoned containment as a result of high energy pipe rupture is described. The computer code, MNODE, has been developed for 12 connected control volumes and 24 inertia flow paths. Predictions by the code are compared with the results of an analytical gas dynamic problem, semiscale blowdown experiments, full scale MARVIKEN test results, Battelle-Frankfurt model PWR containment test data. The MNODE solutions to NRC/AEC subcompartment benchmark problems are also compared with results predicted by other computer codes such as RELAP-3, FLASH-2, CONTEMPT-PS. The analytical consideration is consistent with Section 6.2.1.2 of the Standard Format (Rev. 2) issued by U.S. Nuclear Regulatory Commission in September 1975

  17. Readability assessment of online thyroid surgery patient education materials.

    Science.gov (United States)

    Patel, Chirag R; Cherla, Deepa V; Sanghvi, Saurin; Baredes, Soly; Eloy, Jean Anderson

    2013-10-01

    Published guidelines recommend written health information be written at or below the sixth-grade level. We evaluate the readability of online materials related to thyroid surgery. Thyroid surgery materials were evaluated using Flesch Reading Ease Score (FRES), Flesch Kincaid Grade Level (FKGL), Gunning Frequency of Gobbledygook (GFOG), and Simple Measure of Gobbledygook (SMOG). Thirty-one documents were evaluated. FRES scores ranged from 29.3 to 67.8 (possible range = 0 to 100), and averaged 50.5. FKGL ranged from 6.9 to 14.9 (possible range = 3 to 12), and averaged 10.4. SMOG scores ranged from 11.8 to 14.5 (possible range = 3 to 19), and averaged 13.0. GFOG scores ranged from 10.6 to 18.0 (possible range = 3 to 19), and averaged 13.5. Readability scores for online thyroid surgery materials are higher (i.e., more difficult) than the recommended levels. However, readability is only one aspect of comprehension. Written information should be designed with that fact in mind. Copyright © 2013 Wiley Periodicals, Inc.

  18. Readability of informed consent forms in vascular and interventional radiology

    International Nuclear Information System (INIS)

    Pinto, I.; Vigil, D.

    1998-01-01

    To evaluate the readability of the informed consent forms prepared for vascular and interventional radiology. The 18 informed consent forms were analyzed using the Gramatica tool employed in Microsoft Word 97 For Windows which combines the statistics on legibility in terms of three sections: scores, averages and legibility (Flech index, passive voice, sentence complexity and vocabulary complexity). For each, the integrated readability index was also manually calculated. All the documents present a Flesch index of over 10; the sentence complexity indexes are less than or equal to 20, demonstrating that the sentences are not long or complicated in structure. Finally, the integrated readability index of all of them is well over 70. The forms posses acceptable legibility indexes, but their evaluation should be completed by an opinion poll of the patients for whom they are written. Moreover, it must be kept in mind that these documents, like the procedures performed, are changing continually. Thus, it is necessary to update and modify the information to be provided to the patients. (Author) 11 refs

  19. Quality and readability of online information resources on insomnia

    Institute of Scientific and Technical Information of China (English)

    Yan Ma; Albert C.Yang; Ying Duan; Ming Dong; Albert S.Yeung

    2017-01-01

    The internet is a major source for health information.An increasing number of people,including patients with insomnia,search for remedies online;however,little is known about the quality of such information.This study aimed to evaluate the quality and readability of insomnia-related online information.Google was used as the search engine,and the top websites on insomnia that met the inclusion criteria were evaluated for quality and readability.The analyzed websites belonged to nonprofit,commercial,or academic organizations and institutions such as hospitais and universities.Insomnia-related websites typically included definitions (85%),causes and risk factors (100%),symptoms (95%),and treatment options (90%).Cognitive behavioral therapy for insomnia (CBT-Ⅰ) was the most commonly recommended approach for insomnia treatment,and sleep drugs are frequently mentioned.The overall quality of the websites on insomnia is moderate,but all the content exceeded the recommended reading ease levels.Concerns that must be addressed to increase the quality and trustworthiness of online health information include sharing metadata,such as authorship,time of creation and last update,and conflicts of interest;providing evidence for reliability;and increasing the readability for a layman audience.

  20. Readability assessment of online urology patient education materials.

    Science.gov (United States)

    Colaco, Marc; Svider, Peter F; Agarwal, Nitin; Eloy, Jean Anderson; Jackson, Imani M

    2013-03-01

    The National Institutes of Health, American Medical Association, and United States Department of Health and Human Services recommend that patient education materials be written at a fourth to sixth grade reading level to facilitate comprehension. We examined and compared the readability and difficulty of online patient education materials from the American Urological Association and academic urology departments in the Northeastern United States. We assessed the online patient education materials for difficulty level with 10 commonly used readability assessment tools, including the Flesch Reading Ease Score, Flesch-Kincaid Grade Level, Simple Measure of Gobbledygook, Gunning Frequency of Gobbledygook, New Dale-Chall Test, Coleman-Liau index, New Fog Count, Raygor Readability Estimate, FORCAST test and Fry score. Most patient education materials on the websites of these programs were written at or above the eleventh grade reading level. Urological online patient education materials are written above the recommended reading level. They may need to be simplified to facilitate better patient understanding of urological topics. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  1. Oligomerization of G protein-coupled receptors: computational methods.

    Science.gov (United States)

    Selent, J; Kaczor, A A

    2011-01-01

    Recent research has unveiled the complexity of mechanisms involved in G protein-coupled receptor (GPCR) functioning in which receptor dimerization/oligomerization may play an important role. Although the first high-resolution X-ray structure for a likely functional chemokine receptor dimer has been deposited in the Protein Data Bank, the interactions and mechanisms of dimer formation are not yet fully understood. In this respect, computational methods play a key role for predicting accurate GPCR complexes. This review outlines computational approaches focusing on sequence- and structure-based methodologies as well as discusses their advantages and limitations. Sequence-based approaches that search for possible protein-protein interfaces in GPCR complexes have been applied with success in several studies, but did not yield always consistent results. Structure-based methodologies are a potent complement to sequence-based approaches. For instance, protein-protein docking is a valuable method especially when guided by experimental constraints. Some disadvantages like limited receptor flexibility and non-consideration of the membrane environment have to be taken into account. Molecular dynamics simulation can overcome these drawbacks giving a detailed description of conformational changes in a native-like membrane. Successful prediction of GPCR complexes using computational approaches combined with experimental efforts may help to understand the role of dimeric/oligomeric GPCR complexes for fine-tuning receptor signaling. Moreover, since such GPCR complexes have attracted interest as potential drug target for diverse diseases, unveiling molecular determinants of dimerization/oligomerization can provide important implications for drug discovery.

  2. Computing thermal Wigner densities with the phase integration method

    International Nuclear Information System (INIS)

    Beutier, J.; Borgis, D.; Vuilleumier, R.; Bonella, S.

    2014-01-01

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems

  3. Computing thermal Wigner densities with the phase integration method.

    Science.gov (United States)

    Beutier, J; Borgis, D; Vuilleumier, R; Bonella, S

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.

  4. Computational methods for ab initio detection of microRNAs

    Directory of Open Access Journals (Sweden)

    Malik eYousef

    2012-10-01

    Full Text Available MicroRNAs are small RNA sequences of 18-24 nucleotides in length, which serve as templates to drive post transcriptional gene silencing. The canonical microRNA pathway starts with transcription from DNA and is followed by processing via the Microprocessor complex, yielding a hairpin structure. Which is then exported into the cytosol where it is processed by Dicer and then incorporated into the RNA induced silencing complex. All of these biogenesis steps add to the overall specificity of miRNA production and effect. Unfortunately, their modes of action are just beginning to be elucidated and therefore computational prediction algorithms cannot model the process but are usually forced to employ machine learning approaches. This work focuses on ab initio prediction methods throughout; and therefore homology-based miRNA detection methods are not discussed. Current ab initio prediction algorithms, their ties to data mining, and their prediction accuracy are detailed.

  5. Data graphing methods, articles of manufacture, and computing devices

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Pak Chung; Mackey, Patrick S.; Cook, Kristin A.; Foote, Harlan P.; Whiting, Mark A.

    2016-12-13

    Data graphing methods, articles of manufacture, and computing devices are described. In one aspect, a method includes accessing a data set, displaying a graphical representation including data of the data set which is arranged according to a first of different hierarchical levels, wherein the first hierarchical level represents the data at a first of a plurality of different resolutions which respectively correspond to respective ones of the hierarchical levels, selecting a portion of the graphical representation wherein the data of the portion is arranged according to the first hierarchical level at the first resolution, modifying the graphical representation by arranging the data of the portion according to a second of the hierarchal levels at a second of the resolutions, and after the modifying, displaying the graphical representation wherein the data of the portion is arranged according to the second hierarchal level at the second resolution.

  6. A finite element solution method for quadrics parallel computer

    International Nuclear Information System (INIS)

    Zucchini, A.

    1996-08-01

    A distributed preconditioned conjugate gradient method for finite element analysis has been developed and implemented on a parallel SIMD Quadrics computer. The main characteristic of the method is that it does not require any actual assembling of all element equations in a global system. The physical domain of the problem is partitioned in cells of n p finite elements and each cell element is assigned to a different node of an n p -processors machine. Element stiffness matrices are stored in the data memory of the assigned processing node and the solution process is completely executed in parallel at element level. Inter-element and therefore inter-processor communications are required once per iteration to perform local sums of vector quantities between neighbouring elements. A prototype implementation has been tested on an 8-nodes Quadrics machine in a simple 2D benchmark problem

  7. A novel dual energy method for enhanced quantitative computed tomography

    Science.gov (United States)

    Emami, A.; Ghadiri, H.; Rahmim, A.; Ay, M. R.

    2018-01-01

    Accurate assessment of bone mineral density (BMD) is critically important in clinical practice, and conveniently enabled via quantitative computed tomography (QCT). Meanwhile, dual-energy QCT (DEQCT) enables enhanced detection of small changes in BMD relative to single-energy QCT (SEQCT). In the present study, we aimed to investigate the accuracy of QCT methods, with particular emphasis on a new dual-energy approach, in comparison to single-energy and conventional dual-energy techniques. We used a sinogram-based analytical CT simulator to model the complete chain of CT data acquisitions, and assessed performance of SEQCT and different DEQCT techniques in quantification of BMD. We demonstrate a 120% reduction in error when using a proposed dual-energy Simultaneous Equation by Constrained Least-squares method, enabling more accurate bone mineral measurements.

  8. Comparison of four computational methods for computing Q factors and resonance wavelengths in photonic crystal membrane cavities

    DEFF Research Database (Denmark)

    de Lasson, Jakob Rosenkrantz; Frandsen, Lars Hagedorn; Burger, Sven

    2016-01-01

    We benchmark four state-of-the-art computational methods by computing quality factors and resonance wavelengths in photonic crystal membrane L5 and L9 line defect cavities.The convergence of the methods with respect to resolution, degrees of freedom and number ofmodes is investigated. Special att...... attention is paid to the influence of the size of the computational domain. Convergence is not obtained for some of the methods, indicating that some are moresuitable than others for analyzing line defect cavities....

  9. Computer prediction of subsurface radionuclide transport: an adaptive numerical method

    International Nuclear Information System (INIS)

    Neuman, S.P.

    1983-01-01

    Radionuclide transport in the subsurface is often modeled with the aid of the advection-dispersion equation. A review of existing computer methods for the solution of this equation shows that there is need for improvement. To answer this need, a new adaptive numerical method is proposed based on an Eulerian-Lagrangian formulation. The method is based on a decomposition of the concentration field into two parts, one advective and one dispersive, in a rigorous manner that does not leave room for ambiguity. The advective component of steep concentration fronts is tracked forward with the aid of moving particles clustered around each front. Away from such fronts the advection problem is handled by an efficient modified method of characteristics called single-step reverse particle tracking. When a front dissipates with time, its forward tracking stops automatically and the corresponding cloud of particles is eliminated. The dispersion problem is solved by an unconventional Lagrangian finite element formulation on a fixed grid which involves only symmetric and diagonal matrices. Preliminary tests against analytical solutions of ne- and two-dimensional dispersion in a uniform steady state velocity field suggest that the proposed adaptive method can handle the entire range of Peclet numbers from 0 to infinity, with Courant numbers well in excess of 1

  10. Parallel computation of multigroup reactivity coefficient using iterative method

    Science.gov (United States)

    Susmikanti, Mike; Dewayatna, Winter

    2013-09-01

    One of the research activities to support the commercial radioisotope production program is a safety research target irradiation FPM (Fission Product Molybdenum). FPM targets form a tube made of stainless steel in which the nuclear degrees of superimposed high-enriched uranium. FPM irradiation tube is intended to obtain fission. The fission material widely used in the form of kits in the world of nuclear medicine. Irradiation FPM tube reactor core would interfere with performance. One of the disorders comes from changes in flux or reactivity. It is necessary to study a method for calculating safety terrace ongoing configuration changes during the life of the reactor, making the code faster became an absolute necessity. Neutron safety margin for the research reactor can be reused without modification to the calculation of the reactivity of the reactor, so that is an advantage of using perturbation method. The criticality and flux in multigroup diffusion model was calculate at various irradiation positions in some uranium content. This model has a complex computation. Several parallel algorithms with iterative method have been developed for the sparse and big matrix solution. The Black-Red Gauss Seidel Iteration and the power iteration parallel method can be used to solve multigroup diffusion equation system and calculated the criticality and reactivity coeficient. This research was developed code for reactivity calculation which used one of safety analysis with parallel processing. It can be done more quickly and efficiently by utilizing the parallel processing in the multicore computer. This code was applied for the safety limits calculation of irradiated targets FPM with increment Uranium.

  11. Face recognition system and method using face pattern words and face pattern bytes

    Science.gov (United States)

    Zheng, Yufeng

    2014-12-23

    The present invention provides a novel system and method for identifying individuals and for face recognition utilizing facial features for face identification. The system and method of the invention comprise creating facial features or face patterns called face pattern words and face pattern bytes for face identification. The invention also provides for pattern recognitions for identification other than face recognition. The invention further provides a means for identifying individuals based on visible and/or thermal images of those individuals by utilizing computer software implemented by instructions on a computer or computer system and a computer readable medium containing instructions on a computer system for face recognition and identification.

  12. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    Science.gov (United States)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  13. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  14. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  15. Computation of Hemagglutinin Free Energy Difference by the Confinement Method

    Science.gov (United States)

    2017-01-01

    Hemagglutinin (HA) mediates membrane fusion, a crucial step during influenza virus cell entry. How many HAs are needed for this process is still subject to debate. To aid in this discussion, the confinement free energy method was used to calculate the conformational free energy difference between the extended intermediate and postfusion state of HA. Special care was taken to comply with the general guidelines for free energy calculations, thereby obtaining convergence and demonstrating reliability of the results. The energy that one HA trimer contributes to fusion was found to be 34.2 ± 3.4kBT, similar to the known contributions from other fusion proteins. Although computationally expensive, the technique used is a promising tool for the further energetic characterization of fusion protein mechanisms. Knowledge of the energetic contributions per protein, and of conserved residues that are crucial for fusion, aids in the development of fusion inhibitors for antiviral drugs. PMID:29151344

  16. Conference on Boundary and Interior Layers : Computational and Asymptotic Methods

    CERN Document Server

    Stynes, Martin; Zhang, Zhimin

    2017-01-01

    This volume collects papers associated with lectures that were presented at the BAIL 2016 conference, which was held from 14 to 19 August 2016 at Beijing Computational Science Research Center and Tsinghua University in Beijing, China. It showcases the variety and quality of current research into numerical and asymptotic methods for theoretical and practical problems whose solutions involve layer phenomena. The BAIL (Boundary And Interior Layers) conferences, held usually in even-numbered years, bring together mathematicians and engineers/physicists whose research involves layer phenomena, with the aim of promoting interaction between these often-separate disciplines. These layers appear as solutions of singularly perturbed differential equations of various types, and are common in physical problems, most notably in fluid dynamics. This book is of interest for current researchers from mathematics, engineering and physics whose work involves the accurate app roximation of solutions of singularly perturbed diffe...

  17. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-01-01

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  18. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  19. Activation method for measuring the neutron spectra parameters. Computer software

    International Nuclear Information System (INIS)

    Efimov, B.V.; Ionov, V.S.; Konyaev, S.I.; Marin, S.V.

    2005-01-01

    The description of mathematical statement of a task for definition the spectral characteristics of neutron fields with use developed in RRC KI unified activation detectors (UKD) is resulted. The method of processing of results offered by authors activation measurements and calculation of the parameters used for an estimation of the neutron spectra characteristics is discussed. Features of processing of the experimental data received at measurements of activation with using UKD are considered. Activation detectors UKD contain a little bit specially the picked up isotopes giving at irradiation peaks scale of activity in the common spectrum scale of activity. Computing processing of results of the measurements is applied on definition of spectrum parameters for nuclear reactor installations with thermal and close to such power spectrum of neutrons. The example of the data processing, the measurements received at carrying out at RRC KI research reactor F-1 is resulted [ru

  20. Calisthenics with words: The effect of readability and investor sophistication on investors' performance judgment

    OpenAIRE

    Cui, Xiao Carol

    2016-01-01

    Since the 1990s, the SEC has advocated for financial disclosures to be in “plain English” so that they would be more readable and informative. Past research has shown that high readability is related to more extreme investor judgments of firm performance. Processing fluency is the prevalent theory to explain this: higher readability increases the investor’s subconscious reliance on the disclosure, so positive (negative) news leads to more positive (negative) judgments. The relationship may no...

  1. A computed microtomography method for understanding epiphyseal growth plate fusion

    Science.gov (United States)

    Staines, Katherine A.; Madi, Kamel; Javaheri, Behzad; Lee, Peter D.; Pitsillides, Andrew A.

    2017-12-01

    The epiphyseal growth plate is a developmental region responsible for linear bone growth, in which chondrocytes undertake a tightly regulated series of biological processes. Concomitant with the cessation of growth and sexual maturation, the human growth plate undergoes progressive narrowing, and ultimately disappears. Despite the crucial role of this growth plate fusion ‘bridging’ event, the precise mechanisms by which it is governed are complex and yet to be established. Progress is likely hindered by the current methods for growth plate visualisation; these are invasive and largely rely on histological procedures. Here we describe our non-invasive method utilising synchrotron x-ray computed microtomography for the examination of growth plate bridging, which ultimately leads to its closure coincident with termination of further longitudinal bone growth. We then apply this method to a dataset obtained from a benchtop microcomputed tomography scanner to highlight its potential for wide usage. Furthermore, we conduct finite element modelling at the micron-scale to reveal the effects of growth plate bridging on local tissue mechanics. Employment of these 3D analyses of growth plate bone bridging is likely to advance our understanding of the physiological mechanisms that control growth plate fusion.

  2. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  3. Emerging Computational Methods for the Rational Discovery of Allosteric Drugs.

    Science.gov (United States)

    Wagner, Jeffrey R; Lee, Christopher T; Durrant, Jacob D; Malmstrom, Robert D; Feher, Victoria A; Amaro, Rommie E

    2016-06-08

    Allosteric drug development holds promise for delivering medicines that are more selective and less toxic than those that target orthosteric sites. To date, the discovery of allosteric binding sites and lead compounds has been mostly serendipitous, achieved through high-throughput screening. Over the past decade, structural data has become more readily available for larger protein systems and more membrane protein classes (e.g., GPCRs and ion channels), which are common allosteric drug targets. In parallel, improved simulation methods now provide better atomistic understanding of the protein dynamics and cooperative motions that are critical to allosteric mechanisms. As a result of these advances, the field of predictive allosteric drug development is now on the cusp of a new era of rational structure-based computational methods. Here, we review algorithms that predict allosteric sites based on sequence data and molecular dynamics simulations, describe tools that assess the druggability of these pockets, and discuss how Markov state models and topology analyses provide insight into the relationship between protein dynamics and allosteric drug binding. In each section, we first provide an overview of the various method classes before describing relevant algorithms and software packages.

  4. Computation of rectangular source integral by rational parameter polynomial method

    International Nuclear Information System (INIS)

    Prabha, Hem

    2001-01-01

    Hubbell et al. (J. Res. Nat Bureau Standards 64C, (1960) 121) have obtained a series expansion for the calculation of the radiation field generated by a plane isotropic rectangular source (plaque), in which leading term is the integral H(a,b). In this paper another integral I(a,b), which is related with the integral H(a,b) has been solved by the rational parameter polynomial method. From I(a,b), we compute H(a,b). Using this method the integral I(a,b) is expressed in the form of a polynomial of a rational parameter. Generally, a function f (x) is expressed in terms of x. In this method this is expressed in terms of x/(1+x). In this way, the accuracy of the expression is good over a wide range of x as compared to the earlier approach. The results for I(a,b) and H(a,b) are given for a sixth degree polynomial and are found to be in good agreement with the results obtained by numerically integrating the integral. Accuracy could be increased either by increasing the degree of the polynomial or by dividing the range of integration. The results of H(a,b) and I(a,b) are given for values of b and a up to 2.0 and 20.0, respectively

  5. Designing and Evaluating Patient Education Pamphlets based on Readability Indexes and Comparison with Literacy Levels of Society

    Directory of Open Access Journals (Sweden)

    Mahdieh Arian

    2016-07-01

    Full Text Available Background: Hundreds of patient education materials i.e. pamphlets are annually published in healthcare systems following their design, correction, and revision. Aim: to design and evaluate patient education pamphlets based on readability indexes and their comparison with literacy level in society. Method: The average literacy level among 500 patients admitted to two training hospitals in Bojnurd (northeastern Iran was determined in 2014-2015. Afterwards, all patient education pamphlets in both hospitals (n=69 were collected and their readability level was determined. After that, all the pamphlets were re-designed according to the given standards and in line with literacy level in society. The SPSS software (Version 20 was also used to analyze the data. Results: The average level of literacy among 500 patients in both hospitals in the present study was 6.72±4.34 which was placed in grades six and seven in terms of the guide to readability indexes. In line with McLaughlin’s SMOG Readability Formula, the bulk of pamphlets (91.3% were at college level before corrections and revisions based on the given standards, but 23.2% were at a level lower than grade seven following corrections and revisions. Implications for Practice: Evaluation of patient education pamphlets plays an important role in promoting self-care among patients. Due to the novelty of the present study in Iran, the results of this study can contribute to patient education researchers in order to identify the strengths and weaknesses of patient education materials i.e. pamphlets based on scientific indices as well as their revisions and re-developments.

  6. A fast iterative method for computing particle beams penetrating matter

    International Nuclear Information System (INIS)

    Boergers, C.

    1997-01-01

    Beams of microscopic particles penetrating matter are important in several fields. The application motivating our parameter choices in this paper is electron beam cancer therapy. Mathematically, a steady particle beam penetrating matter, or a configuration of several such beams, is modeled by a boundary value problem for a Boltzmann equation. Grid-based discretization of this problem leads to a system of algebraic equations. This system is typically very large because of the large number of independent variables in the Boltzmann equation (six if time independence is the only dimension-reducing assumption). If grid-based methods are to be practical at all, it is therefore necessary to develop fast solvers for the discretized problems. This is the subject of the present paper. For two-dimensional, mono-energetic, linear particle beam problems, we describe an iterative domain decomposition algorithm based on overlapping decompositions of the set of particle directions and computationally demonstrate its rapid, grid independent convergence. There appears to be no fundamental obstacle to generalizing the method to three-dimensional, energy dependent problems. 34 refs., 15 figs., 6 tabs

  7. Global Seabed Materials and Habitats Mapped: The Computational Methods

    Science.gov (United States)

    Jenkins, C. J.

    2016-02-01

    What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.

  8. Semi-coarsening multigrid methods for parallel computing

    Energy Technology Data Exchange (ETDEWEB)

    Jones, J.E.

    1996-12-31

    Standard multigrid methods are not well suited for problems with anisotropic coefficients which can occur, for example, on grids that are stretched to resolve a boundary layer. There are several different modifications of the standard multigrid algorithm that yield efficient methods for anisotropic problems. In the paper, we investigate the parallel performance of these multigrid algorithms. Multigrid algorithms which work well for anisotropic problems are based on line relaxation and/or semi-coarsening. In semi-coarsening multigrid algorithms a grid is coarsened in only one of the coordinate directions unlike standard or full-coarsening multigrid algorithms where a grid is coarsened in each of the coordinate directions. When both semi-coarsening and line relaxation are used, the resulting multigrid algorithm is robust and automatic in that it requires no knowledge of the nature of the anisotropy. This is the basic multigrid algorithm whose parallel performance we investigate in the paper. The algorithm is currently being implemented on an IBM SP2 and its performance is being analyzed. In addition to looking at the parallel performance of the basic semi-coarsening algorithm, we present algorithmic modifications with potentially better parallel efficiency. One modification reduces the amount of computational work done in relaxation at the expense of using multiple coarse grids. This modification is also being implemented with the aim of comparing its performance to that of the basic semi-coarsening algorithm.

  9. Particular application of methods of AdaBoost and LBP to the problems of computer vision

    OpenAIRE

    Волошин, Микола Володимирович

    2012-01-01

    The application of AdaBoost method and local binary pattern (LBP) method for different spheres of computer vision implementation, such as personality identification and computer iridology, is considered in the article. The goal of the research is to develop error-correcting methods and systems for implements of computer vision and computer iridology, in particular. This article considers the problem of colour spaces, which are used as a filter and as a pre-processing of images. Method of AdaB...

  10. Non-unitary probabilistic quantum computing circuit and method

    Science.gov (United States)

    Williams, Colin P. (Inventor); Gingrich, Robert M. (Inventor)

    2009-01-01

    A quantum circuit performing quantum computation in a quantum computer. A chosen transformation of an initial n-qubit state is probabilistically obtained. The circuit comprises a unitary quantum operator obtained from a non-unitary quantum operator, operating on an n-qubit state and an ancilla state. When operation on the ancilla state provides a success condition, computation is stopped. When operation on the ancilla state provides a failure condition, computation is performed again on the ancilla state and the n-qubit state obtained in the previous computation, until a success condition is obtained.

  11. Nuclear power reactor analysis, methods, algorithms and computer programs

    International Nuclear Information System (INIS)

    Matausek, M.V

    1981-01-01

    Full text: For a developing country buying its first nuclear power plants from a foreign supplier, disregarding the type and scope of the contract, there is a certain number of activities which have to be performed by local stuff and domestic organizations. This particularly applies to the choice of the nuclear fuel cycle strategy and the choice of the type and size of the reactors, to bid parameters specification, bid evaluation and final safety analysis report evaluation, as well as to in-core fuel management activities. In the Nuclear Engineering Department of the Boris Kidric Institute of Nuclear Sciences (NET IBK) the continual work is going on, related to the following topics: cross section and resonance integral calculations, spectrum calculations, generation of group constants, lattice and cell problems, criticality and global power distribution search, fuel burnup analysis, in-core fuel management procedures, cost analysis and power plant economics, safety and accident analysis, shielding problems and environmental impact studies, etc. The present paper gives the details of the methods developed and the results achieved, with the particular emphasis on the NET IBK computer program package for the needs of planning, construction and operation of nuclear power plants. The main problems encountered so far were related to small working team, lack of large and powerful computers, absence of reliable basic nuclear data and shortage of experimental and empirical results for testing theoretical models. Some of these difficulties have been overcome thanks to bilateral and multilateral cooperation with developed countries, mostly through IAEA. It is the authors opinion, however, that mutual cooperation of developing countries, having similar problems and similar goals, could lead to significant results. Some activities of this kind are suggested and discussed. (author)

  12. Readability of Written Materials for CKD Patients: A Systematic Review.

    Science.gov (United States)

    Morony, Suzanne; Flynn, Michaela; McCaffery, Kirsten J; Jansen, Jesse; Webster, Angela C

    2015-06-01

    The "average" patient has a literacy level of US grade 8 (age 13-14 years), but this may be lower for people with chronic kidney disease (CKD). Current guidelines suggest that patient education materials should be pitched at a literacy level of around 5th grade (age 10-11 years). This study aims to evaluate the readability of written materials targeted at patients with CKD. Systematic review. Patient information materials aimed at adults with CKD and written in English. Patient education materials designed to be printed and read, sourced from practices in Australia and online at all known websites run by relevant international CKD organizations during March 2014. Quantitative analysis of readability using Lexile Analyzer and Flesch-Kincaid tools. We analyzed 80 materials. Both Lexile Analyzer and Flesch-Kincaid analyses suggested that most materials required a minimum of grade 9 (age 14-15 years) schooling to read them. Only 5% of materials were pitched at the recommended level (grade 5). Readability formulas have inherent limitations and do not account for visual information. We did not consider other media through which patients with CKD may access information. Although the study covered materials from the United States, United Kingdom, and Australia, all non-Internet materials were sourced locally, and it is possible that some international paper-based materials were missed. Generalizability may be limited due to exclusion of non-English materials. These findings suggest that patient information materials aimed at patients with CKD are pitched above the average patient's literacy level. This issue is compounded by cognitive decline in patients with CKD, who may have lower literacy than the average patient. It suggests that information providers need to consider their audience more carefully when preparing patient information materials, including user testing with a low-literacy patient population. Copyright © 2015 National Kidney Foundation, Inc. Published by

  13. Readability of online patient education materials for velopharyngeal insufficiency.

    Science.gov (United States)

    Xie, Deborah X; Wang, Ray Y; Chinnadurai, Sivakumar

    2018-01-01

    Evaluate the readability of online and mobile application health information about velopharyngeal insufficiency (VPI). Top website and mobile application results for search terms "velopharyngeal insufficiency", "velopharyngeal dysfunction", "VPI", and "VPD" were analyzed. Readability was determined using 10 algorithms with Readability Studio Professional Edition (Oleander Software Ltd; Vandalia, OH). Subgroup analysis was performed based on search term and article source - academic hospital, general online resource, peer-reviewed journal, or professional organization. 18 unique articles were identified. Overall mean reading grade level was a 12.89 ± 2.9. The highest reading level among these articles was 15.47-approximately the level of a college senior. Articles from "velopharyngeal dysfunction" had the highest mean reading level (13.73 ± 2.11), above "velopharyngeal insufficiency" (12.30 ± 1.56) and "VPI" (11.66 ± 1.70). Articles from peer-reviewed journals had the highest mean reading level (15.35 ± 2.79), while articles from academic hospitals had the lowest (12.81 ± 1.66). There were statistically significant differences in reading levels between the different search terms (P reading level guidelines, online patient education materials for VPI are disseminated with language too complex for most readers. There is also a lack of VPI-related mobile application data available for patients. Patients will benefit if future updates to websites and disseminated patient information are undertaken with health literacy in mind. Future studies will investigate patient comprehension of these materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Readability, complexity, and suitability analysis of online lymphedema resources.

    Science.gov (United States)

    Tran, Bao Ngoc N; Singh, Mansher; Lee, Bernard T; Rudd, Rima; Singhal, Dhruv

    2017-06-01

    Over 72% of Americans use online health information to assist in health care decision-making. Previous studies of lymphedema literature have focused only on reading level of patient-oriented materials online. Findings indicate they are too advanced for most patients to comprehend. This, more comprehensive study, expands the previous analysis to include critical elements of health materials beyond readability using assessment tools to report on the complexity and density of data as well as text design, vocabulary, and organization. The top 10 highest ranked websites on lymphedema were identified using the most popular search engine (Google). Website content was analyzed for readability, complexity, and suitability using Simple Measure of Gobbledygook, PMOSE/iKIRSCH, and Suitability Assessment of Materials (SAM), respectively. PMOSE/iKIRSCH and SAM were performed by two independent raters. Fleiss' kappa score was calculated to ensure inter-rater reliability. Online lymphedema literature had a reading grade level of 14.0 (SMOG). Overall complexity score was 6.7 (PMOSE/iKIRSCH) corresponding to "low" complexity and requiring a 8th-12th grade education. Fleiss' kappa score was 80% (P = 0.04, "substantial" agreement). Overall suitability score was 45% (SAM) correlating to the lowest level of "adequate" suitability. Fleiss' kappa score was 76% (P = 0.06, "substantial" agreement). Online resources for lymphedema are above the recommended levels for readability and complexity. The suitability level is barely adequate for the intended audience. Overall, these materials are too sophisticated for the average American adult, whose literacy skills are well documented. Further efforts to revise these materials are needed to improve patient comprehension and understanding. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  16. 26 CFR 1.167(b)-0 - Methods of computing depreciation.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 2 2010-04-01 2010-04-01 false Methods of computing depreciation. 1.167(b)-0....167(b)-0 Methods of computing depreciation. (a) In general. Any reasonable and consistently applied method of computing depreciation may be used or continued in use under section 167. Regardless of the...

  17. Use of digital computers for correction of gamma method and neutron-gamma method indications

    International Nuclear Information System (INIS)

    Lakhnyuk, V.M.

    1978-01-01

    The program for the NAIRI-S computer is described which is intended for accounting and elimination of the effect of by-processes when interpreting gamma and neutron-gamma logging indications. By means of slight corrections it is possible to use the program as a mathematical basis for logging diagram standardization by the method of multidimensional regressive analysis and estimation of rock reservoir properties

  18. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  19. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  20. Computational Methods for Physical Model Information Management: Opening the Aperture

    International Nuclear Information System (INIS)

    Moser, F.; Kirgoeze, R.; Gagne, D.; Calle, D.; Murray, J.; Crowley, J.

    2015-01-01

    The volume, velocity and diversity of data available to analysts are growing exponentially, increasing the demands on analysts to stay abreast of developments in their areas of investigation. In parallel to the growth in data, technologies have been developed to efficiently process, store, and effectively extract information suitable for the development of a knowledge base capable of supporting inferential (decision logic) reasoning over semantic spaces. These technologies and methodologies, in effect, allow for automated discovery and mapping of information to specific steps in the Physical Model (Safeguard's standard reference of the Nuclear Fuel Cycle). This paper will describe and demonstrate an integrated service under development at the IAEA that utilizes machine learning techniques, computational natural language models, Bayesian methods and semantic/ontological reasoning capabilities to process large volumes of (streaming) information and associate relevant, discovered information to the appropriate process step in the Physical Model. The paper will detail how this capability will consume open source and controlled information sources and be integrated with other capabilities within the analysis environment, and provide the basis for a semantic knowledge base suitable for hosting future mission focused applications. (author)

  1. THE METHOD OF DESIGNING ASSISTED ON COMPUTER OF THE

    Directory of Open Access Journals (Sweden)

    LUCA Cornelia

    2015-05-01

    Full Text Available To the base of the footwear soles designing, is the shoe last. The shoe lasts have irregular shapes, with various curves witch can’t be represented by a simple mathematic function. In order to design the footwear’s soles it’s necessary to take from the shoe last some base contours. These contours are obtained with high precision in a 3D CAD system. In the paper, it will be presented a method of designing of the soles for footwear, computer assisted. The copying process of the shoe last is done using the 3D digitizer. For digitizing, the shoe last spatial shape is positioned on the peripheral of data gathering, witch follows automatically the shoe last’s surface. The wire network obtained through digitizing is numerically interpolated with the interpolator functions in order to obtain the spatial numerical shape of the shoe last. The 3D designing of the sole will be realized on the numerical shape of the shoe last following the next steps: the manufacture of the sole’s surface, the lateral surface realization of the sole’s shape, obtaining the link surface between the lateral side and the planner one of the sole, of the sole’s margin, the sole’s designing contains the skid proof area. The main advantage of the designing method is the design precision, visualization in 3D space of the sole and the possibility to take the best decision viewing the acceptance of new sole’s pattern.

  2. A method of paralleling computer calculation for two-dimensional kinetic plasma model

    International Nuclear Information System (INIS)

    Brazhnik, V.A.; Demchenko, V.V.; Dem'yanov, V.G.; D'yakov, V.E.; Ol'shanskij, V.V.; Panchenko, V.I.

    1987-01-01

    A method for parallel computer calculation and OSIRIS program complex realizing it and designed for numerical plasma simulation by the macroparticle method are described. The calculation can be carried out either with one or simultaneously with two computers BESM-6, that is provided by some package of interacting programs functioning in every computer. Program interaction in every computer is based on event techniques realized in OS DISPAK. Parallel computer calculation with two BESM-6 computers allows to accelerate the computation 1.5 times

  3. Readability Assessment of Online Uveitis Patient Education Materials.

    Science.gov (United States)

    Ayoub, Samantha; Tsui, Edmund; Mohammed, Taariq; Tseng, Joseph

    2017-12-29

    To evaluate the readability of online uveitis patient education materials. A Google search in November 2016 was completed using search term "uveitis" and "uveitis inflammation." The top 50 websites with patient-centered information were selected and analyzed for readability using the Flesch-Kincaid Grade Level (FKGL), Flesch Reading Ease Score (FRES), Gunning FOG Index (GFI), and Simple Measure of Gobbledygook (SMOG). Statistical analysis was performed with two-tailed t-tests. The mean word count of the top 50 websites was 1162.7 words, and averaged 16.2 words per sentence. For these websites, the mean FRES was 38.0 (range 4-66, SD = 12.0), mean FKGL was 12.3 (range 6.8-19, SD = 2.4), mean SMOG score was 14.4 (range 9.8-19, SD = 1.8), and the mean Gunning FOG index was 14.0 (range 8.6-19, SD = 2.0). The majority of online patient directed uveitis materials are at a higher reading level than that of the average American adult.

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. Computational Fluid Dynamics Methods and Their Applications in Medical Science

    Directory of Open Access Journals (Sweden)

    Kowalewski Wojciech

    2016-12-01

    Full Text Available As defined by the National Institutes of Health: “Biomedical engineering integrates physical, chemical, mathematical, and computational sciences and engineering principles to study biology, medicine, behavior, and health”. Many issues in this area are closely related to fluid dynamics. This paper provides an overview of the basic concepts concerning Computational Fluid Dynamics and its applications in medicine.

  6. The Effect of Technology-Based Altered Readability Levels on Struggling Readers' Science Comprehension

    Science.gov (United States)

    Marino, Matthew T.; Coyne, Michael; Dunn, Michael

    2010-01-01

    This article reports findings from a study examining how altered readability levels affected struggling readers' (N = 288) comprehension of scientific concepts and vocabulary. Specifically, the researchers were interested in learning what effect altered readability levels have when low ability readers participate in a technology-based science…

  7. Readability of Questionnaires Assessing Listening Difficulties Associated with (Central) Auditory Processing Disorders

    Science.gov (United States)

    Atcherson, Samuel R.; Richburg, Cynthia M.; Zraick, Richard I.; George, Cassandra M.

    2013-01-01

    Purpose: Eight English-language, student- or parent proxy-administered questionnaires for (central) auditory processing disorders, or (C)APD, were analyzed for readability. For student questionnaires, readability levels were checked against the approximate reading grade levels by intended administration age per the questionnaires' developers. For…

  8. Can Readability Formulas Be Used to Successfully Gauge Difficulty of Reading Materials?

    Science.gov (United States)

    Begeny, John C.; Greene, Diana J.

    2014-01-01

    A grade level of reading material is commonly estimated using one or more readability formulas, which purport to measure text difficulty based on specified text characteristics. However, there is limited direction for teachers and publishers regarding which readability formulas (if any) are appropriate indicators of actual text difficulty. Because…

  9. Varying Readability of Science-Based Text in Elementary Readers: Challenges for Teachers

    Science.gov (United States)

    Gallagher, Tiffany L.; Fazio, Xavier; Gunning, Thomas G.

    2012-01-01

    This investigation compared readability formulae to publishers' identified reading levels in science-based elementary readers. Nine well-established readability indices were calculated and comparisons were made with the publishers' identified grade designations and between different genres of text. Results revealed considerable variance among the…

  10. A Comparison of Readability in Science-Based Texts: Implications for Elementary Teachers

    Science.gov (United States)

    Gallagher, Tiffany; Fazio, Xavier; Ciampa, Katia

    2017-01-01

    Science curriculum standards were mapped onto various texts (literacy readers, trade books, online articles). Statistical analyses highlighted the inconsistencies among readability formulae for Grades 2-6 levels of the standards. There was a lack of correlation among the readability measures, and also when comparing different text sources. Online…

  11. Measuring the readability of sustainability reports: : A corpus-based analysis through standard formulae and NLP

    NARCIS (Netherlands)

    Smeuninx, N.; De Clerck, B.; Aerts, Walter

    2016-01-01

    This study characterises and problematises the language of corporate reporting along region, industry, genre, and content lines by applying readability formulae and more advanced natural language processing (NLP)–based analysis to a manually assembled 2.75-million-word corpus. Readability formulae

  12. Readability of "Dear Patient" device advisory notification letters created by a device manufacturer.

    Science.gov (United States)

    Mueller, Luke A; Sharma, Arjun; Ottenberg, Abigale L; Mueller, Paul S

    2013-04-01

    In 2006, the Heart Rhythm Society (HRS) recommended that cardiovascular implantable electronic device (CIED) manufacturers use advisory notification letters to communicate with affected patients. To evaluate the readability of the HRS sample "patient device advisory notification" letter and those created by 1 CIED manufacturer. The HRS sample letter and 25 Boston Scientific Corporation letters dated from 2005 through 2011 were evaluated by using 6 readability tests. Readability (Flesch-Kincaid score) of the HRS sample letter was grade level 12.5, and median readability of the device manufacturer letters was grade level 12.8 (range 10.8-18.9). Similar results were obtained by using other readability scales. No letters had readability scores at the National Work Group on Literacy and Health's recommended reading level-fifth grade; the letters' readability exceeded this recommended level by an average of 7.7 grades (95% confidence interval 6.9-8.5; Preadability scores at the average reading level of US adults-eighth grade; the letters' readability exceeded this level by an average of 4.7 grades (95% confidence interval 3.9-5.5; Preadability of the HRS sample letter and those created by a CIED manufacturer significantly exceeded the recommended and average US adults' reading skill levels. Such letters are unlikely to be informative to many patients. CIED manufacturers should ensure that advisory letters are comprehensible to most affected patients. Copyright © 2013 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  13. Universally Designed Text on the Web: Towards Readability Criteria Based on Anti-Patterns.

    Science.gov (United States)

    Eika, Evelyn

    2016-01-01

    The readability of web texts affects accessibility. The Web Content Accessibility guidelines (WCAG) state that the recommended reading level should match that of someone who has completed basic schooling. However, WCAG does not give advice on what constitutes an appropriate reading level. Web authors need tools to help composing WCAG compliant texts, and specific criteria are needed. Classic readability metrics are generally based on lengths of words and sentences and have been criticized for being over-simplistic. Automatic measures and classifications of texts' reading levels employing more advanced constructs remain an unresolved problem. If such measures were feasible, what should these be? This work examines three language constructs not captured by current readability indices but believed to significantly affect actual readability, namely, relative clauses, garden path sentences, and left-branching structures. The goal is to see whether quantifications of these stylistic features reflect readability and how they correspond to common readability measures. Manual assessments of a set of authentic web texts for such uses were conducted. The results reveal that texts related to narratives such as children's stories, which are given the highest readability value, do not contain these constructs. The structures in question occur more frequently in expository texts that aim at educating or disseminating information such as strategy and journal articles. The results suggest that language anti-patterns hold potential for establishing a set of deeper readability criteria.

  14. Readability of Air Force Publications: A Criterion Referenced Evaluation. Final Report.

    Science.gov (United States)

    Hooke, Lydia R.; And Others

    In a study of the readability of Air Force regulations, the writer-estimated reading grade level (RGL) for each regulation was rechecked by using the FORCAST readability formula. In four of the seven cases, the regulation writers underestimated the RGL of their regulation by more than one grade level. None of the writers produced a document with…

  15. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  16. Quality and readability of English-language internet information for aphasia.

    Science.gov (United States)

    Azios, Jamie H; Bellon-Harn, Monica; Dockens, Ashley L; Manchaiah, Vinaya

    2017-08-14

    Little is known about the quality and readability of treatment information in specific neurogenic disorders, such as aphasia. The purpose of this study was to assess quality and readability of English-language Internet information available for aphasia treatment. Forty-three aphasia treatment websites were aggregated using five different country-specific search engines. Websites were then analysed using quality and readability assessments. Statistical calculations were employed to examine website ratings, differences between website origin and quality and readability scores, and correlations between readability instruments. Websites exhibited low quality with few websites obtaining Health On the Net (HON) certification or clear, thorough information as measured by the DISCERN. Regardless of website origin, readability scores were also poor. Approximate educational levels required to comprehend information on aphasia treatment websites ranged from 13 to 16 years of education. Significant differences were found between website origin and readability measures with higher levels of education required to understand information on websites of non-profit organisations. Current aphasia treatment websites were found to exhibit low levels of quality and readability, creating potential accessibility problems for people with aphasia and significant others. Websites including treatment information for aphasia must be improved in order to increase greater information accessibility.

  17. The Readability of Information Literacy Content on Academic Library Web Sites

    Science.gov (United States)

    Lim, Adriene

    2010-01-01

    This article reports on a study addressing the readability of content on academic libraries' Web sites, specifically content intended to improve users' information literacy skills. Results call for recognition of readability as an evaluative component of text in order to better meet the needs of diverse user populations. (Contains 8 tables.)

  18. Analysis of Readability and Interest of Marketing Education Textbooks: Implications for Special Needs Learners.

    Science.gov (United States)

    Jones, Karen H.; And Others

    1993-01-01

    The readability, reading ease, interest level, and writing style of 20 current textbooks in secondary marketing education were evaluated. Readability formulas consistently identified lower reading levels for special needs education, human interest scores were not very reliable information sources, and writing style was also a weak variable. (JOW)

  19. Readability of online patient education materials on adult reconstruction Web sites.

    Science.gov (United States)

    Polishchuk, Daniil L; Hashem, Jenifer; Sabharwal, Sanjeev

    2012-05-01

    Recommended readability of patient education materials is sixth-grade level or lower. Readability of 212 patient education materials pertaining to adult reconstruction topics available from the American Academy of Orthopaedic Surgeons, American Association of Hip and Knee Surgeons, and 3 other specialty and private practitioner Web sites was assessed using the Flesch-Kincaid grade formula. The mean Flesch-Kincaid score was 11.1 (range, 3-26.5). Only 5 (2%) articles had a readability level of sixth grade or lower. Readability of most of the articles for patient education on adult reconstruction Web sites evaluated may be too advanced for a substantial portion of patients. Further studies are needed to assess the optimal readability level of health information on the Internet. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Methods of defining ontologies, word disambiguation methods, computer systems, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Tratz, Stephen C [Richland, WA; Gregory, Michelle L [Richland, WA; Chappell, Alan R [Seattle, WA; Whitney, Paul D [Richland, WA; Posse, Christian [Seattle, WA; Baddeley, Robert L [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2011-10-11

    Methods of defining ontologies, word disambiguation methods, computer systems, and articles of manufacture are described according to some aspects. In one aspect, a word disambiguation method includes accessing textual content to be disambiguated, wherein the textual content comprises a plurality of words individually comprising a plurality of word senses, for an individual word of the textual content, identifying one of the word senses of the word as indicative of the meaning of the word in the textual content, for the individual word, selecting one of a plurality of event classes of a lexical database ontology using the identified word sense of the individual word, and for the individual word, associating the selected one of the event classes with the textual content to provide disambiguation of a meaning of the individual word in the textual content.

  1. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  2. A Simple Method for Dynamic Scheduling in a Heterogeneous Computing System

    OpenAIRE

    Žumer, Viljem; Brest, Janez

    2002-01-01

    A simple method for the dynamic scheduling on a heterogeneous computing system is proposed in this paper. It was implemented to minimize the parallel program execution time. The proposed method decomposes the program workload into computationally homogeneous subtasks, which may be of the different size, depending on the current load of each machine in a heterogeneous computing system.

  3. Readability of patient education materials on the American Orthopaedic Society for Sports Medicine website.

    Science.gov (United States)

    Eltorai, Adam E M; Han, Alex; Truntzer, Jeremy; Daniels, Alan H

    2014-11-01

    The recommended readability of patient education materials by the American Medical Association (AMA) and National Institutes of Health (NIH) should be no greater than a sixth-grade reading level. However, online resources may be too complex for some patients to understand, and poor health literacy predicts inferior health-related quality of life outcomes. This study evaluated whether the American Orthopaedic Society for Sports Medicine (AOSSM) website's patient education materials meet recommended readability guidelines for medical information. We hypothesized that the readability of these online materials would have a Flesch-Kincaid formula grade above the sixth grade. All 65 patient education entries of the AOSSM website were analyzed for grade level readability using the Flesch-Kincaid formula, a widely used and validated tool to evaluate the text reading level. The average (standard deviation) readability of all 65 articles was grade level 10.03 (1.44); 64 articles had a readability score above the sixth-grade level, which is the maximum level recommended by the AMA and NIH. Mean readability of the articles exceeded this level by 4.03 grade levels (95% CI, 3.7-4.4; P reading level of US adults. Mean readability of the articles exceeded this level by 2.03 grade levels (95% CI, 1.7-2.4; P online AOSSM patient education materials exceeds the readability level recommended by the AMA and NIH, and is above the average reading level of the majority of US adults. This online information may be of limited utility to most patients due to a lack of comprehension. Our study provides a clear example of the need to improve the readability of specific education material in order to maximize the efficacy of multimedia sources.

  4. [Systematic analysis of the readability of patient information on the websites of clinics for plastic surgery].

    Science.gov (United States)

    Esfahani, B Janghorban; Faron, A; Roth, K S; Schaller, H-E; Medved, F; Lüers, J-C

    2014-12-01

    The Internet is becoming increasing-ly important as a source of information for patients in medical issues. However, many patients have problems to adequately understand texts, especially with medical content. A basic requirement to understand a written text is the read-ability of a text. The aim of the present study was to examine texts on the websites of German -plastic-surgical hospitals with patient information regarding their readability. In this study, the read-ability of texts of 27 major departments of plastic and Hand surgery in Germany was systematically analysed using 5 recognised readability indices. First, texts were searched based on 20 representative key words and themes. Thereafter, texts were assigned to one of 3 major themes in order to enable statistical analysis. In addition to the 5 readability indices, further objective text parameters were also recorded. Overall, 288 texts were found for analyzation. Most articles were found on the topic of "handsurgery" (n=124), less were found for "facial plastic surgery" (n=80) and "flaps, breast and reconstructive surgery" (n=84). Consistently, all readability indices showed a poor readability for the vast majority of analysed texts with the text appearing readable only for readers with a higher educational level. No significant differences in readability were found between the 3 major themes. Especially in the communication of medical information, it is important to consider the knowledge and education of the addressee. The texts studied consistently showed a readability that is understandable only for academics. Thus, a large part of the intended target group is probably not reached. In order to adequately deliver online information material, a revision of the analysed internet texts appears to be recommendable. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Clearly written, easily comprehended? The readability of websites providing information on epilepsy.

    Science.gov (United States)

    Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele

    2015-03-01

    There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Modelling of elementary computer operations using the intellect method

    Energy Technology Data Exchange (ETDEWEB)

    Shabanov-kushnarenko, Yu P

    1982-01-01

    The formal and apparatus intellect theory is used to describe functions of machine intelligence. A mathematical description is proposed as well as a machine realisation as switched networks of some simple computer operations. 5 references.

  7. Pair Programming as a Modern Method of Teaching Computer Science

    OpenAIRE

    Irena Nančovska Šerbec; Branko Kaučič; Jože Rugelj

    2008-01-01

    At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM C...

  8. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-01

    research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing

  9. High performance computing and quantum trajectory method in CPU and GPU systems

    International Nuclear Information System (INIS)

    Wiśniewska, Joanna; Sawerwain, Marek; Leoński, Wiesław

    2015-01-01

    Nowadays, a dynamic progress in computational techniques allows for development of various methods, which offer significant speed-up of computations, especially those related to the problems of quantum optics and quantum computing. In this work, we propose computational solutions which re-implement the quantum trajectory method (QTM) algorithm in modern parallel computation environments in which multi-core CPUs and modern many-core GPUs can be used. In consequence, new computational routines are developed in more effective way than those applied in other commonly used packages, such as Quantum Optics Toolbox (QOT) for Matlab or QuTIP for Python

  10. Computational methods in several fields of radiation dosimetry

    International Nuclear Information System (INIS)

    Paretzke, Herwig G.

    2010-01-01

    Full text: Radiation dosimetry has to cope with a wide spectrum of applications and requirements in time and size. The ubiquitous presence of various radiation fields or radionuclides in the human home, working, urban or agricultural environment can lead to various dosimetric tasks starting from radioecology, retrospective and predictive dosimetry, personal dosimetry, up to measurements of radionuclide concentrations in environmental and food product and, finally in persons and their excreta. In all these fields measurements and computational models for the interpretation or understanding of observations are employed explicitly or implicitly. In this lecture some examples of own computational models will be given from the various dosimetric fields, including a) Radioecology (e.g. with the code systems based on ECOSYS, which was developed far before the Chernobyl reactor accident, and tested thoroughly afterwards), b) Internal dosimetry (improved metabolism models based on our own data), c) External dosimetry (with the new ICRU-ICRP-Voxelphantom developed by our lab), d) Radiation therapy (with GEANT IV as applied to mixed reactor radiation incident on individualized voxel phantoms), e) Some aspects of nanodosimetric track structure computations (not dealt with in the other presentation of this author). Finally, some general remarks will be made on the high explicit or implicit importance of computational models in radiation protection and other research field dealing with large systems, as well as on good scientific practices which should generally be followed when developing and applying such computational models

  11. A fast computing method to distinguish the hyperbolic trajectory of an non-autonomous system

    Science.gov (United States)

    Jia, Meng; Fan, Yang-Yu; Tian, Wei-Jian

    2011-03-01

    Attempting to find a fast computing method to DHT (distinguished hyperbolic trajectory), this study first proves that the errors of the stable DHT can be ignored in normal direction when they are computed as the trajectories extend. This conclusion means that the stable flow with perturbation will approach to the real trajectory as it extends over time. Based on this theory and combined with the improved DHT computing method, this paper reports a new fast computing method to DHT, which magnifies the DHT computing speed without decreasing its accuracy. Project supported by the National Natural Science Foundation of China (Grant No. 60872159).

  12. A fast computing method to distinguish the hyperbolic trajectory of an non-autonomous system

    International Nuclear Information System (INIS)

    Jia Meng; Fan Yang-Yu; Tian Wei-Jian

    2011-01-01

    Attempting to find a fast computing method to DHT (distinguished hyperbolic trajectory), this study first proves that the errors of the stable DHT can be ignored in normal direction when they are computed as the trajectories extend. This conclusion means that the stable flow with perturbation will approach to the real trajectory as it extends over time. Based on this theory and combined with the improved DHT computing method, this paper reports a new fast computing method to DHT, which magnifies the DHT computing speed without decreasing its accuracy. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  13. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton-Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on-line numerical computations. Based on the decomposition approach and cross-product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on-line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed-chain robot systems.

  14. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  15. Simplified method of computation for fatigue crack growth

    International Nuclear Information System (INIS)

    Stahlberg, R.

    1978-01-01

    A procedure is described for drastically reducing the computation time in calculating crack growth for variable-amplitude fatigue loading when the loading sequence is periodic. By the proposed procedure, the crack growth, r, per loading is approximated as a smooth function and its reciprocal is integrated, rather than summing crack growth cycle by cycle. The savings in computation time results since only a few pointwise values of r must be computed to generate an accurate interpolation function for numerical integration. Further time savings can be achieved by selecting the stress intensity coefficient (stress intensity divided by load) as the argument of r. Once r has been obtained as a function of stress intensity coefficient for a given material, environment, and loading sequence, it applies to any configuration of cracked structure. (orig.) [de

  16. A hybrid method for the parallel computation of Green's functions

    DEFF Research Database (Denmark)

    Petersen, Dan Erik; Li, Song; Stokbro, Kurt

    2009-01-01

    of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds...... of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only...... require computing a small number of entries of the inverse matrix. Then. we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size....

  17. Comparison of four classification methods for brain-computer interface

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Bobrov, P.

    2011-01-01

    Roč. 21, č. 2 (2011), s. 101-115 ISSN 1210-0552 R&D Projects: GA MŠk(CZ) 1M0567; GA ČR GA201/05/0079; GA ČR GAP202/10/0262 Institutional research plan: CEZ:AV0Z10300504 Keywords : brain computer interface * motor imagery * visual imagery * EEG pattern classification * Bayesian classification * Common Spatial Patterns * Common Tensor Discriminant Analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 0.646, year: 2011

  18. Privacy Policies for Apps Targeted Toward Youth: Descriptive Analysis of Readability

    Science.gov (United States)

    Das, Gitanjali; Cheung, Cynthia; Nebeker, Camille; Bietz, Matthew

    2018-01-01

    Background Due to the growing availability of consumer information, the protection of personal data is of increasing concern. Objective We assessed readability metrics of privacy policies for apps that are either available to or targeted toward youth to inform strategies to educate and protect youth from unintentional sharing of personal data. Methods We reviewed the 1200 highest ranked apps from the Apple and Google Play Stores and systematically selected apps geared toward youth. After applying exclusion criteria, 99 highly ranked apps geared toward minors remained, 64 of which had a privacy policy. We obtained and analyzed these privacy policies using reading grade level (RGL) as a metric. Policies were further compared as a function of app category (free vs paid; entertainment vs social networking vs utility). Results Analysis of privacy policies for these 64 apps revealed an average RGL of 12.78, which is well above the average reading level (8.0) of adults in the United States. There was also a small but statistically significant difference in word count as a function of app category (entertainment: 2546 words, social networking: 3493 words, and utility: 1038 words; P=.02). Conclusions Although users must agree to privacy policies to access digital tools and products, readability analyses suggest that these agreements are not comprehensible to most adults, let alone youth. We propose that stakeholders, including pediatricians and other health care professionals, play a role in educating youth and their guardians about the use of Web-based services and potential privacy risks, including the unintentional sharing of personal data. PMID:29301737

  19. The Extrapolation-Accelerated Multilevel Aggregation Method in PageRank Computation

    Directory of Open Access Journals (Sweden)

    Bing-Yuan Pu

    2013-01-01

    Full Text Available An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.

  20. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  1. Readability of Trauma-Related Patient Education Materials From the American Academy of Orthopaedic Surgeons.

    Science.gov (United States)

    Eltorai, Adam E M; P Thomas, Nathan; Yang, Heejae; Daniels, Alan H; Born, Christopher T

    2016-02-01

    According to the american medical association (AMA) and the national institutes of health (NIH), the recommended readability of patient education materials should be no greater than a sixth-grade reading level. The online patient education information produced by the american academy of orthopaedic surgeons (AAOS) may be too complicated for some patients to understand. This study evaluated whether the AAOS's online trauma-related patient education materials meet recommended readability guidelines for medical information. Ninety-nine articles from the "Broken Bones and Injuries" section of the AAOS-produced patient education website, orthoinfo.org, were analyzed for grade level readability using the Flesch-Kincaid formula, a widely-used and validated tool to evaluate the text reading level. Results for each webpage were compared to the AMA/NIH recommended sixth-grade reading level and the average reading level of U.S. adults (eighth-grade). The mean (SD) grade level readability for all patient education articles was 8.8 (1.1). All but three of the articles had a readability score above the sixth-grade level. The readability of the articles exceeded this level by an average of 2.8 grade levels (95% confidence interval, 2.6 - 3.0; P reading skill level of U.S. adults (eighth grade) by nearly an entire grade level (95% confidence interval, 0.6-1.0; P education website have readability levels that may make comprehension difficult for a substantial portion of the patient population.

  2. Readability of Online Patient Education Materials From the AAOS Web Site

    Science.gov (United States)

    Badarudeen, Sameer; Unes Kunju, Shebna

    2008-01-01

    One of the goals of the American Academy of Orthopaedic Surgeons (AAOS) is to disseminate patient education materials that suit the readability skills of the patient population. According to standard guidelines from healthcare organizations, the readability of patient education materials should be no higher than the sixth-grade level. We hypothesized the readability level of patient education materials available on the AAOS Web site would be higher than the recommended grade level, regardless when the material was available online. Readability scores of all articles from the AAOS Internet-based patient information Web site, “Your Orthopaedic Connection,” were determined using the Flesch-Kincaid grade formula. The mean Flesch-Kincaid grade level of the 426 unique articles was 10.43. Only 10 (2%) of the articles had the recommended readability level of sixth grade or lower. The readability of the articles did not change with time. Our findings suggest the majority of the patient education materials available on the AAOS Web site had readability scores that may be too difficult for comprehension by a substantial portion of the patient population. PMID:18324452

  3. Quality and Readability of English-Language Internet Information for Voice Disorders.

    Science.gov (United States)

    Dueppen, Abigail J; Bellon-Harn, Monica L; Radhakrishnan, Nandhakumar; Manchaiah, Vinaya

    2017-12-15

    The purpose of this study is to evaluate the readability and quality of English-language Internet information related to vocal hygiene, vocal health, and prevention of voice disorders. This study extends recent work because it evaluates readability, content quality, and website origin across broader search criteria than previous studies evaluating online voice material. Eighty-five websites were aggregated using five different country-specific search engines. Websites were then analyzed using quality and readability assessments. The entire web page was evaluated; however, no information or links beyond the first page was reviewed. Statistical calculations were employed to examine website ratings, differences between website origin and quality and readability scores, and correlations between readability instruments. Websites exhibited acceptable quality as measured by the DISCERN. However, only one website obtained the Health On the Net certification. Significant differences in quality were found among website origin, with government websites receiving higher quality ratings. Approximate educational levels required to comprehend information on the websites ranged from 8 to 9 years of education. Significant differences were found between website origin and readability measures with higher levels of education required to understand information on websites of nonprofit organizations. Current vocal hygiene, vocal health, and prevention of voice disorders websites were found to exhibit acceptable levels of quality and readability. However, highly rated Internet information related to voice care should be made more accessible to voice clients through Health On the Net certification. Published by Elsevier Inc.

  4. Readability evaluation of Internet-based patient education materials related to the anesthesiology field.

    Science.gov (United States)

    De Oliveira, Gildasio S; Jung, Michael; Mccaffery, Kirsten J; McCarthy, Robert J; Wolf, Michael S

    2015-08-01

    The main objective of the current investigation was to assess the readability of Internet-based patient education materials related to the field of anesthesiology. We hypothesized that the majority of patient education materials would not be written according to current recommended readability grade level. Online patient education materials describing procedures, risks, and management of anesthesia-related topics were identified using the search engine Google (available at www.google.com) using the terms anesthesia, anesthesiology, anesthesia risks, and anesthesia care. Cross-sectional evaluation. None. Assessments of content readability were performed using validated instruments (Flesch-Kincaid Grade Formulae, the Gunning Frequency of Gobbledygook, the New Dale-Chall Test, the Fry graph, and the Flesch Reading Ease score). Ninety-six Web sites containing Internet patient education materials (IPEMs) were evaluated. The median (interquartile range) readability grade level for all evaluated IPEMs was 13.5 (12.0-14.6). All the evaluated documents were classified at a greater readability level than the current recommended readability grade, P Internet-based patient education materials related to the field of anesthesiology are currently written far above the recommended readability grade level. High complexity of written education materials likely limits access of information to millions of American patients. Redesign of online content of Web sites that provide patient education material regarding anesthesia could be an important step in improving access to information for patients with poor health literacy. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Computational methods to dissect cis-regulatory transcriptional ...

    Indian Academy of Sciences (India)

    The formation of diverse cell types from an invariant set of genes is governed by biochemical and molecular processes that regulate gene activity. A complete understanding of the regulatory mechanisms of gene expression is the major function of genomics. Computational genomics is a rapidly emerging area for ...

  6. Computer methods in designing tourist equipment for people with disabilities

    Science.gov (United States)

    Zuzda, Jolanta GraŻyna; Borkowski, Piotr; Popławska, Justyna; Latosiewicz, Robert; Moska, Eleonora

    2017-11-01

    Modern technologies enable disabled people to enjoy physical activity every day. Many new structures are matched individually and created for people who fancy active tourism, giving them wider opportunities for active pastime. The process of creating this type of devices in every stage, from initial design through assessment to validation, is assisted by various types of computer support software.

  7. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  8. Computational methods for more fuel-efficient ship

    NARCIS (Netherlands)

    Koren, B.

    2008-01-01

    The flow of water around a ship powered by a combustion engine is a key factor in the ship's fuel consumption. The simulation of flow patterns around ship hulls is therefore an important aspect of ship design. While lengthy computations are required for such simulations, research by Jeroen Wackers

  9. New Methods of Mobile Computing: From Smartphones to Smart Education

    Science.gov (United States)

    Sykes, Edward R.

    2014-01-01

    Every aspect of our daily lives has been touched by the ubiquitous nature of mobile devices. We have experienced an exponential growth of mobile computing--a trend that seems to have no limit. This paper provides a report on the findings of a recent offering of an iPhone Application Development course at Sheridan College, Ontario, Canada. It…

  10. An affective music player: Methods and models for physiological computing

    NARCIS (Netherlands)

    Janssen, J.H.; Westerink, J.H.D.M.; van den Broek, Egon

    2009-01-01

    Affective computing is embraced by many to create more intelligent systems and smart environments. In this thesis, a specific affective application is envisioned: an affective physiological music player (APMP), which should be able to direct its user's mood. In a first study, the relationship

  11. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    Science.gov (United States)

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  12. All for One: Integrating Budgetary Methods by Computer.

    Science.gov (United States)

    Herman, Jerry J.

    1994-01-01

    With the advent of high speed and sophisticated computer programs, all budgetary systems can be combined in one fiscal management information system. Defines and provides examples for the four budgeting systems: (1) function/object; (2) planning, programming, budgeting system; (3) zero-based budgeting; and (4) site-based budgeting. (MLF)

  13. Method for quantitative assessment of nuclear safety computer codes

    International Nuclear Information System (INIS)

    Dearien, J.A.; Davis, C.B.; Matthews, L.J.

    1979-01-01

    A procedure has been developed for the quantitative assessment of nuclear safety computer codes and tested by comparison of RELAP4/MOD6 predictions with results from two Semiscale tests. This paper describes the developed procedure, the application of the procedure to the Semiscale tests, and the results obtained from the comparison

  14. A Parameter Estimation Method for Dynamic Computational Cognitive Models

    NARCIS (Netherlands)

    Thilakarathne, D.J.

    2015-01-01

    A dynamic computational cognitive model can be used to explore a selected complex cognitive phenomenon by providing some features or patterns over time. More specifically, it can be used to simulate, analyse and explain the behaviour of such a cognitive phenomenon. It generates output data in the

  15. Computed radiography imaging plates and associated methods of manufacture

    Science.gov (United States)

    Henry, Nathaniel F.; Moses, Alex K.

    2015-08-18

    Computed radiography imaging plates incorporating an intensifying material that is coupled to or intermixed with the phosphor layer, allowing electrons and/or low energy x-rays to impart their energy on the phosphor layer, while decreasing internal scattering and increasing resolution. The radiation needed to perform radiography can also be reduced as a result.

  16. Verifying a computational method for predicting extreme ground motion

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  17. A comparison of methods for the assessment of postural load and duration of computer use

    NARCIS (Netherlands)

    Heinrich, J.; Blatter, B.M.; Bongers, P.M.

    2004-01-01

    Aim: To compare two different methods for assessment of postural load and duration of computer use in office workers. Methods: The study population existed of 87 computer workers. Questionnaire data about exposure were compared with exposures measured by a standardised or objective method. Measuring

  18. Minimizing the Free Energy: A Computer Method for Teaching Chemical Equilibrium Concepts.

    Science.gov (United States)

    Heald, Emerson F.

    1978-01-01

    Presents a computer method for teaching chemical equilibrium concepts using material balance conditions and the minimization of the free energy. Method for the calculation of chemical equilibrium, the computer program used to solve equilibrium problems and applications of the method are also included. (HM)

  19. Improving the readability of online foot and ankle patient education materials.

    Science.gov (United States)

    Sheppard, Evan D; Hyde, Zane; Florence, Mason N; McGwin, Gerald; Kirchner, John S; Ponce, Brent A

    2014-12-01

    Previous studies have shown the need for improving the readability of many patient education materials to increase patient comprehension. This study's purpose was to determine the readability of foot and ankle patient education materials and to determine the extent readability can be improved. We hypothesized that the reading levels would be above the recommended guidelines and that decreasing the sentence length would also decrease the reading level of these patient educational materials. Patient education materials from online public sources were collected. The readability of these articles was assessed by a readability software program. The detailed instructions provided by the National Institutes of Health (NIH) were then used as a guideline for performing edits to help improve the readability of selected articles. The most quantitative guideline, lowering all sentences to less than 15 words, was chosen to show the effect of following the NIH recommendations. The reading levels of the sampled articles were above the sixth to seventh grade recommendations of the NIH. The MedlinePlus website, which is a part of the NIH website, had the lowest reading level (8.1). The articles edited had an average reduction of 1.41 grade levels, with the lowest reduction in the Medline articles of 0.65. Providing detailed instructions to the authors writing these patient education articles and implementing editing techniques based on previous recommendations could lead to an improvement in the readability of patient education materials. This study provides authors of patient education materials with simple editing techniques that will allow for the improvement in the readability of online patient educational materials. The improvement in readability will provide patients with more comprehendible education materials that can strengthen patient awareness of medical problems and treatments. © The Author(s) 2014.

  20. Readability assessment of package inserts of biological medicinal products from the European medicines agency website.

    Science.gov (United States)

    Piñero-López, Ma Ángeles; Modamio, Pilar; Lastra, Cecilia F; Mariño, Eduardo L

    2014-07-01

    Package inserts that accompany medicines are a common source of information aimed at patients and should match patient abilities in terms of readability. Our objective was to determine the degree of readability of the package inserts for biological medicinal products commercially available in 2007 and compare them with the readability of the same package inserts in 2010. A total of 33 package inserts were selected and classified into five groups according to the type of medicine: monoclonal antibody-based products, cytokines, therapeutic enzymes, recombinant blood factors and other blood-related products, and recombinant hormones. The package inserts were downloaded from the European Medicines Agency website in 2007 and 2010. Readability was evaluated for the entire text of five of the six sections of the package inserts and for the 'Annex' when there was one. Three readability formulas were used: SMOG (Simple Measure of Gobbledygook) grade, Flesh-Kincaid grade level, and Szigriszt's perspicuity index. No significant differences were found between the readability results for the 2007 package inserts and those from 2010 according to any of the three readability indices studied (p>0.05). However, there were significant differences (preadability scores of the sections of the package inserts in both 2007 and 2010. The readability of the package inserts was above the recommended sixth grade reading level (ages 11-12) and may lead to difficulties of understanding for people with limited literacy. All the sections should be easy to read and, therefore, the readability of the medicine package inserts studied should be improved.

  1. Readability of patient education materials in ophthalmology: a single-institution study and systematic review.

    Science.gov (United States)

    Williams, Andrew M; Muir, Kelly W; Rosdahl, Jullia A

    2016-08-03

    Patient education materials should be written at a level that is understandable for patients with low health literacy. The aims of this study are (1) to review the literature on readability of ophthalmic patient education materials and (2) to evaluate and revise our institution's patient education materials about glaucoma using evidence-based guidelines on writing for patients with low health literacy. A systematic search was conducted on the PubMed/MEDLINE database for studies that have evaluated readability level of ophthalmic patient education materials, and the reported readability scores were assessed. Additionally, we collected evidence-based guidelines for writing easy-to-read patient education materials, and these recommendations were applied to revise 12 patient education handouts on various glaucoma topics at our institution. Readability measures, including Flesch-Kincaid Grade Level (FKGL), and word count were calculated for the original and revised documents. The original and revised versions of the handouts were then scored in random order by two glaucoma specialists using the Suitability Assessment of Materials (SAM) instrument, a grading scale used to evaluate suitability of health information materials for patients. Paired t test was used to analyze changes in readability measures, word count, and SAM score between original and revised handouts. Finally, five glaucoma patients were interviewed to discuss the revised materials, and patient feedback was analyzed qualitatively. Our literature search included 13 studies that evaluated a total of 950 educational materials. Among the mean FKGL readability scores reported in these studies, the median was 11 (representing an eleventh-grade reading level). At our institution, handouts' readability averaged a tenth-grade reading level (FKGL = 10.0 ± 1.6), but revising the handouts improved their readability to a sixth-grade reading level (FKGL = 6.4 ± 1.2) (p readability and suitability of

  2. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed; Meier, Stuart Kurt

    2013-01-01

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  3. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  4. Computational Nuclear Physics and Post Hartree-Fock Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lietz, Justin [Michigan State University; Sam, Novario [Michigan State University; Hjorth-Jensen, M. [University of Oslo, Norway; Hagen, Gaute [ORNL; Jansen, Gustav R. [ORNL

    2017-05-01

    We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions on strategies for porting the code to present and planned high-performance computing facilities.

  5. Multi-Level iterative methods in computational plasma physics

    International Nuclear Information System (INIS)

    Knoll, D.A.; Barnes, D.C.; Brackbill, J.U.; Chacon, L.; Lapenta, G.

    1999-01-01

    Plasma physics phenomena occur on a wide range of spatial scales and on a wide range of time scales. When attempting to model plasma physics problems numerically the authors are inevitably faced with the need for both fine spatial resolution (fine grids) and implicit time integration methods. Fine grids can tax the efficiency of iterative methods and large time steps can challenge the robustness of iterative methods. To meet these challenges they are developing a hybrid approach where multigrid methods are used as preconditioners to Krylov subspace based iterative methods such as conjugate gradients or GMRES. For nonlinear problems they apply multigrid preconditioning to a matrix-few Newton-GMRES method. Results are presented for application of these multilevel iterative methods to the field solves in implicit moment method PIC, multidimensional nonlinear Fokker-Planck problems, and their initial efforts in particle MHD

  6. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  7. Easy computer assisted teaching method for undergraduate surgery

    OpenAIRE

    Agrawal, Vijay P

    2015-01-01

    Use of computers to aid or support the education or training of people has become commonplace in medical education. Recent studies have shown that it can improve learning outcomes in diagnostic abilities, clinical skills and knowledge across different learner levels from undergraduate medical education to continuing medical education. It also enhance the educational process by increasing access to learning materials, standardising the educational process, providing opportunities for asynchron...

  8. The cell method a purely algebraic computational method in physics and engineering

    CERN Document Server

    Ferretti, Elena

    2014-01-01

    The Cell Method (CM) is a computational tool that maintains critical multidimensional attributes of physical phenomena in analysis. This information is neglected in the differential formulations of the classical approaches of finite element, boundary element, finite volume, and finite difference analysis, often leading to numerical instabilities and spurious results. This book highlights the central theoretical concepts of the CM that preserve a more accurate and precise representation of the geometric and topological features of variables for practical problem solving. Important applications occur in fields such as electromagnetics, electrodynamics, solid mechanics and fluids. CM addresses non-locality in continuum mechanics, an especially important circumstance in modeling heterogeneous materials. Professional engineers and scientists, as well as graduate students, are offered: A general overview of physics and its mathematical descriptions; Guidance on how to build direct, discrete formulations; Coverag...

  9. Methods and systems for identifying ligand-protein binding sites

    KAUST Repository

    Gao, Xin; Naveed, Hammad

    2016-01-01

    The invention provides a novel integrated structure and system-based approach for drug target prediction that enables the large-scale discovery of new targets for existing drugs Novel computer-readable storage media and computer systems are also

  10. Stable numerical method in computation of stellar evolution

    International Nuclear Information System (INIS)

    Sugimoto, Daiichiro; Eriguchi, Yoshiharu; Nomoto, Ken-ichi.

    1982-01-01

    To compute the stellar structure and evolution in different stages, such as (1) red-giant stars in which the density and density gradient change over quite wide ranges, (2) rapid evolution with neutrino loss or unstable nuclear flashes, (3) hydrodynamical stages of star formation or supernova explosion, (4) transition phases from quasi-static to dynamical evolutions, (5) mass-accreting or losing stars in binary-star systems, and (6) evolution of stellar core whose mass is increasing by shell burning or decreasing by penetration of convective envelope into the core, we face ''multi-timescale problems'' which can neither be treated by simple-minded explicit scheme nor implicit one. This problem has been resolved by three prescriptions; one by introducing the hybrid scheme suitable for the multi-timescale problems of quasi-static evolution with heat transport, another by introducing also the hybrid scheme suitable for the multi-timescale problems of hydrodynamic evolution, and the other by introducing the Eulerian or, in other words, the mass fraction coordinate for evolution with changing mass. When all of them are combined in a single computer code, we can compute numerically stably any phase of stellar evolution including transition phases, as far as the star is spherically symmetric. (author)

  11. Unconventional methods of imaging: computational microscopy and compact implementations

    Science.gov (United States)

    McLeod, Euan; Ozcan, Aydogan

    2016-07-01

    In the past two decades or so, there has been a renaissance of optical microscopy research and development. Much work has been done in an effort to improve the resolution and sensitivity of microscopes, while at the same time to introduce new imaging modalities, and make existing imaging systems more efficient and more accessible. In this review, we look at two particular aspects of this renaissance: computational imaging techniques and compact imaging platforms. In many cases, these aspects go hand-in-hand because the use of computational techniques can simplify the demands placed on optical hardware in obtaining a desired imaging performance. In the first main section, we cover lens-based computational imaging, in particular, light-field microscopy, structured illumination, synthetic aperture, Fourier ptychography, and compressive imaging. In the second main section, we review lensfree holographic on-chip imaging, including how images are reconstructed, phase recovery techniques, and integration with smart substrates for more advanced imaging tasks. In the third main section we describe how these and other microscopy modalities have been implemented in compact and field-portable devices, often based around smartphones. Finally, we conclude with some comments about opportunities and demand for better results, and where we believe the field is heading.

  12. Analysis of multigrid methods on massively parallel computers: Architectural implications

    Science.gov (United States)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  13. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Randal Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-22

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest and emerging HPC systems.

  14. An historical survey of computational methods in optimal control.

    Science.gov (United States)

    Polak, E.

    1973-01-01

    Review of some of the salient theoretical developments in the specific area of optimal control algorithms. The first algorithms for optimal control were aimed at unconstrained problems and were derived by using first- and second-variation methods of the calculus of variations. These methods have subsequently been recognized as gradient, Newton-Raphson, or Gauss-Newton methods in function space. A much more recent addition to the arsenal of unconstrained optimal control algorithms are several variations of conjugate-gradient methods. At first, constrained optimal control problems could only be solved by exterior penalty function methods. Later algorithms specifically designed for constrained problems have appeared. Among these are methods for solving the unconstrained linear quadratic regulator problem, as well as certain constrained minimum-time and minimum-energy problems. Differential-dynamic programming was developed from dynamic programming considerations. The conditional-gradient method, the gradient-projection method, and a couple of feasible directions methods were obtained as extensions or adaptations of related algorithms for finite-dimensional problems. Finally, the so-called epsilon-methods combine the Ritz method with penalty function techniques.

  15. Computation of Optimal Monotonicity Preserving General Linear Methods

    KAUST Repository

    Ketcheson, David I.

    2009-07-01

    Monotonicity preserving numerical methods for ordinary differential equations prevent the growth of propagated errors and preserve convex boundedness properties of the solution. We formulate the problem of finding optimal monotonicity preserving general linear methods for linear autonomous equations, and propose an efficient algorithm for its solution. This algorithm reliably finds optimal methods even among classes involving very high order accuracy and that use many steps and/or stages. The optimality of some recently proposed methods is verified, and many more efficient methods are found. We use similar algorithms to find optimal strong stability preserving linear multistep methods of both explicit and implicit type, including methods for hyperbolic PDEs that use downwind-biased operators.

  16. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  17. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  18. Improved methods for computing masses from numerical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kronfeld, A.S.

    1989-11-22

    An important advance in the computation of hadron and glueball masses has been the introduction of non-local operators. This talk summarizes the critical signal-to-noise ratio of glueball correlation functions in the continuum limit, and discusses the case of (q{bar q} and qqq) hadrons in the chiral limit. A new strategy for extracting the masses of excited states is outlined and tested. The lessons learned here suggest that gauge-fixed momentum-space operators might be a suitable choice of interpolating operators. 15 refs., 2 tabs.

  19. Advanced Computational Methods for Thermal Radiative Heat Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  20. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  1. Lattice QCD computations: Recent progress with modern Krylov subspace methods

    Energy Technology Data Exchange (ETDEWEB)

    Frommer, A. [Bergische Universitaet GH Wuppertal (Germany)

    1996-12-31

    Quantum chromodynamics (QCD) is the fundamental theory of the strong interaction of matter. In order to compare the theory with results from experimental physics, the theory has to be reformulated as a discrete problem of lattice gauge theory using stochastic simulations. The computational challenge consists in solving several hundreds of very large linear systems with several right hand sides. A considerable part of the world`s supercomputer time is spent in such QCD calculations. This paper presents results on solving systems for the Wilson fermions. Recent progress is reviewed on algorithms obtained in cooperation with partners from theoretical physics.

  2. Permeability computation on a REV with an immersed finite element method

    International Nuclear Information System (INIS)

    Laure, P.; Puaux, G.; Silva, L.; Vincent, M.

    2011-01-01

    An efficient method to compute permeability of fibrous media is presented. An immersed domain approach is used to represent the porous material at its microscopic scale and the flow motion is computed with a stabilized mixed finite element method. Therefore the Stokes equation is solved on the whole domain (including solid part) using a penalty method. The accuracy is controlled by refining the mesh around the solid-fluid interface defined by a level set function. Using homogenisation techniques, the permeability of a representative elementary volume (REV) is computed. The computed permeabilities of regular fibre packings are compared to classical analytical relations found in the bibliography.

  3. Higher-Order Integral Equation Methods in Computational Electromagnetics

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Meincke, Peter

    Higher-order integral equation methods have been investigated. The study has focused on improving the accuracy and efficiency of the Method of Moments (MoM) applied to electromagnetic problems. A new set of hierarchical Legendre basis functions of arbitrary order is developed. The new basis...

  4. A computational method for the solution of one-dimensional ...

    Indian Academy of Sciences (India)

    embedding parameter p ∈ [0, 1], which is considered as a 'small parameter'. Consid- erable research work has recently been conducted in applying this method to a class of linear and nonlinear equations. This method was further developed and improved by He, and applied to nonlinear oscillators with discontinuities [1], ...

  5. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  6. Computation Method Comparison for Th Based Seed-Blanket Cores

    International Nuclear Information System (INIS)

    Kolesnikov, S.; Galperin, A.; Shwageraus, E.

    2004-01-01

    This work compares two methods for calculating a given nuclear fuel cycle in the WASB configuration. Both methods use the ELCOS Code System (2-D transport code BOXER and 3-D nodal code SILWER) [4] are compared. In the first method, the cross-sections of the Seed and Blanket, needed for the 3-D nodal code are generated separately for each region by the 2-D transport code. In the second method, the cross-sections of the Seed and Blanket, needed for the 3-D nodal code are generated from Seed-Blanket Colorsets (Fig.1) calculated by the 2-D transport code. The evaluation of the error introduced by the first method is the main objective of the present study

  7. Numerical computation of FCT equilibria by inverse equilibrium method

    International Nuclear Information System (INIS)

    Tokuda, Shinji; Tsunematsu, Toshihide; Takeda, Tatsuoki

    1986-11-01

    FCT (Flux Conserving Tokamak) equilibria were obtained numerically by the inverse equilibrium method. The high-beta tokamak ordering was used to get the explicit boundary conditions for FCT equilibria. The partial differential equation was reduced to the simultaneous quasi-linear ordinary differential equations by using the moment method. The regularity conditions for solutions at the singular point of the equations can be expressed correctly by this reduction and the problem to be solved becomes a tractable boundary value problem on the quasi-linear ordinary differential equations. This boundary value problem was solved by the method of quasi-linearization, one of the shooting methods. Test calculations show that this method provides high-beta tokamak equilibria with sufficiently high accuracy for MHD stability analysis. (author)

  8. Сlassification of methods of production of computer forensic by usage approach of graph theory

    Directory of Open Access Journals (Sweden)

    Anna Ravilyevna Smolina

    2016-06-01

    Full Text Available Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  9. Сlassification of methods of production of computer forensic by usage approach of graph theory

    OpenAIRE

    Anna Ravilyevna Smolina; Alexander Alexandrovich Shelupanov

    2016-01-01

    Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  10. Reference interval computation: which method (not) to choose?

    Science.gov (United States)

    Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C

    2012-07-11

    When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions

    International Nuclear Information System (INIS)

    Dragt, A.J.; Gluckstern, R.L.

    1992-11-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high frequency behavior of longitudinal and transverse coupling impedances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides

  12. Computational methods for constructing protein structure models from 3D electron microscopy maps.

    Science.gov (United States)

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2013-10-01

    Protein structure determination by cryo-electron microscopy (EM) has made significant progress in the past decades. Resolutions of EM maps have been improving as evidenced by recently reported structures that are solved at high resolutions close to 3Å. Computational methods play a key role in interpreting EM data. Among many computational procedures applied to an EM map to obtain protein structure information, in this article we focus on reviewing computational methods that model protein three-dimensional (3D) structures from a 3D EM density map that is constructed from two-dimensional (2D) maps. The computational methods we discuss range from de novo methods, which identify structural elements in an EM map, to structure fitting methods, where known high resolution structures are fit into a low-resolution EM map. A list of available computational tools is also provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Interpolation method by whole body computed tomography, Artronix 1120

    International Nuclear Information System (INIS)

    Fujii, Kyoichi; Koga, Issei; Tokunaga, Mitsuo

    1981-01-01

    Reconstruction of the whole body CT images by interpolation method was investigated by rapid scanning. Artronix 1120 with fixed collimator was used to obtain the CT images every 5 mm. X-ray source was circully movable to obtain perpendicular beam to the detector. A length of 150 mm was scanned in about 15 min., with the slice width of 5 mm. The images were reproduced every 7.5 mm, which was able to reduce every 1.5 mm when necessary. Out of 420 inspection in the chest, abdomen, and pelvis, 5 representative cases for which this method was valuable were described. The cases were fibrous histiocytoma of upper mediastinum, left adrenal adenoma, left ureter fibroma, recurrence of colon cancer in the pelvis, and abscess around the rectum. This method improved the image quality of lesions in the vicinity of the ureters, main artery, and rectum. The time required and exposure dose were reduced to 50% by this method. (Nakanishi, T.)

  14. A new method for computing the quark-gluon vertex

    International Nuclear Information System (INIS)

    Aguilar, A C

    2015-01-01

    In this talk we present a new method for determining the nonperturbative quark-gluon vertex, which constitutes a crucial ingredient for a variety of theoretical and phenomenological studies. This new method relies heavily on the exact all-order relation connecting the conventional quark-gluon vertex with the corresponding vertex of the background field method, which is Abelian-like. The longitudinal part of this latter quantity is fixed using the standard gauge technique, whereas the transverse is estimated with the help of the so-called transverse Ward identities. This method allows the approximate determination of the nonperturbative behavior of all twelve form factors comprising the quark-gluon vertex, for arbitrary values of the momenta. Numerical results are presented for the form factors in three special kinematical configurations (soft gluon and quark symmetric limit, zero quark momentum), and compared with the corresponding lattice data. (paper)

  15. Medical Data Probabilistic Analysis by Optical Computing Methods

    Directory of Open Access Journals (Sweden)

    Alexander LARKIN

    2014-06-01

    Full Text Available The purpose of this article to show the laser coherent photonics methods can be use for classification of medical information. It is shown that the holography methods can be used not only for work with images. Holographic methods can be used for processing of information provided in the universal multi-parametric form. It is shown that along with the usual correlation algorithm enable to realize a number of algorithms of classification: searching for a precedent, Hamming distance measurement, Bayes probability algorithm, deterministic and “correspondence” algorithms. Significantly, that preserves all advantages of holographic method – speed, two-dimension, record-breaking high capacity of memory, flexibility of data processing and representation of result, high radiation resistance in comparison with electronic equipment. For example is presented the result of solving one of the problems of medical diagnostics - a forecast of organism state after mass traumatic lesions.

  16. Readability of Online Patient Educational Resources Found on NCI-Designated Cancer Center Web Sites.

    Science.gov (United States)

    Rosenberg, Stephen A; Francis, David; Hullett, Craig R; Morris, Zachary S; Fisher, Michael M; Brower, Jeffrey V; Bradley, Kristin A; Anderson, Bethany M; Bassetti, Michael F; Kimple, Randall J

    2016-06-01

    The NIH and Department of Health & Human Services recommend online patient information (OPI) be written at a sixth grade level. We used a panel of readability analyses to assess OPI from NCI-Designated Cancer Center (NCIDCC) Web sites. Cancer.gov was used to identify 68 NCIDCC Web sites from which we collected both general OPI and OPI specific to breast, prostate, lung, and colon cancers. This text was analyzed by 10 commonly used readability tests: the New Dale-Chall Readability Formula, Flesch Reading Ease scale, Flesch-Kinaid Grade Level, FORCAST scale, Fry Readability Graph, Simple Measure of Gobbledygook test, Gunning Frequency of Gobbledygook index, New Fog Count, Raygor Readability Estimate Graph, and Coleman-Liau Index. We tested the hypothesis that the readability of NCIDCC OPI was written at the sixth grade level. Secondary analyses were performed to compare readability of OPI between comprehensive and noncomprehensive centers, by region, and to OPI produced by the American Cancer Society (ACS). A mean of 30,507 words from 40 comprehensive and 18 noncomprehensive NCIDCCs was analyzed (7 nonclinical and 3 without appropriate OPI were excluded). Using a composite grade level score, the mean readability score of 12.46 (ie, college level: 95% CI, 12.13-12.79) was significantly greater than the target grade level of 6 (middle-school: Preadability metrics (P<.05). ACS OPI provides easier language, at the seventh to ninth grade level, across all tests (P<.01). OPI from NCIDCC Web sites is more complex than recommended for the average patient. Copyright © 2016 by the National Comprehensive Cancer Network.

  17. Readability of Patient Education Materials in Hand Surgery and Health Literacy Best Practices for Improvement.

    Science.gov (United States)

    Hadden, Kristie; Prince, Latrina Y; Schnaekel, Asa; Couch, Cory G; Stephenson, John M; Wyrick, Theresa O

    2016-08-01

    This study aimed to update a portion of a 2008 study of patient education materials from the American Society for Surgery of the Hand Web site with new readability results, to compare the results to health literacy best practices, and to make recommendations to the field for improvement. A sample of 77 patient education documents were downloaded from the American Society for Surgery of the Hand Web site, handcare.org, and assessed for readability using 4 readability tools. Mean readability grade-level scores were derived. Best practices for plain language for written health materials were compiled from 3 government agency sources. The mean readability of the 77 patient education documents in the study was 9.3 grade level. This reading level is reduced from the previous study in 2008 in which the overall mean was 10.6; however, the current sample grade level still exceeds recommended readability according to best practices. Despite a small body of literature on the readability of patient education materials related to hand surgery and other orthopedic issues over the last 7 years, readability was not dramatically improved in our current sample. Using health literacy as a framework, improvements in hand surgery patient education may result in better understanding and better outcomes for patients seeing hand surgeons. Improved understanding of patient education materials related to hand surgery may improve preventable negative outcomes that are clinically significant as well as contribute to improved quality of life for patients. Copyright © 2016 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  18. Readability assessment of patient education materials on major otolaryngology association websites.

    Science.gov (United States)

    Eloy, Jean Anderson; Li, Shawn; Kasabwala, Khushabu; Agarwal, Nitin; Hansberry, David R; Baredes, Soly; Setzen, Michael

    2012-11-01

    Various otolaryngology associations provide Internet-based patient education material (IPEM) to the general public. However, this information may be written above the fourth- to sixth-grade reading level recommended by the American Medical Association (AMA) and National Institutes of Health (NIH). The purpose of this study was to assess the readability of otolaryngology-related IPEMs on various otolaryngology association websites and to determine whether they are above the recommended reading level for patient education materials. Analysis of patient education materials from 9 major otolaryngology association websites. The readability of 262 otolaryngology-related IPEMs was assessed with 8 numerical and 2 graphical readability tools. Averages were evaluated against national recommendations and between each source using analysis of variance (ANOVA) with post hoc Tukey's honestly significant difference (HSD) analysis. Mean readability scores for each otolaryngology association website were compared. Mean website readability scores using Flesch Reading Ease test, Flesch-Kincaid Grade Level, Coleman-Liau Index, SMOG grading, Gunning Fog Index, New Dale-Chall Readability Formula, FORCAST Formula, New Fog Count Test, Raygor Readability Estimate, and the Fry Readability Graph ranged from 20.0 to 57.8, 9.7 to 17.1, 10.7 to 15.9, 11.6 to 18.2, 10.9 to 15.0, 8.6 to 16.0, 10.4 to 12.1, 8.5 to 11.8, 10.5 to 17.0, and 10.0 to 17.0, respectively. ANOVA results indicate a significant difference (P < .05) between the websites for each individual assessment. The IPEMs found on all otolaryngology association websites exceed the recommended fourth- to sixth-grade reading level.

  19. How Readable Is BPH Treatment Information on the Internet? Assessing Barriers to Literacy in Prostate Health.

    Science.gov (United States)

    Koo, Kevin; Yap, Ronald L

    2017-03-01

    Information about benign prostatic hyperplasia (BPH) has become increasingly accessible on the Internet. Though the ability to find such material is encouraging, its readability and impact on informing patient decision making are not known. To evaluate the readability of Internet-based information about BPH in the context of website ownership and Health on the Net certification, three search engines were queried daily for 1 month with BPH-related keywords. Website ownership data and Health on the Net certification status were verified. Three readability analyses were performed: SMOG test, Dale-Chall readability formula, and Fry readability graph. An adjusted SMOG calculation was performed to reduce overestimation from medical jargon. After a total of 270 searches, 52 websites met inclusion criteria. Mean SMOG grade was 10.6 ( SD = 1.4) and 10.2 after adjustment. Mean Dale-Chall score was 9.1 ( SD = 0.6), or Grades 13 to 15. Mean Fry graph coordinates (173 syllables, 5.1 sentences) corresponded to Grade 15. Seven sites (13%) were at or below the average adult reading level based on SMOG; none of the sites qualified based on the other tests. Readability was significantly poorer for academic versus commercial sites and for Health on the Net-certified versus noncertified sites. In conclusion, online information about BPH treatment markedly exceeds the reading comprehension of most U.S. adults. Websites maintained by academic institutions and certified by the Health on the Net standard have more difficult readability. Efforts to improve literacy with respect to urological health should target content readability independent of reliability.

  20. [Global analysis of the readability of the informed consent forms used in public hospitals of Spain].

    Science.gov (United States)

    Mariscal-Crespo, M I; Coronado-Vázquez, M V; Ramirez-Durán, M V

    To analyse the readability of informed consent forms (ICF) used in Public Hospitals throughout Spain, with the aim of checking their function of providing comprehensive information to people who are making any health decision no matter where they are in Spain. A descriptive study was performed on a total of 11,339 ICF received from all over Spanish territory, of which 1617 ICF were collected from 4 web pages of Health Portal and the rest (9722) were received through email and/or telephone contact from March 2012 to February 2013. The readability level was studied using the Inflesz tool. A total of 372 ICF were selected and analysed using simple random sampling. The Inflesz scale and the Flesch-Szigriszt index were used to analyse the readability. The readability results showed that 62.4% of the ICF were rated as a "little difficult", the 23.4% as "normal", and the 13.4% were rated as "very difficult". The highest readability means using the Flesch index were scored in Andalusia with a mean of 56.99 (95% CI; 55.42-58.57) and Valencia with a mean of 51.93 (95% CI; 48.4-55.52). The lowest readability means were in Galicia with a mean of 40.77 (95% CI; 9.83-71.71) and Melilla, mean=41.82 (95% CI; 35.5-48.14). The readability level of Spanish informed consent forms must be improved because their scores using readability tools could not be classified in normal scales. Furthermore, there was very wide variability among Spanish ICF, which showed a lack of equity in information access among Spanish citizens. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.